Practical 5 - Elements of solution

This page provides elements of solution for the Texturing practical.

Exercise 1: Wrapping and filtering

The interpretation of these texturing parameters was discussed in class, come back to us for more information.

Exercise 2: Load models with explicit texture coordinates

First take a look at the cube.obj file:

o cube
mtllib cube.mtl

v -0.500000 -0.500000 0.500000
v 0.500000 -0.500000 0.500000
v -0.500000 0.500000 0.500000
v 0.500000 0.500000 0.500000
v -0.500000 0.500000 -0.500000
v 0.500000 0.500000 -0.500000
v -0.500000 -0.500000 -0.500000
v 0.500000 -0.500000 -0.500000

vt 0.000000 0.000000
vt 1.000000 0.000000
vt 0.000000 1.000000
vt 1.000000 1.000000

vn 0.000000 0.000000 1.000000
vn 0.000000 1.000000 0.000000
vn 0.000000 0.000000 -1.000000
vn 0.000000 -1.000000 0.000000
vn 1.000000 0.000000 0.000000
vn -1.000000 0.000000 0.000000

g cube
usemtl cube
s 1
f 1/1/1 2/2/1 3/3/1
f 3/3/1 2/2/1 4/4/1

Eight vertex positions (v) are defined, as well as four vertex texture coordinates (vt) and six vertex normals (vn). Faces (f) are then defined using three indices per vertex for the position, texture coordinate and normal, respectively. For instance, the second vertex of the second triangle is defined at (0.5, -0.5, 0.5), with the texture coordinate (1.0, 0.0) and the normal (0.0, 0.0, 1.0).

Look at the load function in When normals and texture coordinates are defined in a mesh file, in addition to positions, all these data are automatically added to the attributes dictionary. Thus, the loader will always send the following attributes:

in vec3 position;
in vec3 normal;
in vec2 tex_coord;

Make sure you modified your vertex shader with the attribute tex_coord provided by the loader.

Regarding the texture file, the image filename (here cube.png) is defined in the cube.mtl material file refered in cube.obj.

Note that you can easily overload the texture image (with potentially artistic results when texture coordinates are not matched with the image…):

load(file, shader, tex_file="flowers.png")

Exercise 3: Phong and Texture

No solution for this exercise. The aim is to combine previous solutions for Phong illumination (see Practical 4 - Elements of solution) and what you have just done with texturing.

Tips would be:

  • write a new shader merging phong.[vert|frag] and texture.[vert|frag].

  • the illumination color and texture can simply be multiplied (an addition could induce saturation, i.e. color components overflowing 1.0 thus clipped at 1.0). Other options use mix, see the next exercise.

  • do not forget to add a light in your scene (yes, some of you did forget!)

Exercise 3: Multi-texturing

When loading a mesh file, texture coordinates are usually defined. If you define your own object, you have to do it yourself and then add these texture coordinates to your attributes dictionnary so that they will be sent to the vertex shader. Also, setup two textures and send them as two uniforms to your shaders.

class MultiTexturedPlane(Textured):
    """ Simple multi-textured object """
    def __init__(self, shader, tex_file1, tex_file2):
        # setup plane mesh to be textured
        base_coords = ((-1, -1, 0), (1, -1, 0), (1, 1, 0), (-1, 1, 0))
        indices = np.array((0, 1, 2, 0, 2, 3), np.uint32)
        tex_coords = ((0,0), (1,0), (1,1), (0,1))
        mesh = Mesh(shader, attributes=dict(position=base_coords, tex_coord=tex_coords), index=indices)

        # setup & upload two textures to GPU
        texture1 = Texture(tex_file1)
        texture2 = Texture(tex_file2)
        super().__init__(mesh, diffuse_map=texture1, second_texture=texture2)

Note that the texture coordinates are here shared for the two texture images; this is an example, not the standard.

Also note that here the texture images are exactly mapped to the square: the image vertices are mapped on the square vertices. Try different texture coordinates, and test different images to understand the texture coordinate system (where is up and down?).

In the fragment shader, one texture sampler is defined for each image. There are many ways to combine the two texture colors to compute the output color of the fragment:

  • test simple operators like addition (watch out for overflows) or multiplication

  • a common combination when one of the images contains transparent texels (as in flowers.png) is linear interpolation with the GLSL mix function. Find out its definition.

#version 330 core

uniform sampler2D diffuse_map;
uniform sampler2D second_texture;
in vec2 frag_tex_coords;
out vec4 out_color;

void main() {
    vec4 color1 = texture(diffuse_map, frag_tex_coords);
    vec4 color2 = texture(second_texture, frag_tex_coords);

    // out_color = color1 * color2;
    out_color = mix(color1, color2, color2.a);  // analyse what is done here!