No older revisions available!

Clear message
  • Immutable Page
  • Info
  • Attachments

Since revision 3220 in bzr, some plugins will not build anymore because there were not enough development resources to update them for the new batch-mode OpenGL API. This is a guide on how to port those old plugins over and get them building (and running again)


Generally speaking, the compiz team prefers it if code was functional on both the OpenGL and OpenGL|ES APIs. Since compositors are relatively simple, we can do this for the vast majority of our codebase. The way in which this is done is to use Barker and Frantzis' "OpenGL Subset" approach, where only the subset of API supported by both OpenGL and OpenGL|ES is used by the code (See, Frantzis. A. Barker. J., 2012, "A Subset approach to using OpenGL and OpenGL|ES" in OpenGL Insights. This means that when porting one should have a preference for using the GLVertexBuffer object.

Immediate Mode Usage

Immediate Mode usage is legal in desktop builds, but will not compile in OpenGL|ES builds. In addition, it means using more of the fixed function pipeline, which we are trying to eliminate (see below). Immediate Mode usage is incompatible with shader objects. Its also a lot slower because of the overhead in the driver.

glColor4f (1.0f, 1.0f, 1.0f, 1.0f);
glVertex2f (0.0f, 0.0f, 0.0f);
glVertex2f (1.0f, 0.0f, 0.0f);
glVertex2f (0.5f, 1.0f, 0.0f);
glEnd ();

Compiz provides a static GLVertexBuffer object with a pass-through vertex and fragment shader using vertex buffer objects with GL_STREAM_DRAW as their storage type. These are optimized for the case of copying data to the GPU on every frame. It is less efficient, however, than providing your own GLVertexBuffer with GL_STATIC_DRAW and pre-calculated vertices.

GLVertexBuffer *stream = GLVertexBuffer::streamingBuffer ();
stream->begin (GL_TRIANGLES);
stream->color4f (1.0f, 1.0f, 1.0f, 1.0f);

GLfloat vertices[] =
    0.0f, 0.0f, 0.0f,
    1.0f, 0.0f, 0.0f,
    0.5f, 1.0f, 0.0f

stream->addVertices (3, vertices);
if (stream->end ())
    stream->render ();

There are a few things to notice about the new API:

  1. The API provides a similarly named  ::begin ()  and  ::end ()  method which mirrors  glBegin ()  and  glEnd () . Its not quite the same as regular glBegin and glEnd which will cause OpenGL to enter an error state if a state change is issued between rendering commands, but they are necessary in order to use GLVertexBuffer. GLVertexBuffer is effectively a stateful object - either all of the vertex data is saved in GPU vertex buffer storage, or we are still gathering data in the system memory buffers. Calling  ::render ()  before calling  ::end ()  will result in the last transferred geometry being rendered, which may be no geometry.

  2. It is preferable to assemble an array of vertex data and call  ::addVertices ()  once, rather than multiple times. Its also much easier to read

  3. The client should check the return value of  ::end ()  - if it returns  false , then there is no geometry to be rendered, so calling  ::render ()  would be a redundant call to  glDrawArrays (); 

  4. Be careful with client states. Whenever using GLVertexBuffer, ensure that the ported code does not use  glEnableClientState ()  with either  GL_TEXUTRE_COORD_ARRAY ,  GL_VERTEX_ARRAY ,  GL_COLOR_ARRAY ,  GL_INDEX_ARRAY  or  GL_NORMAL_ARRAY . As noted in the DRIVERS file in the source tree, using both alongside vertex buffer objects, where we set the array data to NULL is a fatal error in some drivers, and undefined behaviour in others.

  5. Vertex data is three-part, and we specify the number of vertices we are adding from the array as array_size / 3 (eg, 9 float, 3 vertices)

Handling texture co-ordinates, colors, normals

Most code won't be about drawing white triangles on-screen as that is not very interesting. Instead, they will probably be using the immediate mode to draw textures or colored polygons on-screen.

If the client code is only using one color for a whole primitive, then you can get away with using  ::color4f ()  on GLVertexBuffer as indicated earlier. If not, then colors, normals and texture co-ordinates are all per-vertex attributes, so you need one per-vertex. For example:

glActiveTexture (GL_TEXTURE0);
glBindTexture (GL_TEXTURE_2D, texture);
glTexCoord2f (0.0f, 0.0f);
glColor4f (0.0f, 1.0f, 1.0f, 1.0f);
glVertex2f (0.0f, 0.0f, 0.0f);
glTexCoord2f (0.0f, 1.0f);
glColor4f (1.0f, 0.0f, 1.0f, 1.0f);
glVertex2f (1.0f, 0.0f, 0.0f);
glTexCoord2f (0.5f, 1.0f);
glVertex2f (0.5f, 1.0f, 0.0f);
glEnd ();

GLVertexBuffer *stream = GLVertexBuffer::streamingBuffer ();
stream->begin (GL_TRIANGLES);
GLfloat vertices[] =
    0.0f, 0.0f, 0.0f,
    1.0f, 0.0f, 0.0f,
    0.5f, 1.0f, 0.0f

const GLushort maxColor = 0xffff;

GLushort colors[] =
0, maxColor, maxColor, maxColor,
maxColor, 0, maxColor, maxColor,
maxColorm 0, maxColor, maxColor

GLfloat texCoords[] =
0.0f, 0.0f,
1.0f, 0.0f,
0.5f, 1.0f

stream->addVertices (3, vertices);
stream->addTexCoords (0, 3, texCoords);
stream->addColors (3, colors);

if (stream->end ())
    stream->render ();

The API is very similar for texcoords and colors as it is for vertices. Color data is 4-part (rgba), and texture data is 2-part (st). When adding texcoords, you also need to specify which sampler unit will be using those texcoords, which is zero-indexed. Generally speaking this will be zero.

Color data is stored in terms of unsigned shorts, with MAXSHORT (0xffff) being the highest value and 0 being the lowest value.

You'll also notice that every though we had two calls to glColor4f, there are three specifications in the color data. Color data is per-vertex, don't forget that.

Changed compiz APIs

A few compiz API's were changed / dropped.


This class is now gone completely. It was redundant, as GLWindow::Geometry is basically a client side buffer of vertex data.

Editing the vertex mesh

Plugins are now given direct access to a window's GLVertexBuffer internally.

GLWindow::Geometry *geometry = gWindow->geometry ();
geometry->vertices ...
geometry->vertexStride ...
geometry->vCount ...

replaced with

GLVertexBuffer *vb = gWindow->vertexBuffer ();
vb->getVertices ()...
vb->getVertexStride ()...
vb->countVertices ()...

Technically speaking, the fact that client code and directly edit the vertex mesh internals is not such a good thing, but the compiz team have left it this way for ease of porting old plugins. Perhaps this API will change in future, though client code will be updated for it if it was already ported to the new API.

Generating the vertex mesh

Most plugins will just use glAddGeometry to generate a vertex mesh for a particular texture matrix and paint/clip region. All the client code needs to do here is clear and refresh the client side vertex buffer held by  GLWindow 

gWindow->glAddGeometry (texMatricies, drawRegion, clipRegion, maxMesn, minMesh);
if (gWindow->geometry ().vCount)
    gWindow->glDrawTexture (texture, attrib, mask);

which now looks like:

GLVertexBuffer *vb = gWindow->vertexBuffer ();
vb->begin (); // defaults to GL_TRIANGLE_STRIP
gWindow->glAddGeometry (texMatrices, drawRegion, clipRegion, maxMesh, minMesh);
if (vb->end ())
    gWindow->glDrawTexture (texture, attrib, matrix, mask);

Things to note:

  1. Since glAddGeometry has a vertex clipping function, it is really important that vb->end () is checked, since there will be paint passes where the specified geometry does not intersect the clip region, and drawing that geometry would result in overdraw in the backbuffer, which results in bleeding in the worst case, and a redundant call to  glDrawArrays  in the best case.

  2. We need to pass a transformation matrix to glDrawTexture now
  3. Checking  ::end ()  is effectively the same as checking if there were any generated vertices

GLFragment::Attrib removed

GLFragment::Attrib was removed. It was used to specify shader programs (which has now been replaced by the specification mechanism in GLVertexBuffer) as well as opacity/brightness/saturation. GLWindowPaintAttrib already did this, so just replace instances of GLFragment::Attrib with GLWindowPaintAttrib

ARB Assembly Program Usage

As  GL_ARB_fragment_program  is not part of the OpenGL core profile or OpenGL|ES, it was removed in favor of GLSL shaders. Unfortunately, that means that any plugin that uses  GLFragment  to generate GPU assembly shaders needs to have that part rewritten. That warrants another article as it is a much larger subject.

Fixed Function Usage

Some plugins will use the fixed function vertex and fragment transformation pipeline in order to do their work. This was removed in OpenGL|ES, and its use is discouraged in compiz. Generally speaking, there are methods of replacing all of the fixed-function pipeline in custom made shaders, or client side. Here's a list of areas where fixed function pipeline usage can be replaced:

  1. Matrix functions:  glPushMatrix, glPopMatrix, glLoadMatrixf, glRotatef, glTranslatef, glScalef . These should all be replaced client side using the  GLMatrix  class, and then passed to GLVertexBuffer's  ::render ()  function, which will be used as a uniform in the shader. Plugins that change the  GL_PROJECTION  matrix will need to write their own vertex shader and use GLVertexBuffer with that.

  2. Texture co-ordinate generation:  glTexGenfv  - these can be replaced with a client side generator

  3. Texture environment, Convolution filters  glTexEnvi   glConvolution , should be replaced with fragment shaders

  4. Fixed function clip planes are not available in OpenGL|ES . Unfortunately, most of the methods to simulate clip planes are slow on vertex shaders, as there is no way to discard a vertex, and instead fragments must be discarded. Most plugins use the built in vertex clipping done in software in glAddGeometry, which is much faster. If one must have clip planes, then it is recommended to use stencil buffers instead, which is much faster. Fill the stencil buffer with data by wrapping  ::glBufferStencil ()  in  GLScreen 

Primitive Specification

 GL_QUADS  and  GL_POLYGON  are not supported in OpenGL|ES, so their usage is not recommended. Generally speaking, if a plugin is just rendering a single-quad texture using either immediate mode or with a single vertex array, then you should be able to get away with changing the primitive type to  GL_TRIANGLE_STRIP , unless there are two vertices which cross each other. In that case, those vertices need to be arranged so that they wind clockwise, with the final point opposite the second point.

In the case where multiple quads are rendered using a single buffer, you will either need to change the vertex generation function in the plugin to emit triangle strips, or only support that codepath on desktop OpenGL.

Index buffer objects

While supported in both OpenGL|ES and OpenGL, indexed rendering is not yet supported using GLVertexBuffer, because its semantics depend on the primitive type. Even though it will be slower, you can simulate its usage when filling the vertex buffer:

std::vector <GLfloat> vertexData;
vertexData.reserve (nIndicies * 3);
for (unsigned int i = 0; i < nIndicies; ++i)
    unsigned short vertexBase = indicies[i] * 3;
    vertexData.push_back (vertices[vertexBase]);
    vertexData.push_back (vertices[vertexBase + 1]);
    vertexData.push_back (vertices[vertexBase + 2]);

vertexBuffer->addVertices (nIndices, vertexData);

Development/zero-nine/GLESPorting (last edited 2012-12-26 14:27:46 by 58-7-107-230)