Saturday, May 16, 2009

OpenGL ES: Video texturing for augmented reality

In augmented reality (AR) applications, it is required to render incoming video frames as background. In desktop PCs, it is quite easy because we just need to call the glDrawPixels function. However, on mobile phones, glDrawPixels are no more supported in OpenGL ES specification. Thus, we have to use texture instead of sending pixels directly to the framebuffer.

So, what we have to do for video texturing are as follows : 
  1. Create a texture which has 2^n width and height. 
  2. Copy pixel data from the current frame image 
  3. Update the texture data partially (depending on the video resolution) 
  4. Render the texture on the screen 
In step 1, the texture should have 2^n width and height since OpenGL ES does not support other sizes. The texture can be rectangular, but the width and height should be 2^n. 
If you use 320x240 video, the texture resolution becomes 512x256. 

The function glTexSubImage2D is used for step 3.
The problem is updating texture takes much time and the application becomes too slow. It is critical in AR applications since real-time video rendering is required.  Thus, performance should be measured. 

Rendering a texture can be achieved in two ways. One is rendering a full-screen quad and applying the texture on it, and the other is using glDrawTexiOES function, which is an OpenGL ES extension. After I tested several times on a few mobile phones, both methods showed almost same performances. 

6 comments:

  1. hello,

    currently i'm exploring how to create an augmented reality in iphone using opengl ES, wonder if can you send me you codes so that i can try it myself...thank you so much..and have a nice day..

    my email : d.nasura@gmail.com

    ReplyDelete
  2. hello Wonwoo ,

    Google just showed me your blog :)
    Currently i trying the same thing just you described ( video rendering on mobile using Opengl ES1.1). I am using opengl ES 1.1 version and i tried with the Vincent 3D Opengl ES implementation. its opensource , but its just software renderer.

    Did you got it working ? and which implementation of Opengl ES did you used ?

    I forgot to introduce me , my name is krishnan.Living in India.

    My personal e-mail id : Krishnadevan at gmail.com

    ReplyDelete
  3. Hello, KD. For me the method in my post works well.
    I tried OpenGL ES implementations of iPhone OS and Rasteroid (a software renderer).

    ReplyDelete
  4. Thank you.. Sorry for very late reply.. I got it all working.. i used RGB565 ,so less copying while updating the texture..

    ReplyDelete
  5. Hi Wonwoo,

    How is your opinion about using iPhone's Media Player framework + AVFoundation classes instead of using OpenGLES video texture method to play video clips in AR Application ?

    Thanks !!
    Deep.

    ReplyDelete
  6. Hello Deep.

    Well, if possible I'd prefer to avoid using OpenGL ES for video texturing because it affects overall performance of an App. (although it depends on the resolution of the video).

    Apple told that they improved the performance of texture update in WWDC2010 video, but I didn't tried to measure the improvement. But still, managing video background in my OpenGL codes is sometimes annoying to me.

    So, in my opinion, it might be good to use other methods for video texturing instead of OpenGL ES, if you can.

    Please refer my another post for video texturing without OpenGL ES texture update:

    http://mobile-augmented-reality.blogspot.com/2009/09/simple-and-nice-way-to-video-background.html

    ReplyDelete