Saturday, May 16, 2009

OpenGL ES: Video texturing for augmented reality

In augmented reality (AR) applications, it is required to render incoming video frames as background. In desktop PCs, it is quite easy because we just need to call the glDrawPixels function. However, on mobile phones, glDrawPixels are no more supported in OpenGL ES specification. Thus, we have to use texture instead of sending pixels directly to the framebuffer.

So, what we have to do for video texturing are as follows : 
  1. Create a texture which has 2^n width and height. 
  2. Copy pixel data from the current frame image 
  3. Update the texture data partially (depending on the video resolution) 
  4. Render the texture on the screen 
In step 1, the texture should have 2^n width and height since OpenGL ES does not support other sizes. The texture can be rectangular, but the width and height should be 2^n. 
If you use 320x240 video, the texture resolution becomes 512x256. 

The function glTexSubImage2D is used for step 3.
The problem is updating texture takes much time and the application becomes too slow. It is critical in AR applications since real-time video rendering is required.  Thus, performance should be measured. 

Rendering a texture can be achieved in two ways. One is rendering a full-screen quad and applying the texture on it, and the other is using glDrawTexiOES function, which is an OpenGL ES extension. After I tested several times on a few mobile phones, both methods showed almost same performances. 

Friday, May 15, 2009

iPhone:Using PLCameraController


The PLCameraController class is for controlling iPhone's built-in camera and one of Apple's private frameworks. If you want to get the preview of incoming video stream from the camera, it is quite easy to do it. Here are steps to do it. 

1. Get the header of PLCameraController. This can be obtained by dumping iPhone OS frame works by using class-dump-X, or you can download it from somewhere.  

2. Add the PhotoLibrary framework from the SDK directory. With the SDK 2.2.1, the framework is in /Developer/Platforms/iPhoneOS.platform/ Developer/SDKs/iPhoneOS2.0.sdk/System/Library/
PrivateFrameworks/PhotoLibrary.framework

3. Import the PLCameraController header to your application delegate. 

4. Add the following code to 'applicationDidFinishLaunching' 


- (void)applicationDidFinishLaunching:(UIApplication *)application { 

// Hide the status bar 

[[UIApplication sharedApplication] setStatusBarHidden:YES animated:NO];

// Get the view for preview and start preview 

PLCameraController *cam = [PLCameraController sharedInstance]; 

UIView *view = [cam previewView]; 

[cam startPreview]; 


// Add the preview view to window 

[window addSubview:view]; 

// Override point for customization after app launch 

[window makeKeyAndVisible]; 


Now, run the application then you will see something like the screen show below. 


Basically, the resolution of the video coming from the camera is 304x400, which is a little weird. The preview video stream is resized in the preview view. If we do not set a new frame size, it is resized with a fixed aspect ratio and thus, there are white area at the bottom of the screen. 


For full screen preview, set the frame size of the preview view like this : 

view.frame = CGRectMake(0,0, 320, 480) ;


Update :  I tried several tests and it turns out that the preview is actually 304x400.



 

iPhone: How to remove status bar in my application

By default, the status bar is always displayed on an application.
Some times, the status bar is obtrusive. 
To remove the status bar, add the following code in the application delegate. 

- (void)applicationDidFinishLaunching:(UIApplication *)application { 

[[UIApplication sharedApplication] setStatusBarHidden:YES animated:NO];


// Then, Do what you want.. 

}


Tuesday, May 5, 2009

How to save UIImage to my iPhone or iPod touch

A UIImage can be saved as a JPG or a PNG by using NSData representation. 
First, initialize NSData object by using the function 'UIImagePNGRepresentation' or 'UIImageJPGRepresentation'. Then, write the data through writeToFile method. 

The location where the file will be saved is the Documents directory of your application. 
The document directory can be obtained by 'NSSearchPathForDirectoriesInDomains' method.  

The whole code may look like as follows. 

// Make a UIImage 
UIImage *myImage = [UIImage imageNamed:@"myfilename"] ;

//.... Do what you want ....

// Get the location of the Documents directory

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) ;

NSString *imagePath = [paths objectAtIndex:0] ;

NSString *filename = @"test.png"

NSString *filepath = [NSString stringWithFormat:@"%@/%@", imagePath, filename] ;


// Save the image 
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(myImage)];
[imageData writeToFile:filepath atomically:YES];