Friday, July 31, 2009

Using PLCameraController on iPhone OS 3.0


Many people want to use iPhone's camera directly, instead of its ImagePickerController. So, people have used PLCameraController class, which is one of private framework classes. After the iPhone OS is updated to 3.0, however, PLCameraController class is modified and the old method to get preview does not work anymore.

There is a nice thread that discusses how to use PLCameraController on iPhone OS 3.0. See it here.
I succeeded to display the camera preview on my iPhone by following the thread. The code works well on both iPhone 3G and iPhone 3GS. On 3GS, the focus rectangle is automatically displayed as shown in the capture image.

However, the raw data, which all the AR developers may be more interested in rather than the preview, is not accessible until now.




First impression on iPhone 3GS


Recently I got a chance to try iPhone 3GS.

I executed Camera application to see how iPhone's camera module is improved.
Well, I felt no change in image capture mode except auto focusing. The preview is still in 15 fps and there is no great improvement in its quality.

In video recording mode, the frame rate of preview increases to 30 fps (it looks like 30 fps). The video quality is not bad and I think preview quality is better in video recording mode.
The video resolution is 640x480 and frame rate is 30 fps.

When I captured a video with the iPhone in vertical, the video's specification in Quicktime player was 480x640 and it was played in vertical. But, when I played the same file in MPlayer or VLC player, the video is played in horizontal and the resolution was 640x480 ( See the screen capture images below. Click them to enlarge).

The Quicktime player --->


MPlayer -->

Apple seems to insert the information about orientation, which is available from the built-in accelerometer, in the video file. So, the recorded video is originally 640x480 resolution and Quicktime may rotate it based on the orientation information. Apple is already doing this in still image capture.

In my opinion, auto-focusing is not welcomed in computer vision jobs since it changes the focal length of the camera and this breaks the assumption of the fixed focal length, which is a very common in computer vision papers. I hope there is a way to turn off it for augmented reality applications.

Hey, Apple. Why don't you open the interface to video camera control ? It may allow developers to make much more interesting applications....

Tuesday, July 28, 2009

Link error with sio2Init and sio2Shutdown when using SIO2 engine v1.4

SIO2 engine is updated to v1.4. As I've used as a static library (refer this post), I also tried to build v1.4 in the same way. The library was built without any problem but I met link errors with sio2Init and sio2Shutdown function, which have never occurred with the previous versions, when I built my application.

Undefined symbols:
"_sio2Shutdown", referenced from:
templateShutdown() in template.o
"_sio2Init", referenced from:
-[EAGLView createFramebuffer] in EAGLView.o
ld: symbol(s) not found
collect2: ld returned 1 exit status
"_sio2Shutdown", referenced from:
templateShutdown() in template.o
"_sio2Init", referenced from:
-[EAGLView createFramebuffer] in EAGLView.o
ld: symbol(s) not found
collect2: ld returned 1 exit status

I asked the forum about this problem and the answer is we need to link the pre-built static library 'libsio2_dev.a' or 'libsio2_sim.a' depending on the target platform. Those libraries contain the implementation of 'sio2Init' and 'sio2Shutdown'. Linking one of those files solved my problem.

Monday, July 27, 2009

A simple and nice way to video background without OpenGL ES texturing on the iPhone

When we make a mobile augmented reality application, video background is required. As all we know, video background requires tedious jobs. On desktop PCs, we can do it just calling glDrawPixels. But on mobile phones we need to do it through OpenGL ES texture and uploading texture data is quite slow on most mobile phones since they do not have GPUs that are fast enough to update textures in real-time.In AR applications, video background is just for displaying what we see through the camera on the screen. While trying to make video background faster, I found a simple way to do it without OpenGL ES texturing. The idea is using two views, one for video background and another for OpenGL ES rendering.

The Cocoa API allows us to add multiple child views to a window object. What I did was adding two views to our window. One is the 'previewView' of the PLCameraController and another is the OpenGL ES View. The 'previewView' of PLCameraController is a subclass of UIView and it displays video preview coming from the camera.

In the function applicationDidFinishLaunching of the application's AppDelegates.m (whatever..), add the code something like this. What the code does is just adding two views to the window.

PLCameraController *cam = [PLCameraController sharedInstance];
UIView *cam_view = [cam previewView] ;
[cam startPreview];
cam_view.frame = CGRectMake(0,0, 320, 480);

glView = [[EAGLView alloc]
initWithFrame:CGRectMake(0,0,320,480)] ;
glView.opaque = NO ;
glView.alpha = 1.0 ;
[window addSubview:cam_view];
[window addSubview:glView] ;

Then, we need to do one more thing. When you render OpenGL scene, we need to clear the scene with a color that have alpha value of 0.0. By making the OpenGL ES view transparent at the beginning of the rendering process, we will see the live video preview in the background.
So, just change the alpha value of clearing color like this.

glClearColor(0,0,0,0) ;

Run your application, then you will see a live video behind your OpenGL scene like the video below.




The pros and cons of this method are:
  • Pros : You don't need to manage textures or OpenGL ES things for video background. Even you don't need to care background update. Useful for developers who just need video background with OpenGL ES.
  • Cons : The background video seems not to be synchronized with the OpenGL ES. I do not dig on this, but if you do image processing with video (like object tracking), the background video may not be synchronized with your rendering result.
The important thing is performance of this method for video background for augmented reality applications.I tested my method with Oolong Engine on the iPhone 3G (OS 2.2.1). The simple skeleton example of Oolong engine runs at 60 fps as it is (before I add PLCameraController's previewView view).

The rendering speed decreased to about 28 fps after I added video background. It means the rendering became slower and I expect it is due to rendering with full-screen transparency. Note that the fps is not the video stream's framerate, which is just 15 fps. It is OpenGL ES rendering speed. It is much slower than when the video background is not applied, but I think the performance is not bad for AR applications.

I tested this method on the iPhone 3G with 2.2.1, but it may work on iPhone 3GS with 3.0 or higher.


Update (2009. 08. 11)

I tested the same code on iPhone 3GS (OS Ver. is 3.0.1). The rendering speed is improved up to 41 fps, which is +13 fps compared to the old iPhone 3G. I think it is because of the better CPU and Graphics chip. The preview is still 15 fps, which is for still image capturing. If we can get the preview for video capturing we will be able to use 30 fps preview video on 3GS, but the interface is not known until now.