Friday, December 19, 2008

iPhone camera access - an alternative way with official SDK

iPhone developers are looking for an alternative way to use camera with official SDK since Apple does not release direct access to the built-in camera and just provide an interface to capture an image. One way is to customize UIImagePickerView to get full screen video preview.

There is a discussion at

A video showing the implementation : 

Building OpenCV on Windows Mobile 6

Open Computer Vision Library (OpenCV) is a computer vision & image processing library widely used in computer vision community. It supports MS Windows, Apple MAC OS X, and linux systems.

Recently, I need to port my marker tracking module on Windows Mobile 6 (WM6) platform, and thus I have to build OpenCV on WM6. I searched articles on the web, but there are not many posts related to this issue. So, I decided to do it on my own. 
Fortunately, OpenCV is written in C language and it is platform-independent except highgui and cvcam. Well, what I need is only cv and cxcore library which are really C codes. 

To build OpenCV on Windows Mobile, we need to modify some of codes. 

1. cxtypes.h

(1) cvRound( double value ) : Leave the code below and comment the others in the function as follows. 

  Cv64suf temp;
  temp.f = value + 6755399441055744.0;
  return (int)temp.u;

2. cxerror.cpp : 

An error occurs because  TLS_OUT_OF_INDEXES is not defined. It is defined in the desktop version of windows.h, but not in Windows Mobile. So, I just  borrowed the definition from desktop version of windows.h. 

// Before

// Error occurs here. 
#if defined WIN32 || defined WIN64
    static DWORD g_TlsIndex = TLS_OUT_OF_INDEXES; 
    static pthread_key_t g_TlsIndex;



   static DWORD g_TlsIndex = TLS_OUT_OF_INDEXES;


3. cxswitcher.cpp 

Many errors occur in this file. Almost of them seem to be related to IPP. I just commented out the functions that caused errors and left their blank body.  

4. cxmisc.h

Errors come here since the build environment is not defined. Including malloc.h is enough to avoid compile errors. 


/* get alloca declaration */
#ifdef __GNUC__
#undef alloca
    #define alloca __builtin_alloca
#elif defined WIN32 || defined WIN64
#if defined _MSC_VER || defined __BORLANDC__
    #include "malloc.h" 
#elif defined HAVE_ALLOCA_H
    #include "alloca.h"
#elif defined HAVE_ALLOCA
    #include "stdlib.h"



#include "malloc.h"


5. WinMain(..) function

There is a WinMain function for building library in OpenCV source code. just comment it out. We do not need it any more since we already have one in the Windows Mobile project. 

After you modify codes I showed above, create a Windows Mobile library project in your Visual Studio and build OpenCV. The change I made in the source code is not carefully examined what problems they will cause. However, there is no problem until now in my case. I showed a screenshot of a simple OpenCV application executed on the WM6 emulator. Of course, it works well on the device. 

Friday, September 26, 2008

First marker tracking example

I tested my marker tracker on iPod Touch. 
Current frame rate is about 6 fps. Marker detection and decoding is fast enough and it does not reduce performance much. 

The major bottlenecks are (1) converting a RGB image to a binary image and (2) Background texture update in every frame. As I posted previously, the major problem is texture update. Still there may be a way to update the performance. 

glTexSubImage2D is too slow on iPod Touch

I tested the performance of changing texture data in every frame, using glTexSubImage2D. 
The resolution of texture image is 320x240. Since OpenGL ES does not support non-2^n size of texture, I created texture of 512x512, then just changed part of it. 

The texture is created as: 

glGenTextures(1, &bg_texture) ;

glBindTexture(GL_TEXTURE_2D, bg_texture) ;

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 

tex_size, tex_size, 




and updated in renderScene() function : 

glBindTexture(GL_TEXTURE_2D, 0) ;

glBindTexture(GL_TEXTURE_2D, bg_texture) ; 

glTexSubImage2D(GL_TEXTURE_2D, 0,0,0

320, 240, 



However, the performance is very poor. Calling glTexSubImage2D makes the program hardly be able to run in real-time. Some applications on mobile phones show that more than 20 fps while rendering video. Thus, I think the problem may be texture format. Threre are several texture formats, like :


Maybe there will be a format that iPhone's hardware prefer.  

Monday, September 8, 2008

OpenCV on iPhone OS - 2

I ran the simple application on the iPod touch. 
I added three image processing step to an OpenGL ES sample application provided by Apple. 

What the application does is : 
- Create OpenGL ES surface 
- Load a color image 
- In the rendering loop 
   1. Gray scale conversion from the original color image 
   2. Simple thresholding 
   3. Canny edge detection. 
- Display 4 resulting images with texture mapping. 

Well, after the image processing with OpenCV functions are added, the OpenGL ES application becomes much slower than before. Unfortunately I couldn't measure the exact frame rate of my application, but I feel that it is about 1 frame/sec. 

Too slow huh ? 

Update on 09/26/2009: Well, the frame rate is because of glTexSubImage2D I called in every frame to render the result and the size of image (I used 512x512 image).

Friday, September 5, 2008

OpenCV on iPhone OS - 1

Today, I tried to use OpenCV library on iPhone OS.
OpenCV is famous computer vision library used by many people working on image processing and computer vision. It is written in C language and platform independent (Windows, Linux, and Mac OS). 

I included the two main library OpenCV (cv and cxcore) in my project and wrote small image processing codes (thresholding, edge detection, and blur) using OpenCV functions. The project was successfully built without errors. 

I tried it on the simulator only. The performance on the real device should be checked. I hope OpenCV to work in real-time on iPhone/touch. ;-)

Tuesday, September 2, 2008

Running iPhone/Touch applications on the devices without the developer registration

Well, Apple blocked installing and running applications developed by a person who does not register to Apple's developer program. This may be one of their policies to control the market as they want. In addition, no applications without developer sign can be uploaded to App. store.  

For me, Apple's policy is quite odd, since I don't want to sell my application or let other people use it. I just want to use it for my research and for fun. Thus, I have to find some other way to do it. 

Some smart people who are familiar with MAC OS X and unix programming already found the way to run applications without developer sign. The steps are as follows. 
  1. iPhone/Touch shoud be hacked and something like OpenSSH should be installed. Cydia makes it easy. 
  2. Write your own code and build it for iPhone platform. (Build target architecture is arm6) 
  3. There are three tips to bypassing code sign issues : (
  4. Copy your application to iPhone/Touch's 'Applications' folder. 
  5. If you don't see your application on the spring board, please reboot your device. 
  6. Now, do you see your application on the spring board ? Click it to run. 

Hacking iPod touch

Today, I tried to hack an iPod touch with QuickPwn tool released by iPhone-Dev team. QuickPwn was much more past since it does not build a custom hacked firmware. The hacking took only 5 minutes. One thing I didn't want is that the QuickPwn changes the boot logo from the apple's to their pineapple. 

After hacking iPod touch, I installed Quake4iPhone from Cydia for testing and it works well. 

Thank you iPhone-Dev team ! 

Sunday, August 3, 2008

OpenGL ES Benchmark on Mobile Phones

There is a nice web site that provides benchmark results on many mobile devices called GL Benchmark (). The website also provides tools for benchmarking.

According to the results, iPhone's floating point calculation performance is much much better the other devices such as Nokia N95. This is a nice feature since augmented reality applications requires many floating point computation in linear algebra and 3D rendering.

Maybe we somehow can be free with iPhone from complex fixed point calculation that requires optimization of the written program for the fast performance.

iPhone result from

Tuesday, July 29, 2008

Oolong Engine for iPhone

Oolong is a game engine based on OpenGL ES on iPhone.
It provides many useful things for developer, such as math library, touch screen support, accelerometer support, and text rendering. It can be integrated with the Bullet SDK, a physics library) for physics simulation.
It also support sound using OpenAL.

Are you curious about the origin of the name Oolong ? It is the name of a famous Chinese tea.

More information is here : Oolong Engine

Thursday, July 24, 2008


AugmentedNav is a cool application using iPhone and AR techniques for car navigation. It seems that the frame rate of the iPhone camera is not so bad.

Augmented Reality Navigation from Robert Winters on Vimeo

Wednesday, July 23, 2008

Camera control on iPhone ?

Unfortunately, Apple does not provide low level API for camera control on iPhone. Apple provides UIImagePickerController, but it does not allow us to access camera in low level either. To do what they want, some people found there is private frameworks that are not publicly available.

The PhotoLibrary framework, one of the private frameworks, has a class called CameraController and it provides interface for camera control. Since it is not available in the iPhone SDK, someone who wants to use it have to some works on it.
I have to dig more on this topic....

iflickr found in google code is one of examples on the web, which uses CameraController class.(

Tuesday, July 22, 2008

ARToolkit comes to iPhone !!

ARToolWorks demonstrated their ARToolkit implementation on iPhone.However, the performance is too low as I judge from the video. It looks like working under 10 frames/sec. Maybe it can be faster in near future.

Look at the video on Youtube:

Drawing primitives on OpenGL ES

Unlike the OpenGL, the OpenGL ES do not provide glBegin and glEnd for drawing primitives, such as triangles and quads. Instead, one has to use vertex array. 

First we have to specify the vertices in an array. 

// Four vertices of a simple rectangle 
GLfloat vertices = {0,0, 1,0, 0,1, 1,1} ;
// Texture coordinates 
GLfloat tex_coords = {0,0, 1,0, 0,1, 1,1} ; 

//Set vertex pointer and texture coordinate pointer
glVertexPointer(2, GL_FLOAT, 0, vertices) ;
glTexCoordPointer(2, GL_SHORT, 0, tex_coords) ;

glVertexPointer indicate the array of vertices. the 1st argument represents the number of coordinates (dimensions) of a vertex, 2nd one means the type of coordinate values. The 3rd argument describes there is a stride between two consecutive vertex coordinates. The 4th one is the pointer to the array of vertices coordinates. glTexCoordPointer works in the same manner.  

// Then, enable the client state
glEnableClientState(GL_VERTEX_ARRAY) ;
glEnableClientState(GL_TEXTURE_COORD_ARRAY) ; 

In a drawing function, glDrawArrays(GLenum mode, GLint first, GLsizei count) perform rendering the primitives defined in a vertex array. mode represents the type of primitives such as GL_TRIANGLE, GL_QUADS, etc. and first specifies the starting index in the enabled arrays. count is the number of indices to be rendered. Since we are going to render a rectangle using two triangles and have 4 vertices, the function call will be : 

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4) ;

If there are two or more objects defined in different vertex arrays, we have call glEnableClientState and glDisableClientState before and after setting arrays and calling glDrawArrays. 

Some tutorials about vertex array are found in : 

The 1st example on iPhone

Apple provides some sample codes to show how to use OpenGL ES on iPhone. I tried the first one, 'GLSprite' on the iPhone simulator. The sample code shows: 

  1. How to initialize OpenGL ES view using UIView 
  2. How to read an image and create OpenGL ES texture. 
  3. How to render the scene. 

I changed the image and tried some OpenGL ES functions. The screenshot is on the right. 

Most of the sample codes is understandable, but, there is one thing I couldn't. The example code seems to use a framebuffer for rendering OpenGL scene, but I'm not sure this is the right way of using OpenGL ES on iPhone... Or, is there any reason to do in this way ? 

Well, it may be clearer after I learn how Objective-C framework works on iPhone.