Tuesday, July 29, 2008

Oolong Engine for iPhone

Oolong is a game engine based on OpenGL ES on iPhone.
It provides many useful things for developer, such as math library, touch screen support, accelerometer support, and text rendering. It can be integrated with the Bullet SDK, a physics library) for physics simulation.
It also support sound using OpenAL.



Are you curious about the origin of the name Oolong ? It is the name of a famous Chinese tea.

More information is here : Oolong Engine

Thursday, July 24, 2008

AugmentedNav

AugmentedNav is a cool application using iPhone and AR techniques for car navigation. It seems that the frame rate of the iPhone camera is not so bad.




Augmented Reality Navigation from Robert Winters on Vimeo

Wednesday, July 23, 2008

Camera control on iPhone ?

Unfortunately, Apple does not provide low level API for camera control on iPhone. Apple provides UIImagePickerController, but it does not allow us to access camera in low level either. To do what they want, some people found there is private frameworks that are not publicly available.

The PhotoLibrary framework, one of the private frameworks, has a class called CameraController and it provides interface for camera control. Since it is not available in the iPhone SDK, someone who wants to use it have to some works on it.
I have to dig more on this topic....

iflickr found in google code is one of examples on the web, which uses CameraController class.(http://code.google.com/p/iflickr/)

Tuesday, July 22, 2008

ARToolkit comes to iPhone !!

ARToolWorks demonstrated their ARToolkit implementation on iPhone.However, the performance is too low as I judge from the video. It looks like working under 10 frames/sec. Maybe it can be faster in near future.

Look at the video on Youtube:
http://www.youtube.com/watch?v=5M-oAmBDcZk

Drawing primitives on OpenGL ES

Unlike the OpenGL, the OpenGL ES do not provide glBegin and glEnd for drawing primitives, such as triangles and quads. Instead, one has to use vertex array. 

First we have to specify the vertices in an array. 

// Four vertices of a simple rectangle 
GLfloat vertices = {0,0, 1,0, 0,1, 1,1} ;
// Texture coordinates 
GLfloat tex_coords = {0,0, 1,0, 0,1, 1,1} ; 

//Set vertex pointer and texture coordinate pointer
glVertexPointer(2, GL_FLOAT, 0, vertices) ;
glTexCoordPointer(2, GL_SHORT, 0, tex_coords) ;

glVertexPointer indicate the array of vertices. the 1st argument represents the number of coordinates (dimensions) of a vertex, 2nd one means the type of coordinate values. The 3rd argument describes there is a stride between two consecutive vertex coordinates. The 4th one is the pointer to the array of vertices coordinates. glTexCoordPointer works in the same manner.  

// Then, enable the client state
glEnableClientState(GL_VERTEX_ARRAY) ;
glEnableClientState(GL_TEXTURE_COORD_ARRAY) ; 

In a drawing function, glDrawArrays(GLenum mode, GLint first, GLsizei count) perform rendering the primitives defined in a vertex array. mode represents the type of primitives such as GL_TRIANGLE, GL_QUADS, etc. and first specifies the starting index in the enabled arrays. count is the number of indices to be rendered. Since we are going to render a rectangle using two triangles and have 4 vertices, the function call will be : 

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4) ;

If there are two or more objects defined in different vertex arrays, we have call glEnableClientState and glDisableClientState before and after setting arrays and calling glDrawArrays. 

Some tutorials about vertex array are found in : 
  • http://www.songho.ca/opengl/gl_vertexarray.html
  • http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=45 

The 1st example on iPhone


Apple provides some sample codes to show how to use OpenGL ES on iPhone. I tried the first one, 'GLSprite' on the iPhone simulator. The sample code shows: 

  1. How to initialize OpenGL ES view using UIView 
  2. How to read an image and create OpenGL ES texture. 
  3. How to render the scene. 

I changed the image and tried some OpenGL ES functions. The screenshot is on the right. 

Most of the sample codes is understandable, but, there is one thing I couldn't. The example code seems to use a framebuffer for rendering OpenGL scene, but I'm not sure this is the right way of using OpenGL ES on iPhone... Or, is there any reason to do in this way ? 

Well, it may be clearer after I learn how Objective-C framework works on iPhone.