Please note that all the consequences of reading this post and trying the codes below are on your own.
Recently, I found a workaround for grabbing raw image data from iPhone's camera on OS 3.x.
As I previously posted, using PLCameraController does not provide a concrete solution for AR since the captured image contains overlaid contents (a related post here). And, the problem have left unsolved.
But a guy left a comment that using two windows can solve the problem (I don't know his/her name, because the comment was written by anonymous). The idea is somehow weird because we usually makes only a window for a iPhone application, but there are other people saying that using two windows is a workaround for the problem on the Internet. So, I did some digging with the idea and eventually figured out how to do it.
As you may know, iPhone OS4 may allow full camera access for developers. I'm using iPhone for my research, so this method is useful until OS4 comes.
Now, I explain how to do it.
First, we need two windows. Just add one more UIWindow to your application delegate. We will add the PLCameraController's previewView to previewWindow.
UIWindow *previewWindow ;
Then, go to the 'applicationDidFinishLaunching' method. Make the main window transparent to make the view that will be added to the previewWindow visible.
// Make the window background transparent
window.opaque = NO ;
window.backgroundColor = [UIColor clearColor] ;
Initialize the main view. In my case, the view will be an OpenGL view. I make the OpenGL view transparent to see the video background. If you don't know why, please refer my previous post about making video background without texture mapping.
glView = [[EAGLView alloc]
initWithFrame:CGRectMake(0, 0, video_width, video_height)
glView.opaque = NO ;
glView.backgroundColor = [UIColor clearColor] ;
[window addSubview:glView] ;
Initialize PLCameraController and the previewWindow.
if(camController != nil)
[camController release] ;
// Initialization of PLCameraController
camController = ..... ;
previewWindow = [[UIWindow alloc] init] ;
previewWindow.autoresizesSubviews = YES ;
Add the previewView of PLCameraController to the previewWindow and make the previewWindow visible. The size of 'preview.frame' becomes the size of image we will get from the camera. In this post, I just used (320,426).
UIView * preview = [camController previewView] ;
preview.frame = CGRectMake(0,0, 320, 426) ;
[previewWindow addSubview:preview] ;
[previewWindow makeKeyAndVisible] ;
Finally, make the main window visible.
Ok, run your application, then you will see a video preview. To retrieve RGBA data from the camera, a well known method is enough, something like this.
CoreSurfaceBufferRef coreSurfaceBuffer =
if (!coreSurfaceBuffer) return;
Surface *surface =
previewHeight = surface.height;
previewWidth = surface.width;
previewBytesPerRow = previewWidth*4;
pixelDataLength = previewBytesPerRow*previewHeight;
void *pixels = surface.baseAddress;
int sufaceBytesPerRow = surface.bytesPerRow ;
// Copy the pixels to your buffer
memcpy(Your_buffer, pixels, pixelDataLength) ;
Here is a screen shot of my simple AR application, where a virtual character is synthesized on a real scene.
Note that the workaround I explained is not perfect. The limitation of this method is that the retrieved image sometimes have misaligned scanlines. Another problem is auto-focusing on 3GS. The focus rectangle appearing when the camera gives a focus is captured in the retrieved image data. I just couldn't disable it.
So, iPhone becomes closer to a good device for AR and with OS4, there will be a plenty of AR Apps in the store.
Update: To disable the focus rectangle, you need to implement PLCameraController's delegate method cameraControllerReadyStateChanged. So, set a delegate object to the PLCameraController and implement the method like this:
- (void)cameraControllerReadyStateChanged:(NSNotification *)aNotification
[cameraController setDontShowFocus:YES] ;
Then you will not see the focus rectangle. Thanks for your comment Arrix !