Tuesday, April 27, 2010

iPhone camera access on OS 3.x : A workaround


Please note that all the consequences of reading this post and trying the codes below are on your own.


Recently, I found a workaround for grabbing raw image data from iPhone's camera on OS 3.x.

As I previously posted, using PLCameraController does not provide a concrete solution for AR since the captured image contains overlaid contents (a related post here). And, the problem have left unsolved.

But a guy left a comment that using two windows can solve the problem (I don't know his/her name, because the comment was written by anonymous). The idea is somehow weird because we usually makes only a window for a iPhone application, but there are other people saying that using two windows is a workaround for the problem on the Internet. So, I did some digging with the idea and eventually figured out how to do it.

As you may know, iPhone OS4 may allow full camera access for developers. I'm using iPhone for my research, so this method is useful until OS4 comes.

Now, I explain how to do it.

First, we need two windows. Just add one more UIWindow to your application delegate. We will add the PLCameraController's previewView to previewWindow.

UIWindow *window;
UIWindow *previewWindow ;

Then, go to the 'applicationDidFinishLaunching' method. Make the main window transparent to make the view that will be added to the previewWindow visible.

// Make the window background transparent
window.opaque = NO ;
window.backgroundColor = [UIColor clearColor] ;


Initialize the main view. In my case, the view will be an OpenGL view. I make the OpenGL view transparent to see the video background. If you don't know why, please refer my previous post about making video background without texture mapping.

glView = [[EAGLView alloc]
initWithFrame:CGRectMake(0, 0, video_width, video_height)
pixelFormat:kEAGLColorFormatRGBA8
depthFormat:GL_DEPTH_COMPONENT16_OES
stencilFormat:0
preserveBackbuffer:NO] ;
glView.opaque = NO ;
glView.backgroundColor = [UIColor clearColor] ;
[window addSubview:glView] ;


Initialize PLCameraController and the previewWindow.

if(camController != nil)
[camController release] ;
// Initialization of PLCameraController
camController = ..... ;

previewWindow = [[UIWindow alloc] init] ;
previewWindow.autoresizesSubviews = YES ;


Add the previewView of PLCameraController to the previewWindow and make the previewWindow visible. The size of 'preview.frame' becomes the size of image we will get from the camera. In this post, I just used (320,426).

UIView * preview = [camController previewView] ;
preview.frame = CGRectMake(0,0, 320, 426) ;
[previewWindow addSubview:preview] ;
[previewWindow makeKeyAndVisible] ;


Finally, make the main window visible.

[window makeKeyAndVisible];


Ok, run your application, then you will see a video preview. To retrieve RGBA data from the camera, a well known method is enough, something like this.

CoreSurfaceBufferRef coreSurfaceBuffer =
[cameraController _createPreviewIOSurface];
if (!coreSurfaceBuffer) return;
Surface *surface =
[[Surface alloc]initWithCoreSurfaceBuffer:coreSurfaceBuffer];
[surface lock];
previewHeight = surface.height;
previewWidth = surface.width;
previewBytesPerRow = previewWidth*4;
pixelDataLength = previewBytesPerRow*previewHeight;
void *pixels = surface.baseAddress;
int sufaceBytesPerRow = surface.bytesPerRow ;

// Copy the pixels to your buffer
memcpy(Your_buffer, pixels, pixelDataLength) ;


Here is a screen shot of my simple AR application, where a virtual character is synthesized on a real scene.



Note that the workaround I explained is not perfect. The limitation of this method is that the retrieved image sometimes have misaligned scanlines. Another problem is auto-focusing on 3GS. The focus rectangle appearing when the camera gives a focus is captured in the retrieved image data. I just couldn't disable it.

So, iPhone becomes closer to a good device for AR and with OS4, there will be a plenty of AR Apps in the store.


Update: To disable the focus rectangle, you need to implement PLCameraController's delegate method cameraControllerReadyStateChanged. So, set a delegate object to the PLCameraController and implement the method like this:

- (void)cameraControllerReadyStateChanged:(NSNotification *)aNotification
{
[cameraController setDontShowFocus:YES] ;
}

Then you will not see the focus rectangle. Thanks for your comment Arrix !

6 comments:

  1. Apple says you shouldn't have more than one window. So... :/

    If you're not going to listen to them, then you might as well use private apis.

    ReplyDelete
  2. Hello. The Apps using this method will never be accepted by Apple. This is just for research purpose.

    ReplyDelete
  3. I used the same technique in one of my AR demos http://arrix.blogspot.com/2010/04/augmented-reality-on-iphone-marker.html

    Actually its easy to get rid of the annoying focus box. Try
    [(PLCameraController *)cameraController setDontShowFocus:YES];
    after calling startPreview.

    To reliably hide the focus box for iPhone OS later than 3, you'll need to put the line in cameraControllerReadyStateChanged: of the PLCameraControllerDelegate.

    I've already tried the new API in OS 4 and what Steve promised seems to be true.

    Your blog posts are very informative and I always enjoy reading them.

    ReplyDelete
  4. Hi Arrix.
    I'll try the mehods and update the post.
    Thanks for your comment and interests on my posts.

    ReplyDelete
  5. Hi! Do you have some sample code to start?? A complete simple project, yet?? Thanks! Great work!

    ReplyDelete