Update: This method is no longer useful because Apple provides camera APIs now.
Here is a nice example that showing how to use a new feature, cameraOverlayView, for AR applications on iPhone. There is a example code, so that you can easily learn how to do AR on iPhone.
In the post, what the author try to do is grabbing a screenshot through UIGetScreenImage() and analyze it. UIGetScreenImage() just provide a screenshot, so that the overlaid contents remain on the screenshot. The author of the post try to remove the contents drawn on the overlaid view by simple interpolation.
I downloaded the example and tested it. The below image is screenshot obtained by UIGetScreenImage(). You see the overlaid green marks of edges.
And, after interpolation, the code gives the following image. It looks OK for now. You may see seams in the image, which is the interpolated pixels.
However, this approach may not work if we draw large objects like a cube since we cannot remove the cube through interpolation. Thus, this approach is definitely not good for tracking either. Another problem is interpolation, which takes some time to do it. It may take too much time on mobile phones.