Thursday, September 24, 2009

Annotation overlay on iPhone 3GS

Recently I made a new demo showing how we can use iPhone's compass and accelerometer for information overlay. (iPhone 3GS, iPhone SDK 3.1)

Enjoy this !!


22 comments:

  1. Hi,

    It's a very nice app! Did you use OpenGl to draw over the iPhone video image?

    ReplyDelete
  2. Hi!

    Could you point me in the right direction (resources, tips) as to how I can overlay GEOLOCATED objects on video? i've been struggling with trying to build my own AR app.

    cool app!

    ReplyDelete
  3. Hi, JP. Do you mean something like what 'Layar' does ? If so, you have to use the built-in compass of 3GS. You can arrange objects depending its direction by using the cylindrical coordinate system.

    ReplyDelete
  4. Hi Wonwoo, The annotation demo looks great. I am trying to implement similar functionality, could you maybe give me a hint as to how to use the cylindrical coordinate system for this purpose? I can't seem to work out the formula for using the compass reading (heading) for a smooth display of a graphical object that falls within the field of view. You seem to have accomplished this very nicely. Any hints , tips would be greatly appreciated.

    ReplyDelete
  5. Hi, uptownben.
    In this example, I didn't use the cylindrical coordinate system. The annotations are 3D objects and displayed on OpenGL ES view. So, I changed the camera's view depending on the compass and accelerometer.

    ReplyDelete
  6. Hi Wonwoo,
    Do you have any links for tutorials or code snippets for how to
    create and position Open GL ES views based on the acceleramoter and compass as a camera overlay?

    You have clearly mastered this concept, and any help would be greatly
    appreciated.

    ReplyDelete
  7. Hello Shai.
    Controlling OpenGL ES view is not that difficult. You can change the camera's viewing direction (I mean rotations) depending on the values coming from the accelerometer and compass. Accelerometer values correspond to the pitch and roll of the camera, and the compass angles matches to the heading.

    In the video above, I put the virtual camera at the center of the scene and changed the camera's view. Only the rotations are changed.

    Hope this helps.

    ReplyDelete
  8. Hi Wonwoo Lee, could this app works on Iphone 3G, is there method to work without the compass?

    Thanks and great work..

    ReplyDelete
  9. Hello Julio.
    You can do the similar thing by tracking feature points in the video from the camera.
    Check this video out :
    http://www.youtube.com/watch?v=HgrJ3gwwP94

    ReplyDelete
  10. Would you have a tutorial you could share on how to set this up for those newer to iPhone programming and augmented reality?

    ReplyDelete
  11. Is it works at any angle/orientation of the device?
    Just can't see it in this video. When the device is upside down?

    ReplyDelete
  12. Hello Gergely.
    Well, I didn't check it in all cases.
    As I know, the compass works well in both portrait and landscape modes. But you need to control accelerometer values in each mode, because the pitch angle corresponds to different component.

    ReplyDelete
  13. Yeah, as far as I can imageine, some angular adjustments needed in different quadrants of the gravity vector. Fo the compass, maybe we need to project only the direction of the raw magnetic vector to different planes in different gravity vector quadrants. I just can't see the whole picture yet, but my suspicions are these for now.

    Anyway, how did you make the OpenGL view transparent? I have set the layer.opaque=NO, the glClearColor is also 0,0,0,0, but it still results in black background. Is there some blend functions missing?

    ReplyDelete
  14. @Gergely / Right, clearing the glView (0,0,0,0) makes the glView background transparent. But you have to check whether the glView you are using is initialized with a pixel format with alpha component, i.e, RGBA like kEAGLColorFormatRGBA8. Usually examples initialize the view with RGB only and glClearColor(0,0,0,0) doesn't work in those cases.

    ReplyDelete
  15. Ok, thanks, I'm gonna look after it. That is the only thing that is missing to get along.

    By the way, I've just finished the XYZ-360 implementation. I wrote about the details at http://gotoandplay.freeblog.hu/categories/xCode_-_augmented_reality/

    ReplyDelete
  16. @Gergely / Hi, I watched your implementation on your blog. It's really cool.

    ReplyDelete
  17. Thanks.

    For the future, as I watched the feature point pixel based rotation detection linked above, it should be stabilized with those points, and the jitter could be wiped out entirely.

    ReplyDelete
  18. Hi,

    Congrates for your work.
    I have misenderstood something about positionning of your labels. Did you use them reagarding a geographical position?

    ReplyDelete
  19. Hello Laurie.

    Yes, the positions of the annotations are determined based on the geographical locations.

    ReplyDelete
  20. Thanks for your answer,

    So your use geographical locations directly to put them on your openGL View? I try to understand on what plan you work and how the openGL View can makes things more simple.

    ReplyDelete
  21. @Laurie / Well, geolocation like GPS position is not exactly used for rendering. What I meant was the relationship between annotations are used for rendering. Anyway you need to make a rule to map between the geolocations and the screen coordinates....

    ReplyDelete