Hi,It's a very nice app! Did you use OpenGl to draw over the iPhone video image?
Yes, I used OpenGL.
Hi!Could you point me in the right direction (resources, tips) as to how I can overlay GEOLOCATED objects on video? i've been struggling with trying to build my own AR app.cool app!
Hi, JP. Do you mean something like what 'Layar' does ? If so, you have to use the built-in compass of 3GS. You can arrange objects depending its direction by using the cylindrical coordinate system.
Hi Wonwoo, The annotation demo looks great. I am trying to implement similar functionality, could you maybe give me a hint as to how to use the cylindrical coordinate system for this purpose? I can't seem to work out the formula for using the compass reading (heading) for a smooth display of a graphical object that falls within the field of view. You seem to have accomplished this very nicely. Any hints , tips would be greatly appreciated.
Hi, uptownben. In this example, I didn't use the cylindrical coordinate system. The annotations are 3D objects and displayed on OpenGL ES view. So, I changed the camera's view depending on the compass and accelerometer.
Hi Wonwoo,Do you have any links for tutorials or code snippets for how to create and position Open GL ES views based on the acceleramoter and compass as a camera overlay?You have clearly mastered this concept, and any help would be greatlyappreciated.
Hello Shai. Controlling OpenGL ES view is not that difficult. You can change the camera's viewing direction (I mean rotations) depending on the values coming from the accelerometer and compass. Accelerometer values correspond to the pitch and roll of the camera, and the compass angles matches to the heading. In the video above, I put the virtual camera at the center of the scene and changed the camera's view. Only the rotations are changed. Hope this helps.
Hi Wonwoo Lee, could this app works on Iphone 3G, is there method to work without the compass? Thanks and great work..
Hello Julio. You can do the similar thing by tracking feature points in the video from the camera. Check this video out : http://www.youtube.com/watch?v=HgrJ3gwwP94
Would you have a tutorial you could share on how to set this up for those newer to iPhone programming and augmented reality?
Is it works at any angle/orientation of the device?Just can't see it in this video. When the device is upside down?
Hello Gergely. Well, I didn't check it in all cases. As I know, the compass works well in both portrait and landscape modes. But you need to control accelerometer values in each mode, because the pitch angle corresponds to different component.
Yeah, as far as I can imageine, some angular adjustments needed in different quadrants of the gravity vector. Fo the compass, maybe we need to project only the direction of the raw magnetic vector to different planes in different gravity vector quadrants. I just can't see the whole picture yet, but my suspicions are these for now.Anyway, how did you make the OpenGL view transparent? I have set the layer.opaque=NO, the glClearColor is also 0,0,0,0, but it still results in black background. Is there some blend functions missing?
@Gergely / Right, clearing the glView (0,0,0,0) makes the glView background transparent. But you have to check whether the glView you are using is initialized with a pixel format with alpha component, i.e, RGBA like kEAGLColorFormatRGBA8. Usually examples initialize the view with RGB only and glClearColor(0,0,0,0) doesn't work in those cases.
Ok, thanks, I'm gonna look after it. That is the only thing that is missing to get along.By the way, I've just finished the XYZ-360 implementation. I wrote about the details at http://gotoandplay.freeblog.hu/categories/xCode_-_augmented_reality/
@Gergely / Hi, I watched your implementation on your blog. It's really cool.
Thanks.For the future, as I watched the feature point pixel based rotation detection linked above, it should be stabilized with those points, and the jitter could be wiped out entirely.
Hi,Congrates for your work.I have misenderstood something about positionning of your labels. Did you use them reagarding a geographical position?
Hello Laurie. Yes, the positions of the annotations are determined based on the geographical locations.
Thanks for your answer,So your use geographical locations directly to put them on your openGL View? I try to understand on what plan you work and how the openGL View can makes things more simple.
@Laurie / Well, geolocation like GPS position is not exactly used for rendering. What I meant was the relationship between annotations are used for rendering. Anyway you need to make a rule to map between the geolocations and the screen coordinates....