Modern smart phone platforms, like Apple’s iPhone, come with a growing range of sensors; GPS, accelerometers, magnetometers and more recently gyroscopes. They also have a (near-)ubiquitous data connection, whether via a local wireless hotspot or via carrier data, and user positioning via multiple methods including GPS.
The development of location-aware, and location-fenced, applications on these devices has lead to an explosion in the use of location-aware, as opposed to marker-based, Augmented Reality interfaces.
Augmented Reality has become one of the killer applications for the iPhone platform. This workshop explores using the accelerometer, magnetometer, camera and GPS along with the Core Location Framework to determine the location and orientation of an iPhone device allowing you to build a simple location-aware AR toolkit. During the workshop you will be walked through building such an AR toolkit, that you can then extend and reuse in your own projects and iPhone applications.
19th–21st April 2011