Your current filters are…
Core Audio, the only media framework available since day one of the public iPhone SDK, offers extremely low latency and powerful access to the device's audio processing system... assuming you can handle what's renowned as one of the hardest APIs on the platform. In iOS 5, Core Audio gets even better, with great new features that had previous been burdensome, if not impossible, to develop on your own. Once the iOS 5 NDA drops, the shiny new bits will be available to all, and this talk will be one of your first chances to learn how they work. Attendees will learn the basics of Core Audio -- the engine APIs that process sound (Audio Queue, Audio Units, and OpenAL) and the helper APIs that get samples into and out of them -- and then look where iOS 5 fills in some of the holes that have existed up to now.
AV Foundation -- introduced in iOS 4, ported to Lion, and enhanced further in iOS 5 -- delivers a comprehensive framework for audio and video capture and playback. The capture functionality is so good, it's now the preferred option for still photography applications. In this session, we'll focus squarely on AV Foundation as a media capture framework. Attendees will learn:
* How to get the most out of the device for still photography, by using AV Foundation to access the flash, white-balance, and image resolution.
* How to capture audio and video to the file system
* How to process incoming audio and video capture buffers in memory, to create real-time effects or pick out interesting parts of the scene on the fly
12th–13th November 2011