•  

Session 1 : Your Brain in the Internet of Things Session 2: Building a Thought Controlled Drone

A session at IoT Conference May 4-5 2015, Santa Clara, CA, USA

Monday 4th May, 2015

5:20pm to 6:00pm (PST)

Your Brain in the Internet of Things( Abstract)
A new generation of wearable devices are coming out that detect our thoughts, feelings, and facial expressions via the electrical currents our body produces. These new devices interface directly with mobile devices via a Bluetooth connection and are designed to provide input all day long. What potential does the Brain-Computer Interface provide for wearable purposes? What is the difference between passive monitoring and active detection? What sort of input is possible? Is the Brain-Computer Interface one way, or is direct brain input possible too? In this class, we will dive into the future of thought as input for wearable development with real-world examples and code. Demonstrations will be shown using the Emotiv EPOC headset, a high resolution, neuro-signal acquisition and processing wireless neuroheadset that uses a set of sensors to tune into electric signals produced by the brain to detect thoughts, feelings and expressions. Additional devices will be discussed and may be used as they become available. You will see the EEG neuroheadset and brain-computer interface with examples of interfacing with desktop and mobile apps. We will dive into the roots of the technology, showing code and examples along with big pictures of the technology. You will walk away with an understanding of how this still evolving and largely unknown technology really works, how it can be used, as well as longer-term implications.

Building a Thought Controlled Drone(Abstract)
This session dives into the intricacies of working with cognitive thought input and mapping that to real world control of a semi-autonomous drone. This covers the SDK for input from the Brain-Computer Interface and the API for controlling the drone. The Brain-Computer Interface devices started on the desktop, but mobile SDKs are coming out now. This class will look at differences and similarities between the desktop and mobile, and options to expand the reach from desktop to mobile. Beyond cognitive input, will also look at reading mental states, emotions and facial expressions via the Brain-Computer Interface, and discuss usages for each input type.

About the speaker

This person is speaking at this event.
Jim McKeeth

Lead Developer Evangelist

As developer relations lead evangelist at Embarcadero Technologies, Jim is a key part of Embarcadero's evangelism team and developer community outreach. Jim is a world-renowned and well-respected software development expert with more than 20 years of programming experience. He is an active member of the global developer community. He is the creator of the popular Podcast at Delphi.org interview program. He speaks at many industry and company conferences and events all across the globe. Jim is also a regular speaker at user group meetings, and runs the local Google Developer Group in Boise, Idaho. Jim holds the patent for the swipe-to-unlock and pattern unlocks used on both iPhone and Android phones, plus a number of other computer and software related patents. He is passionate about helping developers move their ideas and applications forward to new levels.

Sign in to add slides, notes or videos to this session

Tell your friends!

When

Time 5:20pm6:00pm PST

Date Mon 4th May 2015

Session Hash Tag

#bigdataconf

Short URL

lanyrd.com/sdmrgy

Official session page

globalbigdataconference.com/…tml

View the schedule

Share

See something wrong?

Report an issue with this session