Saturday 24th October, 2015
3:30pm to 4:00pm
You have heard the hype about Apache Spark using Python, and would like to learn more?
Distributed Computing is becoming more and more prevalent with the rise of big data, multicore processors and scale-out architecture.
This talk will give an introduction to Parallel Programming using Apache Spark and Python, how you can leverage it in you day-to-day programming, and the core Functional Principles that are making it scale.
"- 35 years old German living in Cork, Master in Computer Science, University of Karlsruhe (Cryptography, Robotics, Compiler Construction)
- My ""passion"" is functional programming (in particular haskell)
- I have been programming in python since about 2002, and have used it for many personal and semi-public projects (NLTK, django, ...)"
Sign in to add slides, notes or videos to this session