by Mark Levy
Last.fm is well known as an early adopter of Hadoop as a way of storing billions of scrobbles and computing charts from them. We also increasingly use Hadoop to scale more algorithmic tasks to large datasets, and I'll talk briefly about three of our projects in this area:
* topic analysis with Latent Dirichlet Allocation
* graph-based recommendations
* audio analysis
by Sean Owen
Apache Mahout is the scalable machine learning library built on top of Hadoop Map Reduce. This presentation goes though an introduction to the Mahout project and how it achieves collaborative filtering at scale.
14th April 2011