Your current filters are…
Companies are collecting more data today than ever before, but as the volume of data increases from terabytes to petabytes many companies are struggling to make good use of this data.
This session will show some of the ways that the latest analytic databases and traditional data-warehousing design can be used in this new data driven world to provide customers with on-demand reporting over very large data sets.
How can we build flexible on-demand reporting systems over huge data sets?
How can we scale up existing systems as the volumes of data we are all collecting increases so quickly?
How can modern analytic systems be used with big data processing tools such as Hadoop, Hive and Pig to bring on-demand/custom reporting over very large data sets?
by Mat Morrison
The Facebook Insights is a horribly limited data environment. Learn how to query the Insights API to get proper data, and how to spider your Pages to get the information that Facebook forgot to give you. Then learn how to apply this to your business.
Questions answered :
Learn how to answer these questions * When is the best time to post (day part, week part) * As we build fans how does that increase our ability to reach an earned audience and a paid audience with sponsored stories? * As we build our fan base to what extent do they remain active? * What's the optimal post frequency for this audience? How does post frequency affect reach, hides and unsubscribes? * What is the interplay between reach and engagement? And how how does engagement impact reach? * What is the shape of the community, and how should this affect communications planning? Who are the most frequent user commentators, and for what share of UGC do they account? What does the communication flow look like in terms of one-to-many, many-to-one, one-to-one, and peer-to-peer?
21st May to 1st June 2012