Wednesday 19th November, 2014
2:30pm to 3:30pm
So, you're building responsive and resilient applications, scaling to deal with an ever expanding firehose of events arriving at your front door. You're filling storage by the terabyte without even trying, and that needs to be resilient, and responsive, and scalable too. So obviously you're storing your data using... well... what? Is there really a single technology that meets all your needs for persistence? And are the 'conventional' technologies really a lost cause?
In this talk we'll look at some of the successful - and less successful - strategies for managing high-frequency, high-volume data. We will explore what is technically possible when you need to record millions of messages per second durably without a bottomless budget, review the common storage options and what they are capable of, and also look at what is possible when you're willing to roll up your sleeves and write your own storage engine.
High Performance Specialist
Andrew Stewart currently leads the business analysis team at LMAX, and has spent more than 20 years working in software development in organisations ranging from 3-strong start-ups to global consultancy firms and on engagements spanning banking, insurance, pharmaceuticals, utilities, oil & gas, telecommunications, gambling and most recently foreign exchange trading systems. The bulk of his experience is in the data, modelling and MI space, specialising in high volume, high performance and inevitably high technology risk projects.
Sign in to add slides, notes or videos to this session