In the contemporary world there are petabytes of information pouring at us from everywhere. We build our Big Data pipelines to try and make sense of all this information. But how do we make sure that we can ingest data into our system at the rate at which it is coming at us?
With AWS Kinesis the answer to this question is simple. During this talk you will learn how to use Kinesis as an elastic and reliable buffer for your incoming Big Data streams. You will see an example architecture of a real life system capable of storing thousands of events per second. You will learn about similarities and differences between Kinesis and Kafka. And of course enjoy a live demo.
The talk will be most useful to everyone who is interested in the design of Big Data pipelines and is new to AWS Kinesis.
Sign in to add slides, notes or videos to this session