Lessons learned handling 150M events with a serverless pipeline on AWS
2021-11-20 21:20 ~ 2021-11-20 21:40
In recent years the amount of data generated by brands increased dramatically, thanks to affordable storage costs and faster internet connections. In this article, we explore the advantages serverless technologies offer when dealing with a large amount of data and common pitfalls of these architectures.
We are going to outline tips everyone should figure out before starting your next big data project At Neosperience, building our SaaS cloud on AWS, we managed to leverage a number of AWS services.
This talk is a deep dive of the choices we made and the reason behind them that made us evolve a standard pipeline with API Gateway + Lambda + DynamoDB into an architecture, able to process hundreds of events per second.
In this journey, we’ll discover some unexpected behavior, tips, and hidden gems of AWS services and how to use them in a real-life use case. Basic knowledge of AWS services is required.
We are going to outline tips everyone should figure out before starting your next big data project At Neosperience, building our SaaS cloud on AWS, we managed to leverage a number of AWS services.
This talk is a deep dive of the choices we made and the reason behind them that made us evolve a standard pipeline with API Gateway + Lambda + DynamoDB into an architecture, able to process hundreds of events per second.
In this journey, we’ll discover some unexpected behavior, tips, and hidden gems of AWS services and how to use them in a real-life use case. Basic knowledge of AWS services is required.
I’m Luca Bianchi, passionate about serverless and machine learning serving as CTO at Neosperience and founder of ServerlessMeetup Italy and organizer of ServerlessDays Milano
SNS Infomation
- Twitter : @bianchiluca
- GitHub : @aletheia
- linkedin : lucabianchipavia
Applicable AWS Certification Program
- AWS Hero
- AWS Community Builder
Organization
Presentation Movie(Youtube)
Presentation Materials