Efficiently write 10B+ events/day using Kafka - DevAfterWork

Efficiently write 10B+ events/day using Kafka

Efficiently write 10B+ events/day using Kafka
DevAfterWork powered by CrowdStrike

Processing hundreds of thousands of events per second and optimizing for both availability and cost is not an easy task.

This talk will showcase how we used a data-driven approach to leverage Kafka and create a generic batching framework that enabled us to solve a real-world use-case in one of our products.

3 reasons why should I participate in the keynote

  • Learn how to use Kafka as a transaction log in the context of distributed systems.
  • Understand how to optimize database writes using batching.
  • Learn how to design reusable components in a microservice ecosystem.



Back to Event