Business values data more and more and the key to unlocking its potential is said to be hidden in BigData processing. However, BigData needs input. How to cover critical business process of the system with data gathering points? How to make sure that data can be easily processed after it’s generated? How to manage changes of produced data? One possible answer is designing event-driven systems.
During this workshop you will learn:
· how to design events
· how to communicate using events
· what are the pitfalls when writing services communicating via events
· how to create and enforce contracts for data
All this while building a simple event-based system with Kafka.
1. Feeding BigData: Event driven architecture.
- Events: introduction. Are microservices obligatory?
- Designing meaningful events.
- Kafka as message bus.
- Producing and receiving events - good practices.
- From Kafka to data store.
- Contracts and schemas - how to manage changes.
The workshop will run for 8 hours from 9AM until 5PM. There will be a few coffee breaks and one 1-hour lunch break (on your own).
- Solid knowledge of Java
- Laptop with preinstalled (IDE, git, docker 1.10+ with docker compose)
Trainer - Adam Dubiel
Adam - developer, team leader & product owner. Successful deployment and satisfied customers is what he finds the most rewarding in his line of work. Loves programming for the power to create usable software out of nothing and seeing the effects immediately. He values working code and hard numbers over declarations and promises. Currently working at allegro.pl, where he leads team responsible for internal technical services.
Tickets price includes
- Full-day workshop
- Coffee & Tea
- Wi-Fi access
- Workshop attendance certificate
What makes us different?
- Over 4000 participants
- 98% satisfied clients
- 9 years' experience
- Unique offer of over 200 specialised training courses
- Over 100 active coaches and consultants