Apache Kafka® x ClickHouse Kuala Lumpur Meetup
Details
Hello everyone! Join us for an IN PERSON Apache Kafka® x ClickHouse meetup on December 11th from 5:30 pm in Kuala Lumpur!
📍 Venue:
WORQ @ Glo Damansara
***
🗓 Agenda:
- 5:30 PM: Registration, dinner & networking
- 6:30 PM: Welcome and Introductions
- 6:40 PM: Derek Chia, Principal Support Engineer @ ClickHouse
- 7:00 PM: Karthikayan Muthuramalingam, Solutions Engineer, Confluent
- 7:20 PM: Talk 3
- 7:50 PM: Q&A, networking & light dinner
- 8:30 PM: Wrap-up
If you would like to speak at or host a meetup please let us know! community@confluent.io
***
💡 Speaker:
Derek Chia, Principal Support Engineer @ ClickHouse
Talk:
Maximising Analytics with ClickHouse and Kafka Integration
Abstract:
ClickHouse, a high-performance columnar database, excels in handling large-scale data inserts and fast analytical querying, making it ideal for building real-time reporting solutions. Kafka, a distributed streaming platform, complements ClickHouse by providing high-throughput data ingestion, message ordering, and fault tolerance. It acts as a data pipeline, efficiently handling real-time data streams between systems.
By integrating ClickHouse and Kafka, you create a powerful, scalable analytics solution that combines real-time data ingestion with fast, efficient querying. In this talk, we'll explore how these two tools work together to build high-performance analytics services, and show a live demo of how to set up ClickPipes for Kafka.
Bio:
With over 6 years of experience in software engineering and tech, Derek is a Senior Support Engineer at ClickHouse where he globally provides technical assistance, triage, and swift resolution to customers and users. Prior to ClickHouse, Derek served engineering roles at DSTA, AI Apprentice, and Ernst and Young. Derek graduated from National University of Singapore with a degree in Information Systems, and is a certified HashiCorp Associate, and an AWS Machine Learning Specialist.
-----
💡 Speaker:
Karthikayan Muthuramalingam, Solutions Engineer, Confluent
Talk:
Apache Kafka® 101
Abstract:
Apache Kafka is an event streaming platform used to collect, process, store, and integrate data at scale. It has numerous use cases including distributed logging, stream processing, data integration, and pub/sub messaging.
In order to make complete sense of what Kafka does, we’ll delve into what an “event streaming platform” is and how it works. So before delving into Kafka architecture or its core components, let’s discuss what an event is. This will help explain how Kafka stores events, how to get events in and out of the system, and how to analyze event streams.
Bio:
Karthikayan is a Solution Engineer at Confluent, specializing in Apache Kafka® and Confluent Platform. He leverages his expertise in IBM, Red Hat, and open-source technologies to help organizations build real-time data pipelines and derive actionable insights.
***
DISCLAIMER
BY ATTENDING THIS EVENT IN PERSON, you acknowledge that risk includes possible exposure to and illness from infectious diseases including COVID-19, and accept responsibility for this, if it occurs.
As the classroom is a mask-on setting, please be reminded that masks should still be worn at all times unless actively eating or drinking
NOTE: We are unable to cater for any attendees under the age of 18.
Apache Kafka® x ClickHouse Kuala Lumpur Meetup