- ClickHouse + Confluent Kuala Lumpur MeetupWORQ @ Glo Damansara, Kuala Lumpur
Calling all data enthusiasts in Kuala Lumpur: ClickHouse and Confluent are headed to your city! Join us for an evening of sharing, learning, and connecting with fellow data professionals and enthusiasts on various use cases where real-time data analytics come alive.
Agenda
- 5:30 PM: Registration & networking
- 6:30 PM: Welcome and Introductions
- 6:40 PM: Maximising Analytics with ClickHouse and Kafka Integration
- 7:00 PM: Apache Kafka® 101
- 7:20 PM: Talk track 3
- 7:50 PM: Q&A, networking & light dinner
- 8:30 PM: Wrap-up
👉🏼 RSVP to secure your spot!
______________________________________________🎤 Session Details: Maximising Analytics with ClickHouse and Kafka Integration
ClickHouse, a high-performance columnar database, excels in handling large-scale data inserts and fast analytical querying, making it ideal for building real-time reporting solutions. Kafka, a distributed streaming platform, complements ClickHouse by providing high-throughput data ingestion, message ordering, and fault tolerance. It acts as a data pipeline, efficiently handling real-time data streams between systems.By integrating ClickHouse and Kafka, you create a powerful, scalable analytics solution that combines real-time data ingestion with fast, efficient querying. In this talk, we'll explore how these two tools work together to build high-performance analytics services, and show a live demo of how to set up ClickPipes for Kafka.
Speaker: Derek Chia, Principal Support Engineer @ ClickHouse
With over 6 years of experience in software engineering and tech, Derek is a Senior Support Engineer at ClickHouse where he globally provides technical assistance, triage, and swift resolution to customers and users. Prior to ClickHouse, Derek served engineering roles at DSTA, AI Apprentice, and Ernst and Young. Derek graduated from National University of Singapore with a degree in Information Systems, and is a certified HashiCorp Associate, and an AWS Machine Learning Specialist.🎤 Session Details: Apache Kafka® 101
Apache Kafka is an event streaming platform used to collect, process, store, and integrate data at scale. It has numerous use cases including distributed logging, stream processing, data integration, and pub/sub messaging.In order to make complete sense of what Kafka does, we’ll delve into what an “event streaming platform” is and how it works. So before delving into Kafka architecture or its core components, let’s discuss what an event is. This will help explain how Kafka stores events, how to get events in and out of the system, and how to analyze event streams.
Speaker: Karthikayan Muthuramalingam, Solutions Engineer, Confluent
Karthikayan is a Solution Engineer at Confluent, specializing in Apache Kafka® and Confluent Platform. He leverages his expertise in IBM, Red Hat, and open-source technologies to help organizations build real-time data pipelines and derive actionable insights.
_____________________________________________________Interested in sharing a talk at this meetup or future events? Complete this CFP form or email [email protected]; we’ll be in touch.