You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Viktor Gamov <viktor@confluent.io>, Sandon Jackobs <sjacobs@confluent.io>
3
3
v1.0, 2025-02-17
4
4
:toc:
5
5
@@ -13,87 +13,100 @@ Deadlines were looming, and the team needed a breakthrough.
13
13
14
14
And when you learned about Apache Flink.
15
15
16
-
We’ll explore how the DataStream API allowed efficient real-time data processing and how the Table API and SQL features simplified complex queries with familiar syntax.
16
+
We will explore how the *DataStream API* allowed efficient real-time data processing and how the *Table API* and *SQL* features simplified complex queries with familiar syntax.
17
17
Testing became more straightforward, and managing the application state was no longer a headache.
18
18
19
19
You’ll learn:
20
20
21
-
• Harnessing the DataStream API: Process unbounded data streams efficiently to make your applications more responsive.
22
-
• Unlocking Table API and SQL: Use SQL queries within Flink to simplify data processing tasks without learning new languages.
23
-
• Effective Testing Strategies: Implement best practices for testing Flink applications to ensure your code is robust and reliable.
24
-
• Stateful Stream Processing: Manage application state effectively for complex event processing and real-time analytics.
21
+
- Harnessing the DataStream API: Process unbounded data streams efficiently to make your applications more responsive.
22
+
- Unlocking Table API and SQL: Use SQL queries within Flink to simplify data processing tasks without learning new languages.
23
+
- Effective Testing Strategies: Implement best practices for testing Flink applications to ensure your code is robust and reliable.
24
+
- Stateful Stream Processing: Manage application state effectively for complex event processing and real-time analytics.
25
25
26
-
By the end of this talk, you’ll be equipped to tackle real-time data challenges.
27
-
Whether you are building analytics dashboards, event-driven systems, or handling data streams, Apache Flink can be the game-changer you’ve been searching for.
26
+
By the end of this talk, you will be equipped to tackle real-time data challenges.
27
+
Whether you're building analytics dashboards, event-driven systems, or handling data streams, Apache Flink can be the game-changer you’ve been searching for.
28
28
29
29
== 💻 Technical Prerequisites
30
-
1. *Basic Programming Knowledge* – Familiarity with **Java** or **Scala** (ugh) (Flink supports both but not for long).
31
-
2. *Understanding of Stream Processing Concepts* – Awareness of real-time data pipelines, event-driven architectures, and streaming frameworks.
32
-
3. *SQL Proficiency* – Basic understanding of SQL for working with Flink’s **Table API**.
33
-
4. *Linux/macOS Command Line Experience* – Ability to execute terminal commands and navigate a Unix-like environment.
30
+
31
+
. *Basic Programming Knowledge* – Familiarity with **Java** or **Scala** (ugh) (Flink supports both but not for long).
32
+
. *Understanding of Stream Processing Concepts* – Awareness of real-time data pipelines, event-driven architectures, and streaming frameworks.
33
+
. *SQL Proficiency* – Basic understanding of SQL for working with Flink’s **Table API**.
34
+
. *Linux/macOS Command Line Experience* – Ability to execute terminal commands and navigate a Unix-like environment.
34
35
35
36
== 🔧 Required Software and Setup
36
37
37
38
=== 1️⃣ Docker & Docker Compose (Mandatory)
38
-
*Why?* Flink and Kafka components will be containerized.
=== 2️⃣ Basic Understanding of Terraform and IaC (Infrastructure as Code)
83
-
*Why?* If Terraform scripts are used, a fundamental knowledge of how it works would be beneficial.
84
-
*Terraform Getting Started Guide:* https://developer.hashicorp.com/terraform/tutorials[Terraform Tutorials]
93
+
94
+
- *Why?* If Terraform scripts are used, a fundamental knowledge of how it works would be beneficial.
95
+
- *Terraform Getting Started Guide:* https://developer.hashicorp.com/terraform/tutorials[Terraform Tutorials]
85
96
86
97
=== 3️⃣ Confluent CLI
87
-
*Why?* The workshop will use the commands in the Confluent CLI to get useful information about new Confluent infrastructure.
88
-
*Download and Install:* https://docs.confluent.io/confluent-cli/current/install.html[Confluent CLI Installation Instructions]
98
+
99
+
- *Why?* The workshop will use the commands in the Confluent CLI to get useful information about new Confluent infrastructure.
100
+
- *Download and Install:* https://docs.confluent.io/confluent-cli/current/install.html[Confluent CLI Installation Instructions]
89
101
90
102
=== 4️⃣ jq
91
-
*Why?* The workshop will use jq to build configuration files used to demonstrate the Confluent Flink Table API.
92
-
*Download and Install:* https://jqlang.org/download/[jq Download Instructions]
103
+
104
+
- *Why?* The workshop will use jq to build configuration files used to demonstrate the Confluent Flink Table API.
105
+
- *Download and Install:* https://jqlang.org/download/[jq Download Instructions]
93
106
94
107
== 📌 Pre-Workshop Setup Tasks
95
108
96
-
1. *Sign up for Confluent Cloud & Configure API Keys* – Ensure access credentials are available before the workshop.
97
-
2. *Clone the Workshop Repository* – The repo will include pre-built examples and configuration files (GitHub link will be shared before the workshop).
98
-
3. *Set Up Environment Variables* – Configure `JAVA_HOME` and authentication variables for Confluent Cloud.
99
-
4. *Run a Simple Docker-Based Flink Job* – Validate that the environment is correctly configured.
109
+
. *Sign up for Confluent Cloud & Configure API Keys* – Ensure access credentials are available before the workshop.
110
+
. *Clone the Workshop Repository* – The repo will include pre-built examples and configuration files (GitHub link will be shared before the workshop).
111
+
. *Set Up Environment Variables* – Configure `JAVA_HOME` and authentication variables for Confluent Cloud.
112
+
. *Run a Simple Docker-Based Flink Job* – Validate that the environment is correctly configured.
0 commit comments