Skip to content

Commit 1eb7b2d

Browse files
committed
migrated to latest shodowJar plugin, gradle warnings
1 parent b7022de commit 1eb7b2d

File tree

5 files changed

+109
-85
lines changed

5 files changed

+109
-85
lines changed

README.adoc

Lines changed: 60 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
= Apache Flink for Java Developers Workshop
2-
Viktor Gamov <viktor@confluent.io>
2+
Viktor Gamov <viktor@confluent.io>, Sandon Jackobs <sjacobs@confluent.io>
33
v1.0, 2025-02-17
44
:toc:
55

@@ -13,87 +13,100 @@ Deadlines were looming, and the team needed a breakthrough.
1313

1414
And when you learned about Apache Flink.
1515

16-
We’ll explore how the DataStream API allowed efficient real-time data processing and how the Table API and SQL features simplified complex queries with familiar syntax.
16+
We will explore how the *DataStream API* allowed efficient real-time data processing and how the *Table API* and *SQL* features simplified complex queries with familiar syntax.
1717
Testing became more straightforward, and managing the application state was no longer a headache.
1818

1919
You’ll learn:
2020

21-
Harnessing the DataStream API: Process unbounded data streams efficiently to make your applications more responsive.
22-
Unlocking Table API and SQL: Use SQL queries within Flink to simplify data processing tasks without learning new languages.
23-
Effective Testing Strategies: Implement best practices for testing Flink applications to ensure your code is robust and reliable.
24-
Stateful Stream Processing: Manage application state effectively for complex event processing and real-time analytics.
21+
- Harnessing the DataStream API: Process unbounded data streams efficiently to make your applications more responsive.
22+
- Unlocking Table API and SQL: Use SQL queries within Flink to simplify data processing tasks without learning new languages.
23+
- Effective Testing Strategies: Implement best practices for testing Flink applications to ensure your code is robust and reliable.
24+
- Stateful Stream Processing: Manage application state effectively for complex event processing and real-time analytics.
2525

26-
By the end of this talk, you’ll be equipped to tackle real-time data challenges.
27-
Whether you are building analytics dashboards, event-driven systems, or handling data streams, Apache Flink can be the game-changer you’ve been searching for.
26+
By the end of this talk, you will be equipped to tackle real-time data challenges.
27+
Whether you're building analytics dashboards, event-driven systems, or handling data streams, Apache Flink can be the game-changer you’ve been searching for.
2828

2929
== 💻 Technical Prerequisites
30-
1. *Basic Programming Knowledge* – Familiarity with **Java** or **Scala** (ugh) (Flink supports both but not for long).
31-
2. *Understanding of Stream Processing Concepts* – Awareness of real-time data pipelines, event-driven architectures, and streaming frameworks.
32-
3. *SQL Proficiency* – Basic understanding of SQL for working with Flink’s **Table API**.
33-
4. *Linux/macOS Command Line Experience* – Ability to execute terminal commands and navigate a Unix-like environment.
30+
31+
. *Basic Programming Knowledge* – Familiarity with **Java** or **Scala** (ugh) (Flink supports both but not for long).
32+
. *Understanding of Stream Processing Concepts* – Awareness of real-time data pipelines, event-driven architectures, and streaming frameworks.
33+
. *SQL Proficiency* – Basic understanding of SQL for working with Flink’s **Table API**.
34+
. *Linux/macOS Command Line Experience* – Ability to execute terminal commands and navigate a Unix-like environment.
3435

3536
== 🔧 Required Software and Setup
3637

3738
=== 1️⃣ Docker & Docker Compose (Mandatory)
38-
*Why?* Flink and Kafka components will be containerized.
39-
*Download & Install:* https://www.docker.com/get-started[Docker Website]
40-
*macOS Alternative:* https://orbstack.dev/[OrbStack] (recommended for better performance)
39+
40+
- *Why?* Flink and Kafka components will be containerized.
41+
- *Download & Install:* https://www.docker.com/get-started[Docker Website]
42+
- *macOS Alternative:* https://orbstack.dev/[OrbStack] (recommended for better performance)
4143

4244
=== 2️⃣ Confluent Cloud Account (Mandatory)
43-
*Why?* Required for Kafka-based streaming exercises.
44-
*Sign Up & Get API Keys:* https://www.confluent.io/confluent-cloud/[Confluent Cloud]
45+
46+
- *Why?* Required for Kafka-based streaming exercises.
47+
- *Sign Up & Get API Keys:* https://www.confluent.io/confluent-cloud/[Confluent Cloud]
4548

4649
=== 3️⃣ Java 21 Installed
47-
*Why?* Apache Flink will run on Java 21.
48-
*Download Java 21 (Adoptium Temurin):* https://adoptium.net/[Adoptium Temurin]
49-
*Verify Installation:* Run `java -version` in the terminal.
50+
51+
- *Why?* Apache Flink will run on Java 21.
52+
- *Download Java 21 via SDKMAN:* https://sdkman.io[SDKMAN]
53+
- *Verify Installation:* Run `java -version` in the terminal.
5054

5155
=== 4️⃣ Git Installed
52-
*Why?* For cloning repositories and working with project files.
53-
*Download Git:* https://git-scm.com/downloads[Git Website]
5456

55-
=== 5️⃣ IDE with Flink Support
56-
*Why?* Recommended for Flink development.
57-
*Download IntelliJ IDEA (Recommended):* https://www.jetbrains.com/idea/download/[IntelliJ IDEA]
58-
*Download VS Code (Alternative):* https://code.visualstudio.com/download[VS Code]
57+
- *Why?* For cloning repositories and working with project files.
58+
- *Download Git:* https://git-scm.com/downloads[Git Website]
59+
60+
=== 5️⃣ IDE with Gradle Support
61+
62+
- *Why?* Recommended for Flink development.
63+
- *Download IntelliJ IDEA (Recommended):* https://www.jetbrains.com/idea/download/[IntelliJ IDEA]
64+
- *Download VS Code (Alternative):* https://code.visualstudio.com/download[VS Code]
65+
- *Confluent Extension for VSCode:* https://marketplace.visualstudio.com/items?itemName=confluentinc.confluent-vscode[link]
5966

6067
=== 6️⃣ Gradle Installed via Wrapper (No Need for Local Installation)
61-
*Why?* The workshop will use the **Gradle Wrapper**, eliminating the need for manual installation.
62-
*Gradle Docs:* https://docs.gradle.org/current/userguide/gradle_wrapper.html[Gradle Wrapper Documentation]
68+
69+
- *Why?* The workshop will use the **Gradle Wrapper**, eliminating the need for manual installation.
70+
- *Gradle Docs:* https://docs.gradle.org/current/userguide/gradle_wrapper.html[Gradle Wrapper Documentation]
6371

6472
=== 7️⃣ Quick Setup for macOS Users with Homebrew
65-
*Why?* Simplifies installation of all required dependencies.
66-
*How?* This project includes a `Brewfile` for managing dependencies.
67-
*Setup:* Run `make setup-mac` to install all required dependencies using Homebrew.
68-
*Update:* Run `make update-brew-deps` to update dependencies.
73+
74+
- *Why?* Simplifies the installation of all required dependencies.
75+
- *How?* This project includes a `Brewfile` for managing dependencies.
76+
- *Setup:* Run `make setup-mac` to install all required dependencies using Homebrew.
77+
- *Update:* Run `make update-brew-deps` to update dependencies.
6978

7079
== 🌐 Network & System Requirements
7180

72-
1. *Stable Internet Connection* – Required for downloading dependencies and connecting to Confluent Cloud.
73-
2. *8GB+ RAM Recommended* – Running Flink, Kafka, and other services may require significant memory.
74-
3. *Sufficient Disk Space (At Least 10GB Free)* – For Docker images, logs, and data processing.
81+
. *Stable Internet Connection* – Required for downloading dependencies and connecting to Confluent Cloud.
82+
. *8GB+ RAM Recommended* – Running Flink, and other services may require significant memory.
83+
. *Sufficient Disk Space (At Least 10GB Free)* – For Docker images, logs, and data processing.
7584

7685
== ⚙️ Optional but Recommended
7786

7887
=== 1️⃣ Terraform Installed
79-
*Why?* Useful for automated infrastructure setup.
80-
*Download Terraform:* https://developer.hashicorp.com/terraform/downloads[Terraform Website]
88+
89+
- *Why?* Useful for automated infrastructure setup in Confluent Cloud.
90+
- *Download Terraform:* https://developer.hashicorp.com/terraform/downloads[Terraform Website]
8191

8292
=== 2️⃣ Basic Understanding of Terraform and IaC (Infrastructure as Code)
83-
*Why?* If Terraform scripts are used, a fundamental knowledge of how it works would be beneficial.
84-
*Terraform Getting Started Guide:* https://developer.hashicorp.com/terraform/tutorials[Terraform Tutorials]
93+
94+
- *Why?* If Terraform scripts are used, a fundamental knowledge of how it works would be beneficial.
95+
- *Terraform Getting Started Guide:* https://developer.hashicorp.com/terraform/tutorials[Terraform Tutorials]
8596

8697
=== 3️⃣ Confluent CLI
87-
*Why?* The workshop will use the commands in the Confluent CLI to get useful information about new Confluent infrastructure.
88-
*Download and Install:* https://docs.confluent.io/confluent-cli/current/install.html[Confluent CLI Installation Instructions]
98+
99+
- *Why?* The workshop will use the commands in the Confluent CLI to get useful information about new Confluent infrastructure.
100+
- *Download and Install:* https://docs.confluent.io/confluent-cli/current/install.html[Confluent CLI Installation Instructions]
89101

90102
=== 4️⃣ jq
91-
*Why?* The workshop will use jq to build configuration files used to demonstrate the Confluent Flink Table API.
92-
*Download and Install:* https://jqlang.org/download/[jq Download Instructions]
103+
104+
- *Why?* The workshop will use jq to build configuration files used to demonstrate the Confluent Flink Table API.
105+
- *Download and Install:* https://jqlang.org/download/[jq Download Instructions]
93106

94107
== 📌 Pre-Workshop Setup Tasks
95108

96-
1. *Sign up for Confluent Cloud & Configure API Keys* – Ensure access credentials are available before the workshop.
97-
2. *Clone the Workshop Repository* – The repo will include pre-built examples and configuration files (GitHub link will be shared before the workshop).
98-
3. *Set Up Environment Variables* – Configure `JAVA_HOME` and authentication variables for Confluent Cloud.
99-
4. *Run a Simple Docker-Based Flink Job* – Validate that the environment is correctly configured.
109+
. *Sign up for Confluent Cloud & Configure API Keys* – Ensure access credentials are available before the workshop.
110+
. *Clone the Workshop Repository* – The repo will include pre-built examples and configuration files (GitHub link will be shared before the workshop).
111+
. *Set Up Environment Variables* – Configure `JAVA_HOME` and authentication variables for Confluent Cloud.
112+
. *Run a Simple Docker-Based Flink Job* – Validate that the environment is correctly configured.

build.gradle.kts

Lines changed: 22 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,11 @@
11
import org.gradle.api.tasks.testing.logging.TestExceptionFormat
22
import org.gradle.api.tasks.testing.logging.TestLogEvent
3+
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
34

45
plugins {
56
java
67
application
7-
id("com.github.johnrengelman.shadow") version "8.1.1" apply false
8+
id("com.gradleup.shadow") version "8.3.6" apply false
89
}
910

1011
allprojects {
@@ -21,30 +22,30 @@ allprojects {
2122

2223
subprojects {
2324
apply(plugin = "java")
24-
25+
2526
val flinkVersion = "1.20.0"
2627
val confluentVersion = "7.9.0"
2728
val junitVersion = "5.10.2"
2829
val logbackVersion = "1.4.14"
2930
val slf4jVersion = "2.0.11"
30-
31+
3132
dependencies {
3233
// Logging
3334
implementation("org.slf4j:slf4j-api:$slf4jVersion")
3435
implementation("ch.qos.logback:logback-classic:$logbackVersion")
35-
36+
3637
// Testing
3738
testImplementation("org.junit.jupiter:junit-jupiter-api:$junitVersion")
3839
testImplementation("org.junit.jupiter:junit-jupiter-params:$junitVersion")
3940
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:$junitVersion")
4041
}
41-
42+
4243
java {
4344
toolchain {
4445
languageVersion.set(JavaLanguageVersion.of(21))
4546
}
4647
}
47-
48+
4849
tasks.test {
4950
useJUnitPlatform()
5051
testLogging {
@@ -55,7 +56,7 @@ subprojects {
5556
showStackTraces = true
5657
}
5758
}
58-
59+
5960
tasks.withType<JavaCompile> {
6061
options.encoding = "UTF-8"
6162
options.compilerArgs.add("-parameters")
@@ -65,25 +66,25 @@ subprojects {
6566
// Configuration for the main application modules
6667
configure(subprojects.filter { it.name == "flink-streaming" || it.name == "flink-sql" }) {
6768
apply(plugin = "application")
68-
apply(plugin = "com.github.johnrengelman.shadow")
69-
69+
apply(plugin = "com.gradleup.shadow")
70+
7071
val flinkVersion = "1.20.0"
7172
val confluentVersion = "7.9.0"
72-
73+
7374
dependencies {
7475
// Common modules
7576
implementation(project(":common:models"))
7677
implementation(project(":common:utils"))
77-
78+
7879
// Flink Core
7980
implementation("org.apache.flink:flink-streaming-java:$flinkVersion")
8081
implementation("org.apache.flink:flink-clients:$flinkVersion")
8182
implementation("org.apache.flink:flink-runtime-web:$flinkVersion")
82-
83+
8384
// Flink Connectors
8485
implementation("org.apache.flink:flink-connector-kafka:3.4.0-1.20")
8586
implementation("org.apache.flink:flink-connector-files:$flinkVersion")
86-
87+
8788
// Confluent
8889
implementation("io.confluent:kafka-schema-registry-client:$confluentVersion")
8990
implementation("io.confluent:kafka-json-schema-serializer:$confluentVersion")
@@ -95,26 +96,26 @@ configure(subprojects.filter { it.name == "flink-streaming" || it.name == "flink
9596
project(":flink-sql") {
9697
dependencies {
9798
val flinkVersion = "1.20.0"
98-
99+
99100
// Flink Table API & SQL
100101
implementation("org.apache.flink:flink-table-api-java-bridge:$flinkVersion")
101102
implementation("org.apache.flink:flink-table-planner-loader:$flinkVersion")
102103
implementation("org.apache.flink:flink-table-runtime:$flinkVersion")
103104
}
104-
105+
105106
tasks.jar {
106107
manifest {
107108
attributes["Main-Class"] = "io.confluent.developer.sql.FlinkSqlMain"
108109
}
109110
}
110-
111-
tasks.named<com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar>("shadowJar") {
111+
112+
tasks.named<ShadowJar>("shadowJar") {
112113
archiveBaseName.set("flink-sql")
113114
archiveClassifier.set("")
114115
archiveVersion.set("")
115116
mergeServiceFiles()
116117
}
117-
118+
118119
application {
119120
mainClass.set("io.confluent.developer.sql.FlinkSqlMain")
120121
}
@@ -127,21 +128,15 @@ project(":flink-streaming") {
127128
attributes["Main-Class"] = "io.confluent.developer.streaming.FlinkStreamingMain"
128129
}
129130
}
130-
131-
tasks.named<com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar>("shadowJar") {
131+
132+
tasks.named<ShadowJar>("shadowJar") {
132133
archiveBaseName.set("flink-streaming")
133134
archiveClassifier.set("")
134135
archiveVersion.set("")
135136
mergeServiceFiles()
136137
}
137-
138+
138139
application {
139140
mainClass.set("io.confluent.developer.streaming.FlinkStreamingMain")
140141
}
141142
}
142-
143-
// Gradle wrapper task
144-
tasks.wrapper {
145-
gradleVersion = "8.5"
146-
distributionType = Wrapper.DistributionType.BIN
147-
}

common/utils/build.gradle.kts

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,24 @@
1+
plugins {
2+
`jvm-test-suite`
3+
}
4+
15
dependencies {
26
// Test dependencies
3-
testImplementation("org.junit.jupiter:junit-jupiter-api:5.10.0")
4-
testImplementation("org.junit.jupiter:junit-jupiter-params:5.10.0")
5-
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:5.10.0")
7+
testImplementation(platform("org.junit:junit-bom:5.10.0"))
8+
testImplementation("org.junit.jupiter:junit-jupiter")
9+
testImplementation("org.junit.jupiter:junit-jupiter-params")
10+
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine")
611
testImplementation("org.assertj:assertj-core:3.24.2")
712
}
813

9-
tasks.test {
14+
testing {
15+
suites {
16+
val test by getting(JvmTestSuite::class) {
17+
useJUnitJupiter()
18+
}
19+
}
20+
}
21+
22+
tasks.named<Test>("test") {
1023
useJUnitPlatform()
1124
}

common/utils/src/main/java/io/confluent/developer/utils/ConfigUtils.java

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,11 @@
33
import org.slf4j.Logger;
44
import org.slf4j.LoggerFactory;
55

6-
import java.io.File;
7-
import java.io.FileInputStream;
86
import java.io.IOException;
97
import java.io.InputStream;
8+
import java.nio.file.Files;
9+
import java.nio.file.Path;
10+
import java.nio.file.Paths;
1011
import java.util.Properties;
1112

1213
/**
@@ -36,9 +37,9 @@ public static Properties loadProperties(String filePath) {
3637
}
3738

3839
// Then try to load from file system
39-
File file = new File(filePath);
40-
if (file.exists()) {
41-
try (FileInputStream input = new FileInputStream(file)) {
40+
Path path = Paths.get(filePath);
41+
if (Files.exists(path)) {
42+
try (InputStream input = Files.newInputStream(path)) {
4243
properties.load(input);
4344
LOG.info("Loaded properties from file system: {}", filePath);
4445
return properties;

flink-data-generator/build.gradle.kts

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,8 @@
1+
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
2+
13
plugins {
24
application
3-
id("com.github.johnrengelman.shadow")
5+
id("com.gradleup.shadow")
46
}
57

68
val flinkVersion = "1.20.0"
@@ -35,7 +37,7 @@ tasks.jar {
3537
}
3638
}
3739

38-
tasks.named<com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar>("shadowJar") {
40+
tasks.named<ShadowJar>("shadowJar") {
3941
archiveBaseName.set("flink-data-generator")
4042
archiveClassifier.set("")
4143
archiveVersion.set("")

0 commit comments

Comments
 (0)