You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,6 +4,10 @@ All notable changes to this project will be documented in this file.
4
4
The format is base on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
Copy file name to clipboardExpand all lines: README.md
+20-4Lines changed: 20 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -143,7 +143,11 @@ The service account needs to have [OrganizationAdmin](https://docs.confluent.io/
143
143
6. Make note of the API key and secret in the output, which you will assign to the `confluent_cloud_api_key` and `confluent_cloud_api_secret` environment variables in the `.env` file. Alternatively, you can securely store and retrieve these credentials using AWS Secrets Manager.
144
144
145
145
#### **1.2.2 Create the `.env` file**
146
-
Create the `.env` file and add the following environment variables, filling them with your Confluent Cloud credentials and other required values:
146
+
Create the `.env` file and add the following environment variables, filling them with your Confluent Cloud credentials and other required values.
147
+
148
+
<details>
149
+
<summary>Example `.env` file content</summary>
150
+
147
151
```shell
148
152
# Set the flag to `True` to use the Confluent Cloud API key for fetching Kafka
149
153
# credentials; otherwise, set it to `False` to reference `KAFKA_CREDENTIALS` or
| Environment Variable Name | Type | Description | Example | Default | Required |
@@ -277,7 +283,10 @@ Then enter the following command below to run the tool:
277
283
uv run python src/thread_safe_tool.py
278
284
```
279
285
280
-
If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sample records from each topic to calculate the average record size in bytes. For example, below is a screenshot of the tool running successfully:
286
+
If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sample records from each topic to calculate the average record size in bytes.
287
+
288
+
<details>
289
+
<summary>Example log of the tool running successfully sampling records</summary>
281
290
282
291
```log
283
292
2025-10-20 07:28:22 - INFO - fetch_confluent_cloud_credential_via_env_file - Retrieving the Confluent Cloud credentials from the .env file.
@@ -394,7 +403,12 @@ If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sam
394
403
2025-10-20 07:31:22 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.
395
404
```
396
405
397
-
If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud Metrics API to retrieve the average and peak consumption in bytes over a rolling seven-day period. For example, below is a screenshot of the tool running successfully:
406
+
</details>
407
+
408
+
If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud Metrics API to retrieve the average and peak consumption in bytes over a rolling seven-day period.
409
+
410
+
<details>
411
+
<summary>Example log of the tool running successfully using the Metrics API</summary>
398
412
399
413
```log
400
414
2025-10-20 06:37:47 - INFO - fetch_confluent_cloud_credential_via_env_file - Retrieving the Confluent Cloud credentials from the .env file.
@@ -527,7 +541,9 @@ If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud
527
541
2025-10-20 06:38:48 - INFO - __log_summary_stats - ====================================================================================================
2025-10-20 06:38:49 - INFO - _analyze_kafka_cluster - Kafka API key RMW7B3RB4J4WWXEE for Kafka Cluster lkc-5py812 deleted successfully.
530
-
2025-10-20 06:38:49 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.```
544
+
2025-10-20 06:38:49 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.
545
+
```
546
+
</details>
531
547
532
548
#### **1.3.1 Did you notice we prefix `uv run` to `python src/thread_safe_tool.py`?**
533
549
You maybe asking yourself why. Well, `uv` is an incredibly fast Python package installer and dependency resolver, written in [**Rust**](https://github.blog/developer-skills/programming-languages-and-frameworks/why-rust-is-the-most-admired-language-among-developers/), and designed to seamlessly replace `pip`, `pipx`, `poetry`, `pyenv`, `twine`, `virtualenv`, and more in your workflows. By prefixing `uv run` to a command, you're ensuring that the command runs in an optimal Python environment.
0 commit comments