Skip to content

Commit be4ab6b

Browse files
Merge pull request #292 from j3-signalroom/291-update-the-readmemd
Resolved #291.
2 parents 915d121 + bd8e601 commit be4ab6b

File tree

6 files changed

+26
-6
lines changed

6 files changed

+26
-6
lines changed

CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,10 @@ All notable changes to this project will be documented in this file.
44
The format is base on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
55

66

7+
## [0.12.06.000] - 2025-11-13
8+
### Changed
9+
- Issue [#291](https://github.com/j3-signalroom/kafka_cluster-topics-partition_count_recommender-tool/issues/291)
10+
711
## [0.12.05.000] - 2025-10-20
812
### Changed
913
- Issue [#242](https://github.com/j3-signalroom/kafka_cluster-topics-partition_count_recommender-tool/issues/242)

CHANGELOG.pdf

514 Bytes
Binary file not shown.

README.md

Lines changed: 20 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,11 @@ The service account needs to have [OrganizationAdmin](https://docs.confluent.io/
143143
6. Make note of the API key and secret in the output, which you will assign to the `confluent_cloud_api_key` and `confluent_cloud_api_secret` environment variables in the `.env` file. Alternatively, you can securely store and retrieve these credentials using AWS Secrets Manager.
144144

145145
#### **1.2.2 Create the `.env` file**
146-
Create the `.env` file and add the following environment variables, filling them with your Confluent Cloud credentials and other required values:
146+
Create the `.env` file and add the following environment variables, filling them with your Confluent Cloud credentials and other required values.
147+
148+
<details>
149+
<summary>Example `.env` file content</summary>
150+
147151
```shell
148152
# Set the flag to `True` to use the Confluent Cloud API key for fetching Kafka
149153
# credentials; otherwise, set it to `False` to reference `KAFKA_CREDENTIALS` or
@@ -210,6 +214,8 @@ KAFKA_WRITER_TOPIC_REPLICATION_FACTOR=<YOUR_KAFKA_WRITER_TOPIC_REPLICATION_FACTO
210214
KAFKA_WRITER_TOPIC_DATA_RETENTION_IN_DAYS=<YOUR_KAFKA_WRITER_TOPIC_DATA_RETENTION_IN_DAYS>
211215
```
212216

217+
</details>
218+
213219
The environment variables are defined as follows:
214220

215221
| Environment Variable Name | Type | Description | Example | Default | Required |
@@ -277,7 +283,10 @@ Then enter the following command below to run the tool:
277283
uv run python src/thread_safe_tool.py
278284
```
279285

280-
If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sample records from each topic to calculate the average record size in bytes. For example, below is a screenshot of the tool running successfully:
286+
If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sample records from each topic to calculate the average record size in bytes.
287+
288+
<details>
289+
<summary>Example log of the tool running successfully sampling records</summary>
281290

282291
```log
283292
2025-10-20 07:28:22 - INFO - fetch_confluent_cloud_credential_via_env_file - Retrieving the Confluent Cloud credentials from the .env file.
@@ -394,7 +403,12 @@ If `USE_SAMPLE_RECORDS` environment variable is set to `True`, the tool will sam
394403
2025-10-20 07:31:22 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.
395404
```
396405

397-
If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud Metrics API to retrieve the average and peak consumption in bytes over a rolling seven-day period. For example, below is a screenshot of the tool running successfully:
406+
</details>
407+
408+
If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud Metrics API to retrieve the average and peak consumption in bytes over a rolling seven-day period.
409+
410+
<details>
411+
<summary>Example log of the tool running successfully using the Metrics API</summary>
398412

399413
```log
400414
2025-10-20 06:37:47 - INFO - fetch_confluent_cloud_credential_via_env_file - Retrieving the Confluent Cloud credentials from the .env file.
@@ -527,7 +541,9 @@ If `USE_SAMPLE_RECORDS` is set to `False`, the tool will use the Confluent Cloud
527541
2025-10-20 06:38:48 - INFO - __log_summary_stats - ====================================================================================================
528542
2025-10-20 06:38:48 - INFO - _analyze_kafka_cluster - KAFKA CLUSTER lkc-5py812 TOPIC ANALYSIS COMPLETED SUCCESSFULLY.
529543
2025-10-20 06:38:49 - INFO - _analyze_kafka_cluster - Kafka API key RMW7B3RB4J4WWXEE for Kafka Cluster lkc-5py812 deleted successfully.
530-
2025-10-20 06:38:49 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.```
544+
2025-10-20 06:38:49 - INFO - main - SINGLE KAFKA CLUSTER ANALYSIS COMPLETED SUCCESSFULLY.
545+
```
546+
</details>
531547

532548
#### **1.3.1 Did you notice we prefix `uv run` to `python src/thread_safe_tool.py`?**
533549
You maybe asking yourself why. Well, `uv` is an incredibly fast Python package installer and dependency resolver, written in [**Rust**](https://github.blog/developer-skills/programming-languages-and-frameworks/why-rust-is-the-most-admired-language-among-developers/), and designed to seamlessly replace `pip`, `pipx`, `poetry`, `pyenv`, `twine`, `virtualenv`, and more in your workflows. By prefixing `uv run` to a command, you're ensuring that the command runs in an optimal Python environment.

README.pdf

8.14 KB
Binary file not shown.

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "kafka_cluster-topics-partition_count_recommender-tool"
3-
version = "0.12.05.000"
3+
version = "0.12.06.000"
44
description = "Kafka Cluster Topics Partition Count Recommender Tool"
55
readme = "README.md"
66
requires-python = ">=3.13"

uv.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)