From 0032a9c1c36963dc0e10c057fb50bb181851bace Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Fabiana=20=F0=9F=9A=80=20=20Campanari?= <113218619+FabianaCampanari@users.noreply.github.com> Date: Wed, 8 Jan 2025 23:22:25 -0300 Subject: [PATCH] Update README.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Signed-off-by: Fabiana 🚀 Campanari <113218619+FabianaCampanari@users.noreply.github.com> --- README.md | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/README.md b/README.md index 13794a9..e94581e 100644 --- a/README.md +++ b/README.md @@ -132,7 +132,15 @@ Feel free to explore, contribute, and share your insights! 8. **Claude Shannon (1948)** * Although Shannon is primarily known for classical information theory, his definition of entropy plays a crucial role in both quantum computing and quantum information theory. Shannon's entropy measures the uncertainty of a random variable, and this concept extends to quantum systems, forming the foundation for quantum information theory. + + Formula for Shannon Entropy (used in quantum information theory):** + $\huge \color{DeepSkyBlue} H = -\sum p_i \log p_i$ + + Where: + - $H$ is the entropy of the system (quantifies uncertainty or information). + - $p_i$ represents the probability of the $i^{th}$ event or outcome +