Skip to main content

Posts

Showing posts from September, 2024

The Evolution of Information in Philosophy and AI

Claude Shannon, often called the "father of information theory," developed a groundbreaking way to understand communication. His theory, created in the 1940s, showed how information could be transmitted efficiently, whether through telegraphs, radios, or computers. Shannon introduced the idea of entropy , which measures uncertainty in a message. For example, a completely random message has high entropy, while a predictable one has low entropy. Shannon’s work also addressed how noise, or interference, can affect communication and how redundancy can help correct errors. The formula for Shannon's Entropy illustrates how the probability of each symbol contributes to the overall uncertainty or "information" in a system. This foundational equation in information theory has broad implications in both technology and philosophy, raising questions about the nature of knowledge and reality. (Najera, Jesus. “Intro To Information Theory.” Setzeus, 18 March 2020,  https://www