search
HomeCommon ProblemInformation entropy calculation formula

Information entropy calculation formula

Jun 10, 2019 pm 01:35 PM
information entropy

Information is a very abstract concept. People often say there is a lot of information, or there is less information, but it is difficult to say exactly how much information there is. For example, how much information does a Chinese book of 500,000 words contain?

Information entropy calculation formula

It was not until 1948 that Shannon proposed the concept of "information entropy" to solve the problem of quantitative measurement of information. The term information entropy is C. E. Shannon borrowed it from thermodynamics. Thermal entropy in thermodynamics is a physical quantity that expresses the degree of disorder of a molecular state. Shannon used the concept of information entropy to describe the uncertainty of the information source. (Recommended study: PHP Video Tutorial)

Claude Elwood Shannon, the father of information theory, used mathematical language to clarify the relationship between probability and information redundancy for the first time.

The father of information theory, C. E. Shannon, pointed out in the paper "A Mathematical Theory of Communication" published in 1948 that there is redundancy in any information, and the size of the redundancy is related to each symbol in the information ( numbers, letters, or words), or uncertainty.

Shannon drew on the concept of thermodynamics and called the average amount of information after excluding redundancy "information entropy", and gave a mathematical expression for calculating information entropy.

Information meaning

Modern definition

Information is the indication of matter, energy, information and its properties. [Inverse Wiener information definition]

Information is an increase in certainty. [Inverse Shannon Information Definition]

Information is a collection of things and their attribute identifiers. 【2002】

Initial definition

Claude E. Shannon, one of the originators of information theory, defined information (entropy) as the probability of occurrence of discrete random events.

The so-called information entropy is a rather abstract concept in mathematics. Here, we might as well understand information entropy as the probability of occurrence of a certain kind of information. Information entropy and thermodynamic entropy are closely related. According to Charles H. Bennett's reinterpretation of Maxwell's Demon, the destruction of information is an irreversible process, so the destruction of information is consistent with the second law of thermodynamics. Generating information is the process of introducing negative (thermodynamic) entropy into the system. Therefore, the sign of information entropy should be opposite to that of thermodynamic entropy.

Generally speaking, when a piece of information has a higher probability of appearing, it means that it has been spread more widely, or that it has been cited to a higher extent. We can think that from the perspective of information dissemination, information entropy can represent the value of information. In this way, we have a standard to measure the value of information and can make more inferences about knowledge circulation issues.

Calculation formula

H(x) = E[I(xi)] = E[ log(2,1/P(xi)) ] = -∑ P(xi)log(2,P(xi)) (i=1,2,..n)

where, x represents a random variable, corresponding to which is the set of all possible outputs, defined is a symbol set, and the output of the random variable is represented by x. P(x) represents the output probability function. The greater the uncertainty of the variable, the greater the entropy, and the greater the amount of information required to figure it out.

For more PHP related technical articles, please visitPHP graphic tutorial column for learning!

The above is the detailed content of Information entropy calculation formula. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

VSCode Windows 64-bit Download

VSCode Windows 64-bit Download

A free and powerful IDE editor launched by Microsoft

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

MinGW - Minimalist GNU for Windows

MinGW - Minimalist GNU for Windows

This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

SecLists

SecLists

SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.