Information entropy is used to measure information. The greater the uncertainty, the greater the information and entropy. The smaller the uncertainty, the smaller the information and entropy.
As far as information entropy is concerned, it represents the uncertainty of events, and the role of information is to reduce this uncertainty. The input of information is equal to the reduction of uncertainty of events, that is, the reduction of entropy, so entropy itself is not a measure of information, but a measure of event uncertainty, and entropy reduction is a measure of information. Of course, if some input information does not reduce entropy, it means that the input may be noise.
Introduction:
The basic function of information is to eliminate the uncertainty of people's understanding of things. Shannon, the founder of American information theory, found that any information has redundancy, and the size of redundancy is related to the probability and ideal form of each symbol of information.
After most particles are combined, valuable numbers are placed on its image-like shape, which inevitably provides a negative entropy argument for a long-term puzzle of game researchers. The dominant form of unrequited love and the understanding of information entropy can be applied to the game after changing the strategy.
After those redundant strategic threats are eliminated, they become acceptable and unbelievable opponents, which are game entropy and biological entropy knot. At this time, the probability of confrontation is great.