1. Carrier Dependence: Information cannot exist independently and needs a certain carrier. In addition, the same information can be attached to the same carrier. For example, traffic information can be displayed by signals or conveyed by traffic police gestures.
2. Validity: Information is valuable. Just as air and water are indispensable, so is information. Therefore, it is often said that matter, energy and information are the three elements that make up the world, and they are indispensable. The "sensory deprivation experiment" is strong evidence.
3. Timeliness: Information often only reflects the state of things at a specific moment and will change with the passage of time, such as traffic information and stock information. Weather forecast, meeting notice, etc.
4.*** Enjoyability: Another important aspect that information is different from matter and energy is that information can be accepted by multiple information users and will not disappear.
Extended data:
Although the information is uncertain, there are ways to quantify it. According to the concept of information, people can conclude that information has the following characteristics:
1. The greater the probability P(x) of message x, the smaller the information; Conversely, the smaller the probability of occurrence, the greater the amount of information. As you can see, the amount of information (which we refer to as I) is inversely proportional to the probability of message occurrence.
2. When the probability is 1, everyone knows what happened, so the amount of information is 0.
3. When a message consists of several independent small messages, the information contained in this message should be equal to the sum of the information contained in each small message.
According to these characteristics, if expressed by mathematical logarithmic function, the relationship between information quantity and message occurrence probability can be expressed exactly: I=-loga(P(x)).
People's weight is measured in kilograms, and people's height is measured in meters, so what unit should be used to measure the amount of information? Usually, the amount of information is measured in bits, which is more convenient, because the amount of information in a binary waveform is exactly equal to 1bit.
According to the research results of information. The scientific concept of information can be summarized as follows:
Information is the reflection of the movement state and change of various things in the objective world, the representation of the interrelation and interaction between objective things, and the representation of the movement state and change essence of objective things.
Bits are units of information, but they are also used as units of signals in engineering. 100bit here refers to the signal. Secondly, the basic problem in communication is to reproduce a little information at another point, which refers to the point-to-point situation. But even in the case of point to multipoint, because in the actual communication system, messages often refer to some transmitted symbols.
How much information these symbols can carry is related to the probability of these symbols, but for any receiver, the probability of these symbols is certain, not this probability of this receiver and that probability of that receiver.
For example, there is a string of symbols 22 1234, which consists of four symbols: 1, 2, 3 and 4. Assuming that the probability of four symbols is 1/4, then 2 appears three times in this string of symbols, then the information carried by 2 is -3 × log2 (1/.
References:
Baidu encyclopedia-information