What do you mean by so much information?

A large amount of information means that the expression is rich in content, large in quantity and wide in coverage.

Information quantity refers to the information measure or content needed to select an event from n equal possible events, that is, the minimum number of times to ask "yes or no" in the process of identifying a specific event among n events.

In information theory, the definition of mutual information is: I (x; Y) = h (x)-h (x | y), and the latter term on the right side of the number formula is called conditional entropy, which can represent a formula of discrete messages, and it represents the uncertainty of X after Y is known. Therefore, the mutual information I (x; Y) is the amount of information about the source X obtained when Y is received. Corresponding to mutual information, H(X) is often called self-information.