Because I have a poor foundation in mathematics and English, I can't read programming languages, and it is difficult to understand programming software. Therefore, I can only express my views from the books on analytical programming I have read before. Although Chinese characters only need 800 to 12000 words to write a good composition, the accumulation of this font is completed in primary schools, but Chinese characters are far more than 12000 words, and only rare words have 12000 words. Not a combination of idioms and words. In communication, the accumulation of one or two thousand words can complete communication, but we all learn Chinese from junior high school to senior high school, so as to understand the deeper meaning of words. Perhaps it is because junior high school and senior high school all have Chinese, but we don't understand the meaning of many classical Chinese, so it is difficult to guess the meaning, let alone be accurate.
Back to the topic, the world language of circuit boards mainly recognizes binary 0 1. When early Americans developed software and hardware, they had defined the case of 26 English letters and the binary arrangement of numbers and punctuation marks from 0 to 9. When developing software, due to the advantage of mother tongue, the code function is given smoothly and has great advantages.
Now let's get down to business. If Chinese characters are to be changed into binary characters 0 1 directly recognized by the circuit board, is it necessary to give binary characters 0 1 at least 800 to 12000 a discharge function? Compared with 26 uppercase and lowercase letters and punctuation marks and the number 10, is it dozens of times the basic number? Does Hong Kong, Macao and Taiwan define their own traditional Chinese characters into the compiler library, which is a hundred times the basic amount? Is there a combination of thousands and tens of thousands of times in mathematics to convert the functions of idioms and words into binary identifiers? With these basic contents, is it a huge task to define the function library (database) for developing software? It needs to be estimated that hundreds or even tens of thousands of elites will gradually realize it. Does the database alone need hundreds of gigabytes of storage to complete the definition function? Only in this way can we have more cutting-edge development software, such as java, and even design the preliminary drawing function of the software? Then hundreds of gigabytes of databases are used to develop desktop support systems like Microsoft. Is it possible to reach the order of TB? I don't know anyway, but if you want to overthrow it, you must do this basic thing well. Moreover, programmers may have to become excellent students with strong knowledge of Chinese mathematics before they can learn new programming.
This is what I can think of, because I only know that the code needs various binary transformations before it can be correctly identified as discharge. As a new word, each Chinese character needs to define binary like 26 English letters of different sizes.
To sum up, so I think we can still narrow the gap with the American software industry step by step by spelling the number and winning the college entrance examination, but the United States still has the advantage in developing basic software, and English is the mother tongue.