Cloud computing, big data, software as a service (SaaS), so many concepts, how to distinguish and understand them?
Let’s go back to 3000 B.C. when number symbols began to appear to start understanding what computing is. Figure 2-4 shows the number of symbols that appeared one after another from ancient times. It should be said that from the beginning of food gatherers, based on the need for survival, there was a need to count, and counting required numbers, and with numbers, there was addition and subtraction. At that time, the main calculation power relied on the human brain.
Figure 2-4 Ancient numerical symbols
From the number of beans, small sticks, string-tied buckles to the invention of numerical symbols and addition and subtraction, ancient man is based on survival. To simplify the number of symbols when the number was huge and to simplify the number of calculations, positional numeral systems (PNS), the concept of so-called decimal or Hexadecimal, etc., were invented so that it was not necessary to invent a symbol for each number. Figure 2-5 shows the Roman numerals using the positional numeral system. Figure 2-6 shows the abacus that appeared in BC, which is a typical positional numeral system, and this later came to China and formed the Chinese abacus. Figure 2-7 is the decimal representation of 1125.
Figure 2-5 Roman numerals representing numbers Figure 2-6 Ancient calculator: Abacus Figure 2-7 Position Digit System PDS
Figure 2-8 illustrates the types of system of position numbers commonly used today. Decimal is a system invented according to our human characteristics. It originated from our ten fingers, making it much more convenient in practical applications, with full tens going forward one position.
How did people calculate until the seventh century AD? Based on a system of numerical positions, they were calculated separately and then expressed in numerical symbols. For example, when calculating using an abacus, if a particular digit was 0, it was empty, and nothing was there.
The emergence of the number zero is a sign that after more than 3,000 years, from ancient Babylon to ancient Greece, to ancient Roman numbers, there has been no discovery of the existence of the number zero, not to mention the recognition of zero as a number. Many mathematicians at that time thought that there was nothing, so how could it be a number? Even the Arabic numbers at that time, it is 1-9. Because there was no number 0, calculator was a profession at that time. Until the 14th century AD, the Romans always thought that Roman numerals were so perfect that there was no need to introduce 0.
The number 0 was invented around 628 A.D. by an Indian mathematician who considered 0 a number and discussed how it could be applied in addition and subtraction, including division. Soon the concept spread to Cambodia, China, and Islamic countries. And before that, among other number symbols, no one considered zero as a number, but simply like a placeholder, as if 0 was nothing in an abacus.
After the advent of the positional number system, the introduction of 0 as a number was one of the greatest inventions in human history, and fast computation became possible. Figure 2-9 shows the numerical symbols with 0, recognizing 0 as a number.
The computer uses a binary system, with only two numbers, 0 and 1. It is precisely because of the position number system (base system) and the number 0 that the machine calculation is possible. After the computer is powered on, all registers must be initialized. Again, it is the number 0, so there is an initial value. No matter how advanced the technology is, the nature of computing will not change.