The Language Before the Hardware


The universal language in which computers carry out processor instructions originated in 17th century in the form of the binary numerical system. Developed by German philosopher and mathematician Gottfried Wilhelm Leibniz, the system came about as a way to represent decimal numbers using only two digits, the number zero and the number one. His system was partly inspired by philosophical explanations in the classical Chinese text the “I Ching,” which understood the universe in terms of dualities such as light & darkness and male & female.
While there was no practical use for his newly codified system at the time, Leibniz believed that it was possible for a machine to someday make use of these long strings of binary numbers.​

In 1847, English mathematician George Boole introduced a newly devised algebraic language built on Leibniz work. His “Boolean algebra” was actually a system of logic, with mathematical equations used to represent statements in logic. Just as important was that it employed a binary approach in which the relationship between different mathematical quantities would be either true or false, 0 or 1.  And though there was no obvious application for Boole’s algebra at the time, another mathematician, Charles Sanders Pierce spent decades expanding the system and eventually found in 1886 that the calculations can be carried out with electrical switching circuits.

Comments

Popular posts from this blog

Waiting for 5G connection? Researchers have this dire warning for you