Understanding the Matrix World/Machine Language

Yagmur Sahin

Crazy Simple Computer Science Series/2 (Understanding the Matrix World/Binary to Numbers and Symbols)

Welcome to the Crazy Simple Computer Science Series!

This series introduces readers to the basics of computer science in a way that anyone can understand. This series, which aims to make computer science and working principles fun, it is aimed to give the basic logic of computer science and helpful information that you can use in daily life in an understandable language.

In our Crazy Simple Computer Science 1- How Computers Work article, we explained how computers work.

If you have started reading this article for the first time, I recommend that you take a look at the first article in the series. :) So, what is explained here will become more understandable.

If you have watched the movie The Matrix, you must have seen that the world of the Matrix is ​​created by a computer program and operates with 0s and 1s. In fact, the computer world works this way. And all the numbers and symbols, including the texts you read, have 0 and 1 equivalents.

https://img.particlenews.com/image.php?url=4CXDBp_0cFEVqGL00
Matrix Movie Scene

So how can computers turn 0 and 1 into numbers and symbols and read these words as they are today? We can answer this question simply as programming languages. But the primary language that works behind every programming language is machine language.

Machine language is the lowest level language and is read and processed by the hardware. A machine language consists of a binary system, that is, logical combinations with 1s and 0s. So it's a bit difficult to understand this at the human level. Machine Language varies according to processor architectures and enables the processor to work with the commands it receives.

There are four types of number systems. These are the Binary Number System (Binary), Octal Number System (Octal), Decimal Number System (Decimal), and Hexadecimal Number System (Hexadecimal).

We usually hear languages ​​such as Java, C#, C, C++, etc. that we usually hear are high-level languages ​​and are quite far from the Machine Language we are talking about. The codes and programs that we write with high-level languages ​​reach the machine language with assembly, the translator, and activate the processor, assuming that you are examining the visual again.

Binary to Numbers

Now, with binary, which means “two,” you only have two numbers to work with: 0 and 1, so instead of utilizing powers of 10, we’ll use powers of two. We have, in fact. 2 to the 0 equals 1; 2 to the 1 equals 2, 2 to the 2 equals 4, and so on to 8, 16, 32, 64, and beyond.

https://img.particlenews.com/image.php?url=0FJczJ_0cFEVqGL00
Binary to Decimals

https://img.particlenews.com/image.php?url=2FLP9A_0cFEVqGL00
Binary to Decimals

Binary to Symbols

But what if I want to use the number 8 as a symbol? I think I’ll need a bit more room to depict the number 8. So I might need to add the eighth place so that I can have a 1, a 0, a 0, and a 0, with the eighth position being 2 to the 3, but I’ll need extra hardware to do so. I’m going to require a new switch, a new light bulb, and a new physical component within my computer. Fortunately, today’s computers have millions of these small transistors and switches, so counting up to 8 isn’t too difficult, but does this imply that we need more physical storage to represent greater and larger values?

Indeed, thematic and computer science is constrained in this way. If you just have a limited quantity of memory or hardware or switches, you may only be able to perform so much computationally. So that’s all there is to it in terms of numbers. We can count as high as we like and express any number of values with only 0s and 1s as long as we have enough bits. However, numbers can only take us so far when it comes to computers.

So we need something else. And at this stage, we use machine language standard ASCII and UNICODE.

Wondering what ASCII and UNICODE are and how machine language converts to symbols and numbers? If so, stay tuned, and see you in our Crazy Simple Computer Science/3 ASCII and UNICODE article.

https://img.particlenews.com/image.php?url=1MBFAi_0cFEVqGL00
US ASCII Code Chart

This is original content from NewsBreak’s Creator Program. Join today to publish and share your own content.

Comments / 0

Published by

I write about the best places to visit in California and the latest entertainment and technology news. Information Security Engineer/Travel Guide

Los Angeles County, CA
18 followers

More from Yagmur Sahin

Comments / 0