Você está na página 1de 2

If you’ve been anywhere near a computer lately, you’ve probably heard that computers

think and communicate in binary (base 2) rather than the decimal (base 10) that we tend to use,
and you have probably wondered why this is necessary.
We humans use base 10 because we have 10 fingers which makes base 10 counting
super simple and natural. When we count, we have exactly 10 symbols to work with (0, 1, 2, 3,
4, 5, … 9). One consequence of our use of base 10 is that it’s super simple to write and work
with powers of 10. For example, 102 = 100, 103 = 1,000, 109 = 1,000,000,000. Notice that the
exponent is the same as however many zeroes are in the resulting number.
Computers work a little differently because, well, they don’t have fingers. Instead they
have transistors, which are super helpful devices that act like a switch. They can be switched
on so they conduct electricity like a wire, or they can be switched off. Computers perform all
their calculations by switching these on and off and later checking if they conduct electricity or
not. This means that rather than having 10 symbols to work with like we do, computers only
have two. We decided to name these symbols very creatively, calling them 0 (off) and 1 (on).
Each transistors stores a bit, which represents either a 0 or a 1. You may think that only using
two symbols might limit the computer somewhat, but the computer compensates by having a
ton of transistors at its disposal: an iPhone XS contains nearly 7 billion transistors.
Just like in decimal and its powers of 10, it’s super simple to work with powers of 2 in
binary. For example, 8 = 23 = 10002 , 64 = 26 = 10000002. (The 2 subscript just means I’m
writing a number in base 2 instead of base 10.) Just like we like using powers of 10 in day-to-
day life (100 dollars, 1000 batteries, etc), computer scientists like to use powers of 2 such as 8
and 64 because it makes lots of things easy. For example, we grouped 8 (23 ) bits together and
called it a byte, the same way you would group 12 donuts together and call it a dozen. To work
with larger amounts of storage, we came up with the kilobyte (KB), which is 1024 (210 ) bytes.
A megabyte (MB) is 1024 kilobytes, and a gigabyte (GB) is 1024 megabytes. To put these in
perspective, an 8-page essay takes up about 15 KB, a photo takes up about 4 MB, and each
hour of high definition video on Netflix takes up about 3 GB.
However, this definition is not the only valid definition. The metric system, which is
heavily based on powers of 10, defines the prefix kilo- to mean 1,000, mega- to mean 1,000
times kilo-, and giga- to mean 1,000 times mega-. This definition is mostly used by hard drive
manufacturers, who jumped on the opportunity to save costs by selling less data. This is mostly
why a thumb drive advertised as 32 GB shows up as 29.8 GB in your computer.
Although computers find it easy to work in binary, programmers find it too clunky. For
example, try finding the difference between the following two numbers: 11010101000001112,
11010111000001112. Rather than having to deal with super long sequences of bits, computer
scientists decided to use hexadecimal (base 16) instead. Instead of just counting up to 9 and
rolling over, in hexadecimal we count up to 9, then A, then B, C, D, E, and finally 10, 11, … 1C,
1D, 1E, 20, etc. Hexadecimal is more convenient than binary because it’s more concise, and it’s
nicer than decimal because it converts very easily to binary through some neat math— since
16 = 24 , every set of four bits in binary converts exactly to one hex digit. Since each hex digit
represents four bits, each byte corresponds to two hex digits. The previous numbers would
convert to 𝐷50716 and 𝐷70716 respectively.
Q: How do computers switch transistors on and off?
A: This is a very good question; unfortunately there’s a huge amount of physics that goes into
this. I recommend looking it up on the internet.

Q: How does a computer know which transistors to turn on and off?


A: The computer reads binary code, which is a sequence of instructions in memory that
essentially tell the chip what operations to perform.

Você também pode gostar