Computers : How they communicate, internally
I was thinking about the time when I understood how computers work...
>And now you don't, right?
Right. So I decided to brush up ... just a mite. In particular, how information moves about inside the computer.
In computerese, they're called 1s and 0s.
Information in a computer travels along wires that carry one of two DC voltages, say 5 volts and 0 volts.
If you want to identify a particular memory location, say the location with address 1234, you'd identify it in binary: a bunch of 1s and 0s.
>And that's ... what?
I think it's 10011010010.
>Are you sure?
Well, let's see. A binary number like abcd can be changed to decimal like so: a*23 + b*22 + c*21 + d*20= 8a+4b+2c+d.
So, for example, 1011 would be 1*8 + 0*4 + 1*2 + 1 = 11, in decimal.
That's just like the meaning of, say, 456, in decimal. It's 4*102 + 5*101 + 6*100 = 400+50+6.
See how the powers of 10 appear, and the digits are from 0 to 9? For binary, we have the powers of 2 and the digits are from 0 to 1. For numbers to base 7, we'd have powers ...
>Can you get back to 10011010010?
Yes. Do you know the powers of 2? They're 210 = 1024 and 29 = 512 and 28 = 256, etc. etc.
We change 10011010010 to decimal like so: 1*1024 + 0*512 + 0*256 + 1*128 + 1*64 + 0*32 + 1*16 + 0*8 + 0*4+ 1*2 + 0*1
= 1024+128+64+16+2 = 1234.
Okay, so a particular location in computer memory would be addressed with a gaggle of 1s and 0s.
These would live on a bunch of wires called the Address Bus.
If there were 32 wires on the bus, the size of the addressable memory would be 232 = 4,294,967,296 memory locations.
In computerese, that'd be 4 gigabytes, so ...
Oh, I forgot to mention that kilo, in computerese, is really 1024 'cause that's 210.
And mega is 1024*1024 = 1,048,576 and giga is 1024*1024*1024 = 1,073,741,824 and 4 giga would be 4*1,073,741,824 = 4,294,967,296.
Of course, if you wanted to address larger memory sizes, you could have a 64-bit Address Bus.
Then you'd be able to address 264 locations: 1024 times larger than 232. That's a terrabyte.
Oh, I forgot to mention that data comes in bytes. That's 8 binary bits, like: 10110111. See?
>Okay, suppose you want to recognize when a particular address is on the memory bus. How'd you do that?
Excellent question. We need some so-called gates:
First we got a NOT gate. It just changes 1s to 0s and 0s to 1s.
Next we got an AND gate. The output is a 1 only if all inputs are 1s.
Next we got an OR gate. The output is a 1 if any input is a 1.
As you might imagine, there are also analogous multiple input AND and OR gates.
If the inputs for the AND are labelled a, b, c etc., then the output is a 1 if a = 1 and b = 1 and c = 1 etc.
>That's why it's called an AND gate, right?
You're gettin' schmarter!
The output of a multiple input OR is a 1 if a = 1 or b = 1 or c = 1 etc.
>That's why it's called ...
If we wanted to recognize when 10011010010 is on the address bus, you might connect a bunch of gates to the bus like Fig. 1
Note that the output of the AND is a 1 only when all inputs are 1s.
The output of the OR is a 0 only when all inputs are 0s
... so the NOT output will then be a 1
... and the final AND gate will then be 1 only for the address 10011010010.
In that way we can recognize when that particular address is on the bus.
I might point out that there are also NOR gates that combine OR and NOT:
One could replace two gates with one, like so:
The output is a 1 only when all inputs are 0s.
There are also NAND gates: The output is a 0 only when all inputs are 1s..
You can buy these guys at your local Radio Shack: You just have to supply 5 volts power.
>So what's inside the gate?
A NOR might look like this:
>And what would you do with that information? I mean ... the address information.
We might like to extract the information from that particular memory location.
Data that travels about a computer does so on a Data Bus ... which might be 32 bits or 64 bits.
If I wanted the number on the Data Bus when the memory address was 10011010010, I'd wait for the (final) 1 from Fig. 1.
Then I'd collect the number on the Data Bus.
>Huh? You would collect the data?
Well, not me ... the Central Processing Unit (CPU). It's got a jillion pins. Some are connected to the Address Bus and some to the Data Bus.
It's got pins to tell it when it's okay to read from the Data Bus and when it's okay to write to that bus. Pins to tell it when to do these things ... synchronized to a clock pulse that comes in on a pin.
Voltage and ground pins. Pins for 5 volts, pins for 3.3 volts. Pins that interrupt the CPU and pins that ...
>How many pins?
Here's a couple of examples.
The ancient 6502 which was on the original Apple PCs, the Commodore PET and Atari. It's pretty basic, eh?
On the latest-and-greatest, I found some Address and Data pins, but my eyesight ain't what it used to be so I undoubtedly missed a bunch ... and most pins, I have no idea ...
old fashioned (1970s) CPU
modern Pentium CPU
>Mamma mia! How things change, eh?
My sentiments exactly. Once upon a time I could actually write code for the (ancient) 6502.
It had an 8-bit accumulator register, two 8-bit index registers (X and Y),
an 8-bit processor status register, an 8-bit stack pointer, and a 16-bit program counter.
>All inside the chip?
>And a modern CPU, like a Pentium? What's inside?
Since you don't even know what's under the hood of a car, I doubt whether ...
I do know what's in a Pentium: There's a swimming pool, a football stadium and several graduate students playing chess.
Display "Hello, world!"
When you write a program (in some language understood by humans, like Basic or C or Fortran etc.), the commands must be translated into a language the CPU can understand.
This translation is called "compiling": changing a command to "machine language".
For example, such a command might be:
A = B + C
After translation to "machine language", a bunch of 1s and 0s will be stored in memory. **
The CPU will address the successive memory locations where the commands are stored, read the sequence of command bytes that arrive on the Data Bus and excecute a
sequence of commands.
A = B + C might be executed as:
Load the X register with the byte called B (which will refer to a memory location holding the byte).
Load the Y register with the byte called C.
Add the two numbers in the X and Y registers, using the Accumulator (which has the capability to "do the math")
Stick the resulting sum (held in the Accumulator register) into a memory location identified by the label A.
Yeah, it "accumulates" the intermediate results when doing arithmetic operations.
In the above ritual, most CPUs will be able to load the Accumulator directly from some memory lcoation (holding the number B), then add the number held in memory location identified by the label C,
then store the resultant number (now in the Accumulator) into a memory location identified by the label A.
For example, these instructions (which translate to: stick memory location B into the Accumulator, Add memory location C, then store the result in memory location A)
... they may look like: LDA B, ADD C, STA A ... which would be the semi-machine language called "assembly".
Computers with Accumulators have been around since the 1960s. Programming computers has been around since ... uh ... I forget.
>Yeah, I expected that.
Well, I did know once-upon-a-time: Click!
** After compiling, nobody can understand the commands (except the CPU).
You need to get your hands on the original "source code", before compiling, in order to modify the set of instructions.
That makes it easy to keep your software safe from hackers: just hide the source code.
However, some code in "interpreted", not compiled.
The computer reads something like A = B + C and changes it to machine language on the fly.
With interpreted languages (like the original BASIC), you can easily modify the code. Just type A = B - C and run the program again.
Once upon a time (in the last century, when I was much younger and much schmarter) I wrote an interpreter for a language that I devised called display.
You write up a bunch of "display" commands with an ordinary text editor then save the text file as (for example): stuff
Then type: display stuff and the display software would "interpret" the lines of code in the stuff file.
I used this to generate a bunch of tutorials (C Tutor) on how to program in the C language.
Alas, it worked under the old DOS and is such a terrible ritual to run these days ... but it's described here.