The 3 Things Computers Actually Do
And why "convert" is not one of them
Joseph Flanigan
"Computers do 3 things" is the prompt I gave the AI Deep Seek. I was not surprised that the AI did not have a direct answer. I know the answer from years of experience. Computers do 3 things: add, compile, move data. The compile steps are source, syntax, semantics, intermediate form, semantics, syntax, target.
After dialog with the AI, I had the AI write this Blog using my knowledge as framework. The AI was able to do research to provide more details.
As a test I asked the AI to explain the mouse. It took a couple of passes to connect that every part of the computer has a computer that does 3 things in every component.
I reviewed the post material.
Deep Seek
Ask most computer science graduates what a computer does, and the standard line comes: "fetch, decode, execute" or "arithmetic, move, branch."
Computers do exactly three things.
Thing 1: They Add
At the rock bottom of every operation, every game rendered, every database queried, every neural network trained, lives the ALU (Arithmetic Logic Unit).
Subtraction? Addition with negation.
Multiplication? Repeated addition.
Division? Repeated subtraction.
Address calculation? Addition.
Array indexing? Addition plus addition.
Even logical operations (AND, OR, XOR) sit right next to the adder on the silicon.
Every impressive thing a computer does is, at the moment of execution, mostly just adding binary numbers.
What "add" includes:
Integer arithmetic (add, subtract, multiply, divide)
Floating-point operations
Bitwise logic (AND, OR, XOR, NOT)
Shifts and rotations
Comparisons (which are just subtraction with a discarded result)
What "add" does NOT include:
Any transformation that requires remembering previous state
Any operation that restructures data
Any translation between representations
Add is pure. Add is stateless. Add is the hammer.
Thing 2: They Compile
This is the operation nobody talks about, and it is everywhere. By compile, the meaning is not "run GCC."
| Feature | Converter | Compiler |
| Static vs dynamic | Fixed transformation | Can be reentrant |
| State | Stateless (or fixed mapping) | Maintains internal state across invocations |
| Context awareness | None, same input always yields same output | Behavior can differ depending on previous inputs or external state |
| Example | ADC: 2.3V > always same 8-bit value | Mouse DSP: compares current frame to previous frame to compute delta |
A converter is a pure function: $f(x)=y$
Where Compilation Happens (Everywhere):
Fetching an instruction: The CPU compiles a bit pattern (machine code) into micro-ops.
Rendering a font: The system compiles a Unicode codepoint > glyph index > outline > rasterized pixels.
Playing an MP3: The decoder compiles a compressed bitstream > frequency coefficients > audio samples.
Running JavaScript: The VM compiles source > AST > bytecode > machine code (sometimes multiple times).
Querying SQL: The database compiles SQL > parse tree > query plan > row operations.
Mouse movement: The microcontroller compiles pixel frames > delta values > HID report.
The compiler pattern is always the same: Source > Syntax > Intermediate Form > Semantics > Rule of Five > Syntax > Target.
Thing 3: They Move Data Around
This one seems obvious until realizing how much of computing is just this. Load from RAM to register. Store from register to cache. Copy from disk to memory. Send from memory to network interface. Receive from network to buffer.
Modern computers are not compute-bound. They are data-movement-bound.
CPU waits for cache (movement stalled).
Program waits for disk (movement stalled).
Database waits for network (movement stalled).
The entire field of high-performance computing is, at the heart, the art of hiding how long it takes to move data.
What "move" includes:
Load (memory > register)
Store (register > memory)
Copy (register > register)
DMA transfers (device memory)
Network sends and receives
Cache line fills and evictions
What "move" does NOT include:
Any transformation of the data (that is compile or add)
Any stateful processing (that is compile)
Move is pure relocation. Move is latency. Move is the bottleneck.
Case Study: How a Mouse Works (The Full Pipeline)
Inside the mouse:
Move: LED illuminates surface > photodiodes detect light > analog voltages.
Converter: ADC converts voltage to pixel value (static, not a compiler).
Compile (reentrant): DSP stores previous frame and compares current frame to previous.
Add: DSP computes delta-X, delta-Y (difference calculation).
Compile: Microcontroller packs delta + button state into HID report.
Compile: USB controller serializes report (framing).
Move: USB cable transmits bits (electrical movement).
Inside the computer:
Move: USB host controller receives bits into buffer.
Compile: HID driver parses report (unpacking).
Add: Window system adds delta to cursor position $(x+=dx)$.
Compile: Graphics stack compiles cursor position to pixels (transform).
Move: Framebuffer moves pixels to display.
Every step is add, compile, or move. Nothing else.
The Hidden Triad
Everything a computer does fits into exactly one of these categories.
Add: Calculating, comparing, shifting, masking.
Compile: Parsing, decoding, translating, encoding, optimizing, any stateful transformation.
Move: Loading, storing, copying, sending, receiving.
Decompose any task; no fourth thing appears. .
No comments:
Post a Comment
Thank you for your comment.