- #1
aychamo
- 375
- 0
How do computers do math? I mean, it seems kinda like the math part is the most basic of the functions of the computer.
But I mean, how does it KNOW that 1+1=2?
To give a background, I understand binary: 1+1=10. I have a rudimentary understanding of assembly, I tried to program in it many years ago, so I could look at this:
mov ax, 1
add ax, 1
But how does that add 1 into ax, and then have the value of 2? (or b10).
I can almost see how like shl, and shr (is that square and square root?) can work, but shifting the bits, but not adding, etc.
I'm assuming the microchip (x86?) has to be preprogrammed to know the order of numbers.. But I have no clue. May someone shed light on this?
But I mean, how does it KNOW that 1+1=2?
To give a background, I understand binary: 1+1=10. I have a rudimentary understanding of assembly, I tried to program in it many years ago, so I could look at this:
mov ax, 1
add ax, 1
But how does that add 1 into ax, and then have the value of 2? (or b10).
I can almost see how like shl, and shr (is that square and square root?) can work, but shifting the bits, but not adding, etc.
I'm assuming the microchip (x86?) has to be preprogrammed to know the order of numbers.. But I have no clue. May someone shed light on this?