Thread: 0 vs. 00
View Single Post
  #14 (permalink)  
Old Fri Dec 01, 2017, 10:50am
jTheUmp jTheUmp is offline
TODO: creative title here
 
Join Date: Sep 2009
Location: Minneapolis, MN
Posts: 1,250
Quote:
Originally Posted by bucky View Post
Yea, I understand.

Now I must throw in my 2 cents regarding technology or lack thereof. Based on what you said, if computer programs can distinguish between 1 and 11 as being different numbers, that must mean the computer program can recognize a null value in the tens place. Therefore, why couldn't the same program do the same for the number 0? I hate computers. All they do is reflect the imperfections of humans

Depends on if you're storing the values as a String (characters) or as an Integer (number).

Computers store everything in binary, as a combination of 1s and 0s. Assuming we're using 8-bit notation (that is, 8 slots per number) if we're storing just Integers, there's no difference between 0 and 00, because they're both stored as 00000000. 1 is stored as 00000001, 2 is stored as 00000010, 3 is stored as 00000011, 4 is stored as 00000100, etc... 11 is stored as 00001011.

String values are handled differently... each "letter" has it's own 8-bit value. Assuming we're using ASCII notation, the 8-bit value for 0 is 00110000, which would be repeated twice for 00, so 0011000000110000. 11 would be 0011000100110001.

Of course, there are many different types of character encoding schemes (UTF-8, UTF-16, UTF-32, ISO8859, Windows 1252, and many many more, all of them work similarly to ASCII, but the actual binary value of a particular character may be different)



You may commence shouting "Shut Up, Nerd!" at me at your earliest convenience.
Reply With Quote