Next: Standards Up: Binary Numbers Previous: Decimal-Binary Numbers

Letters

To process letters in a computer each letter or character is assigned a number. Currently the possible numbers in an 8 bit byte are usually used to define characters. There are several competing conventions. To illustrate the process let us use the widely used ASCII convention for 8 bit characters.

With the ASCII conventions a computer will interpret the sequence of characters:

01000010,01000101,01000100

as the word BED. In addition, mathematical symbols can also be represented by strings of bits and logically manipulated by computers for example, to invert matrices algebraically or to construct mathematical proofs using software.


norman@eco.utexas.edu
Thu Jun 8 16:37:44 CDT 1995