Back
ASCII
EBCDIC
Computer data.
Early systems were mostly number crunchers.
BCD - binary coded decimal.
More easily checked by humans.
Each decimal digits easily stored in a nibble.
Calculations more complex and slower that pure binary.
But much easier to convert to displayable values.
Don't have to convert between user and machine friendly values.
Also, debugging dumps easier to understand.
Many CPUs support BCD math (not always correctly).
Requires custom flags to flag overflow/carry in nibble.
BCD numbers can be packed. 1 digit per 4-bits or 2 digits per byte.
But require custom instructions to use.
System improvements allow labeling of information and need for better
representation.
So 4-bit BCD nibble values superseded by 8-bit bytes.
Computer/software developers creating own character sets.
New design comes out, new character set.
existing data needs to be translated.
Move to standardization in mid 60s.
IBM -> EBCDIC, true 8 bit encoding.
General computing -> ASCII, 7 bit.
EBCDIC - (IBM 1963/4) Extended Binary Coded Decimal Interchange Code
Used an 8 bit character.
Expanded to handle upper and lower case characters without shift.
Layout evolved from the use of punch-cards as data storage for
1890 U.S. Census.
wikipedia : punch cards
Collating (sorting) more complex because even within the separate
A-Z and a-z sequences, there are gaps.
So, J (209) does not immediately follow I (201).
In part, because of how punch cards designed.
EBCDIC table
IBM chose to put the character representation of decimal digits at the
very top of the byte range, xF0-xFF or 1111 0000b - 1111 1001b, making
conversion between packed and unpacked BCD values very simple.
You simply strip the high nibble off to pack or add back on to make
a printable character. Signed values required some additional logic.
Several variations released over the years. Not always compatible.
ASCII