36-bit computing

In computer architecture, 36-bit integers, memory addresses, or other data units are those that are 36 bits (six six-bit characters) wide. Also, 36-bit central processing unit (CPU) and arithmetic logic unit (ALU) architectures are those that are based on registers, address buses, or data buses of that size. 36-bit computers were popular in the early mainframe computer era from the 1950s through the early 1970s.

Friden mechanical calculator. The electronic computer word length of 36-bits was chosen, in part, to match its precision.

Starting in the 1960s, but especially the 1970s, the introduction of 7-bit ASCII and 8-bit EBCDIC led to the move to machines using 8-bit bytes, with word sizes that were multiples of 8, notably the 32-bit IBM System/360 mainframe and Digital Equipment VAX and Data General MV series superminicomputers. By the mid-1970s the conversion was largely complete, and microprocessors quickly moved from 8-bit to 16-bit to 32-bit over a period of a decade. The number of 36-bit machines rapidly fell during this period, offered largely for backward compatibility purposes running legacy programs.


© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search