36-bit word length
From Free net encyclopedia
36-bit word length describes the number of bits, 36, used in some early computers to represent data in the form of words—their basic units of addressing and calculation.
Many early computers aimed at the scientific market had a 36-bit word length. This word length was just long enough to represent positive and negative integers to an accuracy of ten decimal digits (35 bits would have been the minimum). It also allowed the storage of six alphanumeric characters encoded in a six-bit character encoding. Prior to the introduction of computers, the state of the art in precision scientific and engineering calculation was the ten-digit, electrically-powered, mechanical calculator, such as those manufactured by Frieden, Marchant and Monroe. These calculators had a column of keys for each digit and operators were trained to use all their fingers when entering numbers, so while some specialized calculators had more columns, ten was a practical limit. Computers, as the new competitor, had to match that accuracy. Decimal computers sold in that era, such as the IBM 650 and the IBM 7070, had a word length of ten digits, as did ENIAC, one of the earliest computers.
Computers with 36-bit words included the MIT Lincoln Laboratory TX-2, the IBM 701/704x/709x's, the UNIVAC 1103/1103A/1105/1100/2200's, the General Electric 600's/Honeywell 6000's, and the Digital Equipment Corporation PDP-6/10's (as used in the DECsystem-10/DECSYSTEM-20). Smaller machines, like the PDP-1/ 9/15 used 18-bit words so a double word would be 36 bits. EDSAC had a similar scheme.
These computers used 18-bit word addressing, not byte addressing, giving an address space of 218 36-bit words, approximately 1 megabyte of storage. Many of them were originally limited to that much physical memory as well. The architectures which survived evolved over time to support larger virtual address spaces using memory segmentation or other mechanisms.
The common character packings included six 6-bit characters (ubiquitous in early usage), five 7-bit characters and 1 unused bit (the usual PDP-6/10 convention), four 8-bit characters (7-bit ASCII plus 1 unused bit or 8-bit EBCDIC) and 4 unused bits, and four 9-bit characters (the Multics convention). Characters were extracted from words either using either standard shift and mask operations or with special-purpose hardware supporting 6-bit, 9-bit, or variable-length characters. The GE-600 used special indirect words to access 6- and 9-bit characters; the PDP-6/10 had special instructions to access arbitrary-length byte fields. The C programming language requires that all memory be accessible as bytes, so C implementations on 36-bit machines use 9-bit bytes.
By the time IBM introduced System/360, scientific calculations had shifted to floating point and mechanical calculators were no longer a competitor. The 360's also included instructions for variable length decimal arithmetic for commercial applications. So the 360's practice of using word lengths that were a power of two quickly became universal.