You are on page 1of 5

[3B2] man2010010075.

3d 6/2/010 16:21 Page 75

Anecdotes

Intels 8086
Stanley Mazor Editor: Dave Walden

Intel advertised its new CPU chip for PC and laptop com- 16 registers) that the programmer would always hit
puters in 1993 as the Intel Pentium processor. This chip the limit and be unhappy. He said there should either
was the fifth generation (586) evolved from the original be a seemingly infinite number of resources or none
8086 chip, developed in 1976. The results of this 30-year visible at all. He reckoned that although registers
lineage (e.g. 186, 286, 386, 486) have allowed Intel to offer a speedup (compared to using main memory)
dominate microcomputers ever since.12 they should be invisible to the programmer. Hence,
In 1976, I had just returned from being Intel the 8800 CPU was a radical departure from the existing
Europes first applications engineer and was doing mar- microcomputers and most large-scale computers of
ket development for the consumer and game market- the day that provided programmer visible registers,
place. Having worked on three earlier microcomputer such as
chips, I consulted with, but was not part of the 8086
CPU design organization. I recall being in one architec- MOV R1, R2 [move data to R1 from R2]
ture meeting where I suggested removing a single non-
essential instruction. This was unusual, as most Cleverly, Rattner felt that microcomputer CPUs
contributors add to an instruction set. In fact, instruc- should be able to be ganged together to run multiple
tion sets become unwieldy because of this design-by- programs in parallel, and his team included multiprog-
committee effect. Mostly, I just observed from else- ramming within the CPUs microcode. Because custom-
where in the company. Still, my recollections of the ers could easily implement multiple processors to
events surrounding the 8086 CPU chip development re- enhance performance and tackle heavier program
veal that the project was not originally planned, but loads, the 8800 architecture team wasnt worried
rather was a crash, catch-up project that strove to fill about an individual CPUs performance. In hindsight,
a gap in the Intel product line caused by problems this was a mental trap. Normally, CPU architecture
with the 8800 and 8085 chip designs. starts with a speed objective, and one would have
expected the 8800 to be 5x to 10x faster than the
Intel 8800 CPU 8080. However, with multiple CPUs planned, speed
Chip manufacturers offer new products as users wasnt emphasized in the 8800 specification because
needs expand and semiconductor processing improves. the designers attitude was dont worry about it.
Generally, there are two categories of new products: Another breakthrough on the 8800 CPU was repre-
minor enhancements that extend the life of a product sentations for multiple data types including floating-
(mid-life kickers), and major product enhancements point numbers and variable-length instructions. Every
that offer dramatic increase in capabilities. Marketing instruction could vary in size, in stark contrast to earlier
names (numbers) new products accordingly. For exam- machine instruction formats that rigorously obeyed
ple, Intels first 8-bit CPU chip was called the 8008.38 memory word boundaries. Instructions were defined
Two years later, the 8080 chip was introduced, shifting in terms of the number of bits occupied, and byte boun-
the last 8 in 8008 one position to the left reflected a daries were not a constraint. Hardware shifters within
major (10x) speed improvement over the earlier the 8800 CPU would locate instructions within mem-
8008.910 A year later, a slightly improved CPU was des- ory words.
ignated 8085. In 1976, the code name for Intels in- The 8800 CPU architecture started with a blank slate,
tended major new 8-bit CPU chip (8800) would was unbounded, and had no compatibility constraints
follow this shifted-8 tradition, with another expected with earlier processors. Most designers would consider
large chip performance gain.11 this an ideal environment for defining a new product.
It is the computer architects job to choose the partic- However, in my experience, constraints that define
ular improvements for the new product. Justin Rattner the scope and boundaries, and appear to be limiting,
was the 8800 CPUs chief architect, under project man- often lead to success.
ager Bill Lattin, and he had no personal ties to the ear- Some of us on the periphery of the 8800 design wor-
lier Intel chips. On the contrary, he only witnessed their ried that the architectural features were top heavy and
shortcomings. The 8800 was not designed for any spe- ahead of the market. In particular, had I been working
cific application, as was the 4004 for the Busicom calcu- on the project after having worked on a high-level lan-
lator and the 8008 for Datapoints terminal, so Rattner guage computer at Fairchild,1213 I might have been
had the luxury of taking a fresh look at microcomputer suspicious of firming software into microcode, but I
architecture. was observing from a distance.
Rattner told me that if an architect chooses some In 1981, five years and hundreds of millions of dol-
finite number of visible resources (e.g. eight or lars later, Intel announced the 8800 as the IAPX432.

IEEE Annals of the History of Computing Published by the IEEE Computer Society 1058-6180/10/$26.00  2010 IEEE
c 75
[3B2] man2010010075.3d 6/2/010 16:21 Page 76

Anecdotes

Table 1. Intel CPU chip metrics. earlier generation; the designation 8086
emphasized this point.16 By taking advantage
Density Gate delay of the latest NMOS semiconductor process, it
Chip Year Transistors factor (ns) was believed that the 8086 would provide
8008 1972 2,000 1 30 more features and higher performance.17
8080 1974 4,500 0.9 15 Table 1 shows chip density and circuit
8085 1976 6,500 0.7 5 speed for early Intel CPU chips. Moving
8086 1978 29,000 0.3 3 from 8008s PMOS process to the 8080s
NMOS halved the on-chip circuit gate delay
(30 to 15 ns) and more than doubled the
Today (more than 25 years later), with more transistor count (2,000 to 4.500). In just six
than one billion transistors on a chip, there years, transistor count went up 15x and cir-
might be a better match between the 8800 ar- cuit speed improved 10x. More transistors
chitecture and modern semiconductor pro- allow for more hardware that also improves
cesses,14 but lets get back to 1975. instruction execution performance. Each
generation of semiconductor process offered
Intel 8085 and competition at least two dimensions of improvement:
In 1975, a year after the 8080 CPU chip in- higher performance and higher density
troduction, Intel engineers were working on (Moores law).1819
two different projects: the 8800 and the
8085, a mid-life kicker for the 8080. The 8086 development
8085 chip enhanced the 8080 by adding 12 By 1976, Intel had a respectable software
new instructions, several integrated resources, group that had developed an operating sys-
and a simpler hardware interface.15 However, tem, ISIS, for Intels Intellec development
while Intel was struggling with its new chip system and some experience in compiler
designs, competition wasnt standing still. development. Drawing from this broader
After successfully launching Intels 4004, spectrum, Davidow assembled a new design
8008, and 8080, Federico Faggin founded a team: Steve Morse and Bruce Ravenel from
competing company, Zilog, and brought the software and architecture and Bill Pohlman,
8080s chip designer Shima Masatoshi from James McKevitt and John Bayliss as chip
Intel to join them. Their resultant Z-80 CPU designers, with J.C. Cornet supervising. In
chip was a great improvement over Intels May 1976, Steve Morse began specifying the
8080. Zilog experienced considerable success, instruction set for the 8086 CPU chip (see
especially in the nascent personal and hobby Figure 1). He had a speed objective, an aes-
computer marketplace. Many designers thetic objective (more complete instruction
found the 8085 inferior to the Z-80, and set for both 8- and 16-bit data), and more
Intel was losing design wins to the Motor- memory than the 8080s 64 Kbytes. Custom-
ola 6800 and the MOS Technology 6502. ers always wanted larger memory systems.
Intel also faced competition from TIs 16-bit
CPU (TMS 9900). Intel argued that the micro- CPU compatibility
computer customer, unlike the minicomputer Computer users are concerned about pre-
customer, didnt need 16-bits. (Actually for serving their investment in software and
signal processing 14-bits would be a nice engineering and are reluctant to change com-
choice, but that was not a dominant applica- puters. On the other hand, they often need
tion in the late 1970s.) Clearly, Intel needed a the speed improvements offered by a new
breakthrough processor such as the 8800, but CPU. High-level language programming
the project was languishing with a constant avoids ties to a particular CPUs instruction
time to completionalways a year away codes, but microcomputers are often pro-
and the 8085 wouldnt be competitive. grammed using assembler language. Accord-
By mid-1976, the manager of Intels ingly, a CPU chip designer is burdened with
Microcomputer division, Bill Davidow, saw compatibility constraints from earlier and in-
a poor prognosis for the 8085 even before it ferior products when designing a new CPU.
was completed and promises of the 8800 There are several levels of CPU compatibility
that werent materializing. Accordingly, he for both the designer and user to consider:
put together a crash program to develop an
improved follow-on CPU chip to the 8080  Upward versus downward. Will old software
and 8085. To save Intels market share, this run on a new CPU? Will software written
new chip would be compatible with the for a new CPU run on an old CPU?

76 IEEE Annals of the History of Computing


[3B2] man2010010075.3d 6/2/010 16:21 Page 77

Figure 1. Intel 8086 die photo. (Courtesy of Intel)

 Instruction machine code. Is the new CPU would be an acceptable trade-off. Maintain-
object code identical to the old CPUs? ing object program size was not an objective
Will an old object program run on the either. These key decisions gave Morse some
new CPU? freedom in selecting the new instruction set.
 Symbolic code. Is the symbolic assembler On the other hand, he would design the
programming language the same? Can 8086 instruction set so that in most cases
old symbolic programs be reassembled there would be a single instruction mapping
for the new CPU? 1 to 1 from the older 8080. In a few cases, it
 Translation 1:1 or not 1:1. Can an old sym- might be necessary to generate more than
bolic source program be converted to a one 8086 instruction to perform what the
new format? For each old instruction, are 8080 did in a single instruction. He recog-
one or more instructions required? nized that a software conversion tool could
convert 8080 source programs to an equiva-
Morse decided that 8086 CPUs instruc- lent 8086 source program. After assembling,
tions would be multibyte in length, just as the new programs object code would proba-
in the 8080 and 8085, but soon gave up on bly occupy more memory but would function
machine code compatibility. With no such the same as the old 8080 program.
compatibility, customer ROM chips would Another constraint for the new 8086 CPU
need to be scrapped, but he reckoned this resulted from the 12 new instructions

JanuaryMarch 2010 77
[3B2] man2010010075.3d 6/2/010 16:21 Page 78

Anecdotes

implemented in the yet to be announced


8085. Although 8085 users would benefit
from these new instructions, they would bur- Although Intel had
den the 8086 instruction set. Davidow made
a surprising and important decision: leave all originally intended the
12 instructions on the already designed 8085
CPU chip, but document and announce only 8800 to be its third-
two of them! A CPU chip is a monolithic sil-
icon structure that doesnt easily allow add- generation CPU, the
ing or removing logical functions whereas
a CPUs paper reference manual is easily 8086 played that role by
modified.
default in 1978.
16 versus 8 bits
The older 8080 provided a few 16-bit oper-
ations, including load, add, and shift. In con- needs for memory have historically always
cert, these would let the programmer increased. Memory addressing needed to be
reasonably code multiply and divide opera- expanded in the new 8086 CPU, and Intels
tions. A clear enhancement for the 8086 primary business was DRAM memory
was to provide a complete set of 16-bit oper- chipsIntel wanted to sell more, bigger
ations as well as multiply and divide hard- memories.2021 Regrettably, 16-bit addresses
ware. All the older chips 8-bit data dont provide an easy way to address more
capabilities were also maintained. However, than 64 Kbytes. (For larger 24- or 32-bit com-
if the 8086 had only a 16-bit data bus inter- puters, their larger word sizes provide larger
face to memory, it would have lost the com- addresses and consequently enable a lot
patibility with the earlier 8-bit processor and more memory).
the versatility to perform single byte opera- The IBM 360 designers in 1960 were also
tions. Worse than this, Intel sales and mar- concerned with memory addressing.22 That
keting had made many disparaging remarks machines instructions used three compo-
about 16-bit CPUs, and Intel would have to nents to generate a memory address:
eat its words, all 16-bits of them. So the
mandate was for a CPU to have single byte  12-bit displacement in the instruction,
memory addressing but operating on 2-byte  a base register to hold a 16-bit address for
datum as well. The 8086 was given a 16-bit the start of a data region, and
data bus to memory, and earlier 8-bit bus  an index register to select a unique datum.
memory systems wouldnt work with it.
Fortuitously, the 8086 CPU chip design These three values were added together to
evolved into two separate products: the generate a memory address. The benefit is
8088 was instruction compatible with the that instructions were smaller, holding only
8086 but featured a one-byte-wide data bus 12-bits of address. In a typical program, a sub-
and could thus be easily placed into existing routine call uses a branch and link register
8-bit computer systems, saving customers (BALR) instruction to load a base register
from having to redesign their hardware with the subroutines data region address.
platform and memory system. The 8088 Improving on this idea, RCAs CPU shifted
executed the same code but was just a little the index register value to the left depending
slower than the 8086, which featured a 16- on the data type being referenced. This saved
bit-wide data bus. Hence, Intel could adver- the programmer from having to correct an
tise the 8086 family as both 8- and 16-bit index value for the referenced data type.
processors. The eventual IBM PC used These two mainframe CPUs provided two im-
Intels 8086 CPU chip, and IBMs Display portant ideas: using registers to augment a
Writer word processor used the 8088 CPU memory address and shifting a register prior
chip. to adding its value. However, because any
one of the 16 general-purpose registers
Memory size could be chosen in both these systems,
The 8008 could address only 16 Kbytes of 4 bits of the instruction were used to desig-
memory, and the 8080 could address up to nate an index or base register.
64 Kbytes. Most applications of their day fit Morse extended these two ideas in his
within these parameters, but computer users 8086 CPU design. He partitioned memory

78 IEEE Annals of the History of Computing


[3B2] man2010010075.3d 6/2/010 16:21 Page 79

into regions for code, data, and stack and 2. R. Noyce and M. Hoff, A History of Microproc-
used specialized 16-bit segment registers: essor Development at Intel, IEEE Micro, vol. 1,
code segment (CS), data segment (DS), and no.1, 1981, pp. 8-21.
stack segment (SS). Segment register selection 3. S. Mazor, Micro to Mainframe, IEEE Annals of
was implicit, saving instruction coding bits. the History of Computing, vol. 27, no. 2, 2005,
These registers were chosen automatically; pp. 82-84.
the instruction pointer (IP) register (program 4. MCS-4 Micro Computer Set, Data Sheet
counter) was always in the CS, the stack #7144, Intel, 1971.
pointer was always within the SS, and data 5. M. E. Hoff, S. Mazor, and F. Faggin, Memory
references were to the DS. To enable data System for a Multichip Digital Computer,
transfer between two different segments, US Patent #3,821,715, Intel Corp., June 1974.
one extra segment (ES) register was provided, 6. S. Mazor, 8-Bits of Irony, IEEE Annals of the His-
which was chosen with an explicit instruc- tory of Computing, vol. 28, no. 2, 2006, pp. 73-76.
tion prefix byte when needed. 7. MCS 8 User Manual, Intel, 1972.
Extending RCAs idea, Morse called for 8. G. Bylinsky, Here Comes the Second Com-
shifting a 16-bit segment register value 4 bits puter Revolution, Fortune, Nov. 1975.
to the left before adding the address displace- 9. S. Mazor, Intel 8080 CPU Chip Development,
ment, forming a 20-bit address. Therefore, IEEE Annals of the History of Computing, vol. 29.
although the 8086 was a 16-bit CPU, every no. 2, 2007, pp. 70-73.
address was 20 bits wide and memory could 10. F. Faggin, M. Shima, and S. Mazor, Single Chip
total 1 Mbyte. Typically, an 8086 program- CPU, US Patent # 4,010,499, Intel Corp., 1977.
mer organized memory with numerous seg- 11. T. Jackson, Inside Intel, Dutton Publishing, 1997.
ments, less than 64 Kbytes each. The long 12. S. Mazor, Fairchild Symbol Computer, IEEE
subroutine call instruction loaded both the Annals of the History of Computing, vol. 30,
IP and the CS register to set the context for no. 1, 2008, pp. 92-95.
a subroutine similar to the IBM 360s use of 13. M. Wilkes, The Genesis of Microprogram-
the BALR instruction. ming, IEEE Annals of the History of Computing,
vol. 8, no. 2, 1986, pp. 115-126.
8086 legacy 14. S. Mazor and S. Wharton, Compact Code:
Although Intel had originally intended the iAPX 432 Addressing Techniques, Computer
8800 to be its third-generation CPU, the 8086 Design, May 1982, pp. 249-253.
played that role by default in 1978. Con- 15. 8085 User Manual, Intel, 1977.
strained to be compatible with the 8080 fam- 16. S. Morse et al., Intel Microprocessors 8008 to
ily, it influenced its precursor (the 8085) and 8086, Computer, Oct 1980, pp. 42-60.
dropped a few 8080 features. Although it 17. F.G. Heath, Large Scale Integration in Electron-
was decreed to not have floating-point arith- ics, Scientific American, Feb. 1970, p. 22.
metic, an honor reserved for the 8800, 18. G.E. Moore, Cramming More Components
the 8086 was soon joined by the 8087 onto Integrated Circuits, Electronics, vol. 38,
floating-point coprocessor. Over the follow- no. 8, 19, 1965, pp. 114-117.
ing generationsthe 286, 386, 486, and 19. The Technical Impact of Moores Law, IEEE
Pentiumenhancements provided memory SSCS Newsletter, vol. 20, no. 3, 2006.
protection, dynamic memory management, 20. G. Blynsky, Little Chips Invade the Memory Mar-
and 32-bit data. ket, Fortune Magazine, Apr. 1971, pp.100-104.
Even though it was a crash, stop-gap pro- 21. L. Berlin, The Man Behind the Microchip, Oxford
gram, the 8086 CPU provided a solid base for Press, 2005.
successive and remarkable generations of 22. B.O. Evans, System/360: A Retrospective
CPU chips at Intel (see Figure 1). Its talented View, IEEE Annals of the History of Computing,
computer chip architects were versatile and vol. 8, no 2, 1986, pp. 155-179.
creative in response to the challenge of
building a new, but compatible computer Readers may contact Stanley Mazor at stanmazor@
sbcglobal.net.
chip.
Contact department editor David Walden at annals-
anecdotes@computer.org.
References
1. S. Mazor, The History of the Microcomputer,
Readings in Computer Architecture, M. Hill,
N. Jouppi, and G. Sohi, eds., Morgan Kaufman,
2000, p. 16.

JanuaryMarch 2010 79

You might also like