Tuesday 17 June 2008

Stephen Morse: Father of the 8086 Processor

The engineer whose 30-year-old architecture still influences PCs today talks about how Intel took the SEX out of the pioneering CPU, and other tales from the beginning of the x86 era।Part 1 of a special five-part series।

In honor of the 30th anniversary of Intel's 8086 chip, the microprocessor that set the standard that all PCs and new Macs use today, I interviewed Stephen Morse, the electrical engineer who was most responsible for the chip. Morse talks about how he ended up at Intel (his interest in Volkswagens helped), his freedom to innovate at what was otherwise a buttoned-down company, and the importance of a brand-new style of chip development--as well as the 8086 feature that he called SEX.
PC World: For history's sake, when were you born?
Stephen Morse: I was born in Brooklyn in May 1940.
PCW: What inspired you to get involved in electronics?
SM: I was always fascinated by electricity, as far back as I can remember. I recall in sixth grade taking out whatever books I could find in the library on the topic. In junior high I asked my mother what field could I go into that would combine my love of electricity with my strong ability in mathematics. "Electrical engineering," she replied. From then on my career course was charted, and I never deviated--I went on to get a bachelor's, a master's, and a PhD, all in electrical engineering.
PCW: What was the first computer you ever used?
SM: Computer courses were unheard of when I did my undergraduate work, and even in graduate school the only exposure that I had to programming was an after-hours noncredit course on Fortran. At the conclusion of that course, we were allowed to run just one program on the school's IBM 650. Of course we didn't run it ourselves--we punched the program onto a deck of cards and handed the deck to a computer operator. We never even got to see the machine. That was in 1962.
PCW: When did you begin working at Intel?
SM: I began at Intel in May 1975.
PCW: How did you get the job?
SM: Prior to Intel, I had been working for the General Electric R&D Center in Schenectady, where I had single-handedly designed and implemented a complete software support system for a new innovation at that time: a computer on a card. When I decided I could no longer take the cold winters of upstate New York and wanted to move back to sunny California (I had taught at UC Berkeley prior to joining GE), I looked into what companies in the Bay Area were doing related work. I found a relatively unheard of company, called Intel, that was involved with computers-on-a-chip, so I decided to send my resume. The company had a whole team of engineers doing the same things that I was doing by myself at GE, so we had a lot in common, and they made me an offer. I don't know if it was my microprocessor experience that got me the job, or the fact that I was very involved with Volkswagen engines in those days and the hiring manager blew the exhaust valve on the number-three cylinder of his VW bus a week after he interviewed me.
Intel 8086 Development
PCW: How did the 8086 project get started?
SM: The state-of-the-art product at the time was the Intel 8080. But Zilog was eating Intel's lunch with a processor it had, called the Z80. The Z80 was an 8080 in every sense of the word, but it also filled in the 12 unused opcodes with some useful string-processing instructions. So it did more than the 8080, and Zilog captured the 8-bit market.
Intel wasn't too concerned because it was working on a new high-end processor called the 8800 (which would change names a few times before finally coming to market as the 432), and it fully expected that chip to be the future of the company. The schedules for the 8800 kept slipping uncontrollably, however, and management finally realized that they needed to come out with a midrange processor to counter the Zilog threat. But nobody expected it to be anything more than a stopgap measure, because once the 8800 came out there would be no need for such a midrange solution.
In the meantime I had just completed an evaluation of the 8800 processor design, and written a report on it. My report was critical and showed that the processor would be too slow. Because of my report, management decided that I would be the ideal person to design the architecture for the stopgap measure. If management had any inkling that this architecture would live on through many generations and into today's Pentium processors, and be the instruction set executed by more instances of computers than any other in history by far, they never would have entrusted the task to a single person. Instead it would have been designed by committee, and it would have come out much later than it did.
The person I worked for, Terry Opdendyk, was in charge of the software group. He walked into my office one day and asked if I would design the instruction set for the new processor that Intel needed. This was a complete break with tradition, because up until that time hardware people did all the architectural design, and they would put in whatever features they had space for on the chip. It didn't matter whether the feature was useful or not, as long as the chip real estate could support it. Now, for the first time, we were going to look at processor features from a software perspective. The question was not "What features do we have space for?" but "What features do we want in order to make the software more efficient?" (At that time the 8800 was also being designed by software people, but that processor was many years away from coming out the door.)
So there I was, a software person who would be chartered with what was normally considered a hardware task. Although Terry remained my boss, for the work on the 8086 I would report to Bill Pohlman, who was the project manager for the new stopgap processor.
PCW: Was there a specific goal in mind for the 8086?
SM: The only requirements that management gave me were to make it somehow 8080-compatible (so Intel could tell customers that they could run their existing assembly-language programs) and that it address at least 128KB of memory (one of Intel's customers had an application that was exceeding the 64KB limit of the 8080).
PCW: When did development start on the 8086?
SM: The last revision of my 8800 evaluation was dated April 14, 1976. I believe I started working on the 8086 in May, and on August 13 (three months later) I published Rev. 0 of the instruction set. It was actually more than just the instruction set, since it covered the register structure, I/O space, interrupt mechanism, memory addressing modes, etc. So we quickly started talking about the architecture rather than just the instruction set.
PCW: How big was the 8086 development team, and who else was involved?
SM: When it started, it was just Bill Pohlman and myself. Bill was the project manager and I was the engineer. After I finished the first cut of the architecture specs, Bill brought on board a logic designer named Jim McKevitt. Jim and Bill were my primary points of contact in the hardware group. Many other people (a chip designer and such) were added to the project later, although I didn't interact with them directly.
After I finished two revisions of the architecture specs, Terry enlisted Bruce Ravenel, a second software person I could bounce ideas off of, and together we kept refining the specs.
PCW: Describe the atmosphere and general feeling in your office during the creation of the 8086.
SM: Company culture at Intel varied depending on the level you looked at. On the group level, things were great--the software team at Intel was top-notch, and we were all proud of what we were doing and how we were doing it. But at a higher level, the corporate culture often got in the way. Andy Grove was famous in those days for implementing a "late list." Regardless of how late you worked the previous evening, if you were not in by 8:05 a.m. the next day you had to sign the "late list" when you entered the building. At first we all laughed about it, because nothing was done with the list. And Opdendyk even advised us that if we could not get in by 8:05, we should simply not show up until after 9, when the list was taken down. But sometime later upper management started insisting that the list be used in employee evaluations, and it was no longer something we could laugh about. The culture had other similar aspects, and pretty soon Intel stopped being a fun place to work.
PCW: What was your official role in the 8086's development? Your biggest contribution to the project?
SM: My official role was the chief architect, although that title wouldn't exist at Intel until several processors later. Actually, until Ravenel came on the scene, I was the only architect. McKevitt would try to design the logic to implement the architecture, and we had a lot of give-and-take between us as he would point out things that were very expensive to implement and I would then try to come up with alternate specifications.
The spec I was writing was at a high level. It specified the register set but it didn't talk about the bus structure to pass data between the registers or about the machine cycles during which time the data actually got passed. That was all in McKevitt's area. As I was writing and revising a document titled "8086 Architectural Specifications," and he was writing a companion document called "8086 Device Specifications."
PCW: What constitutes the "architecture" of a microprocessor?
SM: The architecture, as I see it, is the high-level specs of the processor. This consists of the instructions set, memory structure, memory segmentation, register structure, I/O space, interrupt mechanism, instruction operands, memory addressing modes, and so on.
PCW: Take us into the design process a little, step by step. With what tools and methods did you design the 8086?
SM: Although physical paper was still in fashion, computer files were starting to catch on, and I wrote the specs by typing them directly into a file. Personal computers were still a few years off, so I did my work on a remote terminal to a PDP-11 mainframe. I wrote the document using a primitive text editor that we had at that time called TECO. I included diagrams in the spec using ASCII characters--that is, drawing boxes made up of dashes, vertical bars, and plus signs.
There was a simulator program that somebody wrote, but I never used it. Instead the hardware group used it to verify the logic design and microcode to make sure that it correctly implemented the specs. The software group did other simulations to make sure that the addressing modes the processor provided would allow for an efficient implementation of high-level languages.
I had no direct involvement with testing the 8086, since the hardware department handled that entirely.
PCW: What obstacles, if any, did you face while developing the 8086?
SM: Very few। Because nobody expected the design to live long, no one placed any barriers in my way and I was free to do what I wanted.
The 8086 in Context
PCW: What are some of the distinguishing characteristics of the 8086 that made it stand out from other microprocessors of the day?
SM: Its most distinguishing characteristic was that it was a 16-bit microprocessor. I believe it was the first commercial 16-bitter in the microprocessor field. But the characteristics that I liked the most, and had the most fun designing and unifying, were the decimal arithmetic instructions and the string-processing instructions.
PCW: Why did Intel start making backward-compatible CPUs, and why did the company do it so well compared with other CPU manufacturers?
SE: The reason that Intel was concerned about backward-compatibility (and the reason everyone is, as well) is that you have a captured market base that you don't want to lose. If you have customers all using the 8008, when you come out with your 8080 processor you want your customers to be able to migrate their existing applications easily. If they had to rewrite all their applications, they would also be free to consider a new processor from the competition.
That's a lesson that Zilog learned the hard way. Zilog made its first splash with the Z80; that chip was compatible with Intel's 8080, so Zilog was able to steal Intel's customers easily. And it became a significant player in the marketplace. Then when the 16-bit race started, Zilog figured it had made a name for itself and could afford to do its own incompatible design for a 16-bit product, called the Z8000. But once Zilog's own customers discovered that programs could no longer be migrated from the Z80 forward, those customers became free to look around at the 16-bit marketplace, and they chose the 8086. Had Zilog gone with a 16-bit compatible upgrade of the Z80, history might have been different.
PCW: Was the 8086 designed with future backward-compatibility in mind?
SM: Backward-compatibility was certainly an issue when the 8086 was being designed. There were some instructions that were implemented and then hidden because we couldn't see a logical upgrade path for them in future processors. These instructions were actually on the chip, but we never documented them so that we would not be constrained by them in the future.
PCW: Can you share any funny, interesting, or unusual anecdotes about the 8086 that we haven't covered already?
SM: I always regret that I didn't fix up some idiosyncrasies of the 8080 when I had a chance. For example, the 8080 stores the low-order byte of a 16-bit value before the high-order byte. The reason for that goes back to the 8008, which did it that way to mimic the behavior of a bit-serial processor designed by Datapoint (a bit-serial processor needs to see the least significant bits first so that it can correctly handle carries when doing additions). Now there was no reason for me to continue this idiocy, except for some obsessive desire to maintain strict 8080 compatibility. But if I had made the break with the past and stored the bytes more logically, nobody would have objected. And today we wouldn't be dealing with issues involving big-endian and little-endian--the concepts just wouldn't exist.
Another thing I regret is that some of my well-chosen instruction mnemonics were renamed when the instruction set was published। I still think it's catchier to call the instruction SIGN-EXTEND, having the mnemonic of SEX, than to call it CONVERT-BYTE-TO-WORD with the boring mnemonic CBW.

After the Launch
PCW: Besides the 8086, did you work on any other major CPU projects at Intel?
SM: No, that was the only one for which I was involved in the design. Although I had nothing to do with the design of the 286 and 386 successor processors, I was very familiar with them, and I wrote books about them as follow-ups to my 8086 book. I was also involved with those later processors at my next company, where I was a consultant to customers who were trying to design embedded systems using those processors.
PCW: From what I understand, the 8088 was the 8086's successor. What advantages, if any, did the 8088 offer over the 8086?
SM: The 8088 wasn't the successor--rather, it was a castrated version of the 8086. As of the day I left Intel the first time (in March 1979), I had never heard of the 8088, but a few weeks later I learned that the company was about to ship it. So you can see that it certainly wasn't a major design effort. What the company did was modify the data bus so that 16-bit data was sent out in two cycles, 8 bits at a time. That meant you could use the processor with all the 8-bit peripheral chips that were already in existence for the 8080, rather than waiting for a new set of 16-bit peripheral chips that were undoubtedly coming but weren't around yet.
That brings up an interesting story. I wrote a book called The 8086 Primer that turned out to be a best seller (it sold over 100,000 copies). I gave a copy to a friend of mine who knew nothing about computers but was proud to display it on his bookshelf. His son was studying computers in school at the time, and when he saw the book he made a comment to the effect that the 8086 was obsolete and had been replaced by the 8088. His son obviously didn't understand what the 8088 really was, but I realized that other people might be in the same situation and wouldn't buy my book because they thought that the processor was "obsolete." So in the next printing of the book, I changed the title to The 8086/8088 Primer , and suddenly everyone again thought that they were getting a book about the latest processor.
PCW: How has being the designer of the 8086 changed or influenced the course of your life?
SM: Not much at all. When people introduce me, they usually add something about my being the 8086 designer. But I usually cringe a bit, because I really don't think it was that great an accomplishment. Any bright engineer could have designed the processor. It would probably have had a radically different instruction set, but it would have had Intel's backing behind it and all PCs today would be based on that architecture instead. I was just lucky enough to have been at the right place at the right time.
PCW: What does the x86 architecture mean to you today? Do you think it is still relevant, or is it merely an vestige of the past?
SM: It's very relevant. There is an underlying instruction set that has propagated from the 8086 forward, such that any assembly-language code (or even machine-language code for that matter) that was written for the 8086 can still be executed on today's Pentium processors. Sure, now we have many new features and advanced caching that were never even imagined when I did the 8086 work. But the core instruction set that is inside every x86 processor is still the same as what was in the 8086.
PCW: Do you have any thoughts or comments you'd like to share about the 8086's 30th anniversary?
SM: That caught me by surprise. Until I received your e-mail, I wasn't even aware that this milestone was coming up. Happy Birthday!
PCW: How do you feel about the Apple Macintosh line's use of x86 architecture now? Were you surprised when you first heard about it?
SM: Yes, a little surprised, but it made a lot of sense. Suddenly I had no more excuse for not buying an Apple computer.
PCW: What are you up to these days?
SM: I'm retired now but still doing the same sorts of things that I've done throughout my career--namely applying computers to new applications. As a hobby I've started delving into genealogy, and I've discovered many ways in which I could use the power of the Internet and the computer to do genealogical searches in ways that were not possible before. So I put up a Web site with a collection of my Web-based tools, and I've developed a sort of cult following. My site now gets over 100,000 hits a day, and I've been invited to lecture on the topic worldwide. The Web site address is stevemorse.org.
PCW: Do you do any electronics tinkering or design in your spare time?
SM: I still enjoy tinkering with electronics as well, and every now and then will build something with which to amuse myself.
PCW: Where do you live now?
SM: San Francisco, in the same house that I lived in when I worked for Intel. I moved into this house in 1975, and except for a one-year house exchange when I lived in Paris, I have been in this house ever since.
PCW: Are you a Mac or a PC guy?
SM: I'm a PC guy. I long resisted the Mac because there were still programs that were written for the PC and would not run on the Mac. I felt it was like the Betamax/VHS story: Betamax was a better technology, but anyone buying a Betamax recorder would have a small selection of tapes available to rent and would be limited in who they could share tapes with. Now that you can get a Mac that executes x86 code, the situation has changed somewhat, but I've resisted a Mac for so long that it's hard to switch gears at this point.

Reference : http://www.pcworld.com/article/id,146917-pg,1/article.html

No comments:

Nasser Hajloo
a Persian Graphic Designer , Web Designer and Web Developer
n.hajloo@gmail.com

Subscribe feeds via e-mail
Subscribe in my preferred RSS reader

Subscribe feeds rss Recent Entries

Advertise on this site Sponsored links

Labels And Tags

Archive

Followers

All My Feeds

Computer And Technology News Blog
Hajloo's Daily Note Blog
Development World Blog
Iran ITIL - ITSM Center Blog
Khatmikhi Press Blog
Khatmikhi Blog
Mac OS X in Practice Blog

Subscribe feeds rss Recent Comments

Technorati

Technorati
My authority on technorati
Add this blog to your faves