Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

When Exxon Wanted To Be a Personal Computing Revolutionary 124

An anonymous reader writes with this story about Exxon's early involvement with consumer computers. "This weekend is the anniversary of the release of the Apple IIc, the company's fourth personal computer iteration and its first attempt at creating a portable computer. In 1981, Apple's leading competitor in the world of consumer ('novice') computer users was IBM, but the market was about to experience a deluge of also-rans and other silent partners in PC history, including the multinational descendant of Standard Oil, Exxon. The oil giant had been quietly cultivating a position in the microprocessor industry since the mid-1970s via the rogue Intel engineer usually credited with developing the very first commercial microprocessor, Federico Faggin, and his startup Zilog. Faggin had ditched Intel in 1974, after developing the 4004 four-bit CPU and its eight-bit successor, the 8008. As recounted in Datapoint: The Lost Story of the Texans Who Invented the Personal Computer, Faggin was upset about Intel's new requirement that employees had to arrive by eight in the morning, while he usually worked nights. Soon after leaving Intel and forming Zilog, Faggin was approached by Exxon Enterprises, the investment arm of Exxon, which began funding Zilog in 1975."
This discussion has been archived. No new comments can be posted.

When Exxon Wanted To Be a Personal Computing Revolutionary

Comments Filter:
  • My first (microprocessor) love. And one that would have remained faithful had I not been entranced by bigger silicon....

    • Re:Ah the Z-80 (Score:5, Insightful)

      by Crashmarik ( 635988 ) on Sunday April 26, 2015 @09:02PM (#49557217)

      I still remember hoping the successors would make some headway. LSI-11, MC68000, Z-80 all proof that evolution doesn't select for excellence.

      • Re:Ah the Z-80 (Score:5, Interesting)

        by LMariachi ( 86077 ) on Sunday April 26, 2015 @09:15PM (#49557259) Journal

        Z80s are still being manufactured and still in use all over the place, just not so's you'd see them.

        • Z80s are still being manufactured and still in use all over the place, just not so's you'd see them.

          The same is true of 68k-family cores, which are also still being used in pretty much the same places as Z80s.

        • Re:Ah the Z-80 (Score:5, Insightful)

          by jeffb (2.718) ( 1189693 ) on Sunday April 26, 2015 @11:41PM (#49557831)

          If you have a child in middle school, there's a very good chance they'll be required to use a TI calculator -- these days, a TI-84, most likely. Those calculators run on a Z80. If your child's ambitious, he/she can still tinker with Z80 assembly on an actual physical host.

          This is a small tribute to the Z80 processor, and huge, scathing indictment of TI's lock on the education market. ~US$100 for a Z80-based calculator? In 2015? It was a sweet chip in 1977, and it's clearly still useful. But at this point the calculators should be selling for well under $10.

          • Re: (Score:2, Informative)

            by Anonymous Coward

            Z80 also had a radiation hardened version so It's used in a lot of special locations, rabbit semiconductor, now digi, still makes them and includes support current network technologies

          • by Agripa ( 139780 )

            The crown jewels of the TI programmable calculators is the firmware and not the hardware. HP had a similar thing going and wrote an emulator running on ARM to emulate the Saturn 4 bit CPU from their earlier calculators maintaining backward comparability.

        • Re:Ah the Z-80 (Score:4, Insightful)

          by TheRaven64 ( 641858 ) on Monday April 27, 2015 @05:41AM (#49558617) Journal
          They're increasingly hard to justify though. Cortex-M cores are really, really cheap (M0 and M0+ especially) and a modern 32-bit instruction set can be a significant win. You can't justify a 16-bit microcontroller on cost grounds anymore, let alone an 8-bit one. The main places Z80s are used is in systems designed in the early '80s that would cost too much to change, but which need periodic repairs.

          I've seen a few things recently that have taken an amusing middle ground and bought ARM cores and used them to run a Z80 emulator, because it was cheaper to get the associated peripherals to attach to the ARM core.

          • Re:Ah the Z-80 (Score:5, Informative)

            by AmiMoJo ( 196126 ) on Monday April 27, 2015 @07:53AM (#49558957) Homepage Journal

            8 bit MCUs are still very common and have many advantages of ARM. Cheap as ARM is it doesn't tend to get down to the few tens of cents range that 8 bit MCUs do, and the cores often require much more support hardware (such as voltage regulation because they can't run from 5V, or need 1.8V to get the power consumption down). Developing for them is also much more involved and particularly for high reliability applications it can be harder to audit the code and guarantee safe operation.

            ARM has a lot of advantages too, but when you just need a cheap, easy to use (software and hardware wise) MCU that consumes next to now power 8 bit is still king.

          • by Agripa ( 139780 )

            8 bit and maybe 16 bit alternatives to ARM where higher performance is not necessary are still both cheaper and lower power but the infrastructure advantage of ARM is making even that difficult to ignore and I would not want to run an IP networking stack on anything smaller. I suspect the largest advantage 8 and 16 bit microcontrollers have at this point is that they are available in packages that ARM is not but should be. The ARM manufacturers seem to be playing a game of market segmentation which is hur

        • A big advantage of the Z-80 was the peripheral chips, are they still available?

          In the days where a printer port has a plug-in board 10 inches across, the Zilog chips allowed much smaller and simpler boards. We built a hardened system for RF transmitter control that was one quarter the size and cost of the competition.

          I was very disappointed that later chips sets still required so many extra "helper" chips. It was not until about 15 years ago that (almost) single-chip peripherals re-appeared.

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        The electronics industry isn't a natural process, it's part of human artifice, so I would say that it's proof that the market doesn't select for excellence, and that's explored well in such things as "The Century of Self" and the reason why marketing departments, even if staffed by idiots, are well funded: people buy what they are told about, more than what has the best functionality for developers. Even if the target market is developers... ARM is less well known than Intel despite leading by volume for de

      • The Z80 design was moribund for several years. If Zilog had quickly made a compatible successor to the Z80 with a 16 bit datapath and a multiply instruction, we'd be complaining about a Zilog monopoly instead of Intel.
        • Comment removed based on user account deletion
    • Actually Zilog was a really fun place to work as I learned a lot of stuffs while working there

      • Its interesting that you say that. As somebody who works in fabs, I have always heard horror stories about the working conditions in zilogs fabs. Pretty much all of them are focused on the completely unrealistic (for semiconductor manufacturing) expectations of oil executives as to what process yields and cycle times should be. This was combined with managers that felt it was OK to stand on an employees desk and scream at them.
    • by Agripa ( 139780 )

      I considered the Z-80 a big upgrade from the 8080 simply because the assembly syntax discerned type from the operands like the later 8086; I hated having separate mnemonics for this on the 8080. The dedicated index registers and their addressing modes were nice as well. I considered the 8086 a further improvement over the Z-80.

  • Z80 was in TRS-80 (Score:4, Interesting)

    by ShanghaiBill ( 739463 ) on Sunday April 26, 2015 @08:48PM (#49557163)

    I remember learning Z80 assembly on the "Thrash 80". Great microprocessor. It had two register banks, so context switches, and interrupts, were really fast. There were also some undocumented instructions, and if you knew those you had a lot of street cred with the other teenage nerds. Fun times.

    • by Hovsep ( 883939 )

      I learned programming on a TRS-80 Model II myself. As I recall it was the "Trash-80," not the "Thrash-80." A derogatory name quickly picked up by it's fans as it's nickname. CLOAD A...and hope the tape wouldn't fuck up in some way or another. At least it worked which is more than I can say for trying to save or read from tapes on a Timex 1000. Was anyone ever successful in getting a cassette tape to work with that thing?

      • My first summer job was at a music store, doing some simple coding and data management on a TRS-80 (with 16K, BASIC, and even a floppy drive!). At least that was my job when they didn't need me to man the till or help fill out rental contracts for students who were going to be taking band.

        There were some simple software packages available commercially for stuff like payroll, but my boss wanted custom stuff so I wrote it all myself.

        I remember staying late a lot, after the store closed, and playing games that

      • I still have my ZX 81. I wrote an accounting program to track expenses on my 300 acre farm. I had no trouble loading the tape every month. All my entries were stored in variables. I could print reports sorted by month, vendor, or category on a 4 inch thermal printer. I used it for three years until I got a TRS-80 Model 4P in 1984. I used that for 10 years until it died.
      • I remember it well...my father worked for Univac at the time. He bought one of the first 100 TRS-80s off the line.

        Of course he opened it up and our motherboard had hand soldered wires to replace incorrect traces on the board.

        That winter, we lived in Tennessee at the time, we got so much snow that from Christmas to the end of January we had 3 school days.

        What I remember most was the "Learning Basic" book, with the TRS-80 with legs talking to you from the margins.....

        good times....good time....
  • by MrLogic17 ( 233498 ) on Sunday April 26, 2015 @09:09PM (#49557233) Journal

    Mangers! Learn this lesson from history: Intel lost one of the word's greatest computer chip designers, and created their own competition by making arbitrary work requirements, and not recognizing work-life balance.

    Employees are people, not machines. Your greatest talent will, at some point, say "screw you" - and start competing with you. Unless you take care of them like human beings.

    • by Michael Woodhams ( 112247 ) on Sunday April 26, 2015 @09:25PM (#49557289) Journal

      There are three lessons here. One is about arbitrary work requirements, which you've made well.

      Second is the problems which arise when vertical integration in your company means that one level's customers are another level's competitors. This conflict of interest is liable to drive away customers. (A company my father worked for many years ago had a similar issue: one branch manufactured and sold refrigeration equipments and spare parts. Another branch maintained and repaired refrigeration equipment, so their competition was the manufacturing branch's customers. The maintenance branch was separated into a new company to avoid this problem.)

      Third is when you have a large corporation with an innovative product, that innovative product's potential can easily be crippled by being held hostage to vested interests of other parts of the corporation.

      • Third is when you have a large corporation with an innovative product, that innovative product's potential can easily be crippled by being held hostage to vested interests of other parts of the corporation.

        Wait, when did we start talking about Sony?

      • by Anonymous Coward

        This is what killed Eastman Kodak. Worked of them for many, many years. They invented a lot of cool stuff, including the digital camera, but buried it because they didn't want to hurt the film business. "Buried it" - what a joke! I can't believe anyone in charge could possibly believe that it could be buried, but they refused to develop it so they could "save the film business". Now it's "Kodak who?". Sad. It was a great company to work for.

    • "Mangers! Learn this lesson from history: Intel lost one of the word's greatest computer chip designers, and created their own competition by making arbitrary work requirements, and not recognizing work-life balance."

      Minions! Learn this lesson from history: even the greatest computer chip designer can't build a competition against us, so you'd better welcome your management overlords.

      We are Intel, The Almighty, but where is Zylog, now?

      • by sjames ( 1099 )

        We are Intel, The Almighty, but where is Zylog, now?

        Zilog is doing just fine producing microcontrollers based on updates to the Z80.

    • by Anonymous Coward on Sunday April 26, 2015 @10:42PM (#49557613)

      When asked to attend an 8:00 AM meeting, the programmer responded that he didn't stay up that late.

    • If you didn't get in by 8, you had to talk to Andy. Some good engineers stayed away from Intel because of Grove's strictness. In retrospect, it was probably a bad choice. The brains of silicon valley chose silicon when they founded Fairchild Semiconductor and when they moved on to found Intel a decade later, the best move was to follow them. They made some bad and distasteful choices, but overall they were just kind of brilliant and improved the world.

    • Comment removed based on user account deletion
    • by AmiMoJo ( 196126 )

      Note to PHBs: The lesson is not "make sure they sign a non-compete agreement".

  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Sunday April 26, 2015 @09:17PM (#49557265) Homepage

    First, a correction:

    the company's fourth personal computer iteration

    True only if you ignore the Apple I and Apple ///, because there was the Apple ][, Apple ][+, and Apple ][e.

    Now, the Apple ][c came out during a brief time when I was trying to ignore computers, so I didn't pay much attention to it at the time, but this from the summary caught me by surprise:

    first attempt at creating a portable computer

    How can anything requiring an external CRT be considered portable? I mean, even by Compaq and Kaypro standards? Looking at Wikipedia, there was apparently a 1-bit LCD display available, but even that was external with no fixed mount. I mean, yeah, they shrunk the form factor, which I would hope they could do after seven years, but portable? No, regardless of their claims.

    • How can anything requiring an external CRT be considered portable? I mean, even by Compaq and Kaypro standards?

      An Apple II was more portable than e.g. a Kaypro 4 because there was a TV you could use as a monitor pretty much anywhere and it was half the weight to carry around... subjectively, I haven't compared numbers.

      • TVs just weren't laying around, even in 1985. They were bulky and expensive, and it probably already had a computer or videogame hooked up to it already.
        • TVs just weren't laying around, even in 1985.

          What, they had an active social life? They were going to parties?

          They were bulky and expensive,

          What? CRTs barely shrank from the 1980s until they went out of fashion.

          and it probably already had a computer or videogame hooked up to it already.

          Oh noes!!11!1!!

        • by AK Marc ( 707885 )
          TVs were just laying around. And it didn't matter if there was something there. You hooked up to it. One on chan3 and the other on chan4, or 100 things on chan3, no problems, so long as one was on at a time. I had a spare attached to a TV, so if someone brought over a C64 or IIc, it's be a 2 second plug to be up and running. Took longer for the tube to warm up than to hook it up.
        • Yeah, because it's really hard to unplug the cable from that Atari console and plug it into the Apple II. It requires electricians with union cards and stuff.

          • Yeah, because it's really hard to unplug the cable from that Atari console and plug it into the Apple II. It requires electricians with union cards and stuff.

            Bah, sophisticated people had one of those combiners, where you plugged in two or more aerial/antenna outputs from your aerial, your VCR and several other devices like computers and game consoles, and had one cable going to your TV, either passive or with a switch to select input. Advanced Electrician's Magic, knowledge now lost to the ages.

      • by gl4ss ( 559668 )

        by that logic sinclair spectrum was more portable...

        if it's portable, it includes the monitor. or rather is an all in one in one portable casing.

    • Re:TIL (Score:4, Insightful)

      by dissy ( 172727 ) on Sunday April 26, 2015 @10:39PM (#49557605)

      The Apple //c was only 7.5 pounds, which is FAR more portable than the original Compaq portable which was 28 pounds.

      I believe the term you are claiming this isn't would be "laptop".
      But for the time these were as portable as you got.

      You didn't need packaging material due to the slightest shock breaking something, they could be disconnected and moved by a single person without any safety registrations (usually requiring one to lift at least 50 pounds), and could be transported as a single unit.

      Of course adding extra peripherals limits that portability - just like now - but the most common hardware was built in and self contained.

      The only big downside for portability the Apple //c had was that the display was an option, and you could choose between the attachable LCD or an external black and white (well, green) CRT that was much cheaper. The CRT was not very portable, although I remember being able to carry it by the built in handle as a child, but it was just as fragile as any other CRT at the time.

      • Is it? 7.5 or 28 does not really make a huge difference. Boths portable use case is carry it to the car, walk to the neighbors, etc. Sure it is easier to pick up and move a 7.5 pound device, but 28 pounds is easily movable as well. They both seem to be in the same strata of portability to me.
        • by dissy ( 172727 )

          Personally I do consider both examples before as portables.

          But the only other comparison would be to non-portables, which was most everything else available at the time.

          I would say both my PC Jr and AT&T 4400 were pretty small and light compared to most micro-computers before that. But either of those was still three trips to the car, or five trips total for both by putting all the cables and such in a box together.

          The Compaq portable was a single trip, as was my first //c with LCD.

          Most older micro's,

      • The Osborne, the original "Compaq portable", the original "IBM Portable", and the Hyperion were all classed as "luggable" by the real people who didn't work for marketting. It was roughly the 20 to 30 pound range (yes, we were still using pounds back then) with a fat-briefcase form factor. The Osborne product was first, but 8-bit g.p. machines were doomed by then. To my mind the Hyperion was the best of them, but they were months too late in the "IBM BIOS compatibility" race and it cost them the business.
      • My housemate used IIc and other systems by the simple expedient of having one monitor at work and another at home.

        He did the same thing years later with a tiny 386-based system, justifying it by pointing out that the entire setup was both more powerful than and half the price of comparable "laptops".

        System portability in the days before widespread Internet was _extremely_ useful.

    • You can ignore the Apple ///. Approximately everyone at the time did so.

  • by Anonymous Coward

    Hardly. Unless Apple had a secret mainframe department, the competition for the home computer market was Commodore, Atari, Radio Shack, TI....

    Enough with the revisionism already.

  • by Anonymous Coward

    My second PC language was Zilog assembler, on the trash-80. The CPU got a new life in the Nintendo gameboy and second/third generation of digital organizers. IIRC, the clock speed was increased several times, with Z-80H running at 8 MHz.

  • I spent several years in the early '80s programming the Z80 and 68000 families in assembler for embedded communications systems. Started with the hardware design and built custom firmware and real time OS. So much more rational and powerful than the Intel "equivalents." Best programming I ever did. Sometimes the good guys don't win.

    And as an aside, I had the Intel-based IBM industrial controller machines (960?) that formed the basis of the PC in my research lab in graduate school, built bare metal packe

  • by coming up with a company brand name that sounds like a Star Wars planet.
  • by stox ( 131684 ) on Sunday April 26, 2015 @10:46PM (#49557621) Homepage

    and ZEUS!

    Zilog was a real Unix contender for a while.

    • by sconeu ( 64226 )

      I worked with one of those for about 5 years. Loved it.

      We were developing an artillery control system and used a Z8000 as the CPU. We did development on the S8000 under ZEUS.

      The Z8000 was really a nice chip. Much nicer than the 286.

  • Exxon had the Qyx office systems which competed with Wang word processors before there were PCs. They had the nicest keyboards that I ever used on a typewriter. The feel of the keys was just right for fast, accurate typing. This was around 1979-80.

  • by Tablizer ( 95088 ) on Monday April 27, 2015 @02:26AM (#49558253) Journal

    PHB1: "We have too much money from oil. What are we going to do with it?"

    PHB2: "I got it, let's be IBM! Lets make those computer thingamajigs."

    PHB1: "Brilliant! I vote we both get a bonus for that idea."

  • So the navigation system on the Valdeez was running a Zilog Processor?

    Evil oil companies!!!

  • by advantis ( 622471 ) on Monday April 27, 2015 @03:25AM (#49558359)

    From TFS: "Faggin was upset about Intel's new requirement that employees had to arrive by eight in the morning, while he usually worked nights."

    I've heard both sides of the story:

    Side A: But if you're in the office while everybody else is in, you can work more efficiently, as everybody else is there to answer your questions.
    Side B: Some of the best engineers I've worked with worked nights. Some of them slept under their desks and rarely showered, but none of the 9-5 people came close to their performance.

    Basically, if people perform don't mess with their schedule or their appearance.

    If you're on Side B, Side A also has that negative that is given as a positive: everybody else is there. (sarcastic tone of voice) Yeah!! If you want to not get any work done because of all the "quick" questions everybody has while "headphones" doesn't register with them as "leave me alone!"

    • Side A: But if you're in the office while everybody else is in, you can work more efficiently, as everybody else is there to answer your questions.

      The benefits of having everyone in the office at the same time is that you can be a more effective team. Engineering is (mostly) a team sport. You have to structure the work environment right so distractions and pointless meetings are minimized otherwise you are setting yourself up for failure. But most important is that you need an environment where the team can work effectively together. For most tasks this requires a non-trivial amount of direct interaction with coworkers. While time shifted teams c

      • I believe that you have a bias about "ideal teams".

        The OP described the 2 profiles that you can find:
        Side A: collaborative type
        Side B: competitive type

        When you work in a collaborative team, everybody unconsciously reduces their effort to a comfortable rhythm for the team.
        When you work in a competitive team, everybody do their best, so they work at their own rhythm.

        I experienced these 2 extreme environments, and the competitive spirit is the most efficient, BUT the collaborative spirit is more focused on rel

        • I believe that you have a bias about "ideal teams".

          Disagree. The ideal team structure for a situation can vary greatly depending on the task at hand and the personalities involved. I make no judgement about what is ideal for a given situation and I've seen a wide variety of team structures work effectively. But what is VERY clear is that having a single individual, no matter how talented, doing something wildly different than the rest of the organization is almost always a recipe for failure. There are exceptions that prove the rule but they are rather

      • What most enlighten companies do is simply "you must be here at least from 2pm to 5pm", and everybody just schedule meetings and face-to-face at those hours. People telecommuting will often see demands like that in the job description.

      • In 40 years of electronic engineering design work, time spent in necessary communications with others was on the order of 1 hour a month. Generally, time spent talking or listening is time spent not working.
      • Engineering isn't usually done by intense collaboration for eight-plus hours a day. There is a need for solitary time, or at least small-group time, and so if we schedule all the all-hands collaboration at well-chosen times we can accommodate people coming in at a fairly wide range of times. This does not mean you want an engineer going off in some weird direction without supervision or collaboration, but that you don't need to have everybody on the exact same schedule. This is useful, since not all peo

    • by Agripa ( 139780 )

      Side A: But if you're in the office while everybody else is in, you can work more efficiently, as everybody else is there to answer your questions.
      Side B: Some of the best engineers I've worked with worked nights. Some of them slept under their desks and rarely showered, but none of the 9-5 people came close to their performance

      Side C: Being their after hours allows productive work to be done without interruption whether that is by people or sound.

      At one place I worked, the boss had the phone system set to

  • I have heard that in the Penang (Malaysia) site, the director at Intel is known to take attendance after 9am.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...