Tuesday, May 29, 2012

Basic: How to read Honda OBD I codes

All passenger vehicles sold in the United States since 1996 have been equipped with OBD II, which stands for "Onboard Diagnostics" in the second generation. It's a reliable system that should give you information on engine and other defects for the life of the vehicle. The ostensible reason for mandating this system was to monitor vehicle emissions. Anyone can get trouble codes from an OBD II scanner which all dealers have, and your local Autozone should possess. There is a plug underneath the dash in the front seats, in the driver or passenger footwell. It's not hard to find.

The precursor to OBD II was, quite predictably, OBD I. It wasn't around for very long, and it wasn't as reliable, and it wasn't as useful, since each manufacturer did whatever they felt fit for reading the trouble codes.

For Civic and CRX models between 1992 and 1995 model years, the OBD I system was in place. For Accords, it's probably the same (although I am unsure, not having worked with them). Honda's particular method of getting OBD I trouble codes is slightly more advanced than simply plugging in a scanner.
  1. Pull up the carpet in the passenger footwell, around the right side (near the door).
  2. Notice on the right side: There is a jumper with at least two slots into which a wire can be inserted. With the ignition key in the OFF position, insert a wire or paper clip between these leads to jump it.
  3. Flick the key into the ON position, but do not crank the engine.
When you have completed step 3, the trouble codes will be displayed to you in a special form: via the flashing of the trouble lights on the dashboard. It doesn't actually matter which of the lights (engine, transmission, etc.) is flashing, since the codes are unique to the sequence of flashes only, not to which light was flashing. N.b.:
  • If a status light flashes once very quickly, that is 1.
  • If a status light flashes once for a longer period of time, that is 10.
  • When the same status light turns off for a extended pause, it's done. Add the numbers together.
  • It will repeat continuously, so if you missed it, just wait for the pause to end, and it will start over. If you have multiple codes, there will be a pause between each new code. Make sure you get all of them.
 Got your codes written down? Find out what they mean by the following table. (click to enlarge)

Hopefully this helps if you are in a situation similar to mine. I just had to learn this procedure a few days ago when my 1995 Honda's engine idiot light came on. I noticed that the engine would sometimes hesitate, so I wanted to figure out what was wrong as soon as possible. A friend referred me to the method for evaluating these codes. It threw me code 41: primary O2 sensor heater. So I turned to my Haynes manual and it told me to check it by disconnecting the output of the O2 sensor and checking the voltage across two of the terminals (check your service manual for detailed information about your vehicle), expecting about 0.9 V when warmed up, instead it read 0.04 V steadily, from cold to hot. Clearly the part was broken.

This is a very easy fix since the primary O2 sensor is right in the exhaust manifold up front, well within reach as long as you have the correct socket (which I borrowed from another friend, but if you want one, they can be found at Carquest for about $16.99). I replaced the primary O2 sensor, reset the battery, and the light did not come back on. When test driving it, it felt better. The problem was solved because the OBD let me know exactly what the problem was without having any detailed knowledge of the engine electrical system.

OBD I has two main advantages that make it pretty nifty for a shadetree mechanic.
  1. It doesn't need a scanner. So if you know how to do it, you can run codes yourself at home without buying any equipment. This means you can do true home maintenance with as little outside support as possible.
  2. It's less troublesome (No experience here, but I'm told this by reliable friends) if you want to modify your car. OBD II is considered to be more invasive and troublesome if you want to work on your engine in ways the manufacturer did not originally intend.
However, compared to OBD II, it has some crippling disadvantages that led to it being abandoned after just a few years.
  1. The placement of the "port" is specific to certain vehicles. Not every one is the same. Because OBD II easily consolidated all the different systems, there was no reason to make mechanics learn something new that was manufacturer-specific. A better solution was possible, and quickly found.
  2. Customers balked at receiving diagnostic fees for simply having their codes read. Some unscrupulous dealers and other shops have billed customers for "dashboard disassembly" as part of the diagnostic fee, which vastly overstates the difficulty and time required to locate the port and read the codes, but customers may not have known any better, and it was at a point in time in which reading codes was novel, and the only alternative was a detailed physical inspection to learn the same secret.
  3. OBD II scanners provide more detailed trouble codes, and have more codes available.
  4.  Adding up the codes by pencil and paper or memory is not hard, but if you're doing it 50 times a day, the extra time adds up.
Regardless of how and why OBD I was replaced, there are still plenty of vehicles made in the early- and mid-1990s which remain on the road and in the secondhand market. It would be advisable for all owners to understand the benefits and limitations of buying a car of this age. As of 2012, a mid-90s Honda in good shape can still yield a few years of reliable, cheap, fun motoring with just the slightest of effort. However, if you have no mechanical ability or interest whatsoever, I would personally suggest you ensure that your next ride is 1996 or later.
Thanks to Honda-Acura.net and Infobarrel.com.

Joke of the Day #10

Straight man:You shouldn't park here.
Speed freak: Why not?
Straight man: Bad part of town. They'll probably steal your radio.
Speed freak: Seriously? That's awesome.
Straight man, puzzled: Why?
Speed freak: I spend a lot of time lightening my car, and all I have to do is park in the right part of town and somebody will remove weight for free?
Straight man: ...
Speed freak: I'll just pop the hood. They can take the AC compressor if they want: lot of weight and it doesn't even work anymore!

Monday, May 28, 2012

Memorial Day: 1000 views

 Although I have never used this blog to point out days of interest, I would like to note that in the United States today is Memorial Day, which is a holiday during which Americans commemorate those who have died in the US Armed Forces. The sacrifice made by these individuals will never be forgotten by American patriots, and their souls are the subject of our most fervent prayers. Readers from around the world are invited to share if their nation also has a similar custom.

Cumulative Barn-megaparsec readership hit 1000 views as of May 28, 2012. In internet terms, this is about as insignificant of a quantum of readership as can be measured. Still, I'd like to take a moment to thank all readers who have shown dedication enough to follow this blog in its current rough form for the past six months. I also welcome readers to offer their opinions in the comments. Thank you for your time. I look forward to many exciting new pieces during the summer.

Saturday, May 26, 2012

Condensed History of Video Games: Nintendo Seal of Quality (1984-1989)


No joke- Atari buried tons of E.T. games
When we left the story of the video game saga in 1984, the market was in shambles and there were serious doubts that it  would ever return to its prior glory. The fear was that video games would pass away like the hula hoop and the Cabbage Patch Doll. No indications suggested that the best was yet to come, but as was the case in 1971 when Computer Space flopped, and 1977 when Pong got boring, 1984 was to be the darkness before the dawn of something better.

Atari 800
We haven't even discussed one of the biggest aspects of the video game crash. Video games earned their first foothold in homes at a time in which there were no personal computers. The 1981 introduction of the IBM PC did not have an immediate effect on this market, since it was too expensive for most households and it was basically a business tool. Graphics were a minimal priority from the start. But as cheaper offerings from Commodore, an improved Apple lineup including the new Mac for 1984, and cheaper PC clones from Compaq and others hit the market towards mid-decade, video games were a harder sell because they only offered entertainment and couldn't do business or education. These drawbacks often formed the topic of negative advertising. Why buy a game system that will ruin your children's minds, when you can nurture them with a home computer that can help them research, study, and learn how to use a computer? Ironically, nearly all of the video game makers, particularly Atari and Coleco, had cheap PC offerings as well, in the interest of diversification. The Atari 8-bit series (400 and 800 models) were also capable of playing some games, but they were full-fledged (although basic) computers. In the UK, Sinclair's dirt-cheap and feature-less ZX-80 and ZX-81 were prompting a British computer education boom that would inspire a new generation of British programmers.
Commodore VIC-20: A steal at $199 in 1982

The trend everywhere was for greater home computer ownership. That doesn't seem to conflict with video game ownership by modern standards, but in the early 80s, all of these were rather expensive propositions, and so it was a fairly well-off family that could afford both a home computer and a video game library in the slumping economy of the early 1980s. Moreover, the value of computer and video games in the home was still not yet proven; the idea was still novel. When faced with the option of getting a modern video game system or a modern computer, the question was easy for most families. They chose the home computer because it was capable of some tasks besides gaming. However, the machines of  the day were unable to do some of the most simple tasks that we take for granted today. Home office technology was largely unavailable unless you had an expensive high-end PC. Needless to say, the World Wide Web  did not exist yet, although there were rudimentary internet services like Prodigy.

If the home computer makers really knew their market, they could (and did) emphasize the game-playing ability of their machines to the same kids who would then pull the sensible side out when trying to get their mommies and daddies to fork over the cash for a new PC. All this was more fodder for the folks who insisted that the video game fad had run its course and it would never be so large again, at least not from the standpoint of dedicated consoles.

Atari 2600 "Jr": yours for $49.99
One might be forgiven for assuming that Atari faded into oblivion after the Crash of '83, and that all we have are memories. Far from the truth! Atari had strong sales of its 8-bit computers after the crash and actually went into the black by the end of 1984. Throughout the 80s they eked out small profits on their lower market share, no longer driving the industry. If the 7800 (1986) and Jaguar (1993) were earnest attempts to regain their former control, nobody seemed to notice. The 7800 was simply nothing special, and the Jaguar was downright terrible. The bright spot in Atari's gaming portfolio continued to come from the old 2600, which was repackaged in 1985, when it was technically obsolete, and sold at a correspondingly low price of $50 since all the design and tooling was long paid for. Atari didn't stop making the 2600 until 1992, making it to this date the longest-lived home video game console. (However, Sony has pledged to build the PlayStation 2 for as long as demand exists, which means the PS2 will beat the 2600 if it survives beyond 2014.)
Atari 7800: Ready in 1984, delayed until 1986

The makers of video games are different from other companies. It is hard to think of a company in as enviable a position as Atari in 1982, when it was the fastest-growing company in American history. It's quite impossible to rationalize that level of growth with the level of contraction that would be experienced even two years later, when video game sales took a nose-dive into mid-70s levels. This level of boom and bust is intolerable to modern corporations. But it's the way the video game market was.

This is part of the magic of the video game industry. It is making a product that is imaginative in every single way. A developer attempts to make a world in which you interact, in which the controls make sense and the things you do onscreen are both pleasurable and meaningful. The appeal of video games is incredibly strong, and it doesn't require physical strength or massive intelligence to play most of the fun games available. The fact that such imagination is required to make games that click with the public, means that no company can hope to ride on past successes and cease to produce challenging, exciting, innovative new games. Even if the market has collapsed, even if the very name of video games has been tainted by poor quality releases, the video game developers still remain interested in their craft, and they will recombine into new companies or new organizations for the development of new and better games. We have learned that video games are not a fad and will never die.
_________________________________________________________________________________

Aside: My favorite company

It may be gray now, but the original was red, and I prefer it!
I feel confident saying that I know the most magical company of them all, the developers of which have produced more smiles, laughter, and hours of entertainment in the last 27 years than Disney and 20th Century Fox combined. A company whose mascots have remained globally popular for decades and are known in nearly every country on Earth. A company that fought tremendous adversity to achieve every milestone of its journey, and has clawed its way back to the forefront of its field even after falling into a prolonged, depressing second-place for years. The famous names at this company have consistently produced masterful games, which have been critically hailed by adults and gobbled up by children alike, for the past 30 years. Part of me still feels like there must be some kind of holy collaboration that makes 90% of their games shine eternally.

...and Mario movies too.
Mario TV shows...
Of course I am talking about Nintendo. Of all the companies in the entire world, not just confined to video games, I can think of none have had such a steady stream of high-quality creative output. I could wax lyrical about the effects that Nintendo had on my youth until my face turned blue. If I wanted to chronicle all the Nintendo games that have touched my life in some way, I would need to write a novel. You might say that this is just fanspeak. That I am a video game geek who is susceptible to this level of adoration. But the rest of the public might agree. As of May 2012, Nintendo has sold over 676 million video game systems worldwide, and over 3.6 billion video games. This is only to include those developed in-house and not the billions of others that were made for Nintendo systems by other developers. What other video game company has successfully turned its mascots into the stars of television shows (admittedly lackluster ones in the late 80s and early 90s) and even a Hollywood movie (Super Mario Brothers: admittedly terrible)? I can think of none. Mario doesn't have quite the star power now that he did at his earlier peak, but he's probably nearly as globally recognizable as Ronald McDonald. Even 10 years ago I would have said that Pac-Man was more iconic, but he is fading from the minds of today's younger gamers. Mario has faded very little. Each new release of a flagship Mario game is met with high praise and commercial success; even rehashes of old Mario games for a new system somehow retain their luster.

The Legend of Zelda (1986) for NES.
For critical appeal, The Legend of Zelda might do somewhat better. I dare say that when it comes to making an impact through video games on those who couldn't care less to play them, Zelda has all the others beaten with its ridiculously catchy Hyrule overworld theme. Countless dads and grandmas and babysitters have caught an earful of their child's video game while doing something in the next room, and even my dad (who doesn't like Nintendo games) will say that Zelda has a pretty darn good tune.

Where did Nintendo begin? It stretches back decades before electronics were invented, and is a story for another article. The topic of this post is the legendary Nintendo effort in 1985 to launch its Famicom (Family Computer) in the United States. It chose to use its own company name proudly on the front and attach a generic "Entertainment System" to the back, resulting in the NES (Nintendo Entertainment System if you're an alien).

_________________________________________________________________________________

Looking back from 1980, you'd never believe Japan lost WWII.

There is a long history of famous Japanese debut efforts in the United States, perhaps because starting prices have usually been so low that even a half-decent effort will sell. Expectations were also low, as stereotypes that were inflamed by WWII took decades to subside. But when Honda first started selling motorcycles stateside in the 1950s, they were soon competitive. When AT&T started licensing its Bell Labs-developed transistor in 1952 for outside use, with a transistor radio following from Texas Instruments in 1954, the Japanese firms Sanyo and Sony were less than a year behind the Americans. They had a major market presence within a decade. Starting with Hitachi and Sanyo in the 1950s, and later bolstered by Sony, JVC, and Mitsubishi, the Japanese captured the bulk of the American television market by the 1970s. GE gave up on the market in 1986 (Magnavox and RCA about the same time) and Zenith, the last American maker of televisions (formerly the last American maker of transistor radios as well) went bankrupt in 1999. The market for personal radios was ultimately cornered by Sony with the Walkman, and that company also was at the vanguard of a successful Japanese assault on American stereo, speaker, and amplifier companies. Canon and other camera makers sold high-quality cameras stateside as early as the 1960s, and just recently (2012) put the legendary US firm Eastman Kodak out of business as well. Basically every appliance that can be named in the American kitchen or living room from microwave ovens to alarm clocks was pioneered by American companies, but US production was overwhelmingly usurped by Japanese makers of the same products. It usually took a long time, and often the first efforts were laughed at, but eventually the Japanese learned how to make the right product.

Japan's original post-WWII strength was its cheap labor, a condition which let it undercut the rest of the world in the cost of certain manufacturing (like transistor radios) until about 1965, when it too outsourced, this time to Hong Kong and ultimately Taiwan and mainland China. At about this time, a remarkable thing started to happen: Japan rapidly increased the sophistication of its engineering and it learned every lesson that America had to teach about business and organization, then started to usurp the former master. And yet Japan stopped producing anything at the lowest-cost. It would seem as though the companies and the nation were engaged in a continuous, harmonious push to make the badge "Made in Japan" itself a considerable asset.

Some manufactured goods, like automobiles, are still Japanese-produced, but for cheap products most of the labor done on the Japanese-branded products (as with most American-branded products as well) has been outsourced to China or Taiwan. In some cases Korea has taken over for Japan as the new up-and-coming manufacturing power, although it too outsources much of the labor to China. China has regulations that are basically unenforced, and they pay wages that undercut the entire world, with volume much more important than quality.

Unless you do a lot of research, you can't quite be sure how good the Chinese factory that makes the product is. The badge "Made in China" appears on everything from test equipment from manufacturers like Tektronix, to countless OE-grade automotive parts, to utter crap at flea markets and 90% of everything on the shelves at Dollar Tree. I can't say that Chinese stuff is all bad, because clearly they can make good stuff, but with the nature of production over there so secretive, it's hard to verify that an outsourcing manufacturer will actually give the Chinese factory of choice what it needs to compete, or if they will simply place an order and wait for the bidders to start fighting for the order to the tune of a few cents saved. That means, to a grumpy fart like me, that at least when you see "Made in Japan" or "Made in the USA", you can be reasonably guaranteed that it's good, since if it wasn't good the company surely wouldn't still be in business with the Chinese competition around.

Occasionally the Japanese tried too soon. Den Fujita, of Fujita, Inc., one of the first companies manufacturing transistor radios in Japan, soon had to learn that the American way of business meant that a contract was binding and that consistent quality was necessary if Japanese wares were ever to be more than gas station giveaways. His account is the subject of an interesting article in the Rutherford Journal.

As the earliest postwar Japanese entrepreneurs will note, Japan must concede that its pre-war methods of assembly and quality control were slightly updated versions of medieval production lines. The degree to which America outpaced Japan in production during WWII is damning evidence that they were many years behind in almost every category. American servicemen learned this firsthand from collecting Japanese small arms of WWII, with production quality varying from acceptable to absolute junk depending on who made it. German manufactured goods of WWII were different in that the production quality could be poor, but German designs were often cutting-edge, even in the final days of the war. Japan produced few modern designs as the war dragged on, and most of these were with support from the Germans.

When Toyota originally began selling cars in the US in 1957, their offering (the first-gen Crown) was far too small, underpowered, and flimsy to cope with American highways, and the thing quickly became laughingstock. Such was the negative response to this attempt, that it was to set the Japanese car export business back years. The car market was a tough nut to crack, but by the mid-70s, the Japanese had a largely competitive set of offerings, anchored by the sensible Honda Civic and later Accord. Although Japan became a semiconductor player by the 1970s, the Japanese did not experience the flourishing of independent PC builders in that decade like the United States did, and when there was a popular offering in the early 80s (the MSX) it was a Microsoft design with an American Zilog processor. With the sole exception of the odd Sony laptop, no Japanese brand has made itself strongly felt in the American market for PCs. Japan also had a tough time exporting its taste in television shows and film to the United States; it would take massive hits like Pokemon in the 1990s to bring anime out of the fringes and into the American mainstream. Starting in the immediate aftermath of WWII, the American soft power (cultural influence) has had a profound effect on Japan that continues to this day, while conversely the Japanese have usually not attempted mainstream cultural exports to the United States. Some manufactured goods markets have seen almost no Japanese penetrations whatsoever, like firearms, full-size pickup trucks, medical equipment, and mainframe computers. The first two (guns and trucks) have become the haven for a certain type of American patriot; it absolutely makes sense that in this market the Japanese wouldn't be highly welcome (Tundra and Titan sales are never class-leading). Still, I always found it strange that the United States was able to stay ahead of Japan in terms of semiconductor technology, processor development, and concentration of computer science talent. Japan never had a full-scale competitor to Wintel for the home market or IBM for the professional or mainframe market. As for medical technology, perhaps this is due to some kind of import regulation; I am unqualified to guess. Maybe in the year 2012 they've done all the conquering they ever plan to.

One horizon remained uncharted in 1985: the American video game market.
_________________________________________________________________________________

1985: Dawn of a New Era

Following in the grand tradition of Japan entering with an unknown name into a market that had formerly been invented by Americans and consumed mostly by Americans, Nintendo decided to start selling its Famicom (Family Computer) in the USA. The dinky original name would have to go. In its place they gave it the rather generic moniker of Nintendo Entertainment System, using the same nomenclature as their predecessors Fairchild and Atari had done. But the NES was something truly special, definitely moreso than any systems that came before it. And amazingly, the traditional virtues of Japanese manufactured goods were NOT what made it special. It didn't sell well because it was the most technically advanced product, or because it was the cheapest, or because it was the most reliable. Come to think of it, the NES is a fairly unreliable system that requires special tricks to keep going, and because it was an 8-bit machine with another MOS 6502 that had powered video game systems for years already, it didn't seem really advanced. At $149 launch price (with a game and two controllers) it was actually slightly more expensive than the Atari 7800, and a whopping $100 more than the stale Atari 2600. So what are the virtues of this machine?

NES in original 1985 trim.

The paradigm shift here is that finally a Japanese firm had conquered the market without emulating its competitors for a lower price or better quality. They did not cut costs to the bone to make the greatest value. They did not simply focus on better construction. They didn't copy existing work and improve it. They were genuinely innovating, and they were selling a system on its desirability, not on its sensibility. They were poised to develop the best video games ever made. This machine was not to fill a gap in the market. It was to revitalize the market, and expand it to new heights it had never known before.

The key strengths of the Nintendo Entertainment System are so well-known that it seems almost silly to rehash them. But for the benefit of younger readers, I have summarized them from my outlook.


1. Exclusive high-quality titles. Nintendo knocked it out of the park immediately with Super Mario Brothers in 1985. They followed it with Metroid, Legend of Zelda, Super Mario Brothers 2, Super Mario Brothers 3, and a slew of popular sports games. Their subidiary HAL Laboratories also introduced the character Kirby on this system. Most of the content for the NES was exclusive, because Nintendo knew it had a winner and it demanded that all developers develop solely for Nintendo, or not at all. The degree of exclusivity ended all ambiguity about which video game system was worth buying. The classics for the NES are why it was popular and why it remains famous today. It was the launchpad for Mega Man, Castlevania, Contra, Metal Gear, Final Fantasy, Dragon Quest, and countless other series.

2.   Restoring confidence in the market. Perhaps the biggest change that the customer noticed was that Nintendo gave you a basic guarantee on each licensed game you bought. This came in the form of the "Seal of Quality", which meant it had been tested by Nintendo and it was guaranteed to work and play somewhat well. It wasn't a guarantee that you'd love the game, but at least it always played and you'd get a few minutes of enjoyment. In truth, the Seal just meant that that game was the product of developers who wanted to play ball on Nintendo's terms and designed a video game that would work around the NES's lockout chip (called 10NES). But this tactic was amazingly effective at getting buyers to trust video games again after the bad taste in their mouths from the flurry of third-party titles for the 2600. Since no such "guarantee" had ever been offered before, the Nintendo Seal would become famous in its own right as a symbol of how the new generation of games was superior to the old.

3.   Nothing but the best from the start. From the beginning, each NES had a pack-in game, which was a novel feature at the time. It was also a fine choice that it was Super Mario Brothers, a superbly fun and approachable game for the entire market. This feature let new console owners immediately gloat over how fun their game system was. It took the guesswork out of choosing games. Everybody knew that if you bought the system, you got a fun game right off the bat. It was a fantastic idea. Launching a game system with at least one "killer app" from the start was increasingly viewed as essential to the success of a new system, since buyers would wait until an anticipated game was available to buy the system in the first place. 
4.   The best marketing ever. Nintendo marketed its mascots ruthlessly, taking advantage of television commercials, radio, cereal boxes, magazines, billboards, TV shows, and an endless onslaught of merchandise. It was impossible to avoid hearing about Nintendo in the late 80s.

5.   A carefully cultivated image. Nintendo pulled an old Atari trick in positioning the NES as a family game system. The advantage of this over a PC was that it involved the whole family in good, clean fun, which is more than could be said for solitary PC gameplay. Compared to the NES, there was no PC which had anything like the amount of exclusive, high-quality video game titles. Moreover, unlike Atari, Nintendo could actually back up its image with fact, since the lock-out feature meant that it was very hard to design bootleg games, and consequently the fly-by-night adult game makers and blatant product placement games were barred from the system, and never even made it onto shelves to sully the system's reputation. 

6.   Better graphics and sound. It might have had a fairly old processor, but the NES was more advanced than anything that had previously used it and it blew the Atari 7800 out of the water. There were more colors, significantly better textures, more memory on the machine and more ROM on the cartridges for longer and more involving games. It would be easily eclipsed by its 16-bit successors, but for the first time reasonably accurate sound and pictures were possible. We were not quite to the point of the spoken voice, but background music became genuinely likable, not just a series of burps and beeps like it was on the 2600. Some basic tunes from the NES days still live with us today, like the Zelda and Mario theme songs.

7.   A perfect game controller. It has never really been bettered and it set the template for the future. A d-pad on the left, start and select buttons in the middle, and action buttons on the right. All controllers since then have followed a somewhat similar layout, at least to the extent of directional control on the left and action on the right. The d-pad was a great advance over earlier joysticks.

8.   A respectable appearance. The only positive aspect of the NES that came with the front-loading slot is that it looked like a VCR, which at the time was a cool piece of technology. Compared to the top-loading Atari systems, the front-loading NES seemed more modern at the time. With its boxy exterior and muted colors, it looked more like a serious addition to the living room than a full-on kid's toy. This was an improvement over the Famicom, since the Famicom was sold in Japan with exactly the kind of Fisher-Price hues that would have ruined its appeal to adults in America.

9.   Battery-backed save. It was possible starting with Legend of Zelda in 1986 to save a file and load it again for future play, greatly enhancing the appeal of longer, more ambitious games. The password save system was in use on many NES titles as well, but it eventually faded out because of how cumbersome it was. Saving a file should be easy. The advent of saved files in this way was an absolute revolution in gaming. Of course it was possible to save games on PCs earlier than this, especially for long "questing" games, using hard drives. But the fact that Nintendo games offered this feature meant that an entirely new type of home video game could be developed: the kind that took multiple days of gameplay to complete.

10.   Sheer good luck? Nintendo took advantage of a lot of recently-formed development firms that  were hurting from the Crash and needed to get games published and so would accept any terms available, including exclusive deals. Nintendo got Tetris as an exclusive even though it was not their own creation, and this was typical. Very many still-popular series were started on the NES over 20 years ago. It's just an incredible outpouring of creativity.

11.  Winning a few key cases. One of the few legal thorns in Nintendo's side was Tengen (an Atari subsidiary), which succeeded in cracking the 10NES lockout chip without making licensed games. If this was done purely by reverse-engineering, it would have been fully legal under the Activision precedent (you can't stifle third-party developers). However, since Tengen had requested and viewed code for the 10NES chip from the US Patent Office, they had no way to prove that it was honest reverse-engineering. The courts ruled that their workaround was copyright infringement and prohibited. Nintendo also won the case against Tengen over Tengen's version of Tetris. Even though it was not at all based on the source code for Nintendo's version, and was built from scratch, it was disallowed because Nintendo had exclusive rights to a game by that name and with that type of gameplay. All of this is rather sad, because Tengen's titles for the NES, particularly their highly polished Tetris, are considered classics.

12.   Better retention of its best people. Nintendo was not a labor camp like Atari under Kassar, but even if it was, the Japanese are seemingly not as inclined to shop around for a new job as Americans. Hiroshi Yamauchi, Nintendo's president from 1949 to 2002, pushed his engineers and developers to their limits but rewarded them appropriately. He spent almost his whole adult life from age 22 running the company after the death of his grandfather (then the current president). He also displayed a remarkable amount of intuition as to what approach was best in game design, despite having no background in it. Whatever he picked was usually a winner. In part because of this nurturing management, Nintendo has retained its heroically prolific developer Shigeru Miyamoto to this day. The Miyamoto-Yamauchi relationship was crucial to the success of Nintendo.

13.   Some good peripherals. I'm looking at you, Light Gun. How many joyful hours I did spend playing Duck Hunt as a youth and being AMAZED at how it worked when it was made before I was even born.
14.   All of the expected technology. It did everything relatively well, as you'd expect from a 1980s Japanese product. It had a pause feature, which had been introduced three years earlier on the Atari 5200. Apart from the quirks of the cartridge loading system, it had no quirks that would put off a neutral observer. It could play on color or B/W televisions. It had ordinary power and reset buttons. It had RF connectivity as well as composite A/V connectivity (later removed for the SNES and not brought back until years later on the N64). The power supply was a simple one-piece AC adapter. It was easy to open the case to make repairs if one so wished. Many historical video game systems we already saw have been plagued by some silly oversight that turned into a glaring fault. But though there were drawbacks, Nintendo didn't tend to drive anyone away to its competitors; by and large, the people who bought other systems either couldn't afford an NES or were dead-set in their buying habits already.


When the NES hit the market in 1985, a frosty reception was expected by most experts. In spring 1985, despite the fact that Famicom had sold 2.5 million units in Japan, Electronics Games magazine noted that the "video game market in North America has virtually disappeared" and suggested that Nintendo was erring in introducing the NES to the US market at that time. The first limited release of the system was October 18th, 1985. Complete nationwide sales would not start until February 1986.

It's staggering to think how far off the mark they were! By the time NES production ceased in 1995, nearly 62 million systems had been made. Nintendo's record of 62 million games sold would be the world's best for years until overtaken by the PlayStation in 1999. By comparison, the Atari 2600 had sold less than one-fourth of that during its period of dominance up to 1984, and actually sold more units when re-released in the mid-80s. The total of Atari 2600s was about 30 million sold in 15 years, roughly half the number of NES systems sold in 10 years stateside (12 years in Japan). The fact that the old Atari 2600 was doing better business after Nintendo than before it, suggests that Nintendo legitimized the video game once again. After the NES, there would be no crashes. Video games would continue to sell well indefinitely. 

NES-101, the updated version
The system was still selling well enough after eight years that Nintendo pulled an old Atari trick and re-released it in October 1993, two years after they had already started selling its replacement, the Super Nintendo. To keep the old system viable, the price was dropped to $49.99, exactly the same as the re-released 2600 of six years earlier. They gave it an updated controller (referred to as the "dog bone" by fans) which embraced the rounded style of the SNES controller. The system itself had a more reliable top-loading layout, also similar to the replacement SNES. Because the NES was quite old and there was little desire for pirated games, Nintendo omitted the lockout chip, so illegal games that had been sold earlier could be played on this version. Happy days- now any Tengen game is playable. More critically, this new system wasn't as likely to freeze or fail to load the game due to poor pin connections failing to verify the game's authenticity. However, these systems are not really in high demand. The internal code for this system is NES-101. For those who are interested, the original NES was coded NES-001.

The NES-101 is a more practical system, but it lacks the bulk, presence, and retro-chic appeal of the original NES-001. It looks a bit bland to behold. But for the lower-middle class that had never considered buying video game systems before, the $50 NES in 1993 was doubtless a gateway for millions of video game players.

Why isn't the redesign considered "cool"? It comes from a time when the NES was fading from memory. The 16-bit generation, anchored by the Super Nintendo and Sega Genesis, would make the old NES seem boring and uncomplicated; the "retro" appeal didn't gain motion until years later, and the original 001 design was inevitably preferred because it was truer to the heart of the 80s. The 101 did the same thing better, but was from the wrong decade. For those who shun emulators and just want a reliable way to play old NES games without getting into the hype of the original classic design, the re-released 101 version remains appealing, and it's definitely cheaper than the original VCR-style model. Likewise for the dog-bone controllers: they are more ergonomic than the sharp-edged original, and have better buttons, but they too are low on style and retro appeal. I'd like to say that style doesn't matter, but the original NES is the real deal, and it's hard to argue with the aesthetic appeal of it. For me I'd prefer to have both, but since I don't, I'm glad that I have the original.


Of course, there are some drawbacks to the NES, but over time everyone's memories of the negativity has faded, and we now view them as charming personality quirks.
  1. Stupid way of inserting cartridges. The push in and push down method puts undue stress on the pins connecting the game to the system. Over time they can stop developing a flush fit, resulting in data loss, which is when your game crashes or won't even start in the first place. We have reclaimed this as a "cool thing" because of the art of getting it to work correctly. Always make sure you blow on the cartridge frequently! Even if the dust is invisible, we still pretend it's there. There is a better workaround because you can buy a new connector that will replace these pins, and it's easily replaceable with just one screwdriver. The part is widely available and inexpensive. The rest of the system is pretty reliable.
  2. Stupidly big cartridges. When you pull one of these out today compared to a DS or Game Boy Advance game, or Wii disc, you might be forgiven for thinking you've accidentally stolen one of the tablets of the Ten Commandments. It's huge, gray, and heavy. Even for the day, these were monsters. The chunkiness doesn't make it particularly tough; you still need to be careful about temperature and moisture with cartridge-based games.
  3. The Flashing Blue (or Gray or White or Green) Screen of Death. Nintendo's effective solution to exclude unlicensed developers was to include the 10NES lock-out chip on the console, which would not play a game unless it had constant contact with all the pins of the game so that it could communicate with the internal chip, which contained memory that let the system know that it was legitimate. If any pins were bent or broken, and contact was lost, it might assume the game was unlicensed, and the screen would flash once per second, because that was the refresh rate of the 10NES chip. The blue screen was by far the most common, but other colors were possible. (Usually solid, not flashing. If you see a flashing green screen, take a video, because it's rare.)
  4. In Japan, the Famicom's controllers were hardwired to the machine. Ouch! Didn't they learn anything from Fairchild? But happily, for the US release, we got removable controllers.
  5. Silly peripherals. Some have credited the R.O.B. (Roboting Operating Buddy) as being a "Trojan Horse" by showing an entirely new facet to video games that made it appealing despite the fact that Nintendeo never intended this robotic peripheral to replace conventional controls. I have never used one, never seen one, and honestly I don't know why anyone was interested. It sounds and looks stupid. The Power Glove looks cooler, but holding your arm out at a 90 degree angle for hours sounds like an even worse idea. This is just my opinion...
_________________________________________________________________________________

The others

Nintendo did not operate in a vacuum, but very nearly. Of course Atari was still present in the video game market, and in the early days of the NES it was hoped by that company that it could leapfrog Nintendo with its 7800 and get back in the game. That proved to be hopelessly optimistic. The Atari name held little appeal for modern gamers in the late 1980s. By 1991, when Atari was wrapping up the 7800 project, they held perhaps 12% of the North American game market, their home territory, while Nintendo held 80%. In Japan,  the Atari 7800 was almost unknown, and Nintendo's margin of victory was even greater.

Atari "Flashback" from 2004.
I have never played a 7800, and not much is written about its unique content. It wasn't heavily mourned; the best birthday present it ever got was a re-released "Atari Flashback" in 2004 which had only five 7800 games and controllers that were identical to the 7800's. Since this system didn't actually use the same internal CPU or graphics, it was said to not "feel" the same as the original 7800, which consequently dimmed its appeal. We've already been over the 2600 hits, and most of the 7800 main titles were just rehashes, so there's little more to talk about here. Atari managed a surprising degree of success with the 7800 considering how uninspired it was. They sold almost 4 million, and that was purely on the American market. I don't know if they even bothered to sell it anywhere else.

Master System: hefty and unique, but poor selection of games.
If we consider the heretofore unmentioned European market, a more serious contender to Nintendo in the mid-80s came from Sega. I've never been  a fan of Sega systems or games, so it's hard for me to understand their appeal, particularly of the Sega Master System. Interestingly, Sega sold a sizable majority of the Master Systems in Europe. While in the early days America's gaming culture influenced Japan and in the latter days Japan's gaming culture influenced America, Europe's gaming culture is much harder to follow. They seemed to pick the Master for some reason I don't understand, and domestic European studios started development for it with much more eagerness than they showed for Nintendo. Master sold over 10 million units, of which nearly 7 million were sold in Europe. In the US they sold under 2 million, and in Japan they sold slightly over 1 million units. Not such great numbers for a system that was sold from 1985 to 1996, during a prodigious video game boom. It's important to note that Sega abandoned the home Japanese market in 1989 and the American market in 1992. The final four years of production were solely for Western Europe and Canada.

Master System II: So generic I'm falling aslee....zzzzz
I personally think the design of the original series Master System was quite cool, with an interesting shape, and geeky overly complicated graphic on the red front panel. It also looked and felt fairly high quality, like an old Zenith VCR. However, the 1989 successor Master System II, which was designed to be a cheaper, cut-down version, looked like a hideous Soviet knock-off made from melted down Knex, and it stands out in my mind as the ugliest, most barren game console of all time. If McDonald's decided to design their own game system and hand it out free with the purchase of 5 Happy Meals, I can only imagine it would look like this. It makes the NES-101 look inspired in comparison.

Sega was definitely looking over the shoulder of Nintendo's Famicom when they selected a D-pad (well, more like a tiny joystick that was almost like a D-pad) and two-button layout for the Master System gamepad. Unfortunately, they didn't copy it thoroughly enough. Amusingly, the controller cord comes out of the right side of the controller, which is exactly where your right hand would be. I have to just stand back in amazement that some Sega executive would be idiotic enough to pass this design on to final production. If anyone played it even for a short period of time, they would see that this design was complete garbage. Sega pulled the same dimwitted failure years later with the Dreamcast, which had controllers with the cord coming out of the back, meaning you had much less cord length. How can you make such a huge mistake twice? What were they smoking at Sega?

However, this system is fondly remembered by some. We should point out that it is technically superior to the NES, with its Zilog Z80 beating out the older MOS 6502 used in the NES. Still, potentially superior graphics was no substitute for third-party support, and Sega was entering a market where Nintendo had an overwhelming supply of hits from its own manufacturers, and exclusive rights to many third-party games too. Sega had an uphill battle, and with a development team much smaller than Nintendo's, they could only churn out a couple of high-quality first-party titles, which wasn't enough to carry the system to success.

I have never personally owned the SMS and haven't logged much gameplay, so I can't comment in too much detail. The best-selling game (and, by consensus, the highest quality game) was Alex Kidd in Miracle World. In the pre-Sonic days, this was Sega's best platforming effort and nearly a competitor to Super Mario Bros. There is something to be said for variety, and if you ever just wished that folks would shut up about Mario and Mike Tyson's Punch-Out, here was your choice if you wanted something a bit less antediluvian than Atari.

But although I will bash Sega, I have no doubt that they were utterly committed to taking on Nintendo on equal terms, which was a courageous objective. Their first effort was nowhere near good enough. But when 1989 rolled around, just halfway through the life of the NES and Master System, they would strike back with the wildly successful Genesis, their biggest-seller of all time. The "16-bit generation" was going to witness Nintendo remain the industry leader (helped by the 1989 release of the legendary Game Boy) but never with the degree of total control that it enjoyed from 1985-1989. Stay tuned to see it unfold.

Sunday, May 20, 2012

Joke of the Day #9


Meg: Did you hear that Walmart has started stocking logic gates in many of their stores?
Greg: Interesting, do they use the old store brand?
Meg: Nope, it’s a new brand called Great ALU.

Friday, May 18, 2012

Joke of the Day #8


Dan: Surely you empathize with the way our business partner is treating us after the merger.
Stan: Totally. A-ffili-ate sucks.

In the works

A preview of upcoming attractions (order tentative and subject to change)

1. CVCC technology by Honda
2. VTEC technology by Honda
3. Video game history part 3 (1985-1991)
4. How to buy a cheap car on craigslist (from a successful amateur)
5. Hall of Fame #3: Nintendo 64
6. Hall of Fame #4: Chevrolet Corvette C4
7. Fall of Fame #5: Old Lakewood box fans

This post will be removed when all of the above have been published.

Wednesday, May 16, 2012

Joke of the Day #7


Dan: Those maggots sure do attack that carcass quickly and in great numbers.
Stan: Oh yeah. Once a body hits the floor, it’s in the grub-lic domain.

Monday, May 14, 2012

Joke of the Day #6


What snack brand has a 50-50 chance of being empty?
Little Maybe’s

Condensed History of Video Games: The Rise and Fall of Atari (1975-1984)

This is part of a series. Please also see:
Condensed History of Video Games: From Tic-Tac-Toe to Pong (1952-1975)
Condensed History of Video Games: Nintendo Seal of Quality (1984-1989)

We ended our discussion of early video games when the market for both home video game machines and uprights appeared. However, while the period between 1972 and 1977 witnessed a flourishing of home Pong machines and arcade video games, the market for home machines was necessarily limited by the appeal of the single home game available: Pong. It hit the market in 1975 and was an immediate success, but home video game innovation was caught up in the craze of this game, and stagnated for a few years.

The missing link was a video game system that could easily swap game programs using some tangible form. The home Pong machines of the mid-70s were not capable of this, and always played the same game each time. As addictive as Pong was, it failed to stay popular indefinitely. By 1977, the public was thoroughly bored of Pong and all of the Pong clones.

Magnavox Odyssey with original packaging
Ralph Baer's Magnavox Odyssey was capable of switching onboard games by inserting different PCBs into the console. It is important to note that these did not constitute unique games; all games were pre-programmed into the logic of the machine, and the only effect of the PCB was to connect the appropriate jumpers that would link the signal generators and logic for each game. No programming was done on these game "cartridges". A further weakness of the Odyssey was that it only displayed in black-and-white. In order to get "color graphics", the Odyssey came bundled with translucent colored sheets which could be taped onto the television screen. It sounds as clumsy as it was, and with only two TV sizes supported, it likely didn't add much weight to the cause of the old Odyssey. To summarize (and recap), when the first Pong home machines hit the market in 1975, they forced the unloved Odyssey off the market.
Odyssey "active card" that was to be inserted into the machine

Baer had some very innovative ideas, such as "active cartridges", which contained additional logic and modules that could implement features such as sound and variable position and speed of onscreen objects. This thinking became archaic as soon as someone could release a complete video game system that did not require updates from games. In order to make a successful home video game business, a company had to have a complete system that could be sold at a loss, and the games themselves should be a simple program stored in memory that could be read by the game system, so that each game would be cheap and the real money would be made from the game sales. The crucial step needed was the simultaneous invention of the microprocessor-based video game system and the ROM game cartridge.

Baer's ideas for developing the potential of the machine through hardware updates would be revisited later. Anyone remember the expansion pack for the N64? In today's world of constant development, it's often not good enough to put a system to market and let it ride for 6 years unchanged. Updates are expected. In the current seventh generation of game consoles, the Playstation 3 and Xbox 360 have modular designs featuring variable-sized HDDs. These systems have both undergone major revisions halfway through their life and still sell well in 2012. Nintendo's Wii is pursuing a different course: it is considerably cheaper than its competitors, but it comes in only one form and it is not designed to be upgraded or modified. Still, Nintendo does offer a bewildering array of peripherals. In this generation, based on sales, it seems Nintendo was right and Sony and Microsoft were wrong, since the Wii has significantly outsold both of its rivals, and at times in the past few years has outsold both of them combined, particularly outside the United States. As we shall see, the Odyssey approach of constant development is adopted most heartily by Microsoft, while the Atari approach of making the best system possible and then keeping it in the market until a full replacement is available, is what Nintendo does. Happily, in the modern world there is room enough in the market for both approaches.

Fairchild F8 microprocessor
But back to the Seventies. The first company to nearly get the recipe for a modern console right was Fairchild Semiconductor (part of Fairchild Camera and Instrument), a legendary semiconductor firm formed from the "Traitorous Eight" who left the draconian Shockley Labs in 1957 (which should be known as the location of the first transistor production). Fairchild developed their own F8 microprocessor in 1975, a few years after Intel invented the market with their 4004. The F8 was (surprise surprise) an 8-bit processor. It had some advantage for cost purposes in that it did away with an address bus, which made it cheaper, and added 64 bytes of scratchpad memory, which meant that very small programs (such as simple games) did not need outside RAM. It was a great microprocessor for a video game system, and that's precisely for what it was soon used.

Fairchild VES/Channel F
The Fairchild VES (Video Entertainment System) was released in August 1976. It will be remembered as the first video games system using a microprocessor and the first to use ROM cartridges as well as the first to have AI sophisticated enough for players to compete against a computer rather than against a human opponent. Thanks in part to the 250,000 sales of this system, the Fairchild F8 was the world's best-selling microprocessor from 1975-1977. When Atari launched their VCS (Video Computer System) in 1977, Fairchild immediately changed the name of their system from VES to Channel F. Fairchild knew that Atari was a more well-known video game and they did not want to be perceived as a rip-off unfairly.

The game system almost bears more in common with the earlier Pong consoles than it does with the latter Atari 2600, because of its unambitious games. In fact, in that vein, the Fairchild Pong game was extremely good and remains one of the best recommended games for the system. Contemporary reviews found most of the games mediocre, but the educational games were said to be very good and effective for children.

Nevertheless, it is surprising that it only lasted for a single generation, since it was technically the first. It did not enjoy much lasting success; few modern game players will be familiar with the Fairchild VES, or the latterly-renamed Channel F. Why is this?
  1. Timing. Fairchild entered the market after the Pong craze, and it had difficulty gaining popularity as the public was largely unconvinced that video games had more potential for unique fun. Some refer to this period as the North American Video Game Crash of 1977; it was neither the first nor the last time that informed journalists and reviewers claimed that video games were a fad that had passed and should be abandoned.
  2. Technical limitations. The Channel F had half the memory of the Atari 2600 and it could only take eight onscreen colors: white, black, and two shades each of red, green, and blue. Complex colors like orange or purple or others were not supported.
  3. Shoddy controllers. The controller is one of the the most-recurring gripes from later reviewers who looked at this historical oddity. The controllers were upright joysticks with no base, so they had to be held constantly. The "cap" atop the stick was the only moving part; it could be twisted, pushed in various direction of motions, or pressed inwards. The controller had no other buttons. Most heinously, the controllers were integrated directly into the machine, so if one or both controllers broke, the system had to be replaced. What a design oversight!
  4. Name recognition. Fairchild was a parts supplier that would have been well-known to manufacturers, but not to the general public. Atari, although a much younger company, had the advantage of  marketing its name directly to the public. It was largely the same advantage that made the IBM Personal Computer so successful compared to competitors. American buyers preferred familiar names to market new products.
  5. Scale of commitment. Fairchild was happy to launch when there were no competitors, but they did not want to take the ruinous losses that Atari was driving them through during the price war of 1979. They concluded that video games were a fad that would die out, so they left the business rather than take losses from the game division.
Given that Atari was already planning to launch the Video Computer System with a microprocessor-based logic, and using ROM cartridges, the fact that Fairchild got a slight headstart on them did not mean that Atari copied anything from the semiconductor company. The main historical impact of the Channel F was not to force other game makers in the same direction (Fairchild did not have the clout to be the standard-bearer of the game industry) but rather to force Atari to speed up and release the 2600 as soon as possible.


Atari hit the market with its Video Computer System (known by 1982 as the 2600 or 2600 Video Computer System) in October 1977.  They had some long-term advantages over all their competitors, especially the Channel F.
  1. More onboard memory. Atari had a separate RAM/IO module (MOS Technology 6532) which had 128 bytes, double the 64 bytes of the Channel F's on-chip memory. It was still miniscule, and sounds ludicrously small by today's standards, but it was just enough to work.
  2. A better processor. Atari used the MOS 6507, which was a cheaper, simpler version of the 6502 which went into many of the first microcomputer builds of the 1970s. Although this was slower than the F8 (1.19 MHz versus 1.79 MHz), the 6507 was capable through hardware tweaks and continuous development to run increasingly ambitious video games. Very soon after launch it had surpassed the complexity of the most advanced Fairchild games.
  3. Bigger games. Channel F carts were only 2 kB while Atari carts could be 4 kB, meaning more could be done in the game. Actually, the 6507 processor could address 8 kB but was cut down to 2 kB for money-saving reasons, but 4 kB carts were released later in the life of the Atari 2600, and were ran using special workarounds.
  4. Better peripherals. Atari joysticks were not bolted to the machine. They could also be replaced with twist-top "paddle" controllers or special driving controllers for a few games. All of them worked very precisely for the given game; controllers were a strong point of the 2600. Atari even released a wireless controller in the early 1980s, almost 20 years before it became common in consoles.
  5. More experienced programmers. Atari's best advantage came in the form of its people, who came from diverse backgrounds. In the pre-2600 era, many creative developers flocked to Atari for the opportunity to make it big in game development. Although the Atari 2600 was one of the most complicated devices for which to program, because of its quirks, many effective solutions were introduced whenever a more ambitious goal was desired. Unfortunately, the Atari management style did not allow individual programmers to receive recognition for their work or compensation appropriate to the magnitude of the massive sales they created. Consequently, many of them became frustrated, left, and never returned, especially those who founded Activision in 1980 and started the very idea of third-party game development.
Why the name 2600? The internal part number for the Atari VCS was CX2600. After the release of the 5200 in 1982, it was necessary to differentiate between the two, so they called the older system 2600. Why did it receive this part number? I really hope that such an iconic game system didn't just get it randomly. There is no reasonable hardware explanation for it: 2600 does not describe the processor speed in kHz, the memory space on game carts, the number of games released by then, the number of pixels displayed, or anything like that. I've heard that it's an homage to the phone phreakers who used a 2600 Hz signal to hijack long-distance Bell telephone lines. It's amusing to note that both Steve Jobs and Steve Wozniak (who was a phreaker with alias Berkeley Blue) worked for Atari at a stint during the 70s before they made the Apple I. But this explanation is unlikely, since Atari was never part of the hacker scene, and its management was rather conservative by the standards of tech company startups. The true reason may never be conclusively known.

Amusingly, the successors of the 2600 were also given the same numeric nomenclature, with their product names (Atari 5200 and Atari 7800) being mere multiples of 2600. That's almost a whole decade of being the number one game company, without spending a single dime or extraneous thought on the right names for its flagship game consoles. (But have times changed that much? PS1, PS2, PS3, PS4...)

En route to their dominance of the early 80s, losses by the company in 1978 caused it to seek outside support from Warner Communications. Nolan Bushnell left Atari that same year and founded Chuck E Cheese. His successor as CEO, Ray Kassar, was appointed by Warner and universally described as authoritarian by the old guard of Atari. He was utterly oblivious to any of the demands of his programmers, and described them as "high-strung prima donnas." In 1979 four of the most experienced engineers at Atari requested commissions on their million-selling video games and Kassar replied memorably: "You are no more important to the game than the guy on the assembly line who puts it together."

Important as the manufacturing line is, and with no intent to denigrate line workers, let's note there were only a few dozen people in the world who had the background and technical skill to write good Atari games. For the CEO to show this much ingratitude was a slap in the face of stunning magnitude, and was basically an invitation to resign.

The new hires by Warner had a background in business, and they lacked the engineering and artistic background that the original staff of Atari had possessed. As more outsiders were forced in, the original spirit was slowly crushed. Despite this, they were more eager than ever to squeeze every dollar possible out of the company's video games. Some have credited Kassar with single-handedly ruining Atari by prompting massive turnover among both programmers and management.

The 2600 launched with a price point of $199, but under Atari's business model, such a high price point was probably not necessary even at the start. By 1979 Atari was periodically slashing prices as low as $99, cutting profits for the sake of increased market presence and more game sales. The initial launch was not auspicious, but it got better each year until the roof caved in.

  • 1977: Atari launched 2600. Sales by year's end were 250,000, which already beat out the Channel F, but it was below expectations.  There were nine launch titles, of which the most popular was Combat.  2600s made early in the 1977 production run were made in Sunnyvale, CA and can be identified as "Heavy Sixers" with their six-switch form factor and heavy RF shielding. The Heavy Sixers are still the most prized by collectors because of the solid construction and bulky materials. These are well-made systems and are the most likely to last well. Even in the late 1970s, it was becoming rare to see assembled modern electronic devices in the US, and this is one of the very few examples of a mass-produced, American-built video game system.
  • 1978: Atari sold 550,000 units on production run of 800,000. The nation was proving still slow to come out of the crash of '77. Nolan Bushnell left the company amid disagreements with the new management regime. Some more unique content was released, like the popular Breakout arcade port. Production was outsourced to Hong Kong and this production run was known as "Light Sixers" for their lighter components, but still keeping the six switches and woodgrain of the original. The outsourced production models are likely to still be reliable, but they are far more common and much less valuable. (By the way, this is the model I own.) Fourteen games released this year. 
Not my favorite, but here it is: Adventure
  • 1979: Atari outlasted Fairchild, which ceased to develop new titles for the Channel F and left the market altogether. In Christmas of 1979, Atari's strategy of attrition finally worked, because they captured virtually all of the increasingly growing market, and sold over 1 million units for the year. It was rumored to be the most popular Christmas gift of the year. Twelve new titles were released this year, including the famous Adventure, which I've never much cared for. Adventure was the first known example of a video game "Easter egg": a room in the game contains the name of the developer, Warren Robinett. Such high quality exclusive content was a major factor in the increased sales.
  • 1980: Atari released Space Invaders in January as an exclusive home title; it was the major factor in selling more than 2 million units for the year. The systems were made cheaper with a redesign that included only four switches; the four-switch models are even more common and less valuable than the 1978-vintage redesign. A total of eleven new first-party titles were released this year, including Night Driver, the first first-person racing game. The first third-party games hit the market, spearheaded by Activision (made of former Atari employees who had chafed under Kassar), which released popular titles like Dragster, Boxing, Skiing, and Fishing Derby. Atari promptly sued Activision over the rights to develop games for a system that they did not develop. Although third-party game makers did help to sully the reputation of the 2600 later on, it is important to note that Activision, made of former Atari employees, is never included in this characterization by historians. Activision games from 1980-1984 are considered some of the finest games available for the Atari 2600 (including the legendary Pitfall) and have been sold in compilation discs on PC and modern video game consoles up to the present day.
Warlords for 2600: one of my favorites
  • 1981: More than 3 million units were sold for the year. Only six new Atari games were released, but among them were some former arcade blockbusters: Asteroids, Missile Command, Warlords, and Video Pinball. All of them were well-received. More third-party games were released by Activision and Apollo, including Tennis and Freeway. On the other hand, Atari bumbled badly in its release of Pac-Man for the 2600, since its poor graphics and bad controls are a far cry from the joy of the arcade original. Pac-Man for 2600 was critically panned. Sales were still high, but failed to meet expectations. On all other fronts, spirits were high in 1981. They were predicting major success in 1982.
  • 1982: More than 4 million units were sold for the year, bringing the total of units sold to over 10 million. Few who had been observing the video game landscape five years ago would ever have predicted that the 2600 would make it so big. In a landmark decision, Atari lost its case against Activision, and the floodgates were opened to any third-party manufacturer releasing games for any system they wish. The contribution of quality games from some outside makers like Activision were actually the bright point in the year. Atari failed massively in 1982 by releasing the poorly-conceived and buggy E.T. video game. The other major disaster was the new-found prevalence of poor third-party game makers, emboldened by the failed suit against Activision. On the positive side, Atari had minimized costs to the lowest level possible. Systems sold for an average of $125 nationwide but cost just $40 in raw materials. Games cost $4.50 to $6 in material cost and $1 to $2 in advertising cost, but were priced at an average of $18.95 and even at that price, the popular ones often sold out. E.T. was not one of these. Atari anticipated strong demand and made 4 million cartridges before sales begin; only half a million were ultimately sold. A store at J.C. Penney noted that in order to move the product, they discounted it five times from its hyped-up $49.95 launch price to just $1. Atari lost $100 million on unsold cartridges and the marketing blitz, earning just $25 million in sales. Atari also launched the ill-fated 5200, which was to be a competitor for the more expensive ColecoVision and Intellivision consoles. To that end, it incorporated many more buttons (a full 0-9 digit keypad) and an analog stick mounted on a controller similar in form factor to a TV remote control. Atari's biggest problem with the 5200 was that it never got enough unique treatment. It was fully compatible with 2600 carts, but since that system was selling better, it was often left to the more-expensive 5200 to get updated versions of old games. The 5200 was novel for incorporating a pause button, but it was otherwise critically panned for not having enough unique content, as well as its clunky controller.
  • 1983: Although Atari published more than twenty titles this year including some previously popular licensed titles, the company started losing money throughout the year. They were forced to bury over a million unsold E.T. cartridges in New Mexico. Major disastrous third-party games of the year were the infamous Custer's Revenge, where the shockingly offensive object of the game is to lead a naked, erect Custer across the screen to rape a tied-up Indian woman. Crude mature games like Custer's Revenge caused untold damage to Atari's reputation as the company was seen to be allowing illicit content on the game machine that they had been advertising for years as clean family fun.  Idiotic product-placement titles like Chase the Chuck Wagon didn't help either. The size of the video game market peaked at $3.2 billion in Q1 of 1983. But by Q4 1983, Atari announced losses exceeding half a billion dollars and entered bankruptcy soon thereafter. The crisis is known as the North American Video Game Crash of 1983, although its full effects were not felt until 1984. Also in 1983, Atari also made one of the stupidest business decisions of all time: After nearly completing a deal with Nintendo for exclusive release of their Famicom (later Nintendo Entertainment System), Atari myopically severed the deal because of irrelevant improprieties committed by Nintendo's other partner Coleco. Nintendo did not attempt to renegotiate, as they saw Atari's fortunes fading fast. This was the last chance for Atari to remain relevant and they blew it. Nintendo would go it alone in 1985. Like with transistor radios, motorcycles, and finally cars, the Japanese makers of video games would independently seek to conquer the American market.
  • 1984: Atari reeled from a completely destroyed home video game market. They pulled the 5200 from the market. Total industry revenue from home video games shriveled to about $100 million, having shrunk 97% from its earlier peak. Warner Communications broke up Atari and sold the computer and home video game divisions (Atari Corporation) to Jack Tramiel, founder of Commodore. Warner kept the arcade division to itself (Atari Games) but later divested it. Neither of these separate entities would ever have control over the video game industry again. By 1984, about 13 million Atari 2600s had been sold worldwide. The successor, the 7800, was announced in spring of 1984, but due to the continued lack of video game sales, Atari postponed the project indefinitely. It would not be launched until 1986. By that time, there was a new and overwhelming competitor with which to contend.
April 2014 update: This article was written in 2012 with the best knowledge I had at the time- it made reference to E.T. games being buried in New Mexico because that is what my research indicated. I included a small screenshot of the Alamogordo Daily News in the next post of this series (I did not think I had license to publish the full paper). The recent rediscovery of the trashed E.T. games in New Mexico is being covered by news agencies as a story as though it were simply a legend. The author is stunned at this revelation, since the dumping was reported at the time it occurred, and it should have been regarded as historical fact rather than some kind of myth. Locals of Alamogordo with long memories know the games were dumped there, and Howard Scott Warshaw's denials should never have been taken as fact. Will future generations regard the existence of The Wizard, a movie with a shameless tie-in to Super Mario Brothers 3, as a myth?