Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Breakout: How Atari 8-Bit Computers Defined a Generation
Breakout: How Atari 8-Bit Computers Defined a Generation
Breakout: How Atari 8-Bit Computers Defined a Generation
Ebook519 pages6 hours

Breakout: How Atari 8-Bit Computers Defined a Generation

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Atari 8-bit computers are the first machines that truly bridged the divide between video game players and home computer enthusiasts. The Atari 400 and 800 signaled the start of a new era in computing. Breakout: How Atari 8-Bit Computers Defined a Generation is the first book to cover what made Atari's groundbreaking computer line great:

LanguageEnglish
Release dateMar 16, 2017
ISBN9781732355286
Breakout: How Atari 8-Bit Computers Defined a Generation

Read more from Jamie Lendino

Related to Breakout

Related ebooks

Computers For You

View More

Related articles

Reviews for Breakout

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Breakout - Jamie Lendino

    Introduction

    My childhood circled around video games in general, but specifically, around one computer: the Atari 800. It’s impossible to overstate Atari’s impact on personal computers and especially gaming. While Apple and a few other companies delivered personal computing for the first time, Atari was the first to bring arcade-like graphics and sound into the home.

    This book serves as a celebration of Atari 8-bit computers and what made them special, with a heavy emphasis on gaming. It’s a look back at how the computers, peripherals, and software worked, and why the games were so good. In a world of always-on social media and ad-filled websites, where the idea of just develop your own game seems hopelessly complex, the simplicity and sophistication of a tightly coded to the metal Atari program is intoxicating—and you didn’t need to have dedicated artists with such a low pixel count.

    Hopefully, reading this book will trigger some pleasant memories of your own. Perhaps you had an Atari computer yourself and miss it. Maybe you even still have or use an Atari computer today. In the 1980s, Atari computers never really got their due. Let’s see if we can finally fix that.

    Who Cares About Old Computers, Anyway?

    Today’s obsession with retro computing is unmistakable—and it’s safe to say our memories are a little rose-tinted. With fast multicore processors, phones, apps, cloud storage, social networks, and streaming media, it may seem surreal to wax lyrical about a particular oversized hunk of plastic, metal, and chips. By today’s standards, any computer from the late 1970s had a tiny amount of unreliable storage and low-resolution graphics, and most of them took forever to load programs. If you were there, though, you know full well why these computers were special. If you weren’t, imagine the ability to control every last feature of the hardware, without layers of software abstraction on top, and in a way that rewards detailed study the more you work with it. The graphics were simple and distinct, and in its beautiful minimalism the machine was both easy to program and difficult to master. Early software sparked your imagination in a way even the most realistic high-definition graphics today can’t quite pull off. Many enthusiasts I’ve talked to over the years feel the same way.

    I was born in 1973 and was fully immersed in the home computer revolution as a kid. I got my first Atari computer, a 400, when I was eight years old. Until then, all of my entertainment options were too defined for me. I’d play a board game, and the rules were the same each time, and if I changed them, the basic structure still remained intact. My imagination may have run wild when I was playing with Lego, but the bricks were always the same each time, and thanks to the laws of physics, they always worked the same way. Beginning with the computer, suddenly I could make new games out of thin air, either by typing in programs from books or creating my own. Every cartridge and disk made the computer do something different than it had done a few minutes earlier. It was as if you could make your own bricks, and make your own physics.

    None of this sounds crazy now. But consider how unprecedented video games, and the ability to play lots of them or program your own, were back then. Remember that computers had already been around for several decades, but they used to take up entire rooms, or at least portions of rooms by the 1970s. They used to cost tens or even hundreds of thousands of dollars. The members of the informal Homebrew Computer Club of the mid 1970s in Menlo Park, California had the right idea. They were the first to see the creative potential in writing your own code, and running it on hardware on your desk, rather than having to line up at a university or work in a big company to use a mainframe system. They worked to make computing more accessible to everyone. Because of this, within a few years’ time, a kid like me could own and experience a real computer every single day, without having to share it with strangers.

    Computer memory was still extremely expensive, though—you could only store a few games or text-based documents on a fragile floppy disk, which meant no digital music, photos, or movies. This was even before CDs and VHS tapes. Music came on records and cassettes, you had to bring film from your camera to a store (and wait) to get printed photos developed, and you could only see movies in the theater. Back then, just about anything digital and affordable was a revelation. Sure, we had our computer-tech-infused science fiction and space adventures—Star Wars, and soon after, Tron, Blade Runner, and William Gibson’s novel Neuromancer. The concept of cyberspace would soon become a thing. But what we didn’t have yet, at least until the late 1970s, was the ability for walk into a store, buy a complete personal computer system, and bring it home to learn to program or play video games.

    One of my favorite books, The Hitchhiker’s Guide to the Galaxy, starts off with a line about humans on Earth being so primitive they still think digital watches are a pretty neat idea. Imagine the jump from a digital watch to your own personal computer. If you’re younger than 30 and reading this, trust me: It was amazing.

    Why I Wrote This Book

    Obviously I wrote this book because I am a huge Atari computer fan, but there’s more to it than that. Putting this book together evoked many good memories. Throughout the writing process, I discovered that what was driving me was nostalgia for a wonderful time and place. They say you can’t go home again, and it’s true in some sense; life is different when you’re in your 40s with a family, a job, and responsibilities. A part of me misses having untold hours to disappear into Ultima IV: Quest of the Avatar (and only half of which was waiting for the maps to load). But when I was 12, I also didn’t have much else going on aside from homework. Finding this kind of free time today would be impossible.

    In addition, I stumbled on some things that reminded me that it wasn’t cool to be a computer nerd in the 1980s. My general impression is that it’s more acceptable to be a geek today. I remember feeling lonely a lot of the time I spent on the Atari. That said, I met some lifelong friends through the bulletin board system (BBS) I ran on the Atari 800, and also talked to countless other people online with whom I’ve since fallen out of touch. It’s never usually just about the thing (computer, kind of car, book club); it’s always about the people, and often that becomes the most important part. And for me—I’d argue for some two million of us, if somewhat difficult-to-prove sales figures are to believe—the Atari 8-bit was central to our lives.

    Atari Jamie 1

    Figure I.1: Me in 1986, in front of my Atari 800, with an Ultima IV: Quest of the Avatar map on the wall. I still have the map.

    How This Book Is Organized

    This book is roughly divided into three sections. The first five chapters cover the mid 1970s to the late 1980s, spanning the birth of the Atari 8-bit platform, the challenges Atari faced, and how the platform evolved over time. Chapter 6 is entirely about the games, which for many people are the most important part of the Atari experience. If you want to skip straight ahead and read about those, go for it! Finally, the last four chapters cover the Atari 8-bit platform as it is today: what it’s like to collect the machines, which ones you should choose, what modifications are available, and the best emulators on today’s platforms.

    Finally, a few notes on conventions throughout this book. Generally speaking, I refer to 8-bit Atari computers as the Atari 8-bit; I wanted to continue in the tradition of the 20-plus-year-old Frequently Asked Questions list (FAQ), which has circulated around seemingly forever thanks to the tireless efforts of Michael Current, its publisher. Saying just Atari computers would also bundle in the 16-bit ST, the 32-bit Falcon, and a rather sorry bunch of PC compatibles from the late 1980s and early 1990s that no one needs to remember. While I had and also loved a 520ST for many years, this book focuses entirely on the earlier 8-bit experience.

    Tense—as in the written-word sense—is an issue when you’re talking about computers that came out 30 years ago, but that people also use today in different ways, not to mention your thoughts about them then (in the 1980s) and now. People can play Atari 8-bit games quite easily on emulators or otherwise modified Atari hardware even now. There are multiple ways to approach tense, and none are perfect. As a general rule, I use the past tense for the first five chapters of the book covering the tenure of the Atari 8-bit, and then switch to present tense for the real meat of the book—the games—and the community, collecting, and hardware mods available today.

    A few other quick notes: I also refer to modem and transfer speeds as bits per second or bps, even though in the 1980s everyone (including me, and incorrectly) said baud. Finally, I edited some quotes very lightly for clarity and consistent style within the book, but otherwise left them intact.

    With that, let's look back at the start of an incredible home computer revolution, one that became intertwined with a golden age of arcade gaming.

    1 | Atari 400/800

    The history of Atari the company has been told and retold. Most of the time, it’s with a focus on either of two things: its coin-operated arcade machines like Breakout, Asteroids, and Missile Command; or its game console lineup, starting with home versions of Pong in 1975, but most notably with the Atari Video Computer System (VCS, later known as the 2600) in 1977. I won’t rehash every last thing about Atari and its various levels of corporate dysfunction and pot smoking in this book, as others have already done the same. But we could do with a brief refresher of how we got the computer in the first place.

    In a nutshell, it was originally about succeeding the VCS with something better. But then it got complicated.

    Nolan Bushnell and Ted Dabney incorporated the Sunnyvale, California–based Atari on June 27, 1972. This was after the two launched Computer Space, the world’s first coin-op arcade game, under the name Syzygy Engineering, and several months before the release of the coin-op Pong. The following year, Bushnell bought out Dabney, and provided financial backing for a group of engineers working under the name Cyan Engineering, located a few hours away in Grass Valley. Atari purchased Cyan Engineering in 1975 and renamed it the Grass Valley Research Center.¹ This is important for our purposes, because four key players for the design of the Atari computer emerged from this group. Ron Milner and Steve Mayer developed the VCS prototype. Joe Decuir debugged it and created a new, gate-level prototype. Decuir apprenticed for Jay Miner, who served as the lead chip designer for the VCS, and later went on to design the Commodore Amiga.

    Separately, in 1976, one of Atari’s early employees worked on the Breakout arcade game with his friend, which turned out to be a huge success. Later, the two developed a home computer design using borrowed Atari parts. Bushnell turned down the design and wanted to stay focused on video games. The two friends, Steve Jobs (Atari employee number 40) and Steve Wozniak (who had been working for Hewlett Packard), went on to form Apple.²

    Eventually, it became clear that Bushnell needed more capital to launch the VCS, so he sold Atari to Warner Communications for $28 million in a deal with Warner exec Manny Gerard. The transaction ensured there was enough money to finish developing, release, market, and distribute the VCS.³ Bushnell remained chairman and chief executive officer of Atari, but tensions between him and Warner remained high, and Bushnell was eventually forced out before the end of 1978.

    After Atari launched the VCS in 1977, the Cyan Engineering team at Grass Valley Research immediately got to work on its successor. The team believed the VCS had roughly three years of life before it would become obsolete⁴, and wanted to fix its most obvious flaws: Make it faster, give it more memory, and vastly improve its graphics and especially sound capabilities.⁵ [We knew we had to] support 1978 vintage arcade games, Decuir said in a presentation at the first Classic Gaming Expo in 1999.⁶ We knew we would need to leapfrog the 2600 before somebody else did. [We had to] support home computer character and bitmap graphics. We saw the Apple II, Commodore, and Radio Shack appliance machines coming.

    The project was known internally as Oz, and Milner, Mayer, and Decuir headed it up. George McLoed designed what would become the Atari computer’s Color Television Interface Adapter (CTIA) chip, which like the VCS’s Television Interface Adapter (TIA) chip, could generate two-dimensional, on-screen sprite animation in hardware for faster performance. And a new Pot Keyboard Integrated Circuit (POKEY) chip, designed primarily by another Atari engineer named Doug Neubauer, would deliver rich four-voice audio for complex music compositions and sound effects, as well as allow for four-port joystick control.

    The First Trinity

    By the mid 1970s, it became clear that microprocessors were the way to go over the discrete logic design found in early arcade machines like Pong and Breakout. The MITS Altair 8800 jump-started the home computer revolution in 1975, thanks in part to its reasonable $399 price in kit form. Just about everything was extra; the Altair didn’t even come with a keyboard, much less a display. Still, for the first time ever, the Altair made it possible for anyone to own a real computer for not much money. Other companies soon joined in with their own kits. By the end of 1976, some 40,000 personal computers had been sold already, with MITS, IMSAI, and Processor Technology making up about half, and dozens of other smaller companies selling the rest.

    At this point, there were still no prepackaged computers available. Sure, you could buy the Altair 8800 or a competing kit, and check the box to pay extra for the company to assemble it at the factory for you. But there was nothing standalone and self-contained—nothing you could just bring home from a store, plug in, and start using.

    This all changed in 1977, thanks to the arrival of the first so-called trinity of personal computers, the Radio Shack TRS-80, the Commodore PET 2001, and the Apple II. Tandy launched the TRS-80 in New York City on August 3. It cost $399 on its own and $599 with a 12-inch monitor, with a $49 cassette recorder for storage.⁸ The TRS-80’s architecture was based around a Zilog Z80 microprocessor running at 1.77MHz. The first machines came equipped with 4KB of RAM. The TRS-80 benefited from the thousands-strong Radio Shack retailers already open, meaning it had a comprehensive dealer network from the get-go. The TRS-80 went on to see serious popularity over the first several years of its life.

    Commodore’s monster PET 2001 started at $795. The PET 2001 looked like something straight out of the movie 2001: A Space Odyssey. The machine contained a 1MHz MOS 6502 CPU, 4KB of RAM (though Commodore bumped it to 8KB by early 1978), a built-in cassette recorder for loading and saving programs, and an integrated monochrome display so you could see what you were doing and what the results were. Many people complained about the feel of the chiclet-style keyboard. I became quite acquainted with it in my fifth-grade computer class in 1983, as our elementary school’s lab was stocked with PET computers. I don’t remember caring at all how the keyboard felt, other than that it was different than the Atari I had at home and therefore neat.

    Then there’s the Apple II, the one most people remember. It was the slowest seller in the beginning, thanks to its high price ($1,298, sans floppy drive or monitor). But it eventually became a juggernaut in home, business, and education environments. Steve Jobs and Steve Wozniak released the Apple II in June 1977. The brilliance of the Apple II’s design, which was based on that of the Apple I, can’t be overstated. Wozniak knew how to get as few chips as possible to do as much as possible, while Steve Jobs ensured the machine was wrapped in friendly, stylish packaging. The Apple II contained a 1MHz MOS 6502 and 4KB of RAM. Like the TRS-80 and Commodore PET, the Apple II output to a 40-by-24-character display, but unlike those machines, the Apple II also displayed color. This innovation—huge at the time, if you can believe it—turned out to be key to the machine’s popularity, and became vital for both gaming and educational software.

    Early personal computers delivered on the promise of a packaged, fully contained system. But all of the popular machines, like the Altair 8800, Apple II, TRS-80, and Commodore PET, had limited graphics, sound, and memory. They also lacked software libraries. Dan Bricklin’s VisiCalc, the first electronic spreadsheet program, became the killer app for the Apple II in businesses large and small. VisiCalc let executives and accountants ditch their calculators and pencils and play out fictional business scenarios to see what would happen before any money was spent. But most people buying these machines ended up writing their own software using BASIC (short for Beginner’s All-Purpose Symbolic Instruction Code, a popular programming language) for the first couple of years, itself an incredibly rewarding activity. Games with good graphics were few and far between.

    On the gaming front, the Atari VCS was the first console to bring cartridge-based arcade gaming home—pull out a cartridge, plug in a different one, and you have a different video game to play. But the VCS was even more limited in power, and since it wasn’t a full-blown computer, it was impossible to program unless you worked for Atari.

    Candy and Colleen

    Back at Atari in March 1978, amid conflict with Bushnell, Manny Gerard (the Warner Communications exec) installed Ray Kassar, a fabric industry executive with an eye on the home computer market, as president of Atari’s consumer division.⁹ Kassar ordered the engineering team to turn the planned VCS game console successor into a real home computer. This meant adding programmable BASIC, a keyboard, a character set, and support for external peripherals such as a disk drive and printer.¹⁰

    p

    Figure 1.1: The Atari 800’s internal expansion riser with the MOS 6502B processor. Credit: Evan Amos/Wikipedia

    With its new marching orders, the engineering team developed the Alpha-Numeric Television Interface Circuit (ANTIC), a chip to control bitmapped graphics and character support in a variety of modes, with different levels of resolution and color. The chip would work in conjunction with CTIA’s video output. The engineers were well aware of the limitations of the Apple II, the PET, and the TRS-80. They wanted their computer to be just as good at gaming as consumers would expect from the Atari brand, while simultaneously delivering a real computing experience.

    The final design had five large-scale integration (LSI) parts: the MOS 6502, ANTIC, CTIA, POKEY, and the 6520-based Peripheral Interface Adapter (PIA, the auxiliary 16-bit chip that delivers interrupt control for peripheral I/O and manages the joysticks). Atari employees also coded the operating system. There [was] a period at Atari when there were no [VCS] games coming from Larry Kaplan, Alan Miller, Bob Whitehead, and myself, said David Crane, an early Atari employee who eventually went on to cofound Activision with the other three men he named. As the most senior designers at Atari we were tasked with creating the 800 operating system. This group, plus two others, wrote the entire operating system in about eight months.¹¹

    I’m very proud of the OS we created for the Atari 400/800, said Alan Miller. It was similar in complexity to QDOS, the OS that Microsoft licensed a couple of years later, renamed MS-DOS, and sold for the IBM PC. However, the Atari OS was much better designed in terms of its user-friendliness and it had a much, much richer graphics subsystem and many fewer bugs.¹²

    Next, Atari needed a BASIC for the computer. The company signed a contract with Shepardson Microsystems to write a version of BASIC that could fit into 8KB, as well as a file management system.¹³ The way Atari got there is fantastic in retrospect. A funny story from this time that Al Miller likes to tell has to do with the Atari BASIC cartridge that was to ship with the system, Crane said. Atari had contracted with a young programmer named Bill Gates to modify a BASIC compiler that he had for another system to be used on the 800. After that project stalled for over a year, Al was called upon to replace him with another developer. So, while Al is the only person I know ever to have fired Bill Gates, I suspect that rather than work on Atari BASIC, Gates was spending all his time on DOS for IBM. Probably not a bad career choice for him, do you think?¹⁴

    Atari ended up deciding on making two computers, famously code-named Candy and Colleen after two attractive secretaries at the company. In December 1978, and despite warnings from Bushnell, Atari announced that it was forming a Computer Division, separate from consumer electronics and coin-op.¹⁵ The next month, at the 1979 Winter Consumer Electronics Show (CES), Atari officially unveiled two 8-bit home computer models: the 400 (formerly Candy) and 800 (Colleen). Both machines had cartridge slots, which immediately marked them as stand-ins for game consoles—instantaneous program loading!—along with four joystick ports and heightened graphics and sound capabilities.

    p

    Figure 1.2: The April 1979 issue of Creative Computing on Atari’s new computer line.

    On the low end, Atari pitched the 400 as a kind of hybrid game console and entry-level computer, albeit with non-upgradable memory. The 800 was considered the real computer, with modular RAM and ROM, a second cartridge slot, a monitor output (including separate luma and chroma pins), and a mechanical keyboard.¹⁶ The names 400 and 800 came from their initial base memory—4KB and 8KB—although Atari also bumped the 400 to 8KB by the time the first shipment of machines hit Sears stores in November. The Atari 400 was a game machine with a flat keyboard, Decuir said. The Atari 800 was a full computer.¹⁷

    Part of what made Atari computers so accessible was that it was possible to hook them up to a regular television, and not only to a dedicated computer monitor the way you had to with the Apple II. The company bundled a television switch box adapter with each machine—which meant that the machines needed to comply with FCC regulations for frequencies in the television range, unlike competing models. As a result, Atari built both computers with extremely heavy cast aluminum shields to minimize radio emissions from the hardware. The machines received FCC approval in June 1979.¹⁸

    Unfortunately, the shielding made it difficult to work with the insides of the computers. In lieu of expansion slots (aside from memory) like those the Apple II had, Decuir developed an ingenious, shielded serial input/output (SIO) interface for attaching peripherals in a daisy-chain configuration.¹⁹ Atari’s SIO was an early plug-and-play system for external peripherals. While the cables were huge and thick by today’s standards, and the connectors large, they worked securely and reliably, and were impossible to plug in the wrong way thanks to the connector’s defined trapezoidal shape. The SIO port made it extremely easy to connect powerful peripherals to the computer anyone could set up. The downside was that each peripheral had to have its own brains and internal interfaces, which drove the cost higher, and since the connection was proprietary, only Atari computers could use Atari peripherals (though this began to change when various interfaces and third-party adapters hit the market a couple of years later). The worst irony is that later in 1979, the FCC changed the rules to allow Class B electronics, or those intended for residential or home use, without the shielding.²⁰

    Thanks to the combination of joystick ports and full QWERTY keyboards, game designers could develop complex simulations and role-playing games the likes of which had never been seen before.²¹ The first was Star Raiders, introduced concurrently with the Atari 400 and 800 and arguably the killer app for the platform; other famous original Atari 8-bit titles like Rescue on Fractalus! and M.U.L.E. came later.²²

    The Atari 400 was supposed to cost $499.95, but the price was bumped up to $549.95 by the time it launched. Atari positioned the 800 on the higher end, at $999.95. Both machines came with a manual, a power supply, the aforementioned TV switch box, a CXL4002 BASIC cartridge, and the book Atari BASIC: A Self-Teaching Guide. The 800 also included a 410 Program Recorder cassette drive. Sears received the first

    Enjoying the preview?
    Page 1 of 1