Sid Meier is famous for creating the video game Civilization. He’s also known for having his name on the box. Meier released Civilization thirty years ago this month, after developing it with Bruce Shelley, a veteran board-game designer. The pair were inspired by the illustrated history books you might find on a middle-school library shelf, and by titles like Seven Cities of Gold (1984), a video game of Spanish conquest created by the designer Danielle Berry. In Civilization, you start with a covered wagon on a map that is largely obscured. You found a city. You learn metalwork, horse riding, feudalism, democracy, and diplomatic relations. Eventually, the rest of the world is revealed—a patchwork of nations. You can dominate your neighbors or strive to outshine them. History rolls on.
Civilization didn’t mark the first time Meier’s name appeared on a box. In 1987, we got Sid Meier’s Pirates!, in which you sail your way across the Caribbean, evolving from a winsome privateer to a peg-legged Blackbeard. In 1990, Meier’s earlier collaboration with Shelley resulted in Sid Meier’s Railroad Tycoon, a construction simulator that spawned a slew of copycats. And then, in 1991, with little marketing fanfare, Civilization appeared. Players realized that they had found a gem. The Sid Meier stamp exploded, popping up on Sid Meier’s Gettysburg!, Sid Meier’s Alpha Centauri, and Sid Meier’s SimGolf. There were sequels to Civilization, which Meier had little to do with. We’re now on Sid Meier’s Civilization VI. His name is still on the box.
The latest rectangle to bear his name is not a game but a book: “Sid Meier’s Memoir!: A Life in Computer Games” (W. W. Norton). It provides a whistle-stop tour of the video-game industry as it evolved across Meier’s four-decade career. Today, we swim through a digital soup made by machines that were developed, in large part, to play games. Graphics processors designed in the nineteen-nineties for first-person shooters became useful, a decade later, to the developers of the neural nets that power our social-media platforms. Many of the people who helped create our virtual environments cut their teeth by making and playing video games. “I’ve been playing Civilization since middle school,” the Facebook C.E.O., Mark Zuckerberg, wrote in a post on his Web site. “It’s my favorite strategy game and one of the reasons I got into engineering.”
Like Meier, Zuckerberg signed his software: when thefacebook.com launched, in 2004, it was “a Mark Zuckerberg production.” Facebook is now a much larger enterprise, as is Civilization. The game is put together by hundreds of hands, from producers, designers, illustrators, and coders to marketers, play testers, and skin-modding fans. Even Meier’s memoir is the result of a collaboration with Jennifer Lee Noonan, a former sound designer for video games. A growing pile of video-game histories—such as Tristan Donovan’s “Replay” and the two-volume “The Ultimate History of Video Games,” by Steven Kent—suggest that the medium has always had collective effort at its heart, from its academic beginnings to its ascent into everyday life. In Meier’s memoir, we discover that he was a good game-maker when he fought this essential fact, but that he became great when he learned to embrace it.
Meier came to computers just as computers were coming to the world. In 1971, the year Meier entered the University of Michigan to study computer science, the first coin-operated video games appeared in American bars, stealing attention from pinball machines and pool tables. One of those games, a spaceship-fighting simulator created by a pair of electrical engineers, Nolan Bushnell and Ted Dabney, was dubbed Computer Space. Computer Space was a clone of Spacewar!, a 1961 game that a group of M.I.T. students devised for their university’s Buick-sized PDP-1 computer. One student suggested the concept, another prototyped the inputs, and the rest of the group critiqued, patched, and improved it. The code made its way from campus to campus, spreading the earliest seeds of video-game design. In 1975, Meier, a member of the game’s target audience, graduated. He found a job networking cash registers in Maryland, and started making a “Star Trek” game while he was bored at work. “I even added small beeping sound effects,” he writes. The game quickly caught on in the office. “Small beeps ricocheted through the halls as a sort of work-abandonment klaxon of shame.”
Bushnell and Dabney, meanwhile, formed a company in Los Gatos, California, and called it Atari. At the time, an arcade cabinet played exactly one game, but, in 1977, Atari released a home console system that let you play different games on a single machine. Two years later, the company released the Atari 800, a console computer that came with a keyboard, four joystick inputs, and, most important, a programmable cartridge that allowed Meier to make his own games. In 1980, he created Hostage Crisis, a shoot-’em-up inspired by the diplomatic standoff in Iran. A giant face dubbed the Ayatollah sat in one part of the screen and shot missiles at the player. Meier proudly showed the game to his parents, resulting in a scene ripe for psychoanalysis:
At that moment, Meier realized his passion. “If great literature could wield its power through nothing but black squiggles on a page, how much more could be done with movement, sound, and color?” he writes. “The potential for emotional interaction through this medium struck me as both fascinating and enticing.”
Meier’s offering wouldn’t have been welcome in the American big leagues. Atari, the leading name in video games, was extremely shy of controversy. The avatars onscreen were often just rectangles or blobs, and the concepts had to stay within the bounds of dinner-table politesse. In one game, a Cold War battlefield became a fictional alien planet; in another, the Holy Grail turned into an “enchanted chalice.” Atari also forbade its coders from signing their work, even though, as Donovan writes in “Replay,” its developers “were starting to see themselves as the artistic pioneers of a new form of entertainment.” In 1980, an Atari engineer named Warren Robinett decided to make a little mischief. In a secret room within his sword-and-sorcery game Adventure, he placed a script—“Created by Warren Robinett”—that appeared like a winking medieval acrostic.
Although Meier’s games were ill-suited for Atari, the cathedral was about to collapse, anyway. In 1979, several star Atari programmers, frustrated by their work not being publicly credited, created their own company, Activision, to make games for the Atari console. Atari sued Activision and lost, unleashing a torrent of independent, third-party video-game developers. (Donovan credits the flood of cheap Atari games, some of them poorly designed advertisements for things like dog food, with the industry’s decline in the United States.) Meanwhile, Meier was on the rise. Like a student sketching faces from the statuary at the Met, he learned his craft through imitation. He often made unauthorized clones of Space Invaders and Pac-Man, stuffing the games into plastic baggies and trying to sell them through his local electronics store.
Though Meier feigns remorse now (“ ‘Adaptation’ is such a flattering word. So much nicer than ‘copyright infringement’ ”), his early approach toward intellectual property echoed the attitudes of academically trained programmers. “Signing code was thought of as arrogant,” the digital librarian Brewster Kahle has said of his time at M.I.T. in the nineteen-eighties. And as the reporter Clive Thompson makes clear in “Coders,” his illuminating history of computer culture, anti-authoritarian hackers and coders of the era saw software as “an intellectual barn raising,” a collective effort. “Owning an algorithm you’d written,” Thompson writes, “seemed as nuts as ‘owning’ the concept of multiplication itself, or constitutional democracy, or rhyme.”
But Meier and his cohort were going a different way. In 1982, Meier created a driving game, Formula 1 Racing, and the publisher put his name on the box. This suited him. “It seemed to me that computer code was just as elegant as any literary prose,” Meier writes. That year, Meier founded a company with Bill Stealey, a former colleague and veteran Air Force pilot. They set up shop in Stealey’s basement and called their enterprise MicroProse. Stealey had a business acumen that Meier lacked. He schmoozed at video-game trade shows and ginned up demand by calling electronics stores, assuming a fake identity, and asking if they had MicroProse products in stock. He also pushed Meier to put out a steady clip of war games—Spitfire Ace (1982), NATO Commander (1983), F-15 Strike Eagle (1984)—which Stealey vetted for military accuracy.
Stealey’s big idea, though, was to turn Meier into a brand. In 1984, he staged a photo shoot, seating the mop-topped coder in front of a computer, putting a joystick in his lap, and surrounding him with cartoonishly labelled bags of money. “But Bill had decided that even this was too subtle,” Meier recalls. “Just before the photo was taken, he had climbed onto my desk to hang glittering golden dollar signs.” Stealey fashioned Meier as a folk hero for other programmers. “In order to lure the smartest and most creative talent in the industry,” Meier writes, “he wanted to promote the message that we treated our designers with the admiration and respect they deserved.” The video-game auteur had arrived.
Meier goes to pains to prove that he in no way encouraged this self-promotion. But he did have one prima donna-ish quirk: he wanted to do everything himself. Like many talented programmers at the time, he preferred to disappear into a fugue of coding and return with a gleaming piece of software. Until Pirates!, Meier, a music fanatic who owns dozens of guitars, had done the sound design on his games himself. He also rendered his own art until 1985, when Stealey hired an illustrator for Meier’s submarine-warfare game Silent Service. “I was, to be honest, a little offended,” Meier recalls. “Sure I was no Van Gogh, but I had been doing our game art for years and felt like I was pretty good.” Meier surrendered when he saw the illustrator’s results. “His 3D perspective was truer, his color contrast was livelier, and his captain looked human,” Meier writes. “It was better in basically every possible way a work of art could be better.”
By the late eighties, game design had become much more collaborative. A century-old Japanese toymaker, Nintendo, had entered the market, led by its own auteur: the banjo-playing Shigeru Miyamoto, who created the Legend of Zelda and Super Mario Bros. series. Even Miyamoto had help; his worlds were enriched by the Tolkienist storytelling of his colleague Takashi Tezuka, and, as Melissa Wood and France Costrel’s Netflix series “High Score” shows, by the many illustrators and sound designers who touched the games. Meanwhile, Meier had started working with Shelley, the board-game designer, who gently nit-picked Meier’s sprawling world-building. In early versions of Railroad Tycoon, for instance, bridges would randomly wash out in floods, just like in real life. The hazard turned out to be more punishing than interesting. “Bruce reminded me of one of my own axioms of game design,” Meier recalls. “Make sure the player is the one having fun.”
Game designers want to impress the players, Meier knew, but players want to impress themselves. “The game isn’t supposed to be about us,” Meier—of Sid Meier’s Civilization—writes, without irony. “The player must be the star, and the designer as close to invisible as possible.” But the player was about to ascend in ways that Meier did not anticipate. Soon after Meier and Shelley circulated their prototype of Civilization, other MicroProse developers began playing the game and hammering on Meier’s door. “What if aqueducts prevented fires, granaries prevented famine, city walls prevented floods?” he recalls them saying. “What if lighthouses increased your navy’s speed, but suddenly became obsolete after the development of magnetism?” Meier incorporated many of these ideas and shipped the game out, only to be buried again under fan letters with suggestions about Aztec bronze-working or the speed of trade caravans. “Civilization brought out the inner game designer in everyone,” Meier writes.
These video-game enthusiasts were about to leap into the car and drive. By 1996, the public Internet was in full swing, and Meier had handed over Civilization, and his name, to other coders. Brian Reynolds, one of the lead designers on Civilization II, followed the example of the first-person shooter Doom, and built a back door that allowed fans to pack new sounds, art, and even mechanics into the game. Meier fretted. “They would probably be terrible at it, I thought, and blame us for their uninspired creations,” he writes. “And if by chance they did happen to be good at it, then all we were doing was putting ourselves out of a job.” The older game designer was now the uneasy parent, looking on as his children ran amok. But Meier ended up seeing that he was “wrong on all counts,” he writes. “The strength of the modding community is, instead, the very reason the series survived.” And, it’s why the label “Sid Meier” lives on, too.
These player communities transformed the rhythm of game design. The task of making today’s blockbuster games—the zombie-apocalypse saga The Last of Us, for example—doesn’t square with the one-man-band approach that Meier and his Atari peers used half a century ago. The journalist Jason Schreier recently catalogued this turn in his book “Blood, Sweat, and Pixels.” Again and again, Schreier watches platoons of illustrators, coders, sound designers, and producers struggle in the trenches for years before a game’s launch, and then scramble for years afterward to meet player demands with a fog of patches and updates.
This is the way of all software in the twenty-first century. It’s how social-media developers, in particular, approach their craft, with much of a product’s sculpture and polish arriving, for better and worse, after it has entered our view. In 2006, a team of developers at Facebook rolled out the News Feed, which made a mosaic of content out of your friends’ profile pages. As Thompson points out in “Coders,” many users, feeling overexposed, protested the change. Zuckerberg apologized, and the company made a privacy patch to address the concerns. Fifteen years later, the News Feed is still around, but it’s a bit like the ship of Theseus: they’ve been patching it ever since.
With painting or prose, it has been said that a work is completed, in some metaphorical sense, by its encounter with an audience. With code, this circuit is literal. Programs and platforms are put on display, then tweaked because of error reports and user data; in multiplayer games, the activity of other players animates the experience. It is easy to say that this diminishes the artist-developer. But what the history of video games reveals is the story of all art. Our accounts often thrive on the vision of singular minds, even though every great work relies on the sweat, luck, and talent of many people, each inflecting one another in a continuous loop. When Sid Meier began tinkering with a new game, thirty years ago, his hope was that players would see themselves in his version of our planet. It was when the audience could watch one another tinker, too, that the planet became a world.