Fan Nonfiction

Video games are haunted by the unseen. Fans lose themselves wondering what’s going on behind the tinted windows of offices in Bellevue, Washington, or Kyoto, Japan. The answer to how video games are made is never a match for the fantasy—oftentimes it’s a simple matter of déjà vu.

Such was the case when Jaime Griesemer visited Blizzard Entertainment in the late 2000s. Walking through the dim, beige corridors, he thought he was on the verge of seeing something unimaginable. A veteran of the games industry who’d helped design Halo and its many sequels, Griesemer and his team at Bungie were looking for inspiration for their next game, the studio’s first original project in more than a decade. The designers at Blizzard had been occupied with their own secret project, a science-fiction successor to World of Warcraft, and Griesemer thought a collegial chance to peek behind the curtain would help him come up with some new ideas. But when he began playing the demo Blizzard had prepared, he felt disbelief. “I was like, ‘Holy shit, you guys are working on the same game down to [the] character classes,’” he would recall. The two teams with little contact between them had somehow been working on the same essential idea. It was as if both studios had been hypnotized by the same mesmerist.

The story was related to Jason Schreier, who retells it in Blood, Sweat, and Pixels: The Triumphant, Turbulent Stories Behind How Video Games Are Made, an anthology of essays describing the development of 10 recent video games. “Every single video game is made under abnormal circumstances,” Schreier says, and while they often appear indistinguishable, it’s the story of how each was made that proves the difference. His mission is to reveal the particular and profound in what seems like a pastime of interchangeable parts. He shows just how much thought it takes to make something that works precisely to the degree that you don’t have to think about it. There is meaning in the relief from meaning that games provide, and each must find their own way to that same end.

Many of the games in the book are sequels—The Witcher 3: Wild Hunt, Dragon Age: Inquisition, Diablo III, Uncharted 4: A Thief’s End. The ones that aren’t are such thorough mimic jobs that they make the distinction between original and serial meaningless. Stardew Valley, for instance, is an elaborate attempt to make an updated version of Harvest Moon, a popular series of chibi-farming games. “I just wanted to play another game that was exactly like the first two Harvest Moons, but just with different people and a different map,” developer Eric Barone says. The developers of Shovel Knight began with the idea to emulate an entire company instead of a single game. “I remember saying, ‘I got into the industry to make Nintendo games. Let’s make a Nintendo game,” designer Dave D’Angelo says.

There’s an impression of insularity and self-reference in all of these stories. Each account is wreathed in the essential tautology of fan thinking. Video games are important because people care about them, and people care about them because they’re important. It’s lava levels and nostalgia all the way down. This is an echo of the circular dynamic between consumption and creation that Roland Barthes described in Criticism and Truth: “How many writers have written, only because they have read? How many critics have read only in order to write?” One could say, the reason fans play video games is so they’ll be better equipped to make them one day. And when one crosses the border from fan to designer, they’re often motivated, Barone admits, to make games for an audience of their imagined former selves.

Griesemer describes how this dynamic of autonostalgia became a creative impediment at the studio Bungie. As he and other Bungie veterans were eager to make something other than a sci-fi shooter, they found the team had swelled from 8 to more than 300, and “the huge majority of them had been hired after Halo 3 shipped. These are all people who love Halo. They wanted to work at Bungie because of Halo. So of course they wanted to work on something that was like Halo.” This helps explain how the company’s next project, which had begun as a medieval fantasy game in which each player resided in their own custom tavern and went on quests with others online, would eventually become the kind of sci-fi shooter (2014’s Destiny) the original team had wanted to get away from.

In the opening chapter, on Pillars of Eternity, another self-referential throwback to the Dungeons & Dragons games of the 1990s that Obsidian had worked on, studio cofounder Darren Monahan describes how the choice to seek funding from fans through Kickstarter made it feel “like we had maybe three hundred or four hundred other people working on the game.” Writer and designer Chris Avellone was happy to be answering to fans instead of publisher executives. “I’d much rather have the players be my boss and hear their thoughts for what would be fun,” he tells Schreier, “than people who might be more distant from the process and the genre and frankly, any long-term attachment to the title.”

This porous boundary between player and producer isn’t an accidental byproduct of games culture. In Coin-Operated Americans: Rebooting Boyhood at the Video Game Arcade, games researcher and professor Carly A. Kocurek traces the sudden popularity of video games to inflation and the stagnant global economy of the early 1970s. “Adults who had grown up in an era of company men, Fordist production, and savings accounts were watching the young grow up in an age of rapid-fire career changes, flexible accumulation, and easy credit,” Kocurek argues.

Against a backdrop of oil embargos, gas lines, escalating unemployment, and a 25 percent drop in real wages, “the high individualization of video-gaming competition and skills appear to look forward to the rise of freelance and contract labor as a dominant mode of work.” Games weren’t just a pastime but a generation’s first chance to tinker with the computational principles destabilizing American culture, the warm screen glow of personal achievement masking the cold indifference of the calculations making it possible. They taught a generation how to enjoy the feeling of being dominated by computers.

“You’ll probably note that most of the people who speak in this book are male,” Schreier writes in a short note about his reporting methods, “which is a depressing (and unintentional) reflection of an industry that has, for decades now, been dominated by men.” Schreier doesn’t elaborate further, but his choice of games to represent the industry reflects exactly the kind of gender bias he’s trying to disown.

In both economic and demographic terms, none of the games covered in Blood, Sweat, and Pixels have been as meaningful to the industry as Minecraft, Pokémon Go, League of Legends, Kim Kardashian: Hollywood, Clash of Clans, or Candy Crush—games played by women at a much higher rate than those discussed in the book. As a fan document, this bias would be understandable, but as a journalist covering an industry for some presumptive public interest, it’s harder to accept. Stardew Valley or Pillars of Eternity are no less derivative of their own respective subgenres than Candy Crush or Kim Kardashian: Hollywood. Yet assumptions about what counts as serious, creative game design and what is just a frivolous waste of time are inescapably gendered. Women accounted for around 44 percent of the 155 million people in America who played games in 2015 but just 22 percent of the industry’s overall workforce. Even when developers consciously strive toward breaking away from their own gender biases, they often become inescapable because of the innate biases about what counts as good, serious design.

In the chapter on Uncharted 4: A Thief’s End, Schreier captures, perhaps unintentionally, how these biases lead to hostility to and mistrust of the creative work of women in the industry, something easily hidden behind the mask of best practices. Amy Hennig had helped lead development on the previous three Uncharted games, and when she began work on the fourth game in the series, she wanted to challenge the team to avoid any sort of gunplay for the first half of the game. She thought this omission would surprise players and challenge designers to think beyond the predictable arithmetic of cover points, ammo distribution, and enemy spawns. Some on the team were excited by the change, but others worried there would be no fun without regular intervals of shooting. Bruce Straley and Neil Druckmann, who’d been leading development on the studio’s other major game at the time, The Last of Us, were invited to look at the work the team had so far done.

Straley described Hennig’s work as “theorycraft . . . a bunch of ideas that work on paper or when you’re talking over lunch, ‘wouldn’t it be cool if’ moments, but when you try to test them in the game, they fall apart quickly.” Straley and Druckmann worked on a proposal to revise the game’s design and structure in a way that might settle some of the team’s unease. Not long after, Hennig left the company. Citing anonymous sources at the studio, IGN’s Mitch Dyer reported that Hennig had been forced out by Straley and Druckmann. Naughty Dog denied the characterization. The company also made her sign a nondisparagement agreement, preventing her from speaking about what actually happened.

Hennig did later speak about her overall work experience at Naughty Dog with designer Soren Johnson, on his Designer Notes podcast in 2016. She described her 10 and a half years at the company as “really hard,” saying her average workweek was 80 hours:

There’s people who never go home and see their families. They have children who are growing up without seeing them. I didn’t have my own kids. I chose my career in lots of ways, and I could be single-minded like that. When I was making sacrifices, did it affect my family? Yes, but it was primarily affecting me, and I could make that choice. But when I look at other people . . . I mean, my health really declined, and I had to take care of myself, because it was, like, bad. And there were people who, y’know, collapsed, or had to go and check themselves in somewhere when one of these games were done. Or they got divorced.

What’s most peculiar about these sorts of working conditions is that they’re often self-directed, the result of a culture where being counted as an employee is a privilege, a symbol of personal achievement that one pays for through self-sacrifice. “It’s never mandated,” Naughty Dog’s copresident Evan Wells tells Schreier. “We never say, ‘OK, it’s six days a week; OK it’s sixty hours a week.’ We never [change] our forty-hour expectation or our core hours, which are 10:30AM to 6:30PM. . . . People put in a lot more hours, but it’s based on their own fuel, how much they have in the tank.” Romanticizing “crunch,” as the absurd hours and brutal overworking in the games industry are referred to, is common among industry boosters.

Is any of this worth it? Schreier mostly avoids the question. In a way, he collaborates with the culture of obsessive overwork by treating his subjects not as individuals but as information nodes for industry gossip and office-park politics. He seems almost paralyzed by the prospect of having to describe his subjects’ lives outside of the games industry. Each new speaker is usually given only one or two sentences that hint at their physical presence or personality.

There’s the slightest touch of desperation to this delicately dissociative approach to the people who make games. In part this is due to the fact that the games industry, at least the console-centric slice of it Schreier focuses on, has been both creatively and economically stagnant for almost a decade. Total video-game-industry revenue has grown from $77 billion in 2009 to $99.6 billion today, but console gaming has lost both market share and total profits (when adjusted for inflation). Most of the growth in the industry has come from phones and tablets (revenue for which jumped from $19 billion in 2009 to $38 billion in 2016), the only platforms where women players outnumber men. In 2009, investment bank IBIS Capital reported that most console game sales were coming from existing franchises, forcing publishers to compete by spending more on each title to make them feel spectacular and new even as they were mostly sequels and remakes.

The strategy offered “short to medium term earnings but risk[ed] declining long term growth.” Activision, the biggest games publisher in the world, dealt with these pressures by closing studios and assigning those it kept open to work together on a small number of big-budget games. At the same time, the company was running money through a shell company in the Netherlands to which it has transferred ownership of all its IPs, to avoid paying taxes on more than $2.3 billion in revenue. All of this was in support of a business that released four major new games in 2016, in addition to a series of downloadable expansions and map packs for existing games.

As the number of big games decreases and our options for participating in games culture slowly narrow, there’s an implicit but unstated anxiety about what any of us would be without all of these games to obsess over.

In these times, the most important task of game journalism isn’t to serve a public interest but to ensure that fans can continue to identify some version of themselves in the games they have played, and ensure future releases will allow them access to even deeper levels of self-expression and understanding. In playing the next game, owning the newest console, having an opinion on the latest patch, we feel like we can become stabler versions of ourselves, all at the cost of clearing out space—both mental and financial—for open-ended consumption of a form without any purpose beyond this increasingly tautological pleasure. This process is necessarily dehumanizing. Games matter because you are here to play them, and you remain here to play them because they matter.

Maybe this is why, with video games, we break from the tradition of identifying people with particular pastimes as lovers—bibliophile, cinephile, audiophile. To love video games is to become not a ludophile but a gamer, a claim on identity rather than a statement of personal interest. Every fact or feeling in our lives that doesn’t relate to games is an extraneous detail, so much so that it can feel like one’s whole life might be beside the point.

Schreier closes his book with an image of self-immolation, in a passage so glib it borders on thoughtless. He describes how in Game Dev Story, a satire about managing a game studio, the individual developers on your team will burst into flames when performing at optimal efficiency. When Schreier marvels at the advanced visual technology of games like Uncharted 4 or Destiny, he says “that’s the image that comes to mind: a room full of developers setting themselves on fire. Maybe that’s how video games are made.” The weightlessness of the comparison captures something true about our desire to think about games in a vacuum, as if the horizon ends at the office parking lot, and everything happening in the world beyond is some kind of surplus reality that becomes relevant only if it can be squeezed back into a new game to add to our endless collections.

But like Schreier’s title, which comes from a phrase Winston Churchill used in a 1940 speech to describe the terrible sacrifice and carnage to come in WWII, self-immolation has a history. It’s become the second-most-common form of suicide in Tunisia following the 2010 uprising, triggered in part by Mohamed Bouazizi, a street vendor who set himself on fire to protest state harassment. In 2016, the country’s biggest burn-treatment hospital admitted 104 people who had set themselves on fire. Many recognized the act not as a symbolic gesture but as a tactic in an ongoing labor conflict. One man, Imed Ghanmi, a teacher who’d lost his job and turned to selling things on the street, had attempted self-immolation several times before finally dying of it. “Imed used to pour gas on himself as a way to blackmail the police so they would give him back his merchandise,” his brother told New York Times reporter Lilia Blaise. “He had already done that as a last resort two or three times before and he told me it worked.”

Stripping images like this of context and history is part of the magic of video games. Games train us to ignore the world, to give everything we have to an industry whose primary purpose seems to be holding both its audience and performers captive. “People would ask me how things were going,” developer Sean Velasco tells Schreier, “and I was like, ‘Everything is getting worse except Shovel Knight.’ That’s getting better.”

Source: The New Inquiry.

Categories Culture

Michael Thomsen is a writer in New York. His work has appeared in the New Yorker, the New York Times, Slate, the New Republic, and the Washington Post.

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution 4.0 International License.