Fifty years since their humble beginnings, video games have grown into the leading entertainment industry in the world. This year, roughly 3 billion individuals in the world will play a video game, and analysts estimate that revenue for the global video game industry will pass US$200 billion, which, in comparison, will be almost double the size of the motion picture industry, and it is expected to continue to grow at double-digit percentages for the rest of this decade. Further, when the entertainment industry in general experienced a significant impact from the COVID-19 pandemic, the video game industry instead saw one of its biggest growth spurts in 2020 as video games became the entertainment of choice for people under lockdown around the world.
However, the gaudy market statistics belie the longstanding issues of the industry. Several high-profile video game companies have recently come under fire for allegedly causing its developers to work unethically long work hours for months on end as well as creating a hostile work environment. Calls for greater employee protection have reached a crescendo in response, but substantive change has yet to materialize despite years of calls for improvement. This article will provide an overview of the issues of employee overwork in the video game industry, why those issues may be approaching a boiling point, as well as the proposed solutions.
History of crunch in the industry
Most if not all jurisdictions around the world today have some form of overtime labor laws in recognition of the deleterious effects of overwork on the general well-being of an individual. The current medical consensus is that for the vast majority of the population, work efficiency starts to drop rapidly above 50-55 hours of work per week, and an employee working 70 hours a week may not be doing any more productive work than a colleague working 50 hours of work the same week. While short-term overwork generally does not cause any significant issues, extended periods of overwork for several weeks or more carry a dramatically higher risk of developing serious chronic medical issues, such as stress headaches, back and neck pain, cardiovascular disease, as well as psychological issues such as depression, anxiety and emotional exhaustion. Those psychological issues can then cause potentially irreparable damage to relationships with friends and loved ones. Due to the negative impact on the overall economy of an overworked work force, there is an interest for governments to ensure that employers are deterred from unnecessarily causing their employees to work excessive hours.
Although video game development is not immediately thought of by the general public as a demanding job in terms of hours worked (“Aren’t you making games?”), in reality, the frequent need to work extra hours for a significant period of time to complete a project or meet a deadline, or “crunch” as it is commonly called, has been a sort of an open secret for a long time, but because of a lack of social media outlets prior to the turn of the millennium and all the fears associated with being a whistleblower, awareness of the issue outside of the industry was virtually nil.
The first widely known complaint of crunch in video game development is often attributed to a blog post/open letter penned in 2004 by an individual under the name “ea_spouse”. In the blog post, “ea_spouse”, later identified as Erin Hoffman, complained that her spouse, an Electronic Arts game developer at the time, had been forced by management to work on an unnamed game project for 13 hours per day seven days a week, with the possibility of a Saturday off at 6:30pm “for good behavior”, without any overtime pay or additional sick/vacation leave. Such work conditions over several continuous weeks were causing her spouse to develop chronic headaches and stomach problems. Worse yet, Hoffman claimed that the crunch was implemented not because the game was behind in progress or has encountered a serious problem – it was deliberately planned as part of the production schedule. Electronic Arts eventually faced two class actions in Kirschenbaum v. Electronic Arts, Inc. (2004) and Hasty v. Electronic Arts, Inc. (2006) in which Electronic Arts was alleged to have intentionally (mis-)classified its game programmers under California labor law so as to keep them from receiving overtime compensation. As Electronic Arts settled both class actions for a total of about US$30 million, the legal issues were left undecided, but the “EA Spouse Letter” was able to create in the wider video game community an inkling of awareness of the labor conditions the most popular games from large studios/publishers were made in.
In 2010, spouses of developers working at Rockstar Games’ San Diego studio on Red Dead Redemption posted a letter online in much the same fashion as Hoffman did several years ago. The letter alleged a disturbingly similar work environment in Rockstar San Diego, such as 90-hour weeks, project mismanagement and chronic health problems, including depression and suicidal tendencies, from the extended crunch. This matter was also settled out of court. Rockstar Games would again find itself under the spotlight after co-founder Dan Houser famously claimed (apparently as praise) in an interview with New York Magazine in October 2018 that “we were working 100-hour weeks” regarding the imminent launch of Red Dead Redemption 2, a sequel to Red Dead Redemption. Despite Houser’s attempt to later backtrack his statement that he was referring to his work hours and of a few closest to him at that time, due to the context of the interview, i.e., creating marketing hype for the game, even if Houser had the best of intentions at the time regarding the work done by the developers, the fact that 100-hour weeks were even needed in the first place raised serious questions about video game development practices.
As awareness increased, more people are coming out to speak about their experiences with crunch. The sudden closure of Telltale Games in 2018 revealed how a once highly successful smaller studio took on too many projects at once, which resulted in widespread crunch of 50 to 70-hour work weeks. In 2019, a number of (ex-)developers for Epic Games’ extremely popular free-to-play game Fortnite Battle Royale (commonly referred to as simply Fortnite) disclosed in interviews that they had worked more than 70 hours per week for months to support the game after its launch in September 2017. Finally, as case in point that crunch is not exclusive to video game development in the United States, Polish video game studio CD Projekt RED announced in mid-2020 that its employees will be forced to work overtime in order for its new game Cyberpunk 2077 to meet the advertised launch date in the 2020 holiday season, even though the studio had previously promised that it will be able to avoid crunch.
For a statistical portrait of the state of crunch in the video game industry, the International Game Developers Association (IGDA), a nonprofit organization formed in 1994 to support game developers around the world, has been conducting the Developer Satisfaction Survey (“DSS”, initially annually but now bi-annually) since 2014 to compile information about the well-being and opinions of video game developers worldwide. According to the 2021 DSS, slightly over half of all respondent developers, including employees, freelance contractors and self-employed, have experienced crunch in their jobs. Among those who experienced crunch, about 60% experienced crunch sessions more than twice over the past two years, which the IGDA noted was a large increase over the 35% reported in the 2019 DSS. In addition, about 55% of employees who experienced crunch worked at least 50 hours a week during crunch sessions, but this is a slight reduction from the 2019 DSS, which reported nearly 70% working those hours or more. Nevertheless, there are still reports of 80+ hour work weeks, and it is telling that about half of the respondents to the survey believe that crunch is “expected as a normal part of their job”.
Causes of crunch
Why is crunch such an endemic issue in the video game industry? Based on the IGDA’s data and observations of the video game industry in the 2010s onward, the causes of crunch may be categorized as follows:
Crunch attributable to the complexity of modern video games
While video games are by definition a visual medium, they are, unlike every other form of entertainment media, also interactive, and they are dependent on the technological capabilities of the gaming device/platform that they run on. As computing technology improved and the first generation of children who played the earliest video games also grew up, more are expected of what video games can do. In contrast to a soccer game on the Nintendo Entertainment System (NES) in the late 1980s to the early 1990s, which, from a visual perspective, the two teams and the soccer field would be limited to more or less a group of colored pixels, in a modern soccer video game, such as the FIFA series made by Electronic Arts that runs on the latest platforms, through licensing with the relevant professional clubs, the likeness of the players is digitally re-created through facial imaging and motion capture; the high-resolution stadium and the soccer pitch allow gamers to discern details in the turf and spectators in the stands. The gameplay has been vastly expanded due to the sophisticated game device controllers, as gamers now have access to and can execute an entire repertoire of attacking moves, defending moves, team tactics, off-ball controls, and set pieces (e.g., free kicks) like the actual players or the coach would in a real match. There are also all the internal workings of the game that the gamer does not perceive on the screen – the computer making calculations to simulate artificial intelligence, the calculations for the physics of the soccer ball and the players, etc. All of the above require careful coordination among specialist teams working on each element of the game, which are then implemented in software through programming, followed by play testing for quality assurance (QA) to find unintended software interactions and report them back to the rest of the development team to fix before the game is launched. As a result, even with the aid of corresponding improvements in the various hardware and software tools for development, the amount of work and the scope of talent required to create a modern video game, especially one that is designed for mass appeal through realistic graphics and easy-to-pickup gameplay, have grown extremely large and diverse, and the whole development process has become a daunting challenge in project management due to the enormous amount of “moving parts” involved; it is thus not surprising that 62% of respondents in the IGDA’s DSS 2019 attributed the occurrence of crunch on “poor/unrealistic scheduling”, which is by far the most prevalent cause.
In this connection, QA often gets the short end of the stick, with a slight majority (56%) of the DSS respondents who worked in QA answering that they felt crunch is a “normal part of the job”. Thorough QA is an extremely time-consuming process – the sophistication of games means there are a lot of aspects to test, and the code is, for the lack of a better description, often put together by “duct tape” due to many people working on separate parts, so often a fix to a reported bug found by a tester may break something else. In practice, for many large-budget games, it is no longer possible for QA to find and resolve all the bugs in the allotted time per the project schedule, and since the game should be in a reasonably “playable” state by the time it reaches QA, if launch is imminent and the developer studio is being pressed to finish the game as soon as possible, there is a temptation to skimp on QA and instead rely on software patches after launch. Many QA play testers, who are often contracted and not a full-time employee of the developer, report low job satisfaction because in addition to forced crunch sessions, many times the bugs they found in their good-faith effort to catch them before release are deferred to post-launch or even outright ignored.
Barring unforeseen leaps in software development, there is a growing concern that the current model of game development is unsustainable if game complexity continues to grow alongside technology; there may be a point where development simply cannot go any faster or become more efficient no matter how much resources can be thrown at it. With that said, there are other human-based factors contributing to crunch that are relatively more solvable.
Crunch attributable to management
The aforementioned increased complexity of games has also caused development costs to spike accordingly. While specific figures for individual games are often not disclosed to the public, big-budget, “AAA (triple-A)” video games that are designed first and foremost for commercial returns now regularly cost hundreds of millions (US$) to make and involve hundreds if not thousands of personnel over multiple years of development. For reference, the following is a list of the most expensive video games by total production cost (adjusted for inflation) with estimated totals based on relevant financial statements and/or third party analysts:
With the exception of Cloud Imperium Games (Star Citizen started out as a crowdfunded game in 2012 and is still under development, the costs of which are being sustained in part by purchases of playable and conceptual in-game assets), the developers of the other four games are all connected to publicly listed publishers at the time of their release: Rockstar Games is a publisher under Take-Two Interactive, while CD Projekt (publisher entity for CD Projekt RED, the studio) and Activision Blizzard are listed on the Warsaw Stock Exchange and NASDAQ, respectively. It is also worthwhile to note that the four launched games all recouped their costs within the first few weeks (if not days or hours), with Grand Theft Auto V in particular being the No.2 best-selling video game of all time with an estimated 150-160 million copies sold as of the end of 2021.
For most of these publicly listed video game publishers around the world, sales of their most well-known AAA games are still a key driver of their top and bottom lines as well as mindshare, despite the move toward adding in-game purchases and continuously released free/paid downloadable content for a single big-budget game. From the publisher’s perspective, it has a duty to its investors to take reasonable measures to manage and control the commercial risk in these costly ventures, but such measures do not always align with the game’s development process and tend to introduce a significant amount of extra work for the personnel to handle.
It starts at the top. One potential culprit for project mismanagement and unnecessary crunch is the lack of individuals with background in game development at executive/management positions of developer studios and publishers. According to the information on the publishers’ respective websites, almost all executives officers at Take-Two, Electronic Arts Activision Blizzard and Ubisoft hold a MBA or a law degree and had previously worked at “management” or “leadership” capacities in other industries; in contrast, there are no executive officers at Take-Two, Electronic Arts and Activision Blizzard, and only two “employee representatives” at Ubisoft, who possess any significant experience in actual game development work. On one hand, the skill set needed to engage in business administration at those large publishers are understandably different from the personnel who are in the trenches of the development process, but the conspicuous lack of people at the decision-making level who has experience in game development arguably can and has caused tone-deaf decisions that needlessly complicated development. It is also at this level that one is more likely, but not always, find defenders of crunch practices, who point to an apparent correlation between the commercial success of the most ambitious titles with the amount of development time spent, leading to rather odd development philosophies like the term “BioWare Magic” at BioWare, a prominent Electronic Arts studio. One ex-BioWare producer Mark Darrah have criticized such philosophy as no more than a belief that things will start to “magically click” at some point in the future so development work can start intensifying to get the game completed solely because it has worked in the past. Pointedly, Darrah claimed that BioWare is hardly alone in such development philosophy.
One commonly observed instance of questionable decision-making can be called “trend-chasing”. Every couple of years a new type of gameplay or sub-genre emerge as the next “popular fad” among gamers, and publishers who wish to ride the wave will either rush to bring new titles featuring such gameplay or incorporate them onto its current and upcoming titles. However, dysfunction often results if such gameplay is merely grafted onto current projects without understanding beforehand whether it is a good fit with the core of the game both audience-wise and project management-wise, and rushing a new game may require diversion of personnel already engaged in ongoing projects. For example, the sub-genre of “battle royale” first or third-person shooter games burst onto the gaming scene in 2017 with the aforementioned Fortnite by Epic Games and another game called PlayerUnknown’s Battlegrounds (now known as PUBG: Battlegrounds, or simply PUBG) gaining millions of players in a few short months. Based on the 2000 Japanese cult classic film Battle Royale, the gameplay features a “last man standing” mode in which typically 60+ players (individually or in small teams of 2-3 players) are dropped randomly in a large area and must attempt to scavenge weapons to eliminate all other players/teams, the twist being that the playable area shrinks in size over time and automatically eliminates all players who are caught outside, thereby encouraging conflict and reducing hiding spots as time goes on. Many of the largest publishers such as Electronic Arts, Activision Blizzard and Ubisoft swiftly announced in early 2018 that they are looking to develop new battle royale games or add battle royale game modes to their popular shooter games. Treyarch, an Activision Blizzard studio that was responsible for developing Call of Duty: Black Ops IV, the then-latest entry in Activision Blizzard’s Call of Duty franchise, allegedly underwent nine months of 70+ hours per week crunch to put together a battle royale game mode in time for the game’s October 2018 launch. Electronic Arts had also announced a battle royale mode for Battlefield V, also the then-latest entry in its Call of Duty-competitor Battlefield franchise, but the mode was not ready at the time of Battlefield V’s release in November 2018 and launched months later in March 2019. The critical response to both game modes was generally lukewarm, and despite the popularity of both franchises, neither new entry saw a material rise in sales that was attributable to the battle royale mode, because fans of either franchise were primarily buying them for their existing gameplay instead of battle royale, and the new wave of players playing battle royale games were less likely to spend US$60 to buy either game to play their battle royale game mode when Fortnite was free and PUBG was US$29.99 at the time.
A variant of this issue is “feature creep”. The definition of the term is simple: The addition of features during production to the extent that the original project scope is no longer accurate. Unlike trend-chasing above, the features added do not necessarily involve what is popular in the market at the time, and they may be an objective improvement to the final product once implemented. The problem with feature creep is that it typically leads to production chaos when features are added for the sake of standing out among the competition from other games in the same genre, and those responsible for managing the project failed to properly account for the new additions because the original project scope was already unclear (i.e., “we had no clear idea what our game should be”), the new additions significantly complicated the programming/QA process, or a failure to reach a compromise on what other features can be dropped or reduced in scale, leading to the need to crunch. An archetypal example would be the making a sequel of a hit game. The initial expectations from above would be a vague but ambitious “everything will be bigger and better“, but as the design starts to take shape, more concrete suggestions may start to come in on what to add – what if the main character can do this? How about adding this new level/environment? What about this game mechanic, which none of our competitors is doing? Without some kind of control on the types of new features to implement based on a relatively solid sense of how the sequel will improve on the original, or at the very least, some mechanism to test out whether the new feature is worth pursuing before committing project resources, the game sequel itself, if completed, will likely be a mere collection of half-baked feature bloat that obliterated the strengths of the original.
Finally, marketing deserves a mention as one of the indirect causes of crunch. In today’s crowded market, a lack of visibility and/or word-of-mouth online will severely cripple a game’s future commercial performance and reputation, regardless of critical praise or budget committed. A common videogame marketing strategy by the publisher is thus to build up anticipation from teaser story leaks or concept art years in advance, often before development has even started, and continue to make such leaks from time to time to keep the discussion going among gamers. Marketing efforts and associated expenses start to accelerate one or two years before the projected launch date, typically through a multimedia blitz of “gameplay” videos, trailers, presentations and/or interviews at one of the large annual videogame trade shows such as the Electronic Entertainment Expo (commonly called “E3”) and the Tokyo Game Show, as well as commercials and advertisements on a variety of media platforms, culminating in a reveal of the launch date and opening pre-orders for the fans who are sufficiently convinced by what they see to put down their money for the game months in advance of launch – thereby achieving the actual objective of the publisher. The operative term is “convinced by what they see”: The so-called “gameplay” videos and trailers are often tailor-made solely for the marketing occasion to wow the audience and may not in any way reflect the actual state of the game at that time (which may even took away developer time that could have been spent continuing work on the game), but the onus is nevertheless hoisted onto the developer to try to deliver a final product that the consumers are expecting based on what they have been told by marketing. If a release date is announced to the public at the same time, the developer team now has to race against the clock as well.
Examples of overpromising in marketing causing troubled development are numerous, yet, as mentioned earlier, none crashed as spectacularly as the case of Cyberpunk 2077. CD Projekt RED revealed to the gaming media that they are working on a cyberpunk action game in 2012, eight years before the game’s eventual launch. News for the game was scant over the next few years save for a few trailers and occasional confirmations that the project was still alive, but the studio was gaining accolades for its other big-budget game The Witcher 3 in 2015, so there is already a sense of high expectations of the next big thing from the studio. At E3 2018, CD Projekt RED put out a gameplay video that exploded in popularity on social media for the depiction of a futuristic urban environment and faithful replication of the hallmarks of the cyberpunk genre, all done with graphical prowess that rivaled or even exceeded the bleeding edge at the time. Hype further reached a frenzy next year at E3 2019 when a new trailer unexpectedly depicted popular Hollywood star Keanu Reeves’ appearance in the game, who also made a surprise personal appearance at the show to promote the game, along with the announcement of an April 16, 2020 release date. The E3 2019 trailer was one of the most watched game trailers of all time on YouTube, and pre-orders for the game numbered in the millions. CD Projekt RED announced in January 2020 that the April release date needed to be moved back to September. COVID-19 was blamed in June for a further move to November, and a final round of QA was cited in October for another delay to December. The game finally released on December 10.
The game contained an unexpectedly large amount of software bugs on every single platform that it was released on, with issues ranging from gameplay bugs to poor technical performance, such as disappearing parts of the environment and game save corruptions, that no competent QA process would have allowed to make it to the final release. While professional reviews were mixed with a few journalists directly criticizing the game as not yet ready for launch, gamers online savaged the game and CD Projekt RED for not only failing to live up to the lofty vision that was pitched in the game’s marketing, but also for lying to the public at every turn about the state of the game while reneging on their promise not to crunch just so the game could make it out for the 2020 holiday season. The massive online discontent forced Sony and Microsoft to apologize and offer refunds to dissatisfied customers, with Sony making an unprecedented move to pull Cyberpunk 2077 from its digital Playstation store on December 18. Between public apologies from persons in charge and further miscommunications about how the refund procedure was going to work, the parent publisher CD Projekt’s stock tanked by as much as 30%.
Anonymous interviews with those involved in the project showed how dire the situation actually was at the studio. Despite the 2012 announcement that the project planning is underway, development did not start until early 2016 when most of the work was done for The Witcher 3 so personnel can switch over to Cyberpunk 2077. Almost everything shown in the E3 2018 gameplay video was created for the show; many of the game events and mechanics depicted in the video were not present in the final game. While gamers worldwide were cheering on the E3 2019 trailer, the interviews revealed that the studio knew at the time there was no way the game could be completed by April 2020, but management was allegedly OK with it for no other reason besides the studio having managed to complete The Witcher 3 in time. Crunch of 12-hour work days, six or seven days a week essentially lasted all the way to launch, but a conservative estimate was that the game needed at least another 18 months to 2 years just to resolve the software stability issues. Incidentally, this has mostly become true, as of the date of this article, Cyberpunk 2077 has had most of its most serious bugs eliminated and can be reasonably deemed as a decent gaming experience, but it will likely remain marred forever by thoughts of what it could have been.
Perhaps the one positive impact is that the name Cyberpunk 2077 has become synonymous with game development that ran amok, and current developer/publisher PR statements have begun to assure fans that they are aware of what happened to Cyberpunk 2077, but the jury’s still out as to whether management has truly started to wake up with respect to project management.
One of the key questions asked in the 2021 DSS was how long the employee respondent expects to stay with the video game company that they are currently working for, and just under half (27% and 20%) responded 1-3 and 4-6 years, respectively, and the second highest answer being “don’t know” at 21%. This corresponds strongly with the average duration of a game project because talent tends to be hired on a project basis. In other words, an employee is hired for the fit of his/her skills to an ongoing game project rather than “for” the developer studio. But this also means that when the project is finished, the additional hires the studio made during the course of the development would need to be laid off to save on “operating expenses”, regardless of the actual commercial performance of the finished game. Hence, while the overall job opportunities in the video game industry have generally been relatively plentiful given the sheer number of games in development at any given time, job security has been a persistent issue, which plays an important role in the acceptance of crunch by the employees themselves.
Much of the video game industry is built on the back of mostly young (age 25-40: 63%, per the 2021 DSS) and male (61%) developers who have nurtured a passion for video games since their childhood. Due to the increasing presence of video games in each generation since the 1980s, there is unlikely to be a shortage of individuals who are interested in a career in video game development any time soon, so an opportunity to work in video game development, especially at one of the larger developer studios, is very competitive, and being hired for such position is regarded as a sort of privilege.
Suppose an entry-level programmer is hired by Rockstar Games to work on the next Grand Theft Auto game and is assigned to work in the programming team implementing car physics. Car physics in a game involving as much driving as in Grand Theft Auto constitute a significant part of the “feel” of the game while playing, so a success here will be a valuable accomplishment to list on the résumé. Now suppose it has been decided based on the internal QA feedback that the car physics needs to be reworked, and the team is asked to crunch to redo a large part of the car physics code, our programmer would be facing a lot of nonverbal pressure to keep their head down and crunch together with the team. There is no need for the higher-ups at the studio to expressly tell the team to crunch as it is counting on the aforementioned unspoken understanding that it is a “privilege” to work on Grand Theft Auto and the camaraderie that they have developed with their colleagues to properly finish the job. If our programmer leaves when faced with an upcoming crunch, there would be nothing on the résumé to show for their next position, and they may be uncredited for their work on the new Grand Theft Auto (which is an actual policy at Rockstar). If our programmer decides to continue their work but leave earlier than their co-workers, more likely than not someone else on the team would have to cover their responsibilities; do this enough times and even if the overall progress were not affected, our programmer would almost certainly be regarded as “not a team player” and “no passion for the job” by their colleagues and superiors, thereby becoming an easy target for the next round of layoffs.
This “peer pressure” element in crunch is corroborated by the 2019 DSS results: 30% of respondents checked “people doing it voluntarily” as one of the reasons for crunch, even though only 4% thought crunch is an important creative part of game development, which means a few people who believes crunch is beneficial to development or otherwise frequently works long hours could cause many others to crunch because no one wants to appear less passionate about working in video games than others due to job security concerns. Without any labor-side protection mechanisms, such as collectively bargained employment terms, against the threat of seemingly arbitrary layoffs, our hypothetical programmer above is left with no feasible recourse except to “voluntarily” crunch for the sake of their career.
Crunch outside of AAA game development and larger developers/publishers
While the above analysis has been more focused on the incidence of crunch at larger studios working on AAA games, neither smaller, “indie” studios nor developers of free-to-play games are immune from crunch. With free-to-play games, revenue is dependent on in-game purchases by players for use in the game environment, but most players would stop spending money or quit the game altogether once they believed they have seen everything the game has to offer, so unlike the old monetization model where the consumer simply pays a one-time fee for content made and packaged in advance, developers of free-to-play games must continue to introduce new content nonstop to keep as many players engaged and spending money for as long as possible. Consequently, the ability of a free-to-play game to keep its players engaged is in some cases a more important measure of the game’s success than the revenue amount, and accordingly, the need to keep pumping out content can result in extended or even indefinite crunch mode. In the case of Fortnite, the crunch may partially be attributed to the unexpected popularity of the new battle royale mode, which was actually an experiment by Epic in trying to revive the stagnating game at the time. One Epic staff member revealed that the number of new incoming support tickets for the game supposedly increased by more than a hundredfold to about 3000 per day. Epic realized it must act quickly to overhaul its entire strategy for the game lest the trend dies down as suddenly as it started, so new hires were trained by the already harried senior personnel in a hurry, and those at the top fumbled around in trying to figure out what can keep the players interested. As mentioned above, when game development falls into chaos and things are being changed constantly due to indecision and/or a lack of a clear direction, it is not surprising that crunch resulted, especially when deadlines are measured in terms of only days and weeks instead of months or years. Given the growing number of free-to-play games that are being developed with higher production values and marketed to players as such, the production issues those free-to-play game studios encounter may become more similar to those faced by AAA game developers as well.
As for smaller or indie developers who are working on smaller games, project management does not necessarily get simpler with a smaller scale project. The lesser resources and manpower available to a smaller developer studio can cause its staff to work long hours all the same, a big example of which would be QA. Even if the studio is self-publishing its games, while it may be free from the external pressure of a parent publisher to complete their work by a certain deadline, there is still internal pressure to timely complete and release games to keep the cash flow alive. While Telltale Games was not a small studio (~300 employees) at the time of its closure in September 2018, it published its own games, and its games were at most “AA”-level games – games with noticeably smaller production budgets and scale than “AAA” games. The studio’s main claim to fame was The Walking Dead in 2012. Developed by a team of less than 90 employees at the time and featuring innovative storytelling mechanics, the game was a critical and commercial success that subsequently attracted well-known media franchises such as Game of Thrones, Batman, and Marvel’s Guardians of the Galaxy to contract Telltale to develop story-based games in their universes. However, while Telltale was cranking out those games at a blistering pace, players were starting to notice that the new games were not only technologically lagging, as it was built on the same software engine used by The Walking Dead, the storytelling was also falling behind competing games. The decline in sales from the perceived lack of innovation put the studio, at nearly 400 employees by 2017, in financial difficulties; the final straw finally came in September 2018 when negotiations for another round of investor funding ended abruptly when two prospective investors coincidentally pulled out on the same day.
Telltale’s tale (pun intended) is illustrative because it is an example of self-inflicted pressure to crunch. It is not clear why Telltale chose and continued to follow a business model of taking on as many different projects as possible at the same time. One reasonable guess is that Telltale’s accounting may have concluded that as long as the studio can keep releasing games, other franchises and investors will continue to come as well, but the rapid personnel expansion and heavy crunch needed to pump out those games affected their quality, i.e., Telltale only managed to release eight games in three years from 2014 to early 2017 by essentially making the same The Walking Dead game with different characters and plots. For story-heavy games in particular, what was once innovative will quickly become unpalatable when the audience can see the same tropes coming due to lazy writing, and the investors apparently saw the writing on the wall before Telltale did.
Since no developer is immune from crunch, and many of the causes of crunch seem extremely entrenched in the development process, how should this problem even be tackled? There have been attempts to provide both external and internal solutions:
Proposed solutions: Unionization
In the US, the Screen Actors Guild (SAG, now SAG-AFTRA after the merger with the American Federation of Television and Radio Artists in 2012) was formed in 1933 to provide collective bargaining power against Hollywood movie studios which had been causing actors to enter into long-term contracts with labor terms that would be deemed as unconscionable today, such as no maximum work hours, giving the studio near total control over the public and private life of the actor, and not allowing the actor to terminate such contracts. Since then, almost all parts of creative industries have some form of labor unions – a giant exception being video games.
The causes behind the disinterest in forming a labor union in the video game industry (in the US) are quite complicated and difficult to pin down. One explanation is that the video game industry still takes more of its cues from the technology industry instead of the creative industry, and Silicon Valley has been traditionally very much anti-union. The tech industry’s aversion to unions is often attributed to the perception that their employees are sophisticated, very well-compensated and can easily find another job to not need someone else to bargain labor terms on their behalf, while on the employer side, a union represents the “old” model of labor vs. management prevalent in blue-collar industries that hampers a tech company’s ability to “stay agile” in the ever-changing tech industry. Similarly, as mentioned, employees in the video game industry are also being conditioned to believe that they are in a position of privilege – you choose to work in video games because you are passionate about them, you don’t need anyone else looking out for your interests, and there are thousands of others who would like to be in your position, etc. Other factors that hampered unionization include the wide disparity between the most well-paid and the least well-paid employees in the industry, as a low-level QA tester making US$15,000 per month will have a difficult time finding a common ground with a lead programmer more than ten times that amount, and the employers being generally far more numerous and diverse than in other industries with a union presence, which puts a lot of burden of the union to represent their clients.
So for unionizing to take shape, there must be a drive to recognize common issues that a union could bring improvements to. The aforementioned sudden closure of Telltale Games and the “100-hour week” comment from Dan Houser of Rockstar Games both about a month from each other in late 2018 brought to the forefront the discussion over the need for union protection against sudden layoffs and excessive crunch, and while the challenges of working during the pandemic kept the conversation rolling, the most recent catalysts for unionizing were the high-profile scandals at Ubisoft and Activision Blizzard in mid-2020 and late 2021, which revealed rife discrimination, sexual harassment and abuse among those in power at those publishers. This recent wave of support for unions is easily observable in data. The 2019 DSS was the first time respondents of the survey were asked about unionization, and about 59% would vote yes for a national union covering all developers compared to 48% for unions at the individual workplace level. Those numbers increased to 78% and 58% respectively in the 2021 DSS.
But the current grand total of certified video game unions in North America is two, both of which are attributable to the efforts of a grassroots organization named Game Workers Unite (GWU). Starting off as a Facebook group for pro-union video game developers, GWU first made waves at the 2018 Game Developers Conference (GDC) held in San Francisco for showing up en masse at a panel on whether unions are appropriate for the industry and engaging in a contentious debate with the panelists on the subject. The group subsequently received lots of positive feedback and its membership swelled as a result. Since GWU is not a union itself, its primary activity is providing support to groups who are organizing within a studio, or connect them to existing labor unions. Recently, GWU has been working with the Communication Workers of America (CWA), the largest telecommunications union in the US, to launch the Campaign to Organize Digital Employees (CODE-CWA) in 2020 dedicated to helping employees in the technology and video games industries to unionize. After assisting the establishment of a union at a small indie studio named Vodeo Games, CODE-CWA is currently supporting nascent union organization activity in Activision Blizzard in late 2021, including an official request for recognition as a union by 21 QA testers at Raven Software, a subsidiary studio of Activision Blizzard, in January this year. Despite Activision Blizzard refusing to voluntarily recognize the Raven Software QA testers as a union (and even apparently intentionally excluding them in an announced conversion of more than 1,000 QA testers to full-time employees), the US National Labor Relations Board authorized the group to conduct a union election on April 22, and the group voted 19-3 in favor of forming an union on May 23. .
Unions are not a panacea for all the problems of the video game industry. A union will not be able to reshape the enormous workload that comes with game development today, nor will it save a studio from making questionable decisions in its game development strategies. Furthermore, due to the lack of actual cases, there is no track record to provide reassurances to employees that their interests will be competently represented by a union, so despite the prevailing positive opinion from employees in the video game industry about unions, it seemed to be more about the “concept” of a union rather than in practice, so there is likely to be substantial skepticism and lack of trust for the time being. On the other hand, union proponents would point to SAG-AFTRA as an example of how long it took for a union to gain sufficient influence in the film industry, and how effective a labor union could still be in an industry that is also predominantly staffed by people with “passion” for the work. In any case, unionization has the potential to materially shake up labor relations in the video game industry, thus its current state of affairs should prove to be of great interest to all who are interested in the industry.
Proposed solutions: Alternative work schedules and co-ops
The current awareness of the work environment issues in gaming has been driving smaller developers to experiment with alternative work schedules or mechanisms to improve the quality of life for their personnel. One proposed solution to reduce crunch is to implement a four-day work week for improved employee productivity as well as quality of life, and allowing the employer to further cut costs. While the four-day work week concept has existed for decades, interest had been minimal until the need for flexibility in work hours due to pandemic caused people to take another look, with countries such as Spain, New Zealand and Japan touting four-day week trial programs. There are many possible variations: A 4 day-10 hour work week, which compacts the original 5×8 work week into four days; a straight reduction to a 4×8 work week with reduced pay; or the same reduction in hours without reduced pay; set the extra day on Friday, Wednesday or leave the choice to the employee, etc. In the video game industry, five indie studios shared their experiences with a four-day work week at one of the panel sessions at the 2022 GDC, and the general reception of the change was very positive, with all studios reporting better morale among employees, but none were sure from the start that the change was going to work well, as there were concerns that people may be pressured to get five days’ worth of work done in four days, and how to remain in sync with the rest of the world still on a five-day work week, but once the employees become acclimated to the shorter work week, productivity and morale started to improve. Some larger studios have also been piloting four-day work weeks, including two Japanese studios/publishers Bandai Namco Mobile and Game Freak (developer of Pokémon games) have announced their plans for four-day work week trial programs in March and April of 2022 respectively. Another developer, Eidos, have been running four-day work weeks at their Montreal offices since October 2021 in an effort to retain personnel amidst the pandemic, with supposedly encouraging results.
Another alternative system looking to reduce labor-related conflict is to run the studio as a co-operative instead of a standard employer-employee based company. The primary benefit of employees owning equal stakes in the studio is the autonomy and independence in how the studio’s business is run, with everyone being informed of and participating in the studio’s business decisions, so it is unlikely for there to be decisions that look doomed from the start or turn out badly because they lack the perspective of the employees doing the actual work. Another advantage claimed is the flexibility of structure: Co-ops can still be hierarchical with a leadership body or completely flat; there may be equal pay for all employees or otherwise; the key mechanisms of running a co-op, according to studios like Motion Twin, a France-based co-op studio since 2001, and KO_OP, a Canada-based co-op studio since 2012, are the implementation of a robust accountability system and ensure the transparency of all decision-making. However, these are also some of a co-op’s largest challenges: How to reach decisions when everyone has a voice, how to respond to occasions when member(s) fail to “pull their own weight”, and how to handle dissent, all of which becomes more difficult when the co-op grows beyond a certain size. It is also worth noting that at least compared to Europe and elsewhere in the world, the appellation “worker-owned co-operative” tends to conjure up socialist/communist imagery that are shunned by a significant portion of the US population, so developers in the US may not even be aware it is a potential option. Even though large studios with hundreds of people making AAA games may likely never be run like a co-op, a number of indie studios who chose to do so are enjoying remarkably long and stable lives, and their message today to other video game developer studios is that this is an experiment worth trying if they are at all concerned about the welfare of their employees.
It may seem that finding out about how the things one enjoys in life are made has become more and more often a disheartening experience in today’s world. In addition to efforts within the industry to correct course, public sentiment will also be needed in the push for more equitable labor practices as it has done so often in the past. While it may be too much to ask for gamers worldwide to all unite together to boycott the companies that normalize crunch, a message can still be sent to the market by ordinary people that overwork should not be a badge of honor in the video game industry, that delaying a game is infinitely more preferable than a pushing the game out the door under crunch. It is hoped that this effort for a better workplace can revitalize developers to move video games onto the next level of innovation.
(The authors’ opinions do not represent the position of this law firm.)