Video games are a profitable industry (Newzoo, 2019) for big companies like EA111Eletronic Arts report from October 2019 shows $3.8 bilions of revenue (http://bit.ly/2LA7us1). and for indie developers222It is hard to define the concept of indie developer. Generally they are small, self-funded companies that create games on a small scale. Out of scope of this paper but related is the essay: http://gamestudies.org/1601/articles/gardagrabarczyk. brave enough to face a competitive endeavor333There are many examples of successful indie games, one of them is “Stardew Valley”, written by a single developer in the span of four years, which made him millionaire in one day after launch (Schreier, 2017).. Successful games are plenty, although they usually come at a price. Indeed, video-game development is known for its management problems (Petrillo et al., 2009; Washburn et al., 2016), which translate into large numbers of issues in the final products. For example, 80% of the top 50 games on Steam444Steam is a video-game, digital, distribution service platform available at https://store.steampowered.com/. need critical updates (Lin et al., 2017).
Video games form also a competitive market in which knowledge is the main weapon against competitors. The lack of information about the process, the techniques, the game engine used for games are but a few examples of how hard it is to understand video-game development. This lack of information also prevents new developers (and even veterans) to avoid common problems by learning from previous mistakes.
One source of information about video-game development that is public are postmortems. Video-game postmortems are documents that summarize the developers’ experiences with the game development, written often right after the game is launched (Washburn et al., 2016). These documents are usually written by managers or senior developers (Callele et al., 2005). They often include five sections either about “what went right” and “what went wrong” during the game development.
“What went right” discusses the best practices adopted by developers, solutions, improvements, and project-management decisions that helped the project.
“What went wrong” discusses difficulties, pitfalls, and mistakes experienced by the development team in the project, both technical and managerial.
Thus, Postmortems offer an “open and honest window into the development of games” (Washburn et al., 2016). However, these documents are written without formal structure and often providing disparate information, in particular with respect to the software-engineering of the games. Consequently, complete, trustful information about game development is hard to find, which limits the number of studies about games from the point of view of software engineering. Yet, we believe that most of software-engineering problems could be mitigated if developers had information beforehand.
Through this paper, we propose a curated dataset describing software-engineering problems in video-game development. We compile a large set of information about problems in game development related to software engineering. We analyze 200 postmortems from the Gamasutra Website555Gamasutra is the main and most complete source of video-game postmortems: http://www.gamasutra.com. It is a descendant of the Game Developers Magazine that ended in 2013. and collected 1,035 problems. We categorized these problems into 20 different problems types and provide a structure to store this information. The dataset is available in the following page: https://github.com/game-dev-database/postmortem-problems.
The paper is structured as follows. Section 2 discusses papers that used video-game postmortems as a knowledge base. Section 3 describes how the data was gathered. Section 4 presents how the metadata is organized. Section 5 shows the results of the database. Section 6 concludes the paper with future work.
2. Game Development Problems
Callele et al. (2005) analyzed 50 postmortems from the Game Developer Magazine and investigated how requirements engineering is applied in game development. They grouped “What went right” and “What wen wrong” in the five categories: (1) pre-production, problems outside of the traditional software development process; (2) internal, those related to project management and personnel; (3) external, those outside of the development team’s control; (4) technology, those related to the creation or adoption of new technologies; and, (5) schedule
, those related to time estimates and overruns. They concluded that project management is the greatest contributor to the success or failure in video-game development.
Petrillo et al. (2009) analyzed 20 postmortems from the Gamasutra Website, searching for the most common problems, which they compared with traditional software problems. They concluded that (1) video-game development suffer mostly from management problems instead of technical problems; (2) the problems found in video-game development are also found in traditional software development; and, (3) the most common problems are related to scope, feature creep, and cutting features.
Kanode and Haddad (2009) used postmortems to discuss the challenges of traditional software engineering in video-game development. They reported differences between game development and traditional development and concluded that video-game development must adopt and adapt software-engineering practices.
Lewis and Whitehead (2011) used two previous papers (Blow, 2004; Tschang, 2005) to identify problems in games and whether/why these could be of interest to the software-engineering research community. They highlighted some areas to explore further and differences between games and traditional software.
Washburn et al. (2016) analyzed 155 postmortems, identified some characteristics and pitfalls, and suggested good practices. They reported the following main problems: obstacles (37%), schedule (25%), development process (24%), and game design (22%).
Politowski et al. (2016) extracted development processes from 20 postmortems. They concluded that the majority of the game industry uses agile instead of the waterfall method. In their next work, Politowski et al. (2018) used 55 postmortems to create a recommendation system for new video-game projects using previous mistakes and good practices.
These previous papers used postmortems to analyze video-game development, in particular Petrillo et al. (2009) and Washburn et al. (2016). We extend these papers with more methodical approach and a study of a larger number of postmortems. We study postmortems from the point of view of software engineering using an iterative approach, which we summarise in the following section.
3. Method of Analysis
We designed the process of analysis to be iterative where the data keeps in constant evolution as we add and refactor the data each new iteration. Figure 1 shows the steps performed to analyse the data from the postmortems.
We start by randomly picking 20 postmortems from the Gamasutra Website between the years 1998 to 2018. Next, each author reads the postmortems, focusing on the “What went wrong” section and using a coding technique from Grounded Theory (Stol et al., 2016). Each author identifies all the problems reported by the postmortems, extracting quotes and grouping similar problems, based on the previous literature definitions (Petrillo et al., 2009; Washburn et al., 2016). Previous definitions were used to create a catalog of problems, which contains problem types, problem groups, and descriptions. We add each new problem type absent from the catalog with its description.
In the next step, we discuss the findings, review some doubts, and analyse the correctness of the catalog. Any change at this point, on the catalogue or on the list of problems, results in a update in both documents. We reiterate this process until it reaches saturation: until no more new type of problems appears.
To keep the distribution of the postmortems read by year balanced, eventually we chose specific postmortems to analyse instead of randomly picking them. Also, not all postmortems are relevant for this study as they might not have useful information regarding the development process. When that happens, we discard the postmortem.
4. Metadata of the Dataset
The catalog of problems is a document listing the possible problems we could find while reading the postmortems. We started building it using problems’ types we gathered from the literature and we update it during during the postmortems analysis.
The catalogue document is updated for every new problem type we end up discovering. The authors review each new problem type and reach to a conclusion. The final version of the document contains 20 different types of 1,035 problems divided in three groups.
Table 1 shows the final version of the catalog where a Problem Type is a index to define the problem which has a short Description and belongs to one Problem Group.
|Group||Type||Description of the problem|
|Production||Design||Any problem regarding the design of the game, like balancing the gameplay. Not a technical detail.|
|Documentation||Not planning the game beforehand, not documenting the code, artifacts or game plan.|
|Tools||Any problem with tools like engines, APIs, development kits, third-party software, etc.|
|Technical||Problems with the team code/assets infra-structure.|
|Testing||Any problem regarding the testing.|
|Bugs||When there are too many bugs in the game/engine, any failure in the game design or technical issues.|
|Prototyping||Lack of or no prototyping phase nor validation of the gameplay/feature.|
|Management||Unrealistic Scope||Planning too many features that end up impossible to implement it in a reasonable time.|
|Feature Creep||Adding unplanned new features to the game during its implementation.|
|Cutting Features||Cutting features previously planned because of other factor like short deadlines.|
|Delays||Problems regarding any delay in the production.|
|Crunch Time||When developers continuously spent extra hours working in the project.|
|Communication||Problems regarding communication with any stakeholder.|
|Team||Problems in setting up the team, loss of professionals during the development or outsourcing.|
|Over Budget||Project cost more money than expected.|
|Multiple Projects||When there is more than one project being developed at the same time.|
|Planning||Problems involving too much time planing/scheduling or the lack of it.|
|Security||Problems regarding leaked assets.|
|Business||Marketing||Problems regarding marketing/advertising|
|Monetization||Problems with the process used to generate revenue from a video game product.|
To store the problems gathered from postmortems we defined a data model which Figure 2 shows in UML class diagram. Each Postmortem has a Game which has a collection of problems. Each Problem has a Type and Group. Also, the Game must have: a Platform [1-3] (PC, Console, Mobile), a Genre [1-12] (Action, Adventure, RPG, Simulation, Strategy, Puzzle, Sports, Platformer, Shooter, Racing, Roguelike, Running), and a Mode [1-3] (Single-player, Multi-player, Online).
An example of the dataset structure is shown in Table 2. In this case, the game “Baldur’s Gate II” from 2001 was analyzed and the entry shows a problem regarding “Testing”, which belongs to the “Production” group. The quote relates the problem to a non-modified version taken from the postmortem.
We began storing the data using JSON format666https://www.json.org/json-en.html following the structure described in Figure 2 and validated using JSON Schema777https://json-schema.org/. Later we migrated the data to a table format (.CSV) where each entry (line) refers to a problem. This makes the data easy to be analyzed with scripts, specially in R language.
|Title||Baldurs Gate II – The Anatomy of a Sequel|
|Name||Baldur’s Gate II|
|Quote||Because of its immense size, Baldurs Gate II was a testers nightmare - this was compounded by the fact that we didnt do enough testing as areas were being developed. Baldurs Gate II contains roughly 290 distinct quests - some of these are very small (20 minutes long) while others are quite large (a couple hours in length). Each quest needed to be tested both in single player and multiplayer modes. The main lobby of the Hall of Wonders. During testing we adopted a very sound task and bug tracking method introduced to us by Feargus Urquhart, the Director of Black Isle Studios and Chris Parker and Doug Avery, our Black Isle producers (all of whom helped the project in many different ways). We put a number of white-boards in the halls of the testing and design area and listed all of the quests on the boards. We then put an X next to each quest. We broke the designers and QA teams into paired subgroups - each pair (one tester and one designer) had the responsibility of thoroughly checking and fixing each quest. After they were certain the quest was bulletproof, its X was removed. It took about 2 weeks to clear the board (on the first pass).|
5. Dataset analysis
The dataset contains 200 video-game projects ranging from 1998 to 2018. There are, in total, 1,035 problems with its respective quotes. The median of problems by title (video game) is 5 and the median of problems by year is 48. Figure 3 show the problems by group: management and production problems are the most common with around 45% each while business issues sum up to 10%.
Figure 4 show the distribution of the problems related to each type. Design, Technical, and Team problems are the most frequent with 35% overall. Problems with Planning, Tools, and Marketing are the next ones with 23%; Communication, Testing, and Feature-Creep problems 15%; Cutting-features, Bugs, and Scope 12%; Delays, Crunch-time, and Monetization 9%; Documentation, Budget, Prototyping, and Multiple projects 8%. Only two problems are related to Security.
Regarding the platforms, the projects are mainly for PC, with 787 problems, followed by 475 Console problems, and 254 Mobile problems. 90 problems are related to multi-platform projects, that is, PC, Console, and Mobile.
The dataset is hosted in a open repository on Github. This allows anyone to access and contribute by using the pull request feature. We decided for this approach to allow the contribution to be reviewed before being accepted. The contributor can also add to the catalog of problems and other meta-data like game platform, game genre, and game mode, allowing the dataset to keep evolving overtime. Lastly, this dataset is related only to postmortems problems, that is, things that “went wrong”. Our plan is to also implement good practices gathered from the “what went right” postmortem section in the future
We presented a database of video-game development problems gathered from 200 postmortems. The database contains 1,035 problems categorized in three 20 different types. It fills the gap between video-game development and software developers by providing a larger and more trustful information about video-game development problems. It thus provides a starting point for new researchers and practitioners to understand and propose solutions to video-game development problems.
Future works include (1) keep updating the database as more postmortems appear; (2) include other types of sources as, for example, technical blog posts and conference presentations like Game Developer Conference – GDC; (3) expanding the database by using the “What went right” section to gather good practices instead of problems; and (4) make a larger comparison with the common problems in traditional software development.
The authors have been partly supported by the NSERC Discovery Grant program and Canada Research Chairs program.
- Blow (2004) Jonathan Blow. 2004. Game Development Harder Than You Think. Queue 1, 10 (Feb. 2004), 28. https://doi.org/10.1145/971564.971590
- Callele et al. (2005) D. Callele, E. Neufeld, and K. Schneider. 2005. Requirements engineering and the creative process in the video game industry. In 13th IEEE International Conference on Requirements Engineering (RE05). IEEE, 240–250. https://doi.org/10.1109/re.2005.58
- Kanode and Haddad (2009) Christopher M. Kanode and Hisham M. Haddad. 2009. Software engineering challenges in game development, In 2009 Sixth International Conference on Information Technology: New Generations. ITNG 2009 - 6th International Conference on Information Technology: New Generations, 260–265. https://doi.org/10.1109/itng.2009.74
- Lewis and Whitehead (2011) Chris Lewis and Jim Whitehead. 2011. The whats and the whys of games and software engineering. In Proceeding of the 1st international workshop on Games and software engineering - GAS 11. ACM Press, 1–4. https://doi.org/10.1145/1984674.1984676
- Lin et al. (2017) Dayi Lin, Cor Paul Bezemer, and Ahmed E. Hassan. 2017. Studying the urgent updates of popular games on the Steam platform. Empirical Software Engineering 22, 4 (2017), 2095–2126. https://doi.org/10.1007/s10664-016-9480-2
- Newzoo (2019) Newzoo. 2019. 2019 Global Games Market Report. https://newzoo.com/insights/trend-reports/newzoo-global-games-market-report-2019-light-version/. [Online; accessed 1-October-2019].
- Petrillo et al. (2009) Fábio Petrillo, Marcelo Pimenta, Francisco Trindade, and Carlos Dietrich. 2009. What went wrong? a survey of problems in game development. Computers in Entertainment 7, 1 (Feb. 2009), 1. https://doi.org/10.1145/1486508.1486521
- Politowski et al. (2016) Cristiano Politowski, Lisandra Fontoura, Fabio Petrillo, and Yann-Gaël Guéhéneuc. 2016. Are the Old Days Gone?: A Survey on Actual Software Engineering Processes in Video Game Industry. In Proceedings of the 5th International Workshop on Games and Software Engineering (GAS ’16). ACM, 22–28. https://doi.org/10.1145/2896958.2896960
- Politowski et al. (2018) Cristiano Politowski, Lisandra M. Fontoura, Fabio Petrillo, and Yann Gaël Guéhéneuc. 2018. Learning from the past: A process recommendation system for video game projects using postmortems experiences. Information and Software Technology 100, March (2018), 103–118. https://doi.org/10.1016/j.infsof.2018.04.003
- Schreier (2017) Jason Schreier. 2017. Blood, Sweat, and Pixels: The Triumphant, Turbulent Stories Behind How Video Games Are Made. HarperCollins.
- Stol et al. (2016) K. Stol, P. Ralph, and B. Fitzgerald. 2016. Grounded Theory in Software Engineering Research: A Critical Review and Guidelines. In 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE). 120–131. https://doi.org/10.1145/2884781.2884833
- Tschang (2005) F Ted Tschang. 2005. Videogames as interactive experiential products and their manner of development. International Journal of Innovation Management 9, 01 (2005), 103–131.
- Washburn et al. (2016) Michael Washburn, Pavithra Sathiyanarayanan, Meiyappan Nagappan, Thomas Zimmermann, and Christian Bird. 2016. What Went Right and What Went Wrong: An Analysis of 155 Postmortems from Game Development, In Proceedings of the 38th International Conference on Software Engineering Companion - ICSE 16. Proceedings of the 38th International Conference on Software Engineering (ICSE 2016 SEIP Track). https://doi.org/10.1145/2889160.2889253