Being deaf, I’ve often made jokes about bad subtitles that are too small to read. Just in the past few years we’ve seen a few examples of how games have stumbled when it comes to implementing proper subtitles, such as Wolfenstein II: The New Colossus with its tiny subtitles that blend, or walls of text in Fallout 4. But deep down, hidden by my humor, is a part of me that feels devastated that a game I’ve been really excited to play is going to be awkward to enjoy properly. Maybe I’m unable to see the tiny text from the couch, maybe that text blends into the darkness, or maybe there are no subtitles at all.
The point is, being deaf, I’m physically unable to follow instructions or a story that everyone else can follow easily. It isn’t just me: there are other gamers out there who struggle just as much with a range of reasons for desiring decent subtitles in video games. Yet subtitles are still a feature in video games that seem neglected, not only leading to upset consumers, but even a loss of sales.
Since the news broke about Activision’s statement regarding the lack of subtitles in Toys for Bob’s Spyro Reignited Trilogy, I’ve felt obliged to keep the conversation going, largely due to the public’s reaction, but also due to my own interest on the topic. The day after that statement was made, I looked at why the backlash that Activision received was an important discussion, with several players speaking out about why subtitles are important to them.
Scattered around the replies from players on our tweet, there were people with experience in subtitle creation joining in with the discussion. Having read these comments, I had started to feel like the process to create subtitles could be easy and simple, which would not only give both Activision and Toys for Bob a bad image, but one that came across as both insensitive and lazy.
I wanted to find out more about subtitle creation and see how difficult it truly is because I’m both genuinely interested in learning about the process and also want to share with readers what the subtitle creation process entails during the development of a video game. With Spyro Reignited Trilogy’s lack of cutscene subtitles during gameplay as the main point of discussion, I consulted with various developers at companies of various levels to find out how the process works globally.
These conversations began with Todd Colby, who originally voiced his opinions on Activision’s statement through Twitter in response to my news post on Spyro Reignited Trilogy. Colby previously worked as a UI artist at Telltale Games and 343 Industries and agreed to answer some questions I had. I started by asking him how subtitles were discussed among his teams and what the process was behind creating them while at both Telltale Games and 343 Industries:
“Building a subtitle system is a cross-team collaboration typically managed by a team manager/producer. From a UI/UX perspective, a developer is designing the space where the subtitles will appear, the font size, the color of the font, etc. Some basic answers need to be answered before production: is that text rich and editable on the backend? Is it baked-in, rasterized, or in-world 3-D text? Can text presentation be customized? Can we get the license to the font? How much screen space can subtitles take up? Will they support other languages? How much contrast is there between the text and the game?
Once high-level design questions are answered, designers work to gather the resources required to build the feature by cross-collaborating with the engineering team, audio team, narrative team, and test [sic]. This process should take about 5-8 business days. However, that process is the best-case scenario in a constantly shifting production timeline.
On top of that you have people taking vacation, people having babies, people leaving the company, people getting sick…all the hard life stuff that deprioritizes work responsibilities that can cause this feature to slip. Every feature in a video game is a hot potato being tossed back and forth—and sometimes someone isn’t there to catch it. To re-iterate: making video games is HARD — like ludicrously difficult, like Dark Souls hard; like Game Development is the Dark Souls of Software Development.
You work long hours, get paid far less than other software development jobs (and if you’re living in a metro area you’re barely scraping by), everyone wants your job (and will do it for less), you have to navigate tribal knowledge/ugly politics/culture across the whole studio, and so many more difficult aspects. All the pit falls of game development aside, it is indescribable how rewarding to see a game you have developed touch someone so deeply and so profoundly. It’s sad when people get excluded from those same experiences because of an accessibility blunder.”
Clearly video games are hard to make, and there’s no doubt about that if you understand the process behind the entire development timeline. Yet, it’s still common to see people online shouting at developers to do things better or add something that they believe will be a simple task for the developers.
It’s easy to forget that there are people behind your favorite title with real-world problems, and at the end of the day, these people are working to deliver a product to the public. It feels great to read that Colby had an experience in which the team was very vocal about how subtitles were designed.
In one of his tweets, Colby stated that it took him 3–5 days to implement subtitles into a game he was working on. I wanted to know how the hours of subtitle creation and the size of a game crossover, to which Colby added:
“It depends on the title. For example, Telltale Games are heavily rooted in dialog systems and rely on dialog as an essential part of their gameplay. So dialog was ever changing, but we had already created systems to quickly implement subtitles and dialog quickly as we worked on an extremely tight deadline, sometimes releasing multiple episodes during a month. At the end of the day, it really depends on the tools you’re using to implement subtitles.
While some studios are moving over to more open toolsets, most studios still really use tools built internally. Adobe Products, Maya, and the rogue’s gallery of software suites and middleware are also present and plugged into the process where appropriate. Working in tools like Unity and Unreal (which Spyro is being developed in) are a bit of a luxury because there are endless resources and communities to help build a subtitle system, or have those systems pre-built into the toolset.
If you’re on a small team, subtitles might be handled by one person who is learning ALL the disciplines listed above to make it work (try wearing an artist/engineer/narrative /audio designer hat and not having a sore back). Suddenly, a bug appears that crashes or breaks the UI and blocks the player’s progression. Said person must deprioritize their workflow on that feature and rally around fixing that game-killing bug. In fact, unless it’s specifically required for certification, indie teams typically must deprioritize accessibility features and patch them in later.”
So far, subtitle implementation seems to be a fairly straightforward process, especially given that popular engines such as Unity and Unreal not only have built-in tools, but also a vast amount of resources available. Of course, as Colby states, smaller teams do need to focus their attention to game-related bugs; however, the idea of patching in accessibility features at a later date comes at the risk of excluding players.
Colby’s tweets mentioned subtitles being a “P3 task” and can sometimes get chucked aside for a “P1 task.” He mentioned that the Activision situation could have been down to bad management if this was the case. For more clarification, I asked him to explain what these terms mean, giving us a better understanding of some behind-the-scenes abbreviations and where tasks like subtitles fall in them:
“‘P’ stands for priority in task tracking tools monitored by managers, PMs, directors, etc. Studios prioritize features differently, but most of them are Priority 3 (lowest) to Priority 0 (highest.) Studios have different goals in terms of accessibility. I have worked on teams where devs have laughed in my face at the suggestion of a colorblind mode, while others have taken it seriously and implemented it as a feature.
Subtitles are one of those problems that can’t be solved until the game is almost finished. The writing is done, and the sounds are mixed — which unfortunately falls around a time of crunch and heavy feature cutting. When developers are trying to ship a product, there are features or bugs that fall into much higher categories of priority, so devs must switch gears based on priority levels. On Spyro, I’m going to take a wild guess and assume subtitles were a low pri-task and it got cut to divert resources to more game-stopping bugs or demanded features.
In this case, I think Activision’s PR really fumbled by brazenly saying subtitles are not a requirement. They are essentially saying Activision doesn’t care about accessibility, which is bonkers to say publicly for any product. It’s a great way to make people upset. I deeply empathize with the frustration when something like this happens, but I must stress: developers are not lazy. Protest and passion is encouraged. It tells developers to prioritize features for a future update. If you call someone running a marathon lazy, they likely will not listen. When you’re loud, be kind.”
Colby made a strong point saying that developers are not lazy. While the situation with Spyro Reignited Trilogy and the straightforward subtitle implementation process could easily lead people to call developers “lazy” because an important feature wasn’t implemented, developers aren’t sitting around and twiddling their thumbs. It makes sense to learn that the playability of the game was rated a higher priority than subtitles, especially if subtitles can be implemented fairly quickly later on.
The trouble with Activision’s statement is that it has been worded in a way that makes it feel like subtitles weren’t worth the extra effort and that implementing them would ruin the feel of staying true to the original games. It not only sends out a message that subtitles must be hard to implement but the comment about subtitles not being industry standard could see other developers also using this as an excuse to avoid extra work.
My conversation with Colby ended with me asking him if he’s ever worked on a video game that didn’t require subtitles, bearing in mind that he worked with developers that pushed out dialogue-heavy games. His response made it clear that subtitles are usually required by most developers and publishers, so Activision’s “not an industry standard” comment seems to be an exception, not a rule:
“Every product I have worked on has required subtitles for certification to put on a platform (XBLA, PSN, Steam) by the publisher, or the studio has mandated that accessibility is a top priority for their studio (it always should be.) Think of the Nintendo Seal of Approval (if you can remember those stickers.) The Nintendo seal of quality was not just a sticker on the box, that meant the game went through rigorous requirements to get certified and ship. Those certifications are based on internal checklists that are required for usability, accessibility, etc.”
All in all, Colby’s answers led me to believe that subtitles are usually talked about amongst development teams nowadays and that in bigger companies various teams must work together to pull them. While my time editing in Premiere Pro and using YouTube’s subtitle feature painted their implementation as a pure timecoding process to me, the process is actually a much more in-depth, albeit somewhat straightforward, process in game development depending on things like team size and the game engines being used.
To help me further understand how game engines work in relation to creating subtitles on a more indie level, I briefly spoke with David Jimenez, lead game designer and co-founder at 2Awesome Studio about the process he uses to create subtitles:
“We used a custom coded setup on Unity. Text source is in an Excel file that we export to separate CSVs per language (easy for localization process). CSV files are then imported and read into Unity by our custom components. Biggest problem is always dealing with different font and font size for each language.”
This shows that subtitle implementation is an element indie developers pay attention to and know how to handle. Of course, what Jimenez is explaining is from his experience with the Unity engine. To look into the subtitle implementation process in Unity further, I talked with David Su, a researcher at MIT Media Lab who has just finished localization for his music video game titled Yi and the Thousand Moons.
Su is also working with the Unity engine for his game, so I once again asked how simple the task of adding subtitles in Unity was, even for his music-based game:
“Since Yi and the Thousand Moons was so centered around storytelling through music and lyrics, there was no question that subtitles would be necessary, although admittedly the primary motivation was not so much accessibility as simple understandability. That being said, it soon became apparent that subtitles would improve accessibility, and that there was potential for deaf or hard-of-hearing players to enjoy the game through gameplay, visual art, and narrative even without the auditory elements, despite the game’s emphasis on sound and music.
The process itself was pretty straightforward. I did have some stylistic discussions with our art director Dominique Star (who also voice-acts Yi!), and we decided on an almost karaoke-style subtitling, with musical notes surrounding the text — this also ended up helping convey the musicality of the narrative via non-auditory means.
I’d say the greatest challenge for Yi came from implementing localization — I had hard-coded a lot of the subtitle functionality to work only with English (i.e. doing silly things like storing lyric content directly in the source code instead of reading that data from text/JSON/CSV files), and it was a bit cumbersome to abstract things out to accommodate multiple languages. This was 100% due to my past self’s laziness, and through the process of adding localization, the subtitles implementation is much cleaner now. I suppose all of this goes to say that a little foresight goes a long way!”
Su and Star’s decision to add musical notes surrounding the text to visually explain that the text is meant to be read musically is an interesting one. While some video games like Red Dead Redemption 2 do have moments where subtitles are present when characters sing, there isn’t much to visually explain that they’re singing as opposed to speaking regularly. Television text, at least in the UK, relies on usually having “#” before the text to indicate the following text is sung. Video games sometimes have “[singing]” at the start of a subtitle, but Su’s way seems artier and less generic.
I also asked Su to go into more detail about what his experiences with video game engines were like, if they offered him any decent tools for subtitle creation, and how long the process actually took him:
“I’ve actually had a good time with subtitle creation in Unity — this basically consisted of creating a UI element with custom data structures and events to handle updating the text (based on gameplay as well as language settings). I’ve never used Unreal so I can’t speak to how easy it would be in that engine, but I would imagine something similar wouldn’t be too much effort to implement. Based on my experience, ease of adding subtitles probably has to do more with how much the development process had subtitles in mind to begin with, rather than the specific engine of choice.I’d say for Yi, subtitles have maybe taken up somewhere around 5% of the developmental process? Doing the initial English ones took maybe two to three days (based on commit history), plus some additional spots here and there of revision work, whereas implementing localization I’d say took over a week’s worth of work total; it was a bit on-and-off over several weeks at that point, so hard to say for sure.”
Going by Su’s comments, it certainly seems like if a company plans to include subtitles, and has a clear idea of how they’re presented, it should be a straightforward task to incorporate them. If roughly 5% of Yi‘s developmental process was subtitle creation, I wonder how much this percentage would be for larger-scripted titles such as The Last of Us, or even titles with multiple story arcs such as Detroit: Become Human.
So far I’ve had a few developers discussing their time with the Unity engine, but what about the Unreal Engine? Keep in mind that this is the engine that Spyro Reignited Trilogy was developed in. I spoke with Chris Payne, a developer who previously worked at Telltale Games and EA, about subtitle implementation in Unreal, and this was his response:
“Unreal implements two types of audio object: SoundWave and the newer DialogueWave. Both have a subtitle field. However, only the DialogueWave subtitle field is localization-enabled.
I discovered this on a recent project (my first in Unreal) where all the VO had already been implemented as SoundWaves, so none of the subtitles had even been translated. Because the VO triggers were scattered all over the blueprints, we went for an inelegant but thorough solution – all the VO was duplicated as DialogueWaves which were then translated, and the code continued to play the existing SoundWaves but looked up the matching DialogueWave to find the translated subtitle.”
It’s worth keeping in mind that Payne is explaining subtitles that are in relation to in-game subtitles and not cutscene subtitles. However, he did explain that in-game subtitles will be played when the voice dialogue plays. Additional sounds such as dogs barking in the background or crowds muttering are usually optional subtitles.
He also explained that cutscene subtitles would work in the same way as in-game audio, but video clips would be a continuous audio track/stream in which the game cannot distinguish the start and end points for each line of dialogue. These would need to have manually assigned timecodes.
Payne then directed me to check out the Unreal manual, specifically to the subtitle section. Surprisingly, the subtitles feature should be enabled by default and works in-game providing there is a SoundWave (voice recording/Sound effect) set to that DialogueWave (the line of dialogue spoken including voiceover metadata). The developer would need to ensure the subtitle text is entered into DialogueWave and have that triggered to show the subtitle rather than just the SoundWave, which is just an audio file.
Going back to David Su in Unity, I had asked how the process was different for creating subtitles in both in-game and cutscene elements.
“Pretty much the same for me, although I can imagine for games with more character customization or less tight coupling of gameplay and soundtrack there might be more of a difference there. The cutscenes in Yi were rendered directly in-game, and all the subtitle timings were tied to timeline positions in the soundtrack, so there was no need to treat gameplay and cutscenes separately.”
He also explained that players should be given the opportunity to alter subtitle sizes and that the text should have an outline of some sort to stand them out from a bright background. This is something I agree with wholeheartedly. Life is Strange 2 totally surprised me (in a good way) at first when I performed my usual ritual of booting a new game up and locating the options to turn subtitles on. Instead of a simple on/off switch, I was presented with several options for text sizes, and even the type of background I wanted to see against the text.
From these conversations with developers at various levels, it’s clear that subtitle implementation is a fairly well-documented and straightforward process that is seen as a standard by most studios. Both the Unity and Unreal engines have plenty of methods to create subtitles quickly and easily and, providing you have a script, I can only assume the process is even more simple.
While Activision’s comments are what initially sparked my investigation, other large companies do still fail to implement subtitles nicely. Bethesda, as an example I noted earlier, has had terrible subtitles in most of their games like Fallout 4. Despite some of the less effective implementation of subtitles in some AAA releases, it’s also good to see that subtitles are usually quite well-discussed through many development teams. Discussions on font styles, borders, backgrounds, and more have come into the conversation, as they should.
It’s a shame that subtitles still aren’t largely adjustable for individual players, but we’re starting to see that change with recent titles such as Marvel’s Spider-Man, Madden NFL 19, Life is Strange 2, Red Dead Redemption 2, Assassin’s Creed Origins and Odyssey, and God of War. All of these titles are introducing tons of accessibility options that allow you to choose a text background, speaker names, and (sometimes) the ability to change text sizes.
Some of these have launched with the accessibility options intact, while titles such as God of War had to implement options through a patch after players addressed them being too small. Hopefully, this post-launch approach for accessibility will also be the case with Spyro Reignited Trilogy as with Activision’s comment that they “care about the fans’ experience especially with respect to accessibility for people with different abilities, and will evaluate going forward” being a good sign.
It’s safe to say that subtitles are actually deemed an important part in video games, and developers do actively discuss how they look, and if they are serving their purpose well. It’s also incredibly important to remember what Todd Colby said regarding them often being pushed aside to fix game-breaking bugs and for us to continue to discuss how important of a feature they should be in the game development process going forward. However, it’s still always a punch in the gut when a game launches with terrible subtitles, or no subtitles at all, forcing players that need them to wait until a patch comes.