How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Accessibility in gaming is always important; Ben looks at some of the ways that game developers could help deafness and gaming become one.

on June 25, 2018 12:00 PM

Editor’s Note: “Deafness,” as referred to in this article, crosses all forms and levels of hearing impairment.

As someone who has a first-hand experience with deafness and gaming — someone who wears hearing aids — I’ve found myself cowering behind shelter in PlayerUnknown’s Battlegrounds as a friend instantly pinpoints the direction of a sniper from miles away; meanwhile, I can’t locate them on just audio alone. I’ve played plenty of narrative-driven games like Wolfenstein II: The New Colossus and had to squint and move closer to the TV to read the puny subtitles. I’ve had dialogue-heavy games that don’t offer subtitles at all, leaving me with little idea of what’s going on.

Recently, we’ve been hearing developers, publishers, players, and companies discussing accessibility more than ever before. The most notable example of this came from when Microsoft recently revealed their Xbox Adaptive Controller to allow disabled gamers to play their favorite titles and even try new ones, with charities and non-profit organizations such as SpecialEffect and AbleGamers having contributed to its design and functionality.

The topic of accessibility in gaming has opened up more discussion online and, as a result, I’ve seen a fair amount of developers take to social media asking for help with how they can make games more accessible.

I’ve not really seen many creators asking about deafness and gaming until the other week when a friend tagged me in the above tweet — bearing in mind I just may not be seeing the right discussions — so there could certainly be a lot of talk about deafness and gaming. Nonetheless, this led to me creating a Tweet thread that gained amazing reception (so far) by developers big and small; so, I wanted to list some ideas on how developers can make video games more accessible for deaf gamers.


Directional Waveform HUD

This is a simple idea that could allow deaf players to pinpoint directional sounds coming from within the game, whether they be from objects in the environment, other players, etc. The ring could be a feature that could be toggled on or off by the player and would be useful for both first-person and third-person games. The idea is that the ring acts as a compass and audio that’s worth paying attention to in-game would disturb the ring with a waveform.

The waveform can easily show how near or far the sound is by appearing more or less spiked. For example, a nearby door shutting behind you in Fortnite would show a fairly large and thin waveform. However, a gunshot in the distance would show a smaller and wider waveform. The mock-up below shows a waveform only idea, but it would probably be more beneficial to have icons on the inner ring to allow the player to know exactly what sound it is.

I’d also like to point out that this system wouldn’t really offer an “advantage,” nor would it be exploited by hearing players, because the audio would only be illustrated by the ring in the same way a hearing person would hear the audio with directional sound.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

A mock-up of Epic Games’ Fortnite with a directional waveform, which would allow deaf gamers the ability to pinpoint specific sounds from the environment.

 

While some games may see this feature implemented simply as an overlay over the character or crosshairs, it also leaves developers room to get artsy with its presentation. An example would be to implement the feature in a similar way to The Division in which the important HUD details are tracked to the character’s backpack so they’re locked into view. Replace that with the waveform ring, and it could usefully serve both hearing-impaired users, but also hearing players.

Now, I have been informed that Fortnite already has a system like this in the game’s mobile version. While it’s roughly the same idea, the purpose isn’t necessarily for aiding deaf players, as it was intended so people who like to play on the go don’t have to whip out headphones, or play it with their phone speakers blaring out. This system was never implemented for the console or PC versions and it’s a shame, as it would go a long way towards making the game more accessible to hearing-impaired players on other platforms.

Vignette Cues

In a brief explanation, this is a system used to indicate that danger is nearby by having a color gradient follow the edge of the screen, depending on the direction of the off-screen action. I saw a system like this used in Gran Turismo Sport, in which the edges of the screen grow darker when another racer is close to the body of your car. Why not take this approach and use it for audio cues in other games?

It’s subtle and can indicate danger or friendly noises. Color hues can also help differentiate types of audio, such as red for enemies, blue for allies, and yellow for objectives. Let’s take this screenshot from The Vanishing of Ethan Carter and imagine that the situation is a creature chasing the player just behind them to the left; but to the right, a friendly character is shouting out to the player.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Visual vignettes, such as this mock-up of The Vanishing of Ethan Carter, would also provide deaf gamers with more explicit visual representations of audio cues from their surroundings.

 

Controller Vibration

I’m not a game developer: I may have dabbled, but I never chased the profession head-on. I certainly don’t know how much control a developer gets over a controller’s vibration: however, I do know there are various levels of rumble and vibration in game controllers, and I’m certain in the past that I’ve experienced a heartbeat style pattern. Linking up key audio moments in video games could aid deaf players in knowing how the atmosphere of the game is playing out, or just informing them of events they can’t hear or see off-screen.

The trouble with this idea is that controller vibrations are widely used to signify what the player’s character is feeling, such as a rumble for every punch taken, every kick given, and the recoil from a machine gun. Having to adapt to a different way of feeling the vibrations would be a weird experience, I imagine, but it’s always worth trying new things. Imagine a heavy rumble for a monster roar or a tapping pattern with increasing heaviness in the rumble for footsteps approaching.

Audio Positional Icons

Let’s say a fantasy-themed game has a mystical pillar that emits a ringing tone throughout the world for you to follow it, or maybe there’s a person in distress amongst a third-person superhero game. Hearing players could follow the audio with directional audio, but deaf players would be lost.

Imagine having icons that pop up signifying what the noise is and those icons floating on the edge of the screen in relation to the direction that sound is coming from. Or even on the screen as shown in the mockup below, featuring Fade to Silence. You’d be able to clearly see footsteps in some overgrowth, and the birds that could be signifying distress in the distance.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

In-game icons depicting sounds, such as this mock-up of Fade to Silence, could provide players with a clearer depiction of where sounds are coming from in the game world.

 

I’m unsure how well this feature would work for deafness and gaming if it was constantly showing, so I’d assume that this would work best if a “Hold to Focus” button was added, allowing the player to hold a key or button to have the character focus on the world around them.

As for how the icons should probably look, take a look at how State of Decay 2 shows icons on screen after climbing a tower to scout them out. Just imagine doing that for “hearing” and icons indicating important focus points in the environment such as creatures, odd sounds, and more.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Targeted/Directional Subtitles

Did you know Minecraft actually has a deaf mode? Switch it on and subtitles will indicate the direction they’re coming from by showing what sounds are nearby and the direction indicated by arrows. This is a very handy feature that is so simply implemented and fitting to the style of Minecraft. Here, have a look:

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Here’s how Minecraft uses the subtitle feature to inform deaf players of their surroundings. (Image from @_BluShine, Twitter.)

 

So that’s all well and good right? But what about other games that don’t follow the art style of Minecraft? A global subtitle style would suffice for this idea, but I’d be concerned that modern games have so many key sounds that it’ll just end up getting too busy in the subtitle area. So really, I imagine this system would work best by only showing subtitles for key sounds, depending on which direction you’re facing.

Sure, this probably means you’ll not be able to tell what’s behind you, but maybe mixing in another system from this post would create the perfect combination?

Custom Subtitle Sizes and Options

This is hugely important for deafness and gaming, and often it’s a feature that feels like it has just been added into the game without more considerate thought, as developers often opt for small-sized text so that it doesn’t take up the screen, and so that it’s there for the sake of it. This is reasonable from a visual standpoint to not fill the screen with text, but makes for a more difficult experience when deaf gamers have to rely on subtitles to understand what is going on.

As I noted earlier, one particular example of this came up with Wolfenstein II: The New Colossus and it having subtitles in a small, hard-to-read font size. Let’s take a look at a screenshot below from the game – this is, I think, a prime example of trying to keep visual aids to a minimum in favor of showing off the game more:

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Now let’s go to this example from Fallout 4, which is a bit more legible but comes off as a “wall of text.” There is far too much information being displayed here, and while it’s most likely to have the information be timestamped with the in-game audio, the legibility and amount of text is too much to process at once in a short window of time. While the green character font sits well on the eyes, I can’t say the same for the white text beneath it that struggles against the brighter portions of the screen.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Life is Strange did deafness and gaming well in this regard by allowing players to choose what font size they wanted, and if they wanted a black box behind it to make the text more legible. Although it wasn’t perfect, it was more than most games offer. Deaf users would love to have a system for every game in which they can customize their subtitles. Most developers probably believe that having white text with a thin black outline is enough, but really the contrast is terrible, and there are better options to having legible subtitles.

When games like Half-Life 2 and Life is Strange use semi-opaque boxes behind the text, it separates the text from the game world nicely. However, this isn’t to say that developers should always use these boxes, because sometimes having other options can work nicely. Having a variety of font options such as different weight and colors for specific characters — like in games such as Rise of the Tomb Raider — would be amazing to be able to choose from.

How Can Developers Build Accessibility Bridges for Deafness and Gaming?

Here’s a mock-up from Popcannibal’s Make Sail of a few subtitle styles that could work well. Nice simple fonts, nothing fancy: colors and sizes matter.


There you go: that’s everything I’ve thought of for deafness and gaming! There are already a few games out there that deaf gamers can enjoy due to their focus on visual elements, with audio being an additional (but not essential) element to the game: a few include games such as Throne of Lies by Imperium42 Game Studio, FAR: Lone Sails by Okomotive, and the Steve Jackson’s Sorcery! titles by Inkle.

While these and a few more offer experiences that are a bit more friendly to hearing-impaired games, there are still plenty of games that don’t yet offer ways to make them more accessible to deaf gamers. Hopefully, with this list of ideas from a deaf player himself, developers will get some form of inspiration to implement or expand on these ideas to make their games more accessible.

 / 
Ben Bayliss is a UK based staff writer for DualShockers. He spent over ten years doing media production related projects, and five years as a camera operator/director for live TV. He has been writing about games since 2015 as well as managing gaming websites. Now he's here at DualShockers to provide the latest coverage in gaming news, and more.