All right. Hi, hello everyone. I am Sightless Kombat for those of you who don’t know who I am, I am an accessibility advocate from the UK, and I primarily focus on the accessibility of mainstream video games.
Now, I’ve been playing mainstream video games for a number of years, partly to engage in the same social experiences as my sighted peers, but also just to generally indulge in this unparalleled form of escapism. As well as being just an accessibility advocate, I also write accessibility reviews for hardware and mentioning games. Partly, this feedback is to send to those who might be interested in purchasing these products before they do so, so that they’re aware of any accessibility issues I encountered during their use. Whether that’s through unboxing, or whether that’s through actual usage in testing, but also I send this feedback to developers of these products so that hopefully all being well, they can actually improve during the next iteration of that product. Now, I will be using throughout the remainder of this talk the term Gamer Without Sight or GWS. Now, some of you might be thinking well, you’re blind aren’t you?
And the answer is, yes, I am that’s correct. But I prefer not to use that label simply due to the fact that legal blindness can actually include some residual and usable vision. So, I therefore take Games Without Sight to mean that a person is playing video games, and they cannot see at all whilst doing so.
This isn’t my first time actually being in the US as tourist to briefly referenced in the introduction that she provided, but the reasoning is, I was able to travel for five weeks in the previous year, as part of a Winston Churchill Traveling Fellowship provided to me by the Winston Churchill Memorial Trust or the WCMT. Now, what is a Winston Churchill Fellowship? You might be asking. Well, this is an overseas grant, a research grant rather that allows you to travel overseas and bring your any findings that you take from your projects back to the UK to hopefully positively impact professions and communities they’re in. Now that I knew when I started this project that I was going to be focusing on game accessibility.
So, I realized that my findings would have far reaching, and possibly a global impact rather than just a UK wide reach. So, I entered my research with five questions in mind namely as follows. Number one, are developers aware of accessibility? Number two, are developers willing to listen to the challenges faced by GWS? Number three, does company infrastructure affect how implementation of accessibility features happens from studio to studio? Number four, what is the situation like across current platforms for GWS?
And number five, when do I think the first fully accessible game will come out in a triple a form with full in-game menu accessibility and gameplay? For the remainder of this, I’m going to discuss partly answering these questions but mostly this is done through the findings themselves. So, as part of my five-week tour, crossing four states, I actually had the chance to not only conducted a number of company visits, which I will discuss in a little more detail later on, but I also had the chance to attend E3, the massive trade show that the industry takes on every year, where large numbers of products and games are revealed, but also Microsoft’s gaming and disability bootcamp which I believe Ian mentioned in the very first talk of the day, and also the Xbox FanFest Event which Microsoft runs alongside E3.
Now, my reactions at the expo was pretty much a mixed bag in a sense because, on one hand I had some interactions with companies who were very accommodating and approachable. And one even hooked me up with a haptic chair that was wired up to the trailers that were being exhibited, that was an interesting experience, and also I had very interesting discussions about games engines. And in the case of crack down, I was able to help solve an audio bug or at least just find said bug on the spot.
Now, on the flip side, I was actually unfortunate enough to encounter companies with inaccessible booths or booths staffed with individuals who either didn’t actually know anything about, or seem to be interested in accessibility. Expos are a valuable source of feedback, they’re invaluable in fact in terms of feedback and building positive relationships with your gaming public, including those with disabilities. Please be as welcoming as you can to individuals who come along to your conventions with disabilities, as it may be their first time interacting with not only your products, but your company as well, and first impressions can certainly go a long way in that regard. So, slide six, with this in amongst the expos and other events I’ve just mentioned came a number of company visits, and with that the reactions were positive as well, but also very enlightening.
As much as I talked to a number of game studios, I also had conversations with Valve and Nvidia which were interesting on both ends from my perspective and from theirs. During these times, I actually got the chance to have my first fully fledged experience with VR, as well as engaging in well, Mortal Combat for want of a better results against members of the development teams or just collaborators with said teams like fighting against obviously 16 bits in Injustice two as well as fighting against Ken Lobb. One of the guys who actually came up, was involved in the original Killer Instinct, but I fought against him in the reboot was conducting. Yes, there’s a picture there for those of you who can’t see it of me standing next to a spartan, but it’s actually a statute, but I’ll come to that in a second. Because, whilst I was visiting, I actually got the chance to go and see 343 Halo Museum, which is a very interesting place for those of you who get the opportunity to attend. But as a Games Without Sight, I often get asked what I think video game characters look like.
This isn’t exactly the easiest of questions to answer mainly because the “internal imaging process” can take a large number of forms and work with a fair few sources. Most of the source material for these internal images comes from the voice acting and sound design in the game, but it also comes from sometimes external media such as books, and also action figures, and other elements of the franchise. Now, with that though, the fact was clearly demonstrated to me that, being able to stand toe-to-toe with a Spartan clad in full new armor, as well as facing down the Covenant in the form of elites and brutes.
Up close, I mean, yeah, they’re just statues and costumes, but the point still stands. It gives a sense of scale and perspective that is really hard to replicate, given that the internal images you have as a Games Without Sight are almost certainly flawed in terms of their heights of the player character for instance. But, this visit was able to demonstrate to me that accessibility doesn’t just encompass most of what’s been talked about today in the features of the game you’re playing.
Accessibility reaches further out than that to external elements so comic books, novelizations, animated media, and action figures and things. These positive inferences were carried forward to discussions I had with developers, and again as I stated earlier, most of the reactions I had were very positive from these company visits. Some developers were surprised simply at the fact that not only a Games Without Sight was playing their games but, doing relatively well but also the use cases that hadn’t been considered before we’re coming up such as, use with a screen reader. So, a screen reader just a recap for those of you who aren’t necessarily familiar with the term is a piece of assistive tech that put simply can rely on-screen elements such as buttons, controls, and just text in general when it’s formatted in standards through synthesized speech.
But other companies appreciated the opportunity to have in-person one-to-one interactions because again, they’re highlighted use cases that hadn’t necessarily been seen before, and was able to reference issues that may have come up previously but not necessarily with the context of having a Games Without Sight in the room to help solve issues. Speaking of problem-solving, there were companies who were able to solve issues on the spot with new and potentially effective solutions being theorized right then and there. Above although, it was very clear to me that once the companies had realized that accessibility was relatively easy to implement and certainly achievable, the reality had begun to dawn. So as you can see there’s a number of bullet points here, and these are in part some of the recommendations I came back with from my five-week tour, which I then wrote up into my report.
So contact, make it easy for those with accessibility concerns regardless of the disability to contact you. Please do not copy and paste generic PR messages. That is a rather frustrating thing that I’ve seen because what it can do is, it can almost certainly turn gamers without sight off of watching your company for any future accessibility development regardless of how useful and large scale they might be.
Use your accessibility features as a selling point. Now, this is slightly been covered with things like Uncharted Four and other games even where the passive face could be said to be definitely covering the accessibility based by working with marketing of the difficulty level settings, et cetera. But if I see a product that says has accessibility features and what those specific features are, that sounds like a really interesting concept to look into. And the day I can buy a product without having to worry about what I sometimes term as the accessibility versus price problem, namely that sometimes, you’re not sure whether you can buy a game because it might not have the right accessibility features, then the day I can buy a product without having to worry about that is a day where I will be more than happy to purchase more products from that company. Publicize your accessibility features.
Well, publicity is the, we’re heading there. Publicize your accessibility features as soon as they are approved even if that’s massively before launch. Because then, gamers without sight like myself can rest easy, relatively so with the knowledge that their experience will be at launch marginally if not greatly less painful than it would have been otherwise without those accessibility features in place. Communication, if your accessibility features are being hampered in terms of their implementation by either higher management, higher authorities, company infrastructure issues, or even just the engine your working in which can sometimes be an issue, let consumers know what’s going on because at the very least, they’ll have sympathy for what you’re going through. And sometimes they might even be able to suggest ways around it. And testing with the demographic.
It’s been highlighted earlier, who better to test with than the people who are actually going to use your features? I mean, it’s almost logic really. If you’ve got a blind person, and you’re adding in accessibility features for gamers without sight, get those players in and get them feeding back because sometimes they might find even easier ways as Michelle was highlighting. They might find easier ways rounds while you’re currently trying to do to say shorten verbosity of the incoming text.
So, more recommendations here, and yeah, more recommendations here starting with worldwide testing. Equally, as much as you are working with testing in-house, in studios, and that might seem to be the easiest solution, there are going to be testers like myself who are unable to actually come into your studios in the States and give you one-to-one feedback and put the time in as it were in terms of assistance. Now, bring the boats to test this remotely. It’s not impossible. Might be tricky at times, but it’s certainly not impossible to do because that feedback coming from a large range of use cases can be very useful.
Speaking of wide-ranging things, the wider pre-release access as well. You see all the YouTubers are the big streamers and content creators flying out to these pre-launch events at various HQs. I can remember Battlefront Two myself being a big thing where you had a large number of streams covering even things that went in the open beta as far as I can remember. But, it would be interesting to see accessibility features and the game in general marketed by streamers with disabilities. And simulation testing, flipping back briefly to the testing sphere of things.
It’s been highlighted earlier as well. Even if you can get what you might loosely term as subject matter experts in on the testing cycle, what you also want to do is get yourself testing it because then you might actually find issues that you haven’t found before. So for example, if you have a product you can blindfold yourself completely so you can’t see anything. Trying to unbox it from start to finish, and see if there are any parts where an unboxing process might be too complicated or if you have a game, turn it onto a competitive setting and see if there are any glaring audio cues that need to be put in or any menu things that needs to be revamped, but we’ll come on to those aspects a little later.
And the final one for this, multiple resources. So, even with narrated menus and accessibility features, plain HTML files can be extremely useful before launch when they’re released. Especially, given that sometimes you won’t have the sighted assistance that’s needed to get into the game, get into the right menus, and turn these accessibility features on on day one. So, having that ability to do that without actually having any sighted assistance, does ease the stress and frustration of this whole someone.
So now, to the practicalities of this. So for years, the main barrier to entry for gamers without sight into a brand new game, say they just bought it for example, is the menus. So, navigating complex menu structures. Now, this has mostly been facilitated for a very, very long time now by written guides.
So, these guides are often compiled by gamers without sight who have attempted to play the game before. Whether they’ve tried or failed to manage actually getting anywhere. Now, sometimes that’s assisted by sighted assistance other times that’s assisted by trial and error.
And these guys can often remain incomplete to varying degrees of incompleteness in fact, due to the simple reason that sometimes there are patches that change menu orders or remove or add options as time goes on. Also, a complete lack of sighted assistance to document all the processes that you actually need to work with to gets into the deeper layers of setting up specific scenarios. So, you might have how to get into a multiplayer lobby just on a basic level, but you might not have how to get into a custom private match with five people and yourself with a certain set of load outs only permitted I don’t know, but that’s the idea. But now, there are options for resolving this issue. In recent years, a main piece of tech that’s come onto the assistance field if you will, is OCR or Optical Character Recognition. Now, OCR basically involves relaying extrapolating text from an image and relaying it through to a screen reader, which as I said earlier, is a piece of software that can synthesize this text into computerized speech.
Now, OCR for console games often involves using an internet connected app on a second screen. So, a laptop or desktop or sometimes even a phone that will be able to then read the screen for the user and reload the text. However, such apps can most certainly be impacted by performance issues, connectivity issues, operating systems, device health, any number of factors that can make it extremely tedious. Then, sometimes even fonts can get in the way. Things like having a clear and readable font can not just impact those with visual, total of actual usable vision, but it can also impact Games Without Sight using OCR.
However, the landscape is certainly shifting and changing to accommodate the new trends if you will, in terms of the accessibility sphere. Last year, Microsoft announced that GDC, the the same systems that were being used to underpin an eraser, have been combined into part of the development kit in a thing called the Microsoft Speech Synthesis API or as I believe I termed in my report the MSSA. And this basically allows for spoken menus, so fully spoken menus, and in-game UI elements as well. So, things like health or- You know, pause menus or just tutorial messages, but you also have as well as that you have the Unity plug-in, mentioned earlier today which can provide full accessibility to Unity games.
And you also have the TOCC Abstraction Library which was used to retrofit school girls into the PC accessible fighting game, this is as now, and PC games can definitely make use of that library. Now, these tools all work for interfaces. Now, if you’re working with a game, like say Hearthstone or Football Manager, then you’re going to theoretically at least be able to make these games relatively almost fully accessible in a relatively short space of time. But that doesn’t necessarily mean the all games are this easy to work with in terms of making them fully accessible to gamers without sight. Once you start involving complex mechanics, which we will come on to in a second, that can then make things more difficult, but that doesn’t necessarily mean accessibility can’t be achieved. So, FPS games.
So once you work with a 2.5D games, a fighting game for example or a side-scroller, that is, in essence, relatively simple to work with in terms of stereo audio or proximity based curious. Though not all games manage this but that’s on a on a case-by-case basis normally. However, once you add in the relatively mysterious Z-axis or the third dimension, developers often get kind of confused and uncertain as to how to implement these more complex mechanics and particularly navigation of complex environments. Now I’ve played a fair few different shooters, I’ve played a Overwatch, I played Gears, I’ve played a couple of iterations of Halo, but most of this play is facilitated by spatial audio cues so, where sound is coming from around me and sometime Gina volume of sound and other queues. But also via the sighted assistance of others, whether this assistance comes from things like local co-pilot or cooperative systems that are locally available or whether that comes from online services such as Xbox live. So if I’m teaming up with a number of people in Halo Five I can either do that through the console itself locally, maybe not in Halo Five case but I can do that online as well should I need to.
So, here’s a clip of co-pilot in action. Now co-pilot, I don’t believe it’s been explained, but for those who aren’t aware co-pilot is a system integrated into the Xbox One Operating System, this allows for two sets of controller inputs rather, to be rendered as a single set of controller inputs. Now actually, I think Ian highlighted this earlier, the specific combination of inputs doesn’t necessarily matter as long as two controllers are being used as one. So, in this next clip I’m about show you from Doom 2016, I am taking control of the weapons side of things so the the firing and the changing of weapons when needed as well as possibly grenades as well, and my co-pilot is actually taking control of all of the navigational elements, so the jumping around the looking, and the navigation generally. So here is a clip of Doom 2016. So as you can see there’s an audio key to indicate we’re in battle, and that’s me firing the rifle there.
This game does in essence become a wall of noise up points so a co-pilot can be helpful, even in just navigating that. So as you can see there’s a fair amount of looking around going on and that’s being facilitated by my co-pilot trying to find the enemies. And there are cues for picking up power-ups as well, and then that guy makes his presence very well-known by that audio cue. Yes, so that is a co-pilot in Doom 2016 this works for other games as well including this next one but this next one being Titan Fall Two, but this clip actually comes from a slightly different scenario where instead of using a local version of site assistance facilitated by co-pilot, what ends up happening is myself and a fellow player, are actually online together in the game and working to actually do what turns out to be my first Titan kill.
For those of you who can’t see, this other player is actually in Hamilton and this is through his POV, and you will see in a second, that’s yes that’s going to hurt. Now, I’ve played a fair amount of high-low and things independently and really enjoyed it, but it’s not necessarily just me who’s attempting to work with 3D mechanics. So, I’m aware of a Call of Duty player who has a visual impairment, I’m aware of a number of individuals who are playing GTA Five as well in first person vie, both of these cases are facilitated by not only the audio cues but the assists available as well in their respective games. So, we’re going to go through a few top tips here. So the first one auditory and haptic cues for enemies or objectives addictive proximity to cross out so basically being able to line things up and pull the trigger, quite useful in those kinds of games. Aim sensor on walk or button commands are also sensors for those of you who may have seen previous Halo games if you run forward and your gun is not pointing in the center, that will redirect your gun.
Gears has I think first sprint. If you sprint you’re going to re-center itself, and cues for nearby weapons pickups and locational objectives. So, if say I have to run to an area pull a lever and then move to a different area, then I should be able to make that happen fairly easily. Aim and camera assist so essentially making your bullets more likely to hit the enemy and being able to lock on to the right direction to travel, and distinct cues for Friendly’s and enemies.
So, being able to tell whether the footsteps or the gliding noises or whatever audio cues the specific characters using are friendly or enemy is quite useful. And a couple of things I failed to mention earlier, this applies to all the general examples as well I’m going to go through, run all your menus and in-game messages through the text tools I mentioned earlier, and also as well menu navigation. So, if you have a cursor, like an analog cursor that you move over a menu item and then click it, that should also be facilitated via D-pad specific commands as well.
I can think of two recent example, well fairly recent examples namely Destiny Two and Assassin’s Creed Origins, both of which failed to do this which is very frustrating when you get into a game and you can’t even move the menus around. That can be quite annoying. So, we’re going to move from this onto Racing Games. Now, for around a decade and, well for over a decade and a half, Racing Games in audio-only circles have been present, so there’s certainly a few lessons that can be learned from mainstream developers attempting to integrate various facets of accessibility into their own titles. Now, while discussing the concept of accessible Racing Games with various individuals connected with the falls are series in its various iterations, I discussed a theoretical concepts called the audio racing line.
Now this is a separate cue that would be separate from your car’s audio and any other audio in the game, and it would move to indicate which direction you need to go. So, in essence if it’s in the center of your stereo field you need to go straight ahead, and if it moves to your left or right, you then you then follow your audio racing line cue to make the turns. Sounds fairly straightforward, and that’s because the learning process then would be pretty much the same as SIC player. So, figuring out what speed you can take corners out without crashing into walls which is very easy to do regardless of how well you can see. Yes But also having a time trial mode or a practice mode can also be of great help given that you want to try and get the best lap times you can.
So we’re going to go for a few top tips here as well. So, rally-style 10 indications at Top Speed Three which is a game I was going to recommend on the previous slide. Top Speed Three was a game that is now no longer an active development created by a studio called Playing in the Dark. This is basically what I would consider to be the best quality place to stop for implementation of accessibility given that this was an audio-only racing game.
Rally-style turning indicators actually pretty much present in Top Speed Three, although they’re nowhere near as complicated as what’s so called pace notes in actual rally driving. Audio cues and haptics too of course for distance from the racing line. So, if you’ve run off the edge of the track, you are immediately informed about it via two sources instead of just one.
Practice modes for cars and tracks. So you know what car, so how your car will sound on different surfaces, and so that you know how to navigate the tracks correctly. Distinct audio cues for cars you are or aren’t controlling. Say you’re racing against one other person and you’re both using the same car so that you can tell which car is yours, and the ability to just ambiance and music independently of the rest of the game. Now, that would be useful say so that you can even just hear the UI generally or so that you want to turn up maybe the ambience so you can know that it’s raining for example or that it’s snowing, I don’t know.
Maybe you’re racing in the snow, or if it’s just sunny you’ll just have nothing I suppose or just- yeah. All right. So, fighting games, the genre that probably most of you here are aware of what my content creation were probably expecting. Now, fighting games are relatively easy genres to work but in terms of accessibility. Though Killer Instinct is still the game that I hold to very high regards, the reboot that is, given how much it does right compared to its current competitors.
It has a very simple way of answering a One v One exhibition match, and it’s lobby system isn’t too bad either for multiplayer sort of competitive tournaments for example. It also has a relatively simple menu structure in general. But and also the main key point of Killer Instinct or KI, as it’s sometimes known, is it’s very very detailed audio design. Now, to put this in perspective, KI has a full range of movement cues for characters walking forwards and backwards, dashing forwards and backwards, jumping forwards, backwards and in neutral.
The jumping aspect of it is quite important especially as from the moment you leave the ground pretty much to when you come down, there is something going on in the audio for your character. This is in comparison to Street Fighter Five where the only way you necessarily know when a player has jumped is halfway through when they’re at the apex of the arc, which is slightly annoying when you’re trying to anti air a character. But given the KI is such an audio heavy game, I think it’s better that I just demonstrate how much it does well in terms of providing information. What I’m going to use to demonstrate this is a clip from probably one of the hardest special boss fights I’ve had to face. I think those of you who know the game will know what I’m about to say next.
Box Shadow Diego, this input reading boss is very tricky to deal with, but this is a clip that demonstrates just how it works if you can’t see anything at all. Outputting lots of energy. I wonder what happened there. So, as you can see, there’s near enough cues for pretty much everything. So there’s even cues for when he activates his instinct mode. So even when I’ve come up sort of guarding from his next attack, the cues of that indicate that I’ve managed it.
As I said, read your inputs. Here my execution fails me when I try to get the ultimate, even though I still won. Right. So that is, thank you. Thank you.
So, yeah, and with that we will go on to the top tips. So, as I said earlier, if there is any text, sorry. Say for instance you have, I don’t know, a crafting menu in your fighting game which is not unheard of, killer instinct did do this for one of its modes which is why I reference it. Stereo separation, no matter how close the characters are when they’re fighting, so that you can at least tell who’s on the left or right. That can be very crucial. Unique cues for all important actions.
So, if it has any significance to the player in terms of the outcome of a matc, for example, give it a queue. Queues for resources or health gauges et cetera. So any any important UI elements that need cues give those cues as well, but also give it an ability to be adjusted independently.
Because sometimes players might learn to fight with those cues off once they play the game for long enough. For character select, when you have a character highlighted, make it clear for player one and two which character they’ve highlighted before they have to lock it into place permanently. Also that extends to mirror matches as well.
So if I’m playing as the same character as my opponent, make it clear which one of us has actually won the match. So, yeah, we’re going to go into just a few things. I mean that was a couple of genre specific examples. But, if you have any questions about projects that you guys are working on, please do come and sort of ask me and ask other people as well because there are likely things that can be done.
It matters how much it replicates the experience of a sites of gamma. But if it can’t necessarily be done for completely logical reasons which will sort of be discussed as time goes on, then that can change elements of it. But with KI as well, just to sort of say highlight a thing that I should have highlighted before, I will be taking on challenges next door with a rake setup. So, if you guys want to find out whether I can play as effectively as a sighted player, then please feel free to come and- >> Just go for it. >> Yeah.
So, but I’m happy to discuss any questions that you guys may have about projects you’re working on even if they’ve already been covered in the genre specifics. I really hope this presentation has given you at least a little insight into how gaming with outside actually works in practice. But, accessibility is achievable.
This is the front line. Where do we go from here? The answer, we’ll see.