Kelly shuster screencap

Kelly Shuster: Android is for Everyone

In this talk from 360|AnDev, Kelly Shuster talks about making apps everyone can use!

There are currently over 1.4 billion active Android devices worldwide, and as Android expands globally, that number is guaranteed to increase. As developers, we need to start thinking now about how to create applications that serve a diverse range of users. Specifically, we should think about the 1 billion people in the world with disabilities. We’ll talk about who these users are, common challenges they face with existing apps, and what it really means to create an inclusive user experience. Let’s create applications that everyone can use!

Introduction (0:00)

For those of you who don’t know me, I live in Colorado and I actually grew up in Colorado. To be able to speak at a conference in my hometown is really cool.

There are tons of cool things to do in Colorado and in Denver there’s lots of beautiful hikes. If you have any questions or want recommendations, definitely come ask me after the talk. There’s a lot of really cool stuff to do here in Denver as well. We have tons of museums. The Colorado History Museum is brand new. It’s really cool. The Clyfford Still Museum in Denver has almost all of Clyfford Still’s works throughout his lifetime.

Next to the Clyfford Still Museum is the Denver Art Museum. This museum got a new addition about ten years ago called the Hamilton Building. It was designed by the architect Daniel Libeskind. He is most famous for his work on the Holocaust Museum in Europe, so we were really excited to have such a well-known architect do this addition for us on our art museum.

It’s really unique. A lot of custom work had to be done to get this structure. You really should visit it. It’s really impressive to stand on the street and have this huge cantilever hanging over the street and onto the other side of the sidewalk. It’s pretty brilliant. This building is definitely a sculpture, a work of art on its own, and it was not easy to build.

But here in Colorado we get a lot of snow. Here’s a picture of the Denver Art Museum in the spring, and you’ll notice where that arrow is, there’s a temporary barrier that’s been set up. This is because of the way that the building’s roof is slanted. Snow will fall off in massive chunks and hit people. This is less than ideal.

Unfortunately, the fact that it snows in Denver was not taken into account when the slant of this roof was designed right in front of the main entrance, so every year they have to put up this temporary barrier so people don’t get smacked by snow.

The Denver Art Museum is really fantastic and has a unique orientation on the outside. To support that structure, it also has a unique interior. In the atrium there’s a huge stairway that you can go up, and in this photo I’m standing about two-thirds of the way up overlooking a balcony.

There are not really orthogonal lines in this building. Everything is slanted. The stairways are slanted. The beams are slanted. The balconies are at awkward angles. While it looks really cool, it’s also kind of freaky.

When the museum opened, the title in the Denver Post was “Denver Art Museum: It’s Dizzying.” Many people were literally falling over and getting vertigo if they spent too much time in the central atrium. In fact, the most likely group of people to feel sick from this museum was the elderly, which awkwardly enough happens to be the largest demographic of visitors to the art museum.

Finally, one of my favorite parts. Because of the awkward structure of the building, there are a lot of weird beams that are coming out at strange angles. Some of the beams become a hazard for hitting your head, so they put these large bump outs underneath the beam. Those bump outs become tripping hazards, as you can see by the scuff marks.

There are also awkward beams that jut out in front of you as you’re wandering around looking at the art and you can see by the drywall that’s chipped on this corner, lots of people have stumbled over this. I watched three people trip over it in the four minutes I was waiting to get a clear picture.

This museum is an impressive feat of design and engineering. No one has ever built a building like this before and no one probably ever will again.

Are you designing for your users (4:59)

It took a lot of skill and talent to build this building, but you have to ask, “Was the architect designing for his users?”

I think that we do this a lot as software developers. We take a lot of pride in our craft and we’ll spend a lot of time building a custom view or a custom animation or something that just looks really frickin’ cool. It’s exciting to put something in our app that’s beautiful, that delights our users. But what if it’s actually hindering our users? What if it’s beautiful, but it’s causing our users to trip in our app?

Get more development news like this

As you may know, Material Design is the design language of Android. Material Design is all about bringing the material world to Android. There’s pieces of paper floating on top of each other at different elevations and it’s really cool looking.

One of my favorite parts about Material Design is the floating action button. It’s the button that’s in the lower right hand corner of the screen and it floats above all the other content on the screen. It’s really bold to have a button floating above everything else on your screen, and it’s an obvious call to action. It’s hard to not want to push the floating action button, even if you don’t know what it’s going to do.

Now think about the visual hierarchy of this layout. When you look at this app, your eyes are first drawn to the tool bar at the top. Then they’re instantly drawn down to the floating action button, and third, they look at the content on the screen.

For people who are blind, who use an assistive technology called a screen reader, they will go through each element on a screen view-by-view and the screen will be read out to them. This is a visual representation of that process. This green rectangle is highlighting the cell that the screen reader will be announcing to the user.

If you’re blind, you’re relying on the screen reader to tell you what’s on the screen and it reads through each element one-by-one. The screen reader reads through the visual hierarchy looking at your layout, your code, from the top to bottom of the screen.

Take a look at this example. I have a list of Pokémon, and as you may know, there are 150 original Pokémon, so this list is pretty long. The person gets to the bottom of the screen as they’re going through on their screen reader, and instead of hitting that action button, the list auto-loads the next few items. 150 Pokémon is a lot of Pokémon.

What if you were looking at an email inbox and it was making an API call at the end of every list to pull in 25 more items? You might never reach the floating action button in your app if you’re using assistive technology. Even if you only have 10 or 15 items, every time somebody using assistive technology wants to access the floating action button, they will have to run through every one of the items in your list.

You’ve now successfully made the most important, the most prominent action in your entire application, completely invisible to a certain set of your users. That’s a bummer. The good news is that there’s a fix.

 fab.setAccessibilityTraversalBefore (;

You can set the accessibility traversal order so that all assistive technologies can read out items on the screen in the same order as the visual hierarchy for sighted users.

On the floating action button, I can call setAccessibilityTraversalBefore on the Pokémon list. Now if we look at this diagram, the screen reader will read out the tool bar text if there is any, then it will immediately jump to the floating action button and read out that. Then it will start running through the content on the screen.

Now you’ve matched the visual hierarchy of your screen with the assistive technologies’ order that it’s reading it out. This was a one line code fix, and it wasn’t really that difficult.

The problem was we didn’t know that we had users that were experiencing this difficulty using the floating action button. The big question is, who are your users?

Who are your users? (10:00)

Who are the people that might be using your application? Currently around the globe there are 1.4 billion active Android devices. That’s a lot of people. That’s a lot of different types of people using Android. In the world we know that there are one billion people with permanent disabilities. We also know that in the United States, 20% of our population has a permanent disability.

One of these people is Victor Tsaran. He works at Google creating accessible products. He’s also blind. He gave an excellent TED Talk in 2009 at Silicon Valley. At the beginning of the talk he said, “Technology helped me overcome society’s understanding and society’s assumptions about my abilities.”

I think that we make a lot of assumptions about what people with disabilities can and cannot do because we just don’t know them. Victor Tsaran was able to use technology to push him ahead and have an extraordinary career. He’s obviously an extremely brilliant person and technology was able to bridge that gap for him to let him push ahead, even when people assumed that he wasn’t capable of things that he could actually do.

I think that’s really powerful to think about because we, as technology creators, have the power to bridge this gap for one billion people around the world.

Speaking about assumptions, I was giving a talk about accessibility once and someone was talking to me afterwards and they said, “I really enjoyed your talk, but it’s not really relevant to me because I’m working on a fitness app, so I won’t have any blind users.” I think that is a really good example of an assumption that we make about peoples’ abilities.

This woman is Amelia Dickerson. I worked with her at the Blind Institute of Technology as an accessibility QA tester. We would run through our apps together and she would give me feedback on things that she was having issues with accessing, and then I would go fix it in the code.

She’s an extraordinarily brilliant woman. She’s a marathon runner. She’s also blind. Not only is she a marathon runner, but she’s a really talented runner. She currently holds the record for speed in America, of men and women, for the best time for a visually impaired runner. She’s also currently qualifying for the Paralympic Games in Rio this year.

A lot of people ask, “How is this even possible?” When Amelia runs, she runs with a partner. She holds one end and they hold the other end of a short rope, and her partner calls out any potential obstacles or asks people to step aside. “Blind runner coming through at a faster pace than you. Please move.”

If she’s running extremely long distances, she’ll have multiple partners. That way if her partners can’t maintain the same pace as her they’re not holding her back. She runs with a running club to practice before she goes on a race, and there’s a whole community around her that supports her running.

This is really cool. I never would have thought that this is something that’s possible, but it just goes to show if you have a desire and a passion for something, you will work to find a way to be able to do it. We should remember that and not place limitations on other people.

Another person who’s really destroying stereotypes and breaking down assumptions is Jillian Mercado. She is a very successful model and she also has muscular dystrophy and is in a wheelchair. She was the face of the Diesel Reboot campaign, she’s modeled for Target, and most recently, she modeled the merchandise for Beyonce’s Formation tour. This woman is truly one of my personal heroes.

This is Haben Girma. She’s a graduate of Harvard Law. She’s an attorney and a disability rights advocate. She’s also deaf and blind. She was the first deaf-blind graduate from Harvard. This is a picture of her meeting the President of the United States. She has a lot of wonderful talks and writings about disabilities and disability advocation.

One of my favorite quotes from her is: “Let’s change the narrative from ‘heroic disabled person overcomes obstacles’ to ‘society is inclusive, all can succeed’.”

This year at the Apple Developer Conference, WWDC, she gave a talk where she reiterated this point and she said that someone’s disability is not what holds them back. It’s the society that’s not inclusive of them that holds them back. I feel like that’s really powerful because it really flips the table on the way that I think most of us have thought.

It might be difficult to wrap your head around this at first, and so I’d like to share with you a really incredible example that I experienced of how society can be inclusive.

This April I was in Italy and I had the opportunity to visit the Cinema Museum in Turin. This was the most accessible museum I have ever been to in my life. Every display also had a display in braille. This was a museum about cinema, so a big portion of the museum was about visual content, and everywhere I turned they were able to translate some visual content into a way that would be meaningful to somebody who was visually impaired.

The best example of this was in the section of the exhibit about optics, something that I think a lot of people would assume that someone who is visually impaired might not care about, and might not be able to understand. The museum took the time to address this, and every single display on optics had physical 3-D components that you could touch to understand what was going on.

This specific example is talking about the differences between convex and concave lenses. There is a red building that’s the same size in front of each lens, and then when you look through the lens you would see the building smaller or larger depending on which lens you’re looking through. Behind each of the physical buildings there was also a clear 3-D building that represented how much the building would grow or shrink, so that you could go and feel each building and “see” with your hands how it was changing.

Temporary disabilities (17:38)

A lot of times when we talk about building inclusive and accessible technology, the first thing that people think of is populations that have permanent disabilities, but building inclusive software is so much more than that.

People often assume that the elderly aren’t high technology users so we shouldn’t think about them, but this is wrong. Especially in our modern age, folks are becoming more and more digitally aware because they have to be. Things are presented in digital-only formats.

It’s not just the elderly that we should be thinking about. Currently in the United States there are 150 million people over the age of 40. This matters because, unfortunately, for all of us at the age of 40 our eyes start deteriorating at a much faster rate. There are several different diseases that you can get, and it’s a normal part of life.

Back in the day some people would say, “Well, it doesn’t really matter because people over 40 aren’t really high tech users.” That’s not the case anymore. These are the leaders of Alphabet, Google, Yahoo, Tesla, and Amazon–people I would personally consider to be some of the most tech savvy people in America, and every one of them is over the age of 40. We can’t ignore anymore the eye problems that people might be having once they’re over the age of 40, because these are the most tech savvy users.

Sometimes we’re also not permanently disabled, but we experience a disability in our lives. This happened to me not too long ago. I was at an eye appointment and I got my eyes dilated. I was supposed to text my husband to come pick me up after the appointment. I looked at my phone and I couldn’t read a single thing. I tried holding it far away. I tried holding it really close. It didn’t really matter. Everything was completely blurry.

I was wandering around the mall panicking. I thought maybe I could tap someone on the shoulder and awkwardly ask, “Could you text my husband for me?” Then I remembered that Android has accessibility settings that can blow up the text for you. Through a lot of trial and error, I was able to find the settings after an embarrassingly long time, and I turned them on. Then I was able to text my husband.

This is an example of how all of us, 100% of the population of humans on Earth, are going to experience a temporary disability at some point in our lives. Why shouldn’t our software be able to help us?

Here’s another example of being temporarily disabled. This is my brother-in-law, Edgar. He’s father of the year, as you can see here, with my adorable nephew. He’s feeding my nephew, he’s also eating dinner himself, and you can also see that he has his phone out. He doesn’t really have his hands available to use his phone at all right now, but thank goodness for voice search and voice activated software.

At Google I/O this year they announced that 20% of all Google searches are now done exclusively through voice search. This goes to show that a lot of us want the convenience of hands free access because we don’t always have our hands accessible to us.

One interesting example of inclusive design is with Facebook. Facebook released a feature a while ago, and they were like, “We really want to entice people into staying on Facebook. Now when you scroll through your feed, we’re going to start automatically playing our videos, but we’re not going to turn the sound on because that’s kind of rude. So we’ll just start playing the video and we totally know we’re going to suck you in and then you’re going to start actually playing the video with sound.”

Interestingly enough, that was not what happened. 85% of all videos in Facebook are watched in complete silence. Whether it’s because you’re at work and you don’t have headphones or you just can’t be bothered to turn the sound on, people will watch the entire video but they’ll never turn the sound on. Now Facebook didn’t realize that 85% of its user base was essentially deaf, but that’s what’s happening here.

Content providers are now changing the way they present videos to be more inclusive of this user base. People have started to put text captions in their videos because they know for a fact no one’s turning on the sound, but they still want to tell their really cool story.

Here’s an interesting thing that I experienced at Google I/O this year. The event was at an outdoor venue and a lot of the booths were set up in a parking lot. There was a ton of electrical wiring and cables running from one booth to another. So people don’t trip or accidentally unplug these wires, they had these devices set up on the ground, they’re called linebackers. These cords run through the linebackers so people don’t trip on them.

On your right you can see two regular linebackers side-by-side to contain the cords, and on the left you can see an accessible ramp for a wheelchair. I noticed an interesting thing throughout the conference. That divot in between those two linebackers, everyone was tripping over them. I saw a lot of people consciously walk around to walk on the accessible ramp.

A lot of times when we’re designing things both in the physical world and the digital world, the more accessible user experience is just better for everybody. We see this in Android.

A lot of the problems that we ran into when I was working with the Blind Institute of Technology and they were testing our app, interestingly enough happened on screens that we had web views. The screen reader was having a difficult time jumping into the web view, then it had a difficult time jumping out, but we all know as Android developers that web views aren’t a great solution. It’s better to have native components anyway.

Android Nougat (24:49)

There are many really exciting things coming with Android Nougat. One of these exciting things that you really should be aware of is that when people get a new phone and they’re setting it up for the very first time, there are now a lot of really awesome prompts to help people know about the accessibility settings and help them turn them on from the beginning.

I happened to be lucky to know that there was a massive “increase text size option” in the accessibility settings, but I think a lot of people wouldn’t even know it’s there. That means that you as a developer need to be really conscious of this.

With this new operating system it’s going to be even more likely that people are using assistive technology with your app, so you’ll definitely want to test it to make sure that it’s working. One of my favorite things in this new operating system release is this new screen zoom. This really would have helped me out back when I had my eyes dilated. Instead of just increasing the text size, you can now zoom in through the whole screen.

You shouldn’t have to make any changes to your existing code for this to work, but you should test your app with this to make sure that it looks okay. You should have all of your layout dimensions in DP units and text in SP for this to work. If anything is in just straight pixels it’s not going to work. Make sure that your app still looks good in this setting.

I’ve mentioned some of the assistive technologies that we have on Android, but I’m going to take a little bit of time to show you each one.

Here is an example of me running through a list using the screen reader like I demonstrated earlier. TalkBack is the name of the Android screen reader and it will read out each view element on the screen. The user can swipe their finger left or right to move the cursor up and down and double tap to select any item.

Switch Access is a similar concept. There’s also a cursor that runs through each view, but instead of being a voice activated assistive technology or assistive technology that’s reading out content, Switch Access is controlled by a hardware switch. Someone who doesn’t have a lot of dexterity might have a simple switch that they’re using to move through your app. Someone who is paralyzed will have a switch on the head of their wheelchair, or maybe they’ll even have a straw that they’re sipping and puffing into, to control the access of their smartphone.

BrailleBack allows you to plug in a braille keyboard so that instead of hearing a voice read content to you, you can read the content on your own on your braille keyboard. Then, also, easily type on your braille keyboard.

Voice Access is new. It’s really cool, you should check it out. This allows you to control your entire phone with your voice, so you can open up a specific app. You can go through the different items in your app and open up an email. It’s really powerful and really fun to play with so you should try that out.

Finally, Accessibility Scanner. This is a new app that was released by Google and it is really awesome. One of the big challenges of creating accessible technology is that we just don’t know what our users expect. It’s hard to know what they want. It’s hard to know where we’re going wrong. Sometimes it’s hard to learn how to use an assistive technology if you’ve never used it before.

Accessibility Scanner solves a lot of these problems for you. You download the app, you turn it on in accessibility settings, and then you start running your own app. For each screen, Accessibility Scanner will go through the whole screen and pick out any items that are designed inaccessibly.

For example, these top two buttons, the title says “Buttons That Do Stuff.” These buttons have been highlighted because they don’t have what’s called a content description. If I was using this on a screen reader and I got to that first button, it would say, “Unlabeled button 65.” If I got to the second button it would say, “Unlabeled button 14.” I would have no idea what these do because they’re just pictures and no one took the time to label them. Even worse, the next time I come into the app, those buttons will be differently numbered unlabeled button. It’s not consistent, so you can’t memorize a pattern, like 74 is the trashcan.

What you want to do is put a content description for this image button. It’s super easy. In XML, Android: content description= and then your string. If you didn’t know that that’s what you needed to do, no worries. Tapping on that orange highlight in the Accessibility Scanner will tell you what’s wrong and it will lead you to documentation in the developer docs to show you how to fix it.

Since we’re here chatting, I’ll tell you how to fix some of these others. If you take a look at those check boxes, example touch targets, they’re too small. In Android we should be following the 48DP rhythm, which means that your touch targets should not be any smaller in width and height than 48DP.

The next example has text contrast issues. It’s much too light. It’s really difficult to see that light gray text against this white background.

Finally, this is another image button that does not have a content description with it. You should be putting content descriptions on any of your images or image buttons that have some meaningful action or convey meaning to your users. If there’s not a meaningful item, if it’s just there for decoration, then set your content description to @null so that the screen reader explicitly knows to skip this item.

Once you know what you’re supposed to do and you start learning from Accessibility Scanner the types of things you should be doing, it’s a good idea to enforce it through testing because a lot of times we just forget. We’re not experiencing the pain points when we run through our app on our own smoke tests, so we forget that other users might be experiencing these pain points.

The first thing I always do is I turn all accessibility lint warnings to errors so my build fails if I forget to add an accessibility piece. This isn’t super intelligent because it won’t tell me if my content description is meaningful or if the thing I have added makes any sense, but it’s a good first level check.

A more intelligent way to check is through Espresso. This is the automated UI testing framework for Android. You want to make sure that your content descriptions are meaningful, and with Espresso you can make sure that the content description for an image is what you expect it to be.

Espresso is also great for testing dynamically changing content descriptions. For example, if your app has multiple states, maybe it’s a media player and you have a play and pause button. As that image changes from a play and pause button, so should the content description behind it. Using Espresso you can manage this and make sure that you are updating your content description appropriately.

Another thing that I hear from folks when we’re trying to get an application to qualify with accessibility standards, is that it’s going to look ugly. There are a lot of suggestions about contrasting colors for people who have difficulty seeing contrast and for people who are color blind. Colors that you shouldn’t use together. That doesn’t mean that your app has to be ugly.

This is a great website,, spelled with A-one-one-Y because that’s the abbreviation for accessibility. This site is curating beautiful color combinations that work with all types of color blindness and that also work in terms of contrast. You can go to this site and up-vote or down-vote different color combinations and also cycle through the most popular ones and get an idea of what to use for your app.

Conclusion (33:58)

Molly Watt is an accessibility advocate who lives in the UK. She’s also deaf-blind. She wrote a great blog about different accessibility features that she wishes that software would have and include automatically.

She said, “The accessibility features I speak of would help huge numbers of people, so why are these things not taken into account by developers right at the beginning?” Why aren’t they? They have the power to change peoples’ lives.

Jennison Asuncion, the lead accessibility developer for LinkedIn, was also wondering the same question. Was it too technically difficult? Did it take too much time? Did people think that it wasn’t important because it wasn’t a large enough user base? He put out a survey and he asked developers, “What is the number one thing keeping you from creating accessible software?”

The answer was this: Lack of awareness and education. It’s not technically difficult. It doesn’t take that much time. It’s just that you don’t know. You don’t know the problems your users might be experiencing, and you don’t know how to fix those problems in your own software.

Well, good for you guys. You already know because you came to this talk, so you can’t use this as an excuse anymore. As I mentioned at the beginning of the talk, there are 1.4 billion active devices all over the world for Android. Android truly is the world’s phone and something that’s super exciting to me about being an Android developer is that a lot of people around the world are coming to Android as their first device–their first computer. Their first experience with the digital world is going to be Android.

That’s a lot of power for Android developers. It’s really exciting. How cool would it be if somebody’s first encounter with technology had a baseline of inclusive apps? An ecosystem of inclusive software, and that was their standard for how software should be written?

It doesn’t matter if you’re writing a game app or a banking app or a coupon app. All of them can be improved and be made more inclusive to create a better software ecosystem for the entire world. Please take this into account and use Accessibility Scanner and fix some things in your app to create a more inclusive society and a more inclusive digital ecosystem for Android. Because truly, Android is for everyone.

About the content

This talk was delivered live in July 2016 at 360 AnDev. The video was recorded, produced, and transcribed by Realm, and is published here with the permission of the conference organizers.

Kelly Shuster

Kelly Shuster is an Android Developer at Ibotta and a Google Developer Expert for Android. She holds a B.S. in Electrical & Computer Engineering, and worked as an embedded firmware engineer prior to her career in mobile development. Kelly is currently the director of Women Who Code Denver, and enjoys sharing technical knowledge wherever she goes, from local meetups to international conferences. When not programing, she can be found playing in the Rocky Mountains.

4 design patterns for a RESTless mobile integration »