Andreafalcone androiddeveloperoptionsdeepdive cover

Android Developer Options Deep Dive

If you enjoy talks from 360 AnDev, please support the conference via Patreon!

Testing an app on your perfect Pixel device under perfect network conditions works great, but you’re getting reports of strange behavior from the wild and weird crashes that you can’t reproduce. This talk will walk through some of the more complex options on the Developer Options screen in the Android OS and discuss how a developer can use them to debug problems in their application. Topics covered include: showing surface updates to help you find layout loops in your app, showing layout bounds to get your views pixel perfect, and aggressively killing processes and activities to ensure that your app performs well under stress. We’ll cover these options and more as we dive deep into debugging apps.


Introduction

In your phone, you have a large number of debugging features. Some of them you’ve carried around unknowingly for years. I will talk about the developer options section of the Android OS, and how you can use it to tackle a variety of problems in your app.

My name is Andrea Falcone. I am a software engineer at Google, where I work on Fabric. We were acquired by Google in January of this year. We’re working with the Firebase team. They’ve been working on Crashlytics for five years. I had my five year Crashiversary two days ago, before any acquisition or before the development of Fabric. I spent that time working on the original Crashlytics Android SDK, the Beta by Crashlytics Android App, as well as tooling around supporting Android developers, e.g. working on our Android Studio plugin, Gradle plugins. I also extended the support for Android in our open source tool called fastlane. It’s an automation tool for automating your release.

I believe in empowering developers to build better apps; in the work that I do professionally, as well as giving talks at conferences on that topic. I’ve done Android stuff but not a ton of Android app work. The Beta by Crashlytics app is the only example of an app that I’ve worked on; I have little experience in that area, because I work on tools. I wrote this talk because I wanted to know more about debugging actual apps. And I thought that focusing on something that was available to everyone that had an Android device or emulator was a good place to start.

My goal is to show other developers how a tool that you hold in your hand or have in your pocket all the time can help you solve problems without needing a USB cable or without needing to get whatever the latest tool is.

Developer Options

There are many options in the developer options screen in the OS. I’ll only cover some, since it’s easy to get overwhelmed. They do vary from OS to OS, so some things are new in N and O. My hope is that everyone will learn something new and go on and make development easier for you and make apps better for everyone else.

“Take bug report”

Have you had a bug report from a user? Have you heard of a bug in your app from a user? Maybe someone’s told you about weird behavior, something that you can’t reproduce, and it wasn’t a crash. You weren’t able to track it on any system.

The Take Bug Report developer option can help you with the “give me all the info” option. It’s been available in OS for quite a while, but the interactive option is new here. When you pick this option you get a pop-up choosing either interactive report or a full report. And what it’s going to do is collect a crap load of information about what’s happening on the device and email it to you.

It takes several minutes for this to work to grab all the stuff that it needs. And it’ll send off an email, open the email client for you to send off with a text file containing about 200,000 lines, and 20 megs of interesting stuff.

Get more development news like this

There’s a screenshot button. A user can give you six screenshots of how they ran into a problem, as well as describe it. This gets all bundled up in a zip file that can be emailed off.

After you’ve asked the user to send you this bug report, you get an overwhelming bit of text. There’s a lot of outputs. Most of these are the results of running various Unix commands on the device. But let’s look at some interesting things.

You can see the uptime. How long has this device been running? Get some stats about the internal storage card. Find things about memory (how much total memory, free memory, cached active memory). Maybe this problem is only happening in low memory situations. You can learn things about the processes and threads that are currently running.

There’s also app activities; this is a list of all of the activities on the device right now. Not just your app, but all the activities, as well as the active fragments in their entire view hierarchy, is laid out in this text file.

It can be hard to parse, but if you are good with your find or some command line find tools, you should be able to narrow it down and find the bits that you care about. Same goes for services and content providers running on the device - you’re also able to see all of that kind of stuff.

There’s a system log, logcat, that gives you the full output. When I tested this, I got about an hour of backlogs. You could see some events that led up to the problem as well as the recent stack traces for any ANRs with the VM traces. A window or a look into what’s happening on the system around the time that a bug occurs. Not necessarily a crash, some behavior in your app that you can’t otherwise explain.

“Allow mock locations”

Have you ever needed to reproduce a bug that involved location? If your app is primarily location based, you’ve got more sophisticated tools than this one here. But this is great for a first time debugging and diving into looking at GPS-aware components. It’s also great for stealing swarm mayorships. ;) You can set your location to a different place and go there and keep checking in and get those crowns.

I thought “I’m going to turn this on and it’s going to work”. I was surprised that it was a toggle. There was no prompt, nothing telling me where to put in my location. In Android N they changed the name of the option to be select mock location app, which is much more accurate to how this works. You click on it, and you get a list of apps that have this permission called access mock locations set in their manifest. These are apps that are going to use a specific API to mock out location. You can download a variety of these apps from the Play Store or you can make one yourself using the set test provider location API.

There are other ways to mock location. On an emulator it’s more simple. You can use the location console in DDMS. Or you can telnet to the emulator and use the geo fix command. They don’t work on actual devices but if you’re willing to use an emulator for this, this is easier than taking a road trip with your physical device.

“Wait for debugger”

Who’s ever had a bug in the onCreate method? The startup of your app? You have to keep launching Android Studio and Debugger to catch that at the beginning. You can’t start the app up again because your breakpoints are going to get hit because they’re at the beginning of your app.

You can use the “wait for debugger” option. It becomes available when you set a debug app. Similarly in Android Studio, you can pick what app you’re going to watch for debug at the bottom of the Android monitor. At the top you’ve got the attach to running device. Similar, but this is going to get you before the app starts up. Once the app has started up, you’ve possibly gone past the bug that you’re trying to capture. If you turn this on, it’s going to halt, wait for the debugger; that way your breakpoints that are set in those early parts are hit every time the app loads. It makes it easier to catch behavior in your onCreate and catch buggy behavior.

If you want to change code in the app, you can use the wait for debugger method as well, with the same behavior. You’d set that before the breakpoint that you’re looking to get. It’s easier to do it with the device that you have, but you know we’ll give you some other options.

“Show Taps”

Have you ever taken a video or screen capture or maybe received one from a user? Maybe you found it hard to see what was happening. The “show taps” or “show touches” option gives you this white circle that shows where the user is currently touching. You can show the emulator. You can do it on a real device.

Let’s look at a video. You don’t know where to look as a user, you can’t figure out how the user activated search or what they were doing to scroll. But in the second video there’s subtly this white circle showing you where the touch position is. You get a better sense of what a user is doing.

I also like this for making screen captures of an app to show people how it works. You can pitch it better this way, for putting a demo up somewhere or tweeting out a video.

“Pointer location”

In a similar vein, but with more technical details, is the pointer location. You saw crosshairs for the exact touch location. You can see if the element that you expect to be touched on screen is in that location, of if it’s not activating the element because either you’re not touching in the right place or the elements not the size that you think it is. As well as a mouse trail so you can see the movement that the OS captured for the touches.

The color of the trail changes with the speed of the movement. The red dots are the touch points and the line that’s drawn between them, that blue line, is like traced between them. You can tell the speed by how far away the touchpoints are and the more red it is, the slower the user was going. The more blue you see, the faster they are going, the fewer touchpoints that were captured in that span of time.

At the top there’s even more densely packed information. You have the number of touchpoints that are currently active, and the number that have been captured since the last time you touched. That whole touch and move that I did there was one capture.

It would show you how many were active. I took this screenshot while none were active because I needed my hands to take the screenshot. You also see the x, y coordinates when it’s active or the dx dy, the delta, the change in the x and y direction from where you started touching to where you ended touching. The velocity in the x and y direction: How fast were you moving? How quickly did you change locations?

There’s also information about pressure and size on there. Those can’t be captured on devices that aren’t capacitive touchscreens. These are estimated for most devices. You can use this information to find bugs in your sketching app. Does what you’ve drawn on screen not look like the input that the OS received? Or if you have custom gestures. Why am I not seeing this thing that’s supposed to be a flick up? Maybe the user moved too slow.

“Show layout bounds”

Have you ever needed to know where one view stops and another one starts? Or maybe you tried to figure out why two things are laid out or spaced apart from each other on screen when you think they should be closer together. Show layout bounds can help with that.

When you turn this on, for each view, you’re able to see the margins (the pink area around them). Some views don’t have margins. The red lines are the optical bounds. The blue lines are the clip bounds. And the difference between optical and clip bounds are about things like shadow. Optical let’s you see the bounds of where the element is. If two elements have a lot of space between them where you’re not expecting it to be, you can figure out for example whether the margin is inside of view A or view B. That’ll tell you what you need to adjust; looking at it without the bounds turned on, you’re not necessarily able to get that information.

You can also see the touch area for a particular view – the area that activates the text fields. The individual rows are bigger than the text itself, but without this on you don’t necessarily see the area that you can touch to activate the view.

Something that’s new in O is that if you haven’t defined any focus state colors, you’re going to start getting some default colors for focus states in your app. You can disable that by turning off the default focus highlight enabled. But you can also use show layout bounds to tell you which view has focus.

In the first one (see slides), your text field has the focus. And if you look in the bottom one, the OK button had the focus at that time. And you could tell in that case because there is, subtly, a highlight color for that element. But for text you could see that the cursor was there. There are plenty of cases where you may not have a visual distinction and you need to know what has focus. Turn this option on and you’re able to find out.

“Force RTL layout direction”

Confirming that your app is properly internationalized involves different components. But if you’d like to see if you’ve properly supported right to left layout languages, you can turn on force RTL layout direction. This toggle forces things to layout from right to left instead of left to right, even if the language that you’re using isn’t typically right to left laid out.

You could switch the device language to something that is laid out in that direction. However, I find it difficult to navigate a device in a language that I can’t read. Not only my app but the options, as there are no icons so I can’t tell the difference between items. I wouldn’t know how to turn this option off.

You can turn it on, and things start laying out on the right to left. I found that there was at least one place in the developer options screen itself where the start and end padding hadn’t been correctly used. They had used right and left padding. And the debugging label, the system header is moved to the right; but the debugging header stayed on the left.

The same thing could happen in your own app: you might find an element or sections of elements that aren’t properly using the start and end padding, they’re using right and left padding instead. Turn this on and make sure your whole app looks the way you expect it to right-to-left laid out.

“Debug GPU overdraw”

If you like to find drawing hot spots in your app, you can use Debug GPU Overdraw. This is going to tell you when the same place on screen has been drawn over multiple times by different views.

This option will show different colors when a view has been drawn over. When there’s no background to an element, it’s whatever color it’s supposed to be, then it hasn’t been drawn over at all.

As you start to get to green and pink and red, these have been drawn over multiple times – you have multiple layers. Something is drawing a background in that green section but when you look at the settings screen, you don’t necessarily see that – they are just two white backgrounds. But your app has had to do more work to render that green part and the pink bits around those text fields and icons than it did for the bottom half.

On one screen maybe that isn’t an issue, but think about your whole app. You could be doing more work than you need to. You can use other tools to tell you how to reduce layers like Hierarchy Viewer. You want to get more flat things. In general, the lower the z index is the better. You want those things to have transparent backgrounds. You don’t want to keep drawing backgrounds on top of backgrounds. You’re not going to see them.

You can check this using a lint check. If you want to add this into your build, make sure that you don’t have regressions in overdraw in your app. You could add a lint check to make that happen on every commit or during NCI.

“Window animation scale”

If you have custom animations in your app and they don’t look right, take a closer look. You can scale the animation speed to see what they’re doing.

Scale is backwards from what you might expect - it scales the time an animation takes, not its speed. 0.5x scale is half the time, so twice as fast, and 10x scale is ten times the time, so slower.

Let’s look at normal window animations – different apps coming in and out, and how fast they typically come in.

Let’s look at them at 10x, which is going to be very slow. If you had something happening here where it’s moving differently than you expected, you’re able to see when and why what at normal speed is maybe only a flicker.

You can also turn window animations off, and you can see if you have buggy behavior. When I was working at Twitter, I found a bug in the Periscope app where they had bound some app behavior, e.g. navigating backwards through the back state to animation completion.

The animation didn’t happen because it was turned off on my device for whatever reason. It never completed and navigation never happened. I would click back, or was doing something and couldn’t get out of a specific screen – I had no idea why. I kept writing bug reports. I had this option set to off. I didn’t have any window animations and the app was broken under these conditions, which are perfectly reasonable user conditions.

Window, transition, duration

You may have noticed that there were a couple different options in the main screen. There was window animation scale, and transition animation scale, and animation duration. These are different types of animation that happen in the app. That is different apps coming in and out.

You also have transition animations. This is between activities in your own app. That is the transition that was slow.

There’s also the animation duration. This is smaller animations that happen inside the app. If you watch the tabs while this one is running, the ripple. Watch how slow the ripple goes.

If you have custom focus animations, and you have something wrong or you’re trying to build out something cool and you can’t see it at normal speed, slow them down, take a look at the individual frames.

“Show surface updates”

Show surface updates is going to show you parts of your app that are doing work that isn’t visible to the naked eye.

Show surface updates shows you parts of the screen that are doing work. In the first video you’re going to see that the whole screen lays out and eventually it settles and it doesn’t flash pink anymore. Turn it on, it does a little bit, and then stops, it’s done. This is not doing any more work.

If you look at the second video. This dialog continues, it’s animating, it’s doing work. That’s going to keep going while it’s animating. Video stopped. In the Beta by Crashlytics app, we had this option turned on and noticed that there was part of the screen that kept flashing pink, but nothing was obviously animating.

In the app we had spinning loaders, but data had already filled in on the screen and the loading indicator was not set to gone – it was still on the screen. It had been set to invisible. It was still there doing work, but you just couldn’t see it. I’ve benefited from turning this on in my own apps and looking through them and making sure they’re not doing work when they’re not supposed to.

“Show hardware layers updates”

If you’re looking to debug hardware-backed views in your app, showing the hardware layer updates is similar to the previous option. You can see which elements on screen here are rendered using the GPU. If your app is behaving properly, you should only see brief green highlights. Not things that are constantly updating.

“Profile GPU rendering”

Have you ever had some jank in your app? If you don’t know where to start looking to improve performance, you can turn on the profile GPU rendering option. This will give you a good place to start.

It’s going to show you how long it takes to render certain frames of your app. You have a choice: you can show it on screen as bars or in the ADB Shell. I was going to keep you in the device – no USB cables for us. Time is laid out on the x-axis. And amount of work done is laid out on the y-axis. As you scroll as you start doing stuff, turn that on and off.

There’s this magic green line at the bottom, which represents the 16 millisecond target. Frames that take longer than 16 milliseconds to render is outside of the 60 frames per second (what we consider a rate that looks good, is not janky for users). For whatever reason, there’s jank. Things are beyond that green line. Every time it misses that, you’ve missed a frame and your users could be seeing stuttering images.

There are different colors indicating how the OS is handling the input and responding to it. There’s more information. In the developer documentation, there’s a whole section dedicated to using this option, reading the output, and then figuring out how to fix those problems.

My intention was to show you that it’s here, let you turn it on for your app or other apps and go, “I feel the jank, I see it.” And then have some empathy to want to fix it and have that benchmark of that green line to get down under.

“Simulate color space”

What if you wanted to test accessibility in your app? Simulating color space is a neat way to test for the ability for people who are color blind to differentiate colors or textures within your app. It became available on Lollipop. There’s several types of color blindness that affect people, when they can’t see the difference between different shades or different colors.

When you turn on simulate color space, you’re able to see that there are multiple different ways to transform your UI, to test your UI, for the ability to compare different elements.

I took my own home screen and I turned on this option. You can’t capture the color output with ADB screen capture tool. I had to take a device and take a picture of my phone. All the other developer options we’ve discussed are using built-in screen capture tools, but the layer at which these colors are applied is not transferred in screenshots.

If you look at some of the icons, you can see that the red and green (which should be red and green here) are displayed as the same color. If those elements were next to each other our user wouldn’t necessarily be able to see the difference between them. If you had buttons laid out that were red and green, the boundary between them wouldn’t be obvious. This is one quick on-device way of testing whether your app is accessible to all users, but there are many other ways.

“Don’t keep activities”

Have you ever tried to debug issues in your app that were maybe only impacting users who have lower memory devices or happening in low memory conditions?

As an Android developer, often I have higher end devices that have good memory. I don’t run into some of the problems that people with real devices out in the world do. One way that you can simulate this, and give yourself some empathy for folks with lower-end devices, is to turn on the don’t keep activities option. It’s going to destroy every activity as you leave it, as if the OS had killed it to reclaim that memory.

It can simulate the unpredictable life cycle behaviors. Anywhere that you’re dealing with SaveInstanceState, you’re going to be able to find bugs with this option turned on. If you want to be able to more frequently or predictably run through these cases, turn on “don’t keep activities”, and things will start getting killed off and you’ll be able to test this behavior.

“Background process limit”

Since we’re in the killing things off mindset, we can look at the background process limit option as well. The options here are arbitrary: Standard limit, which is 20 processes, none, or one through four. The reason that you’d want to do this is similar to why you would want to not keep activities. You want to run through the OS lifecycles of your app and the operating system can kill off background processes when it chooses to. Is your app resilient to that. Run no background processes and see what happens.

There are some behavior in your app that may depend on background process. If you’re trying to fetch data and it doesn’t happen because there’s no background processes, it’s not necessarily a bug in your app but, does your UI respond to that in a way that makes sense? Is your zero state good? Do your error conditions make sense? You can test that.

“System UI demo mode”

Last but not least! Don’t take hundreds of screenshots without turning on the system UI demo mode.

This makes all your different icons at the top, and your time, and turn them into a consistent, same time, same icons on every screenshot. The only reason I saved this option for last is so you wouldn’t be looking at all of my screenshots throughout this whole presentation, pointing this out. About halfway through I turned it on.

Now you’ll have some tools that you need to debug and to develop better applications for everybody else with no external tools necessary.

Next Up: New Features in Realm Java

General link arrow white

About the content

This talk was delivered live in July 2017 at 360 AnDev. The video was recorded, produced, and transcribed by Realm, and is published here with the permission of the conference organizers.

Andrea Falcone

Andrea Falcone is a Senior Software Engineer at Google, in the Developer Product Group. She has worked on many parts of Fabric, building the Android tooling, including Android Studio Plugin, Crashlytics Android SDK, Gradle plugin and Beta by Crashlytics. She is currently working on fastlane, an open source release process automation suite.

4 design patterns for a RESTless mobile integration »

close