Deep Dive into writing Performance Tests with @JankTest

Marcos Holgado
ProAndroidDev
Published in
8 min readApr 12, 2019

--

I’ve always been interested in performance testing, back in February I gave a talk about how to write and automate performance tests at MobOS. Today I want to show you what the majority of talk was about which is how to use and improve the test jank library that you probably didn’t know Android provides us.

You can find all the code from this article and much more at this GitHub repository which I created for this talk:

https://github.com/marcosholgado/performance-test

You should be able to follow the different approaches and enhancements step by step looking at the tests in that repo but I will point to different files in this article so you don’t get lost. If you want to jump to the final solution just have a look at the sixthtest folder.

A very quick introduction to performance testing

In order to write these tests we need an app first. I have written a very simple app that uses a RecyclerView and an Adapter to display a bunch of rows. The adapter has some code to simulate work on the main thread to create jank, nothing special there.

You have probably already read about the magic 16ms or 60 frames per second. The way this works is kind of simple. Our brain is continuously processing signals that our eyes are sending, these are “static images” but if we show a lot of those images quick enough we can trick our brain into perceiving motion. It turns out that 60 pictures per second (or as we call them, frames) is that perfect number where everything just looks smooth to our eyes (to our brain actually).

If you do the math, 60 frames per second equals to around 16ms that we have available to render each of those frames. If a frame takes more than 16ms to be rendered it will jank and create a sluggish and clunky user experience.

How to measure performance in Android

We mainly have two ways to measure the performance of our apps that we take advantage of to write performance tests: FrameMetrics and Dumpsys.

FrameMetrics was introduced with Nougat as a way to access graphic metrics in your app without having to use adb shell, etc. with FrameMetrics you can get information about a bunch of metrics in real time and by using some of them you can eventually get the amount of janky frames and use that in your tests. However this requires to do some calculations and I personally think that most of the metrics that you can get are not very helpful for a normal developer that just worries about janky frames and a very simple performance test.

The second options is using adb shell dumpsys gfxinfo. Dumpsys is an Android tool that runs on the device and dumps interesting information about the status of system services. Passing the gfxinfo command to dumpsys provides an output in logcat with performance information relating to frames of animation that are occurring during the recording phase. What’s even better is that, since Android 6.0 (API level 23), dumpsys prints out aggregated analysis of frame data collected across the entire lifetime of the process.

The JankTestHelper library

First things first, we need to import the jank test helper library into our project. If you look at the AndroidX migration website, the dependency looks like this:

androidx.test.jank:janktesthelper:1.0.1

But that doesn’t work. I raised a bug back in November with this problem and hopefully it will get fixed at some point. Until then you need to use this dependency instead:

androidx.test.janktesthelper:janktesthelper:1.0.1

This library gives us a few things:

  • GfxFrameStatsMonitor annotation used to configure a gfx framestats monitor.
  • GfxMonitor annotation used to configure a gfx monitor.
  • JankTest annotation used to configure a jank test method.
  • IMonitor interface used to define a class that monitors test methods and collects metrics.
  • IMonitorFactory interface used to define a class that constructs IMonitors for test methods.
  • JankTestBase is the base test class for measuring jank.
  • UseMonitorFactory annotation used to override the default IMonitorFactory implementation.

In a nutshell it works as you can see in the diagram below.

Our test class has to extend JankTestBase which will use a monitor to monitor the jank. Before each test, the monitor will call startIteration(), then, the test will run and once it finishes the monitor will call stopIteration() and send the results back to the JankTestBase class to either fail or pass the test based on those results.

Using JankTestBase

Let’s begin by writing a simple test to scroll up and down several times. The only purpose of this test is to detect jank.

Like I said before, we start by extending JankTestBase and then write our test. You can see that I’m annotating the test with @JankTest and passing three values.

  • beforeTest is a method that we want to execute before each test runs.
  • expectedFrames are the total frames that we expect to render on this test, anything less than this number will fail the test.
  • defaultIterationCount is the amount of times that we want to repeat this test. This helps to avoid false positives by running the test several times.

I explained before that we have to provide a monitor to the JankTestBase class, we do that by using yet another annotation. In this case we use @GgxMonitor, also provided by the jank test helper library. In order for this monitor to work we have to pass the process name (aka the package name) of the activity that we are testing. The monitor takes a process name because internally uses dumpsys gtxinfo to collect all the relevant data.

beforeTest = “launchApp”

In case you are wondering what this is and why do we need it, here is the explanation.

We are not using any activity test rules in our test and so the activity won’t be automatically launched for us. But we need a way to launch the activity and that is what this method is going to do. If we create that launch method the test now would look like this:

We get an instance of a UIDevice and use the package name to launch the activity. This is obviously not great, it adds a lot of boilerplate and is a bit confusing. Is also worth saying that I’m using UIAutomator rather than Espresso but you can also use Espresso for all the UI interactions.

The problems with JankTestBase

Unfortunately the JankTestBase class has quite a few problems.

First of all we have all the boilerplate that we just saw. It is very annoying having to do something like that when we could be using an activity test rule instead.

Secondly, it only allows us to test against the total number of frames that we are expecting to render. How do you calculate that number? Is it going to be any accurate? Those are very legitimate questions and the answers are not straightforward. You could use the 16ms rule to calculate how many frames to expect in X seconds but, do you really want to do that? It would be better if you could use some other metrics, like the different percentiles or the percentage of janky frames instead. An example would be “I don’t want to have more than 10% of janky frames” rather than “I want to render at least 450 frames”.

There are also problems with the monitors, in some cases depending on the device or the OS version you may get this exception:

junit.framework.AssertionFailedError: Failed to parse NUM_FRAME_DEADLINE_MISSED

And these monitors just return one metric although they have a method that’s not being used which returns all the metrics in a Bundle.

In order to fix the problems mentioned above you can write your own monitor to avoid the frame deadlined missed issue or to use a different metric. This should be easy enough, just copy&paste the GfxMonitor and change whatever you want but unfortunately is not.

The main reason is because depending on the metrics that the monitor sends, you may want to perform different comparisons. JankTestBase always does a greater or equals comparison which doesn’t work in all cases like when we want to have less % of janky frames than X or when using percentiles.

assertTrue("Too few frames received...",
numFrames >= annotation.expectedFrames());
}

So here we are, having to write yet another class. The last problem is that JankTestBase extends from InstrumentationTestCase which is deprecated. Do you really want to extend from a deprecated class? Probably not.

What we really want is something like this:

An annotation that I can use to do performance testing where I provide the process name (needed for gtxinfo), the type of metric that I want to test against, the threshold that I want to use and the type of assertion (although this could change depending on the metric to avoid this extra parameter).

Note: I’m still using UIAutomator for the UI interactions hence the device, UIScrollable, etc.

Final solution

With that in mind, let’s see what we can do. First of all we need to replace the JankTestBase class with a new Activity performance test rule. If you remember from the diagram above, there are 4 things the JankTestBase class does:

  1. Starts the monitor before the tests.
  2. Runs the tests.
  3. Stops the monitor and gets the results.
  4. Evaluates the results.

In our new activity test rule we first need to get the annotation to know which metric do we need to use, threshold, etc. We can do that at the apply method since we can get the annotations from the test method by using the description attribute that we get from apply. We also create the monitor at this stage but it could be based on another parameter of the annotation.

Secondly we need to start the monitor, fortunately we have one method that we can override just for that, beforeActivityLaunched().

Lastly we stop the monitor, collect the results, analyze them and either fail or pass the performance test. Like with start monitor there is another method that we can override to do this, afterActivityFinishes().

I we put all the code together it looks like this:

I won’t put the monitor code here but instead I’ll give you a few snippets. If you want, you can check out the full class here:

https://github.com/marcosholgado/performance-test/blob/master/app/src/androidTest/java/com/marcosholgado/performancetest/sixthtest/PerfMonitor.kt

When we start a new iteration the monitor clears out any previous data by executing:

val stdout = executeShellCommand(
String.format("dumpsys gfxinfo %s reset", process)
)

When it stops the iteration, it calls dumpsys again to fetch the performance data:

val stdout = executeShellCommand(String.format("dumpsys gfxinfo %s", process))

Then it reads from the output trying to parse the frame stats that we care about which are the ones you can see below:

// Total frames rendered: ###
// Janky frames: ### (##.##%)
// 50th percentile: ##ms
// 90th percentile: ##ms
// 95th percentile: ##ms
// 99th percentile: ##ms

If the line that is parsing matches a set of regular expressions it then stores the value in an EnumMap with its type of metric as a key (JankStat).

private val accumulatedStats = EnumMap<JankStat, List<Number>>(JankStat::class.java)

Finally it puts all those metrics in a Bundle and returns that to the activity test rule so it can check wether it should fail or not.

The annotation class is very simple so I’ll let you check the GitHub repository to see how is implemented:

https://github.com/marcosholgado/performance-test

Remember that the final solution is in the sixthtest folder and you can also check the previous tests to follow all the steps and different solutions that I found before getting to this one.

That was everything for this article, I hope you learned a bit more about performance testing and how you can use it in your integration tests. As always if you have any questions or a better solution or ideas please leave a comment or reach out on Twitter.

--

--

Senior Android Developer at DuckDuckGo. Speaker, Kotlin lover and I also fly planes. www.marcosholgado.com