The Importance of Information in A/B Testing
“ipsa scientia potestas est” (knowledge itself is power)
– Sir Francis Bacon
We all make decisions every moment of every day of our lives. These decisions can range from banal, ‘I should have another cup of tea’ to life-changing, ‘I’ve decided to ask Mary to marry me’ and everything in between.
When we make these decisions, consciously or not, we consider a lot of different values in our minds; ‘Do I have milk or would I need to go to the shop?’, ‘Do I think Mary would say yes?’. Our ultimate decision is generally a judgement deriving from these values weighted against relevance and potential outcomes. After all, deciding to propose is usually a bigger decision than whether or not to have another cup of tea.
While emotions and potential outcomes play a role in our decision making, information is fundamental. Information is the foundation from which we build our options, the skeleton from which we sculpt our model.
In this post, I would like to show you a glimpse of collecting and utilising this information from A/B testing in regards to software, specifically mobile apps, and using it to influence our decisions. It is the information we gather through A/B testing that helps us form a good decision-making strategy.
An Informed Strategy
Strategy plays a part in most, if not all, decision making in business. And information plays a big part in knowing what strategies are available to you, and which strategies work for which situation.
So let’s say you want more engagement on your app and you decide to change your app icon. But before you do this, you want to test your new icon to see if it will have any impact on your engagement rate.
Changing an app icon is not a decision to be taken lightly. Your app icon is your brand ambassador. It acts as the entry point for your user. It’s the first thing they see on their home screen. It should do everything from neatly and concisely suggesting your design principles to inviting users to tap into it and keeping the app in your users mind when they aren’t using it. A/B testing is a really simple and useful tool when considering changing your app icon.
So how do we begin?
By gathering data of course.
A, or B?
For the purposes of this post, we will look at Android A/B testing. Google Play has an extremely powerful suite of tools for just our type of data gathering.
iOS A/B testing is a bit more involved but also possible.
A/B testing hinges on a simple concept, show some of your users one thing (A) and some of your users another thing (B), and then collect data on both and compare the difference.
If your A group opens your app more than your B group, that suggests that your A changes resonated more with your users.
I use the term “suggests” because data can sometimes be viewed through a biased perspective or be missing crucial contextual data.
Without further ado, let’s take a look at setting up our A/B store listing test in Google Play.
Experimenting Shouldn’t Finish at College
Generally, the first stage of A/B testing is deciding on what you would like to test. This sounds obvious, but remember that at least some of your users will be in the testing group. If your test is not well thought out you may end up driving them away from the app by testing an unneeded, unwanted, or poorly implemented idea.
A/B testing should begin with a hypothesis, in our case “We would get more engagement from users if we changed our app icon”. This helps us narrow our scope to just the app icon and not a huge overhaul of the app which should be tested more carefully and with smaller testing groups.
“Start by identifying a problem that you want to solve. For example, have your conversions dropped off? Have traffic patterns changed? Have your demographics shifted? A close examination of trends in your Google Analytics behavior reports is a great place to start.”
Test Title Please Ignore
Now that we have our hypothesis, we should begin creating our test. On Google Play, this is under the term “Experiment”.
So in our test, we will be showing 50% of users who see our store listing our “A” app icon, and 50% our “B” icon. Tweak this as you see fit in your own experiment. It might be prudent in riskier experiments to have a lower percentage of the audience testing your variants.
The Results Are In
After a couple of days, maybe even a week or two, you can check the results.
The Google Play Store will tell you which variant is winning and even allow you to apply that variant to all users.
This is a high level overview of testing one thing on the Google Play Store, it can get a lot more detailed and involved if needed.
When it comes to decision making, playing the odds can be relatively safe compared to trusting your gut (or your marketing team’s gut) but it also has downsides.
If I had asked my customers what they wanted they would have said a faster horse.Henry Ford (Possibly apocryphal)
Chasing data is akin to chasing trends, you will always be behind unless you can read the data in ways no one else can. In other words; read between the lines.
There are three kinds of lies: lies, damned lies, and statistics.Benjamin Disraeli (Again, most likely apocryphal)
Another potential problem with data is the conclusions drawn from it can vary based on our interpretation.
Someone might see a user spending a long time on a page and think the user is engaged. While another sees a user spending a long time on a page and wonders if it’s an issue with unclear navigation.
There’s an art to data interpretation.
In saying that, A/B testing is a powerful tool in your arsenal of information gathering. It could put you ahead of competitors who rely on instinct or outdated preconceptions. Don’t be afraid to use it.
And remember: Information is power.