Learnings from 200+ App Store Conversion Tests

Item not bookmarked
Resource bookmarked
Bookmarking...
⛏️
Guest Miner:
Sylvain Gauchet
Review star
Review star
Review star
Review star
Review star
💎  x
9

Sharath Kowligi (Director of Ad Monetization at GameHouse - game publisher - and advisor to RocketShip HQ) joins Shamanth Rao to share the learnings from having run over 200 app store conversion experiments.

Source:
Learnings from 200+ App Store Conversion Tests
(no direct link to watch/listen)
(direct link to watch/listen)
Type:
Podcast
Publication date:
March 4, 2020
Added to the Vault on:
March 6, 2020
Invite a guest
These insights were shared through the free Growth Gems newsletter.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
💎 #
1

There is a lot of evidence showing that the 2 main things to test on the app stores are:
1. Icon
2. Feature graphic (video thumbnail).

05:40
💎 #
2

The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.

13:35
💎 #
3

Putting testing cadence first is a great way to build a creative team. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful

15:30
💎 #
4

For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

18:17
💎 #
5

If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

18:32
💎 #
6

At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.

19:40
💎 #
7

Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.

21:20
💎 #
8

For icons, headers, copy, etc. testing, use Returned Users (vs. Installed Users).

23:22
💎 #
9

For conversion testing and onboarding testing, use Google Play's 7-day look back window.

23:49
The gems from this resource are only available to premium members.
💎 #
1

There is a lot of evidence showing that the 2 main things to test on the app stores are:
1. Icon
2. Feature graphic (video thumbnail).

05:40
💎 #
2

The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.

13:35
💎 #
3

Putting testing cadence first is a great way to build a creative team. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful

15:30
💎 #
4

For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

18:17
💎 #
5

If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

18:32
💎 #
6

At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.

19:40
💎 #
7

Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.

21:20
💎 #
8

For icons, headers, copy, etc. testing, use Returned Users (vs. Installed Users).

23:22
💎 #
9

For conversion testing and onboarding testing, use Google Play's 7-day look back window.

23:49
The gems from this resource are only available to premium members.

Gems are the key bite-size insights "mined" from a specific mobile marketing resource, like a webinar, a panel or a podcast.
They allow you to save time by grasping the most important information in a couple of minutes, and also each include the timestamp from the source.

💎 #
1

There is a lot of evidence showing that the 2 main things to test on the app stores are:
1. Icon
2. Feature graphic (video thumbnail).

05:40
💎 #
2

The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.

13:35
💎 #
3

Putting testing cadence first is a great way to build a creative team. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful

15:30
💎 #
4

For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

18:17
💎 #
5

If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

18:32
💎 #
6

At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.

19:40
💎 #
7

Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.

21:20
💎 #
8

For icons, headers, copy, etc. testing, use Returned Users (vs. Installed Users).

23:22
💎 #
9

For conversion testing and onboarding testing, use Google Play's 7-day look back window.

23:49

Notes for this resource are currently being transferred and will be available soon.

The increase in tests

Historically it was very hard to track changes and perform A/B tests. Now:

  • Still tricky to test properly on iOS,
  • Possible on Google.

Google Play Store has become much more important to publishers and Google provides: more data, more granularity, better identification throughout the funnel.

→ even if the tests are done only on Google Play, it can still have a huge impact.


Testing process

Visuals (vs. copy) are where you should start.

[💎@05:40] There is a lot of evidence showing that the 2 main things to test are: 1. Icon 2. Feature graphic (video thumbnail).


"The app stores really is just good packaging for your app"


For a Doctor game they tried an out-of-the-box idea for the icon that nobody thought would work and they add a +40% increase for that variant.

In this case the variant touched a very emotional part of the game story, although they didn't see it at the time.

A/B testing when done right (or when you get lucky) makes a huge difference. It changed the top of the funnel, and even the game story.


Cadence over ideas

The biggest thing is the testing process vs. generating ideas, because you never know which one is the big idea.

[💎@13:35] The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.


Some variants that can appear very different make no difference as well. Sometimes the audience just doesn't care.


[💎@15:30] Putting testing cadence first is a great way to build a creative team. Because any good creative team has people who believe in what they're doing. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful.


Process/Structure

People the most involved in the testing process: growth team (selling the app) and studio (making the app). Whoever has the responsibility of a specific part is who does it.


[💎@18:17] For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

[💎@18:32] If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

Stop the test after 7 or 8 days if you have no result.


[💎@19:40] At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.


Approaching iOS testing

Some companies have totally different results on iOS vs. Google.

For the US, Shirath (for the apps he's seen) doesn't see a huge difference between iOS and Android when it comes to something being a killer creative.

  • Proxy/landing page might be better for some (3rd party tools),
  • Carrying Android results to iOS better for others.


[💎@21:20] Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.


Cost of implementation on iOS is higher: developing, submitting, waiting, etc. Also, using a 3rd party tool ("fake app store page") adds a layer/step to the funnel.


For their casual segment, a huge win on Android typically translates into a win on Apple. So they deploy on iOS.


KPIs to evaluate on Google Play when testing

  • [💎@23:22] For icons, headers, copy, etc. use Returned Users (vs. Installed Users)
  • [💎@23:49] For conversion testing and onboarding testing use Google Play's 7-day look back window


D7 benchmark of 20% is a good number for the kind of apps he works on.


The notes from this resource are only available to premium members.

The increase in tests

Historically it was very hard to track changes and perform A/B tests. Now:

  • Still tricky to test properly on iOS,
  • Possible on Google.

Google Play Store has become much more important to publishers and Google provides: more data, more granularity, better identification throughout the funnel.

→ even if the tests are done only on Google Play, it can still have a huge impact.


Testing process

Visuals (vs. copy) are where you should start.

[💎@05:40] There is a lot of evidence showing that the 2 main things to test are: 1. Icon 2. Feature graphic (video thumbnail).


"The app stores really is just good packaging for your app"


For a Doctor game they tried an out-of-the-box idea for the icon that nobody thought would work and they add a +40% increase for that variant.

In this case the variant touched a very emotional part of the game story, although they didn't see it at the time.

A/B testing when done right (or when you get lucky) makes a huge difference. It changed the top of the funnel, and even the game story.


Cadence over ideas

The biggest thing is the testing process vs. generating ideas, because you never know which one is the big idea.

[💎@13:35] The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.


Some variants that can appear very different make no difference as well. Sometimes the audience just doesn't care.


[💎@15:30] Putting testing cadence first is a great way to build a creative team. Because any good creative team has people who believe in what they're doing. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful.


Process/Structure

People the most involved in the testing process: growth team (selling the app) and studio (making the app). Whoever has the responsibility of a specific part is who does it.


[💎@18:17] For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

[💎@18:32] If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

Stop the test after 7 or 8 days if you have no result.


[💎@19:40] At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.


Approaching iOS testing

Some companies have totally different results on iOS vs. Google.

For the US, Shirath (for the apps he's seen) doesn't see a huge difference between iOS and Android when it comes to something being a killer creative.

  • Proxy/landing page might be better for some (3rd party tools),
  • Carrying Android results to iOS better for others.


[💎@21:20] Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.


Cost of implementation on iOS is higher: developing, submitting, waiting, etc. Also, using a 3rd party tool ("fake app store page") adds a layer/step to the funnel.


For their casual segment, a huge win on Android typically translates into a win on Apple. So they deploy on iOS.


KPIs to evaluate on Google Play when testing

  • [💎@23:22] For icons, headers, copy, etc. use Returned Users (vs. Installed Users)
  • [💎@23:49] For conversion testing and onboarding testing use Google Play's 7-day look back window


D7 benchmark of 20% is a good number for the kind of apps he works on.


The notes from this resource are only available to premium members.

The increase in tests

Historically it was very hard to track changes and perform A/B tests. Now:

  • Still tricky to test properly on iOS,
  • Possible on Google.

Google Play Store has become much more important to publishers and Google provides: more data, more granularity, better identification throughout the funnel.

→ even if the tests are done only on Google Play, it can still have a huge impact.


Testing process

Visuals (vs. copy) are where you should start.

[💎@05:40] There is a lot of evidence showing that the 2 main things to test are: 1. Icon 2. Feature graphic (video thumbnail).


"The app stores really is just good packaging for your app"


For a Doctor game they tried an out-of-the-box idea for the icon that nobody thought would work and they add a +40% increase for that variant.

In this case the variant touched a very emotional part of the game story, although they didn't see it at the time.

A/B testing when done right (or when you get lucky) makes a huge difference. It changed the top of the funnel, and even the game story.


Cadence over ideas

The biggest thing is the testing process vs. generating ideas, because you never know which one is the big idea.

[💎@13:35] The way to approach testing is to be systematic: it's to think about the testing cadence instead of trying to find THE big idea. Because once a team gets into the rhythm of testing, that's when you get better at it.


Some variants that can appear very different make no difference as well. Sometimes the audience just doesn't care.


[💎@15:30] Putting testing cadence first is a great way to build a creative team. Because any good creative team has people who believe in what they're doing. It lowers the ego threshold of being right and being wrong and puts the focus on doing things that are meaningful.


Process/Structure

People the most involved in the testing process: growth team (selling the app) and studio (making the app). Whoever has the responsibility of a specific part is who does it.


[💎@18:17] For most B2C, you have to give the test at least 7 days. It allows you to run about 50 tests a year for one app.

[💎@18:32] If you want to do more tests you can also split between markets that you know behave the same way for YOU (e.g. could be US/UK) to run tests in parallel.

Stop the test after 7 or 8 days if you have no result.


[💎@19:40] At least once a week, report on experiments whether they are done on the paid advertising side, the product side or growth in general and even on the CRM side.


Approaching iOS testing

Some companies have totally different results on iOS vs. Google.

For the US, Shirath (for the apps he's seen) doesn't see a huge difference between iOS and Android when it comes to something being a killer creative.

  • Proxy/landing page might be better for some (3rd party tools),
  • Carrying Android results to iOS better for others.


[💎@21:20] Once you run the experiments 1/2/3 times and try on iOS what works on Android, you'll know if this is something you can do. If a killer creative on Android doesn't make an impact on iOS, then you can't carry results from Android to iOS.


Cost of implementation on iOS is higher: developing, submitting, waiting, etc. Also, using a 3rd party tool ("fake app store page") adds a layer/step to the funnel.


For their casual segment, a huge win on Android typically translates into a win on Apple. So they deploy on iOS.


KPIs to evaluate on Google Play when testing

  • [💎@23:22] For icons, headers, copy, etc. use Returned Users (vs. Installed Users)
  • [💎@23:49] For conversion testing and onboarding testing use Google Play's 7-day look back window


D7 benchmark of 20% is a good number for the kind of apps he works on.