Recently, the Stardust app experienced a major leap forward in growth that most apps dream about. After only six weeks, their revenue and paywall conversion rate grew more than 400%.
Even more incredible is the speed of iteration that enabled this app to grow so quickly, all without engineering work.
Let’s dig into how Stardust’s growth was more than a pipe dream!
Where this story begins
Stardust is an app that stands out in the market. It was founded by a group of women who weren’t happy with the cycle and pregnancy tracking apps they saw in the App Store. From their view, most apps in this space were too pink, too flowery, and were missing something delightful in the experience they wanted.
So, they built one that they’d want to use. Turns out, others did, too. During the last two years, more than 1.5M users have downloaded Stardust.
In early 2023, someone suggested they add a paywall in the app. That paywall consistently converted installs into paying subscribers at a serviceable rate. After more than a year, though, the paywall and pricing hadn’t evolved from their first iteration and it wasn’t matching the growth potential the team believed it was capable of.
So, how did an app with consistent downloads grow its proceeds per user by more than 400% in only a couple of months? Let’s dig in!
The three growth levers Stardust used to kick-start growth
When an app starts its experimentation journey, specific growth levers tend to have the most opportunity and several best practices allow those opportunities to accelerate. This is especially true if the app already has consistent traffic and baseline conversion data.
1. Start with pricing and packaging
In pricing and packaging optimization, a user’s perception of value has as much to do with price presentation design as the actual prices or introductory offers. It includes what product(s) you display initially, how you show comparisons between products, and what is included in those plans.
In Stardust’s case, they hadn’t done price testing and also wanted to learn whether a free trial would increase overall paying users. Additionally, their two plans had always been displayed plainly, showing price and duration without any relative savings or equivalent prices.
We identified two specific tests by asking the following questions:
Will adding a free trial increase conversion rate and proceeds per user, and if so, how does changing the price impact conversions?
Does optimizing the price presentation design by adding a discount percentage and equivalent pricing increase the conversion rate?
Free trial and higher price test
In the first experiment, the control and five price variants were tested, including price sets with and without free trials.
The result was a conversion rate increase for every variant offering a free trial, including a stunning 131% increase in paywall conversion rate for the variant adding a free trial to the control’s yearly price of $24.99. Higher price variants without a trial had a lower conversion rate and similar proceeds per user.
The first test was paused to allow all trials to conclude so trial conversion rate data could mature; an essential part of running any price test with trials. After all trials concluded, it was clear that adding a trial increased average proceeds per user by 78% and 45% more paying subscribers. Additionally, the new variant increased the share of yearly subscriptions by more than 50% over the control.
The winning variant: $24.99 per year with a free 7-day trial.
Price packaging design test
While the first test was paused, the second was launched, which kept the control pricing for all variants ($24.99 per year and $2.99 per week) but changed the packaging design. These three variants were tested:
Add a discount percentage on the yearly plan and show weekly equivalent pricing
Lead with the yearly plan and make both options visible by tapping a “More Pricing” button, revealing a product drawer
Only show the yearly plan
The winning variant: add a discount percentage on the yearly plan and show weekly equivalent pricing.
As it turns out, showing a discount percentage on the yearly plan and weekly equivalent pricing increased the conversion rate by 50%. But more importantly, proceeds per user increased by 62.5% due to a 30% increase in the share of new yearly subscriptions.
The other variants in this experiment were both conversion rate and proceeds per user losers.
2. Layer experiments and combine winners
If you noticed that the second test ran while the first test’s trials were still concluding and thought that was a smart idea, good for you!
Timing tests like this is called experimentation layering. It's a key practice to accelerate the number of experiments you’re able to run and thus influences how quickly you’re able to grow. Doing this allows you to only run tests based on how long each variant takes to gather enough users into the cohort, and then run another experiment while the first test’s trial conversion or retention data is gathered.
At this point, Stardust had two winning variants: a set of prices with a free trial and a new price presentation design.
The winning variant from test two became Stardust’s new control paywall and the first variant in the new experiment would combine test 1 and 2’s winners.
Because there was enough traffic to run additional variants quickly, several more were added, including…
The combined winners variant with a monthly plan instead of weekly
Retesting a higher price variant with a trial but in the new price presentation design
A combined winners variant with a video background design instead of a text-heavy design
If you bet the combined winners variant would combine the growth from the first two tests, you’re right.
In Stardust’s third test, their combined winners netted another whopping 55.7% increase over the new control and a 9% increase in proceeds per user. However, this was not the winning variant from this test.
The winning variant: the combined winners variant with a video background design.
With a 67% paywall conversion increase and a 12.5% proceeds per user increase, the combined winners variant with a video background showed that combining a design change was another lever to unlock for Stardust. Enter the next phase of testing.
Note: The key to experimentation layering is picking the right tests to run so that the following tests try to answer questions that aren’t dependent on the results of the first test. One of the benefits of Superwall is the ability to rapidly launch experiments and take full advantage of experimentation layering.
3. Test paywall designs when you’ve optimized pricing and packaging
Test three showed the potential of that combining design changes with optimized pricing and packaging. For test four, Stardust was eager to test a trial timeline design.
The winning variant: trial timeline design.
While the video background design signaled the quality of the app, showing the trial timeline design further convinced more users to start a trial (13.6% conversion rate increase) without impacting the trial-to-paid conversion rate.
Total results
So after several tests and compounding results, where did Stardust net out?
If you recall, the app’s original conversion rate was working but didn't match expectations. By the completion of the fourth test, that conversion rate had grown by 465%.
Install-to-paying subscriber rate increased by 252%. Additionally, average proceeds per user increased by 408%.
Finally, the increase in yearly subscriptions over the original paywall design was significant, and will be a nice revenue bonus one year from now when they renew.
Unlocking Stardust’s growth potential
Stardust is a great product with a high-quality team behind it. With growth numbers like theirs, the question isn’t why weren’t they doing this before, but what was blocking it?
Stardust had plenty of ideas, but they’re a lean team, which doesn’t often leave a lot of room for design and engineering to build, launch, and iterate on experiments. The big unlock that allowed Stardust to test their backlog of ideas was using Superwall to remotely build paywalls without code and ship A/B tests without engineering resources.
They were free to run experiments at the pace of their users, not their sprint capacity.
What’s next for Stardust?
The thing about experiment wins is they make you hungry for more. Stardust has several new experiments poised to answer new questions.
They’re also exploring using Superwall to experiment with other experiences as well, such as on the screens users see right before a paywall and the pregnancy tracking experience.