Optimizing the Conversion Rate

Project Overview
Sign-up conversion was a primary focus of our Optimization Squad. Our team was effective, collaborative, and lean. We built our own split-testing product, launched many winning tests, and cast vision for future iterations of Signup.

Contribution:
Collaborating closely with the Product Manager and Developers to create a process for running split tests. Brainstorming ideas with the team for tests. Doing qualitative research to understand how customers experience the funnel. Visual design for tests.
Sign-up funnel before optimization
Split Testing
A lot of people have strong opinions about split testing. Our view is this: if you don’t split test, you are missing out on potential conversion rate increases and the ability to safely ship new experiences.
Process
Our approach was simple but effective. We started by creating our own split-testing platform and data management system. This allowed us to measure all aspects of our funnel. We started by understanding our baseline. Then we identified the parts of the funnel with the largest drop-off. We focused most our energy trying to improve those areas.
Guidelines
We established guidelines to safeguard the funnel and force us to test responsibly. First, we decided that the most important metric for us to optimize for was “Funded Accounts”. This was the last step in the funnel. Second, we would wait for results to be statistically significant before we implemented new test designs into our funnel. Third, we would test one variable at a time so we could understand what actually influenced the conversion change.

Split Tests

1. Move "Account Creation" Step
We were asking leads to verify their email before creating an account. We tested switching the order and saw a 152% increase in conversion. It turns out, taking your leads away from the funnel to their email too early is not a happy path for us. Creating an account before they verified their email significantly improved the likelihood of completing sign-up.
Move the "Account Creation" Step
2. Redesign "Email Verify"
Our email verification screen hadn’t been updated in 5 years. The layout was complex. It had a lot of text and too many colors. It didn’t match our other screens. So we tested a new design. I wanted to bring clarity to the action of verifying your email. This resulted in a 3.5% increase for on-page conversion and 1.4% increase in funded accounts.
Redesign of the "Verify Email" screen
3. Reduce Copy on "Personal Info"
Our entire funnel had lots of copy on every screen. The assumption was, it helped users understand why we were asking for each piece of information. This results of our split-test told another story. We ran this as a multi-variate test. [Control] vs. [Reduced Copy] vs. [Reduced Copy + Gender Removed]. [Reduce Copy + Gender removed] won. This resulted in a 7.6% increase in funded accounts.
Reduce copy on the "Personal Info" screen
4. Remove "Your Money" Screens
The original funnel was created to sign-up for mutual funds, not a cash management account. We asked really personal financial questions about things like estimated income, marital status, and investment experience. The thinking was, if we ask for this up front, then we don’t need to ask it during cross-sell. However, qualitative data indicated people felt uncomfortable answering these questions. During a user test, a participant asked, “How do I calculate my Liquid Net Worth?" and "Will my income amount affect my eligibility for this account?”

We wanted to test removing these screens from the signup funnel.

It was a very complex test. We had to remove it from our current funnel without breaking our complex api structure. Remove it from the progress indicators on mobile and desktop. And then we had to make sure we added it to our cross-sell funnels.

Removing these two screens from our funnel resulted in a 14% improvement over baseline.
Remove Employment & Finances and Investment Experience screens
Optimizing Other Metrics
Not every change we made to the funnel increased conversion rate. Some of our efforts were dedicated to increasing opening deposit. That said, we would not implement a test if it significantly decreased the amount of funded accounts.
5. Move "Link Bank Account"
We asked for Opening Deposit and PWIF (“Pay What is Fair” aka set your own bank fee) before they linked an external bank account. This was problematic in a few ways. User testing indicated that people expect to see some sort of confirmation after opening deposit. They did not expect to see PWIF and Link Bank Account. Also, this order prevented us from using Plaid data (transactional and account balance) to serve personalized tests. We tested moving Link Bank Account before Opening Deposit. This only had a conversion rate change of +1%.
Moving our Link Bank Account earlier in the funnel
6. Introducing "Recurring Deposits"
In our investment funnels, we have a screen that allows users to schedule recurring investments. We wanted to test this idea in our cash management account funnel too. Our hypotheses were pretty pessimistic. We thought it would significantly decrease conversion and only slightly increase deposits. We were wrong. This is why I love testing.

Our recurring deposit screen only slightly decreased conversion by -1.45%. However, it was adopted by 12.33% of users. And the average monthly deposit amount was $323.93 (Up from $0—that's a big win).
Adding "Recurring Deposit" to our funnel
7. "Opening Deposit" Tests
Remove $10 Helping Text
The original Opening Deposit screen had an empty form field with helper text that said “$10 minimum deposit”. We removed the text and only showed it when users typed less than $10. This increased the opening deposit amount by 77%.

Dynamic Buttons
Then we created a test that had suggested deposit buttons. The hypothesis built off the concept of anchoring. If you anchor with higher amounts, opening deposits will be higher.

How did we choose the amounts? We looked at historical data of the relationship between original balance and deposit amount. For example, if your external bank account had $500 in it, you were likely to transfer 20% of your balance. We would show you numbers like $100, $250, and $375 with an open form field for a custom amount. My role in this test was the ideation and conceptualization. Special thanks to Dana Calderone (PM) and Michael Traquir (Designer) for doing the data analysis and visual design for this concept.

Showing users dynamic buttons increased average Opening Deposit amounts by 25% and maintained the same conversion rate for funded accounts.
Dynamic Opening Deposit Suggestion Buttons
Summary
We built a powerful split-testing product, increased conversion through an iterative process, and improved our signup funnel. We helped way more people sign-up and saved Aspiration a ton of money on customer acquisition. And our rapid development of tests led us to envision an even better sign-up experience.

Thank you, Matt Lee (VP of Product) for protecting and nurturing our squad in the early days. Adam Stone (PM), your stalwart focus on repeatable and scalable processes was hugely impactful on our team and set us up for much success. Matthew Gonzales, your full stack abilities were incredible to behold. It was a pleasure creating with you all. Dana, Tiffany, Kevin, Vernie, Matt U, you guys carried the torch well!