Tuft & Needle / 2021

Improving conversion rate through a focus on usability

A graph of our projected versus actual revenue.
Problem

An increasingly competitive marketplace was lowering site conversion rate and decreasing company revenue.

Result

Through a series of A/B tests focusing on usability and conversion, we increased annualized revenue by $35M.

My Role

Product Management, Strategy, Leadership, UX Direction, Mentorship


Tuft & Needle was struggling to compete in the increasingly crowded mattress industry. Competitor copycats made differentiation incredibly hard while competitors’ shady marketing tactics drowned our message.

We chose to focus on increasing the usability of our site as the primary way to increase conversion rate. I pulled together a team of UX designers, user researchers, engineering managers, and project managers across three product teams to work on improvements spanning the site funnel.

Identifying problem areas through a usability benchmarking study

To help identify problem areas (and serve as a means of measuring improvements), our user researcher conducted a usability benchmarking study across 15 metrics on nine major site pages. Meanwhile, I used Google Analytics to review page performance and identify the most significant areas of drop-off in our funnel.

Following our initial research, I facilitated three ideation workshops, one each with the Customer Acquisition, Product Discovery, and Purchase Experience product teams. Using the benchmarking study as our guidepost, we ideated ways to improve usability and increase conversion.

Benchmarking data from the PDP
Benchmarking for the product detail page (PDP). With each benchmarking, we looked at our performance in the context of other competitors.
Benchmarking data from the PDP continued

An imperfect prioritization process

Due to limited team resources and excessive priorities, I worked with the teams to identify the ideas with the highest potential impact and lowest implementation effort. Admittedly, the decision process was mostly subjective, based largely on gut and past performance.

Guiding strategy and removing roadblocks

The teams then got to work implementing the ideas as A/B tests. I met with them each week to guide strategy, provide feedback on designs, interface with data science, troubleshoot implementation issues, and unblock resources. Each month, I presented our progress and findings to the executive leadership team and addressed any problems while getting more buy-in for the work.

$8 million in Q1, but wanting to do more

In Q1 of 2021, we launched 12 experiments. We had some successes and, of course, some failures. These led to $7.58M in incremental annualized revenue for the company. While I was happy with our momentum, I wanted to find a more scientific way to prioritize what the teams worked on. And so, I went to work creating an algorithm to predict potential revenue for each experiment idea.

A closer look at conversion rate optimization (CRO)

To get started, I put together a general list of all the ways we might improve conversion by helping our customers:

I realized that I was not a good enough data scientist — or one at all ;) — to predict the impact of each of these types of changes and that we might make several of these changes at once, further compounding the algorithm. And so I tried another way of thinking about this.

A revenue algorithm is born

Zooming out a bit, I realized that revenue made from an experiment would generally be influenced by:

  1. How many people see the experiment
  2. How qualified the traffic is, or in other words, how far down the purchase path the person is
  3. How significant the change is from the baseline

Using Google Analytics as the data source, I took a stab at creating an equation that matched the revenue data we were collecting and then calibrated it using additional past tests.

The revenue algorithm that I created documented in a Confluence article
The revenue projection algorithm in Confluence.

Algorithm success: 132% increase in Q2!

After vetting the algorithm with the data science team, I rolled it out to the CRO team to begin using for prioritization in Q2. I’m excited to report that we increased the annualized revenue by 132% that quarter by using this method!

A few of our CRO experiments

A presentation slide showing the Bullets-in-buybox experiment hypothesis, supporting data, expected result, and actual result Another presentation slide showing the Bullets-in-buybox experiment desktop implementation
Adding bulleted details to the buybox. This experiment performed much better than anticipated and reinforced how important the buybox real estate is.
A presentation slide showing the Bullets-in-buybox experiment hypothesis, supporting data, expected result, and actual result
Adding tax, shipping, and recycling fees to the cart. Even though we weren’t giving the user any more information about how much they would be charged, indicating that we’d calculate this information for them in checkout set better expectations and led to less drop-off in checkout.
A presentation slide showing the Bullets-in-buybox experiment hypothesis, supporting data, expected result, and actual result
Changing the position of the Sale link. People want to save money. And drawing more attention to our sales by moving the link to the left was well received.

The project was put on hold due to competing priorities and resource constraints during Q3. However, in total, we generated over $34.82M in incremental annualized revenue for the business during this time.


Thank Yous

Brooke Kao User Research

Ryan Evans Product Design

Cristian Valdes Product Design

Meaghan Socaciu Product Design

LoriAnne Carpenter Product & Project Management

Marilyn Cole Engineering & Project Management

Valery Peep Product & Project Management

Colin Darland Engineering Management

Jayson Virissimo Engineering Management

More work