Your source of insights for a successful digital transformation.

Discover How To Optimize Digital Customer Experience With A/B Testing

ab_testing_dcx.jpg

In the Age of the Customer, companies have unprecedented opportunities to expand their business. All you need is a proper digital customer experience strategy, a mobile-first, if not a mobile-only engagement strategy and set of tools, such as a customer-facing app and a responsive website and then anyone can see – and buy! – whatever you’re selling.

But although it sounds easy, creating digital tools and content that entices people to spend is difficult. Luckily, there’s a surefire way to figure out what works: an A/B test.

In 2008 Bill Gates stated that “We should use the A/B testing methodology a lot more than we do today.” His words still resonate in today’s marketing world. We still do not A/B test as much as we should.

The principle behind A/B testing is simple: by showing different versions of your app and content to randomly selected test groups over a short period of time, you generate data on which option is the most effective. This data offers organizations huge opportunities to increase customer engagement. This type of testing is one of the most effective ways to step from mouldy data to smart data.

Take Barack Obama, as an example. The last Obama campaign website allowed users to leave their email addresses if they wanted to sign up for a newsletter or contribute to the campaign. The page displayed a photograph of the candidate in an ocean of supporters waving “Obama” flags next to a field for entering email addresses, along with a sign-up button.

The Obama team wondered whether this was the best possible image and button combination. After a series of A/B tests using different images and texts, they found a winner: visitors were 40.6 percent more likely to share their email address when the website showed a photo of Obama surrounded by his family, next to a button that said “Learn more.” As a consequence for the Obama campaign, this change resulted in 2.8 million more email subscribers and an additional 57 million dollars in donations.

As you can see in this example, A/B testing offers huge advantages, and it is (or, at least, should be) integral part of customer experience management. And it’s worth noting that although this testing technology used to be complicated and expensive, that is no longer the case.

In recent years, A/B testing has become an established method to improve web pages, and thanks to the Right-Time Personalization feature of the Neosperience Cloud we are now bringing this capability to the next level in the app domain.

To be perfectly clear, before you begin implementing this immensely critical improvement to your app and website, you need to consider exactly what you want to test. Because without a definition of your goal and a hypothesis, A/B testing won’t tell you much. Every app is different; every website is different; every brand is different; and there will never be a ‘one size fits all’ solution.

Just like any other aspect of your business, you need a specific set of key performance indicators. So, start by clearly defining quantifiable success metrics – that is, metrics which measure whatever data is most relevant to you – in order to evaluate the testing later.

For instance, let’s say you are a retailer. It wouldn’t be a very good strategy to implement A/B testing that simply measured taps in your app, because taps don’t indicate what customer think about your products. Instead, purchases, products added to cart, or product views, shares, comments and repeat visits to your app might say more about what kind of content resonates with your customers.

Although A/B testing is great for tweaking and refining your digital presence, it can also lead you to make major, widespread changes in how you structure your mobile apps and websites.

obama_ab_test.jpg

This was what Disney experienced when it performed an A/B testing experiment on the homepage of one of their TV networks, ABC Family. When they examined their search logs, the Disney team noticed that a lot of visitors were searching for specific shows.

So, instead of making minor tweaks, they decided to create a completely different homepage structure which listed all the shows, making them easier to find. Disney’s goal was to increase the number of clicks on the experiment page by 10 to 20 percent – which they easily surpassed. As a result of the A/B tests, in fact, engagement went up by 600 percent.

One more example: Netflix used A/B testing in this way when the company redesigned its user interface since 2011. The original interface suggested just four titles, each with a star rating and a play button underneath. But then the video service tested another variation, which showed near-endless rows of thumbnails with pictures and titles users could scroll through. This variation was hugely effective: It not only improved customer retention, it also increased engagement.

What conclusions can we draw from the analysis of these important case studies? When it comes to digital customer experience design, either with a focus on your app or your responsive website, less really is more. In fact, A/B testing has shown that removing any fields that aren’t absolutely crucial will have a huge impact on the engagement of those that ultimately matter the most: users, visitors, customers.

Any single test or choice you make, you do it to improve the experience of your customers. And a better customer experience always converts into revenues: By using a “hide” function (a feature that displays information only when you ask for it) for the promotion code and shopping options forms in their checkout dialogue, US retailer Cost Plus World Market increased the revenue per visitor by 15.6 percent.

Of course, there are occasions when you can’t declutter or remove fields. In those cases, i.e. you should break a longer form into more sections, especially on mobiles. This approach worked for Obama’s 2012 re-election campaign. The donation page simply couldn’t be pared down any more – every element was vital.

But then the campaign team had an idea: In order to make the form appear shorter, they broke it down into two separate pages – one with the amount donated, the other with personal details. Together with other similar alterations, this tiny change amounted to an extra $190 million in donations.

An image says more than a thousand words, as the saying goes. And although there’s a lot of truth to that, A/B testing shows us that if you really want to engage people, you need to find the right design, the right images and the right words, and you can not determine what is right without trying. That’s what testing is all about, after all.

So, stop losing your precious time with fragile assumptions and useless “what if” meetings. Start doing and fail fast instead, and let your customers tell you the way to go.

YOU MIGHT ALSO LIKE: 3 Key Elements For Your Omni-Channel Customer Experience

To help you provide a strategic advantage to your organization, Neosperience has crafted the first DCX 7-Steps Checklist, with requirements and insights for a successful digital transformation. Download the free guide here: 

 

Download DCX 7-Steps Checklist

 

Topics: Digital Customer Experience Neosperience Analytics Retail Mobile