Divide (with Data) and Conquer!

Your idea might be brilliant, but probably not for every customer. Experiment with customer samples to generate action and knowledge, thereby avoiding market losses

Divide (with Data) and Conquer!

Making small changes to your product or service can have a huge effect on customers. Kevin Systrom, founder of Instagram, often tells the story of how his girlfriend told him that she didn’t like posting photos on this app because most of them were by expert photographers and editors who made them look very professional. “You should add filters,” she told him. This small change gave Instagram a unique market preference. 

We are all searching for that change which will increase sales, guarantee customer loyalty, and maybe even discover markets we had never imagined. This search puts enormous pressure on managers, who usually act impulsively regarding the market. An idea put forward by a manager or originating from an action taken by a direct competitor becomes an untimely call to the media agency and an emergency campaign directed at absolutely every customer... “anyone who will go along with it.”

We often receive offers to defer bank credit card payments, but it can be perplexing if it arrives after we have already paid 100% of the balance. Why do we receive offers that make no sense? Because most companies (including this hypothetical bank) assume that the more people receive the offer, the greater the chance of success. However, many customers will probably ignore any other messages from this bank if they perceive it as a desperate marketing ploy. 

The deferment initiative may raise that month’s loan placement percentage, but we will never know exactly what triggered this behavior. Was it the email campaign, the ATM offer, or the one-to-one calls? Or perhaps the economy suffered a setback and customers would have chosen to defer with or without the campaign? More importantly, what types of customers accepted the offer? Who should this offer be sent to next month to generate more impact without overdoing it? Which key messages should be conveyed, and which ones avoided?

A concrete way to test an idea or campaign is by means of A/B testing. This test is neither new nor does it enshroud statistical mysteries. It consists of testing an idea on a specific group of customers or users, rather than on the entire population. In the case of a credit deferment campaign, the bank could only select customers who are at least two weeks behind in their last monthly payments, or those of a certain age or geographic region.

In the online world, an A/B test shows two versions of a website to different groups of users to find out whether the variables (font, location or buy button size) have a specific effect on conversion (sales by number of visitors). Large retailers use heatmaps, such as Hotjar, to ascertain the most visited areas of their websites by observing the user path on the site and the moment when they buy or leave the online store. Some examples of A/B testing actions are:

  • Try a more visible search bar for a few weeks if the heatmap indicates that it is the first place customers go to or from which most purchases originate.
  • Move the buy button to the center of the page instead of the right-hand side.
  • Replace phrases such as “Get a discount on the first order” with “Get an additional discount.”
  • Test changes on the website before the app. 
  • “Clear” the offers page
  • Focus the website content on a single product.
  • Change the color and size of the buy button.

Knowing which group you want a campaign or new product to target is crucial. If the website and app users are different, why put both types of customers in the same basket? Defining the expected results is of equal importance. Is a 3% lift in conversion (number of buyers/number of visitors) enough to invest in changing the size of the buy button? If the objective has not been defined, it is very likely that after the test no action will be taken or you won’t be able to resist the temptation to change more variables (additional offers, website design, price) without knowing for sure what happened.

The Harvard Business Review guide to analytics for managers describes a common mistake that leaders make: interrupting the experiment. As soon as they discover an insight, they want to act on it and, logically, this can lead to errors. It is like opening the oven door too often; the variation in humidity and heat can ruin the shape and texture of a cake. Another mistake is to consider too many metrics at once, either because of the number of variables we want to modify in the experiment (price, volume and color of the coffee cup used in a cafeteria) or because of the results we expect (preference at different times, incremental sales or sentiment analysis on social media).

In marketing, there is a joke that A/B testing comes from Always Be Testing. It might seem to be a funny acronym, but it does actually encompass the truth (and seriousness) of every innovative strategy that has faced difficulties, mistakes and many iterations. Testing allows you to embrace imperfect ideas, avoid the temptation of wanting to be everything for everyone, and, most importantly, to learn from your customer along the way. Take a deep breath…  the world moves so fast that, if something goes wrong, you can always go back to the original idea.  

Originally published in El Universal.

Articles of Innovation
Go to research
EGADE Ideas
in your inbox