Boost Power (OVO Energy)
Boost Power needed to improve customer retention but they didn't know why customers were leaving. I led a discovery phase to help the team understand our customers needs. Then through faciliating collaborative design sessions, we quickly developed concepts and validated them through user testing before launching the 'Winter Wallet', the company's most innovative initiative to date. This delighted customers and significantly improved customer retention.
OVO Energy launched Boost Power, a Pay As You Go (PAYG) energy brand, in 2017. With PAYG, you need to keep your energy account topped up with credit to ‘keep the lights on’. Boost’s main product, ‘Smart PAYG+’, lets customers do this through a smartphone app linked to their smart meter.
PAYG customers are free to change suppliers whenever they want without paying exit fees. And they were leaving us in droves. The team had lots of ideas for improvements but as we’d never done in-depth customer research, it was impossible to judge if these ideas would solve real customer problems. We needed to take a step back to identify the problems PAYG users actually face. And then agree which problems to go after to improve customer retention.
To better understand customers’ pain points, I reviewed customer feedback from social media and our leavers survey. Many complained about the cost of their energy, particularly in the winter, when they might fork out five times as much than in the summer.
The team had formed the following assumption to explain why people leave:
I was keen to use the research to help get to the bottom of this.
Lots of our customers are lower income mums and as most of the negative feedback was around energy costs, I was keen to explore:
I believed that interviewing PAYG users in their homes would help us build rapport, making them more comfortable talking about the sensitive subject of money. We’d also get a better sense as to how they lived their lives and what they really spend their money on — reducing any social desirability bias we may have got through interviews in a neutral location.
With a limited research budget, we decided to run a pilot study to help inform the interview guide for the more costly home visits. We ran online interviews through usertesting.com with PAYG users, experimenting with different questions as we went along.
At Boost, we strive to have a multidisciplinary team of problem solvers. It was therefore key that we involved different team members, from developers to copywriters, right from the beginning, to build a shared understanding around the customers’ problems.
I ran a workshop where we watched the usertesting.com interviews as a team and captured observations as we went, which we then grouped into pain points, behaviours, attitudes and understanding and goals.
Through this activity, some key insights emerged.
These insights allowed me and my research partner to create a more focussed discussion guide for the home visits.
As we didn’t have any experience running interviews in people’s homes, we pulled in another researcher from the OVO group to help out. She trained us and joined for the first interview, then I was able to able to lead on the following interviews we conducted with financially struggling mumsaround the country. This included our customers and competitor’s customers so we could learn how they’d solved problems that our customers faced.
As we were interviewing single mums in their homes, we wanted to ensure they felt comfortable, so we made sure a female team member joined every session. Choosing the right number of people to attend was also challenging. We needed to balance the benefits of letting team members experience the interviews first-hand with creating an overwhelming experience for the interviewee. We had 3 people attend one interview but the interview felt too awkward so we stuck to 1 interviewer and 1 note taker for the remaining interviews. Despite these constraints, we managed to have different members of the development team join all but one interview.
In parallel to running the home interviews, I ran phone interviews with customers who’d downgraded from our main PAYG+ product. I stayed away from the topic of how they manage their money as I felt they were unlikely to open up as much over the phone. Instead, I focussed on why they were unhappy with the PAYG+ product.
Following the interviews, I ran a workshop with the team to analyse the interview transcripts (which I’d written up in-between interviews).
We learnt that interviewees really struggled to save money generally and with no reserves, paying for energy in the winter was using most of their budget.
Some people had strategies like topping up more than they needed in the warmer months, building up a surplus, which would ease the pain in the winter. Not everyone was that savvy. We heard stories about others with no savings who had to choose between heating and eating.
Based on these insights, we came up with an alternative interpretation of the social media comments mentioned earlier.
We agreed to tackle this and to help focus the creative exercises, I reframed this as:
HOW MIGHT WE...
Help customers save money in the warmer months so they can cope better with energy costs in the colder months
Over the last half century, there’s been a growing body of psychology and behavioural economics research that’s helping us understand why we behave the way we do. To be successful, we’d need to help our customers change their behaviour (start saving for the winter) so it made sense to build our solutions on top of existing evidence of what works.
In our workshop the next day, I presented some of the relevant behaviour change research and then I facilitated a ‘design studio’. To avoid groupthink, where everyone’s thinking about solving the problem in one way, the design studio format allowed the team to generate a variety of ideas independently, receive critique from the group and then iterate on those ideas.
With the next round of interviews booked in, we had the opportunity to test these concepts so I quickly refined the most promising ideas into a basic prototype. We had 2 ways of helping people cope better with the additional winter energy costs:
You set a savings target and decide what percentage of each future energy top up you make should be transferred to a separate savings pot, which you’d be able to use in the winter.
You see a breakdown of your predicted energy usage across the next 6 months. You can setup a scheduled top up into your energy account e.g. £25 a week. This would be more than you’re actually using in the warmer months so credit would build up on your energy account, which you’d rely on in the winter months when your weekly payments on their own wouldn’t cover your energy usage.
I tested the concepts in one-to-one interviews with financially struggling mums held in our office. The idea of a separate savings pot got people really excited.
In contrast, the scheduled payments concept got a mixed reaction. Some felt like they’d lose one of the key benefits of being PAYG — the ability to track of how much energy they were using each week.
Something that we’d not previously considered was people’s maths literacy. The testers who struggled with maths, found the percentages confusing with the savings pot concept, and the graphs confusing with the scheduled payments concept.
We also explored different reward mechanics to incentive people to keep saving. Not surprisingly, the idea of rewarding people with a fixed bonus if they hit their target was also very positively received.
Through further user testing and peer feedback, I refined how you contribute to your savings pot. I ditched the fixed percentage idea, which was over complicated and did not suit those who are not maths literate.
We still wanted to keep the incidental nature of contributing to your savings pot - as in, if you’re topping up your energy account weekly, then that’s a perfect opportunity to encourage people to top up their savings pot.
With the next iteration, once you’ve chosen how much to top up your energy account, you’re asked to contribute to your savings pot. We added instant feedback, showing how close you are to reaching your savings target based on the amount you’ve chosen. And to reduce decision paralysis, we set a £20 default amount (spoiler alert: this came back to bite us when we launched).
As summer was coming to a close, we knew we had to launch the most basic version of the savings pot straight away so people still had time to build up savings before winter. We launched the initiative, which we rebranded the Winter Wallet, in the last week of August. The key features of the Winter Wallet were:
We’d hypothesised that helping customers save money for the winter using the Winter Wallet, would lead to happy customers who want to stay with us for longer.
Based on this hypothesis, we created 2 key performance indicators (KPIs) that we’d track and try to optimise:
From tracking the opt-in rate in the first week, we could see that we were unlikely to hit that target so we needed to take action quickly.
I introduced a variety of feedback mechanisms to learn quickly what customers were doing and why.
Launching our on-site survey helped us learn whether customers who’d visited the Winter Wallet landing page intended to sign up to it in the future and if not, what were the barriers to doing so. Creating a conversion funnel highlighted where people were dropping out of the sign up flow.
I devised a series of A/B experiments to optimise the numbers opting in and hitting their target.
Were customers seeing the homepage promo but didn’t have the motivation to click through?
Returning to the behavioural economics literature allowed me to build on learnings from similar experiments. For example, an experiment ran by the ‘Common Cents Lab’ found that using ‘claim your discount’ rather than ‘sign up for a discount’ had successfully increased open rates and click through rates in an email experiment. The authors of the experiment reasoned that ‘claim’ signals a sense of ownership which triggers feelings of loss aversion. Loss aversion is the tendency to prefer avoiding losses to acquiring equivalent gains. The word also trigger the scarcity bias (we unconsciously assume things that are scarce are more valuable than things that are abundant). We tried changing the language to ‘Claim your 5% savings bonus...With our Winter Wallet’ which led to a massive 56% improvement in sign up rates.
Were users even paying attention to the homepage promo?
We experimented with making the promo stand out more by changing its background colour. This created more contrast between the now blue promo and the rest of the page. This resulted in a 22% improvement in sign-ups to the Winter Wallet.
Or were they intrigued by the Winter Wallet promo but needed to focus on the more immediate goal of topping up their energy?
Hardly anyone was clicking the promos we had throughout the top up flow so I suggested we include a message once they had completed their top up task, with an animation to draw extra attention to it.
This created a huge spike in sign ups.
There’s a danger when you’re trying to optimise for a certain behaviour that you make the action too frictionless. Customers who sped through the top flow, repeatedly clicking on the Confirm button, accidentally added the default £20 contribution to their Winter Wallet. We learnt through feedback from the customer services team that this was driving calls from upset customers who couldn’t afford to lock away that money in their Winter Wallet.
In a test environment, it’s difficult to recreate real-life scenarios — someone desperate to top up their energy (possibly about to be disconnected), where English is their second language. With user testing, these biases can creep in without you realising. For us, the fact that they were using usertesting.com meant that their reading level was high enough to follow the onscreen instructions. Also, user testers tend to spend more time working through a prototype than they would in reality.
Fortunately, we had the right feedback mechanisms in place to quickly capture and address these issues. In this case, I removed any default so the user had to actively pick an option (including ‘Not now’ if they didn’t want to contribute anything to the Winter Wallet) before they could move onto the payment page.
Nearly 25,000 Boost Smart PAYG + customers opted in to the Winter Wallet, and they managed to save a whopping 1 million pounds.
7,000 of these customers reached their ambitious targets and received their 5% bonus credit, meaning we credited those customers with a total of around £35,000 towards their energy bills this winter
To ensure we had the right data to support or reject this hypothesis, we made sure we had a control condition of customers who weren’t exposed to the Winter Wallet. An analysis of the 2 condition shows you’re significantly more likely to stay with us if you were in the Winter Wallet condition.
In addition to improving retention, our customers have been very positive about the Winter Wallet on social media, which should help attract new customers to the brand.