Homeworks Energy
Objective
We were charged with increasing the conversion rate of the sign up flow for home energy assessments. Homeworks Energy is a company that makes most of its money through inspecting homes and advising homeowners on how to make their homes more energy efficient. The monthly average was around 45% .
analytics
Before the project began, there were no tools in place to measure the success of the online scheduler. No one was sure what the overall conversion rate was, much less the conversion rate of each individual page.
My first step was to implement Google Analytics to measure the amount of users on each page of the sign-up flow. Analytics would give us a picture of the 10,000 foot view of what was happening in the checkout flow.
It was clear that the contact info page was the page where we lost the most customers. The conversion of the contact info page was only around 60%, compared to the others that were 80-90%.
Session recording
I analyzed over 600 real session recordings of users attempting to sign up for a home energy assessment. The goal was to discover where the most users were giving up or encountering difficulties.
I found that most people were giving up on the contact info page before answering any questions. What I gathered from the session recording and the heuristic analysis was that users found the page overwhelming. They would simply look at the page and end the session.
The second biggest problem on the contact info page was the address field. It appeared that the Chrome autofill feature was blocking our autocomplete feature and users were getting frustrated and giving up.
Other notable issues noticed were poor error handling, Users were not being taken back up the page to incomplete or incorrectly filled out fields after clicking the “Next” button. The size of the buttons for heating type on the eligibility page were also causing issues for mobile users.
heuristic analysis
I analyzed each element of the UI. I used both research from sources like Baymard and NNGroup as well as observations learned on the job.
You can see all of my comments in the image below. Some of the big themes I noticed were the overly wordy phrasing of questions, not indicating which questions were mandatory vs. optional, gray text fields that appeared disabled and some colors used that were not accessible to colorblind users.
Using the findings from our research I made a user journey map (seen below) where I outlined all the good and bad points of going through the flow.
Some of the top issues were:
Gray text fields that looked like they were inactive and made the page overwhelming.
Copy that was not easily scannable. Text fields were labeled as “What is your first name?” instead of “First name”.
Several questions turned out to be unnecessary after an internal review.
user research
I wanted to learn about who the users were exactly so we could make sure we were still catering to them. All the users were homeowners that tended to skew age 50+. A strong majority of them were using desktop, about 70% while ~28% used mobile and only ~2% used tablets. Knowing that older customers tend to be less tech-savvy, this made us focus on simplifying the flow even more.
internal research
Since our biggest goal was to simplify the UI as much as possible, we wanted to remove all unnecessary questions. This required talking to all the internal departments at Homeworks to ensure we were not losing any crucial information. After talking with members of the marketing, sales and operations departments we learned what we were able to remove, which was substantial, though not quite as much as I had assumed. Some questions which I had assumed would not be important turned out to be crucial for certain departments.
Design
When I got to the design the biggest focus was on simplicity and making the design more accessible to all users. Based on our internal interviews we eliminated as many questions as we could, simplified the phrasing of the questions, improved the error handling and made the pages less visually overwhelming.
I created the new designs in Figma with a prototype and full user flow (seen below) to ensure the developers would have a reference for how the product should behave.
We conducted several usability tests of the new designs to make sure that we were not introducing any new usability issues into the flow. We were comfortable concluding that we were not introducing anything that would confuse users.
results
As soon as we released our changes the overall conversion rate shot up from ~45% monthly to ~65% monthly, a 55% increase. The conversion rate of the contact info page increased from ~60% weekly to ~85% weekly.
next steps
Once we solved these problems, other ones cropped up. There was a bug on the schedule page that occurred for some users that had not been apparent before and which customers were within service range needed to be corrected as we experienced some issues there.
There were some factors I was interested to see get A/B tested. For instance, changing the fields from gray to white was a source of controversy. I wanted to make sure we were making the right decision and used Google Optimize to A/B test the field color. Over the course of one month the white fields performed ~12% better than the gray fields. Once we had enough data we were confident we could switch over to white fields entirely.
The changes were an overwhelming success. Once the changes for Home Energy Assessments had been released we went to work on increasing the conversion rate for HVAC Sales. When we implemented similar changes using the same practices as before we increased the overall conversion rate from 24% to 55%, an 129% increase.