A/B tests are an excellent way to find out what wording resonates with your audience but if you aren’t finding trends in both your positive and negative responses you won’t be able to fully optimize your campaign.
Looking for the subtleties in your prospects responses can be a time-consuming task but I’ve significantly improved my response rates by committing time to understand the thoughts and feelings of the audience.
This test was run on a campaign targeted toward legal professionals who bill hourly.
Initially, the campaign had rates (30-40%) but the response rate was less than 1%
After pouring an evening into reading through almost every response and putting them into a category I was able to determine 3 major issues with my copy that I otherwise would have ignored.
Here is the original copy
{FirstName},
Losing billable time is leaving revenue on the table.Instead of manually tracking your time you can simply tell, your virtual time tracking assistant.
{Our Company} helps you bill more time with less administrative effort.
To help you get started, we’ll send you a new Amazon Echo Dot with your subscription.
Can we carve out 5 mins for a demo?
The responses that this copy turned were drenched with curiosity and fear.
Here are the most common response sentiments that I didn’t cover in my first draft.
- It doesn’t seem like this is secure
- Isn’t Alexa always listening?
- I already have a tool for time tracking track my time
Each one of these is an easy objection to handle; but how many people had these same fears or assumptions when reading the first draft of this email and simply chose not to respond? (The answer is TOO MANY )
Here’s the updated copy
Hi {FirstName},
Have you considered how many more hours your firm could bill if your team didn’t have to manually record each task?
We’ve created an hyper-secure way for attorneys to track billable hours by voice.
Our digital assistant integrates fully with popular practice management software like {PARTNER 1}, {PARTNER 2}, and {PARTNER 3}.
Can we set aside 10 minutes for a demo?
What changed
- I chose not to talk about technology that is inherently scary (we can do that on the phone call when objection handling is instantaneous)
- I emphasized security.
- I removed talking about “getting started” and instead talk about integration
Sure, with enough A/B testing all of these issues may have surfaced, but by taking the time to LISTEN to my prospects I was able to craft better messaging that achieved a better response rate faster.