How To Test Design Impact And List Quality

Target audiences can be fickle. An email campaign drives inbound calls to the sales rep all week long one week, and then the next week: crickets. We know we are sending to recipients that need our services and products, but some messaging just doesn’t speak to their needs. Even how you present a call to action can create different responses out of your reader. And that’s OK. This is an opportunity to drill down and learn more about your consumers to help maximize your campaigns going forward.

Breaking up your email lists into groups and testing content topics and types (articles, infographics, videos, etc.) can better help you decide what to send to whom. Simply laid out, mediapost.com introduces us to A/B testing to help drive open rates and, ultimately, traffic to your organization.

Jim Anthony | So-Mark Founder

Source: mediapost.com | Re-Post So-Mark 5/3/2017 – 

Testing is a time-honored direct marketing practice. To this day, marketers test ideas, both online and offline, using “math developed a century ago,” Stefan Thomke and Jim Manzi wrote a few years ago in the Harvard Business Review. But a shocking number of marketers don’t. For example, half do not test email, GetResponse reported in a study earlier this month.

Hard to believe — but it’s in line with other recent surveys. In effect, it means that email blasts go out with little insight into whether subject lines, offers, calls to action, headlines, body copy, landing pages and email lists will work.

Yes, you can rationalize that wasted emails don’t cost much (as opposed to misdirected direct mail packages). But what about the annoyance factor and the failure to engage with the customer and build a relationship?

You can’t afford to buy into this folly. Above all, you have to test creative and other elements to determine whether they work for a mobile device, said Jay Schwedelson, CEO of Worldata, in a recent Data & Marketing Association Webinar. But he advises not to do a straight 50-50 split. “On the surface A/B testing for email sounds like a great idea,” he says in an email. “The problem is that when you A/B test you wind up sending 50% of the audience you are emailing the lesser performing version of your campaign. I prefer to recommend a different setup. Take 20% of your overall audience and do an A/B test with that group. Then wait 24 hours and see the open and initial click rate on those drops. You will see statistically relevant data that should provide a clear path on which version is best. Then rollout to the remaining 80% with the ‘winner.’ This scenario allows you to send your best version of your campaign to 90% of your audience instead of just 50% of your audience.”

Read the full article…

Subscribe to our Newsletter

Sign up to receive notifications about our services or new blog posts!

Scroll to Top