You constantly hear about A/B testing, and the importance of testing your email. If you’re anything like me, you’re probably wondering “Why? How much difference could it really make?” I send out my emails and I get some leads – it’s great.
The truth is quite horrifying. A good email, gets more than a 100% increase in open rates. That’s twice as many potential leads. Just because you tweaked your email campaigns a little (okay… quite likely a lot). Either way, it makes your email campaigns a much more interesting all of a sudden.
MarketingSherpa does a good job in this article of talking about the actual difference between two emails they ran. The comments from the users highlight a few factors that you might also want to use in your experiments, and this MarketingSherpa article provides an overview of the significance of various factors (there are some notable overlaps – highlighted in bold):
- Variance in link location performance (on one email)
- Quantity of images
- Subject line
- User profiling / targeting
- Landing page design & performance for conversion
The bottom line is that your bottom line could be improved by experimenting with some of these factors. Don’t try to handle them all, just focus on a few at a time and check to see if there’s any improvement. Keep in mind that open rates aren’t the goal, you’re still focused on leads & conversion. So re-positioning solely to achieve a higher open rate, isn’t helpful if there’s no change in the nett performancen of your leads.