I ran the following split test on lead gen pages for a Jewish dating site I’m working on, MatchedByFriends.com .
I’d like to share not just what I did on the landing pages, but context on traffic. So often, traffic quality and nature is not discussed, clouding the reader’s understanding of the a/b test with a fair measure of fog.
I was using the Google Display Network, with site targeting, to drive traffic to my pages. I’m not going to share placements and banners, since that’s more valuable insight to competitors, but I will say that I focused on Israeli, English-speaking traffic. The one exception was where I tried to ramp up traffic towards the end of my test by using less accurate targeting, and ended up with a load of visitors speaking the wrong language, not interested in dating Jews, and click fraud to boot. (I took pictures of the click fraud AdSense sites and showed them to Google, who after repeatedly telling me I could just exclude them from my targeting in the future, eventually gave in to my polite insistence and issued a credit. )
Anyways, with my initial control landing page, I sought to use an image of a happy couple to convey the benefit of what I was offering. Benefits, not features right?
Here’s what the page looked like.
The next page took the hypothesis that moving the CTA further up (on my laptop screen it was below the fold) would help:
Finally, I took Steve Krug’s advice and just made things super concise:
The data – I’ll admit that I cut the test short slightly.
The reason is that the trend was strongly in favour of landing page 3 – which converted around 9.3% vs ~3% for the others – and that in the past, similar hypotheses/tests got similar results. So I can’t say with statistical certainty that the results weren’t a fluke, I feel strongly that this was a win because of prior experience.
In the past, I did extensive testing on very similar landing pages (here’s a no longer function example) for my advanced SEO book. Those pages traded free chapter downloads for a name and email. I got the conversion rate on those up to 30%, and 50% on warm traffic from my blog. So the “cut the copy and get to the point” hypothesis of lead generation – aka “title, benefit bullets and a cta” template shared by Tim Ash of SiteTuners at Affiliate Summit – has proven itself to me over and over.
p.s. I used Visual Website Optimizer for splitting the traffic and measured conversions with some custom PHP code I wrote, which sends my conversion source/campaign/landing page data to MailChimp (or any other email marketing service provider). Part of the reason for that is that I was also split testing the brand name, based on an idea from The 4 Hour Workweek where he tested book titles based on AdWords CTR – I wanted to test my brand name based on conversion rate. Unfortunately AdWords no longer allows you to have more than one domain in an adgroup, and my attempt to copy targeting across campaigns and adgroups resulted in dramatically uneven traffic levels between the domains (of which GoodIntros was one) thus making that aspect of the test pointless. I worked out MatchedByFriends ultimately thanks to some very fun collective brainstorming I crowdsourced from Amazon Mechanical Turk. If people are interested, I can write about that process in another post.
p.p.s. What difference would it make to your business if you had, say, 30% higher conversion? Get in touch at ConversionRateOptimization.co and I’ll show you opportunities to improve your site live over the phone, free. (I’m not worried about abuse, because I figure if you like what I show you, then you’ll treat me fairly and hire me to consult or provide services.)