I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.
Justin Rondeau, karaoke king and Chief Editor & Evangelist with WhichTestWon, presented on Day 1 of Affiliate Summit East 2012 with
Top Testing Tips to Optimize Your Site. This was a session all about A/B testing, guiding the audience with what to test, followed by many examples.
Website design is traditionally guided by gut instinct, copying the competition, or following so-called
best practices. Yet none of these methods will always give you the best conversion rates.
Email marketers have split-tested their mailings more and for a longer period than website owners have tested their designs. Google Website Optimizer was the tool to use for split testing website elements, until they shut it down. Some of the features have been moved into Google Analytics under Content > Experiments.
When deciding what page to test, don’t start with the pages with the highest abandon or bounce rates. Instead, start closest to the money (or even beyond the money – the thank you or receipt page). Just make sure the page gets enough traffic and that it directly affects conversion rates.
While testing, remember: 75-80% of tests give no definitive results. That means you’re testing the wrong stuff.
Test Smart! Choose the right elements
- trust elements
- search boxes
- overlays (test overlays first)
When testing images, add people. Where are the people looking (make them look at the call to action button). Avoid stock photography – users begin to recognize people (especially across niches).
Including a Trust Element doesn’t always increase conversions. Adding a trust element puts the question of trust into the user’s mind.
When designing forms, required fields only force users to enter content – not necessarily truthful content. The example Justin gave showed a 31% higher lift in quality leads with optional form fields. If a user didn’t enter their email address, it wasn’t a “qualified lead.” If this field was required, users would simply enter fake data to move on (firstname.lastname@example.org).
Also, forms should not have a “Submit” button. Make the button say what the action really is: “Sign Up” or “Get More Info.”
When testing, segment your audience (based on geolocation, device (mobile, iPad), traffic source, new vs. returning visitor, and site interaction).
Don’t fall in love with your idea – don’t leave money on the table
Measure Meticulously – Look for 95+% confidence from your data. How much conversion lift do you want? What’s the sample size going to be? How long will you test for?
Justin included many real-world examples, many of which stumped the crowed (we voted before results were given, choosing which test we thought won). After the results were shown, Justin explained why one version beat another. Some examples are:
- Including “Join 14,752 others” above an email signup caused a decrease in signups. The theory is that 14k isn’t a big enough number
- Rich HTML vs. text-only email went out to a B2B audience. The text-only version converted higher, as many users were reading on mobile devices.
Split testing is something I always say I need to do, but never seem to implement. While I enjoyed this session, Justin wasn’t able to stick around for Affiliate Karaoke later in the show, which was a disappointment.
I’ve covered split-testing before in a previous Affiliate Summit session, “How to Quadruple Revenue Using Existing Traffic“