Landing Page Optimization Test Results: 25% Conversion Lift

Mar 11th, 2013

Simple A-B Split Test Result

At inlineVision, we like to prove to our clients (existing and future) that our experience and methods are truly beneficial to their bottom line and that what we do and how we do it does make a real and measurable difference.

Landing pages come in all shapes and sizes, but they all should be developed with one thing in common: Focus on the desired goalthe conversion.

A/B Split Test on One Landing Page

While you might think that a total of 43 conversions is not a lot, you have to take into account that this is only one test on one element, which ran for one month on one landing page. There are many more pages and calls-to-action on the site; and we keep track of the results while we continue to test other variables.
What is also not shown here are any offline conversions (phone calls) this particular page generated during the testing period.

Overall Goal Conversion RateVisitors to our client’s website visit an average of 3.2 pages on the site, so additional conversions might have occurred on other pages or even by clicking a different element on the same page during the same visit.

The site’s overall conversion rate we were able to track during the same time period was 5.07% (as shown in the screenshot above).

Let the numbers do the talking

The real value of this experiment becomes apparent when we attach a monetary value to each conversion:
For this particular test and this particular service we assigned a value of $600 to a successful conversion. We are basing this number on the average invoiced amount for this service and on the factor that not 100% of people who request an estimate from our client do commit to it.

Which means: This 25% conversion rate increase resulted in $3,000 more revenue compared to the original version of this landing page in only one month. That’s $36,000 a year – all because we decided to try out very simple variants on the page.

What we tested

We tested 2 fairly simple things at once: The color of the call-to-action button and the wording on the button (We could have created more variants but chose to change more than one variable at one time).
* The statistical confidence on this experiment is only 77%. But it was part of a series of tests, all of which led to a very similar conclusion.

Control (Original):

Green Get-Button Loser

Variation 1 (New Version):

Orange Button-Request Winner

The orange button was the winning variant in this test, everything else on the page was exactly the same.

Track, Measure, Analyze, Optimize

No website is ever “done” – or “perfect” – that’s what we believe. There’s always room for improvement — or, to be more on topic: optimization.

“But my website is great, (because) I have xxx number of hits on my website every month” does not say the least thing about the site’s performance and – inherently – its conversion rate.

Only if you are willing to accept that your website can do better and have proper tracking methods in place is when you can start optimizing. Just because the website owner “likes” how the page looks, does not mean users are equally fond of it or that they find it easy to use.

About the author:

Nina Khoury is a computer scientist, software engineer, data and information junkie and online marketer. She taught at various universities for more than six years and worked on projects for Fortune 500 companies including cisco, Intel and HP.

Leave a Comment