Are Display Networks Full of Conversions, or Just Full of It?

Source http://feedproxy.google.com/~r/KISSmetrics/~3/uvlIlMbHwC4/

There are hundreds of ad networks that promise to run your ads with the best chance of conversion, the hottest technology, and the most efficient algorithm.

So, digital marketing managers, who hear ad network reps beating the drum in one ear and upper management expecting measurable results in the other, are plagued with one key question: does display advertising really work?

By that, I mean, is there tangible, incremental conversion lift coming from the display ads you run with Rocket Fuel, Quantcast, Chango, and the like?

This post attempts to unpack how success is measured, what red herrings often crop up, and how you can get to the bottom of display advertising’s contribution to your company’s top line.

Definitions of the Jargon in Display Advertising

Click-through and view-through conversions, multi-touch attribution… What does it all mean?

Let’s quickly define our terms:

  1. Click-through conversion – Your prospect sees a display ad or banner ad, clicks on it, and converts.
  2. View-through conversion – Your prospect visits a page where your display ad is shown, but does not click on it. Then, later, the prospect converts.
  3. Viewability – Because view-throughs are measured based on whether the ad loaded on your prospect’s computer, we don’t really know if your prospect saw the ad, consciously or subliminally or at all. It might have loaded at the bottom of the site, while your prospect never scrolled that far. To get at this calamity, some tools attempt to measure this metric (viewability) in order to answer the question: was your ad actually viewed; or, did it just load somewhere on the page and fall on deaf ears, so to speak?
  4. Last-touch attribution – This is the traditional conversion path that is measured against simple click-through conversions. Whichever ad or channel had the last touch (or the last click) that led to your prospect converting is the one that gets the credit.
  5. Multi-touch attribution – This is the new(er) hypothesis about complex advertising programs that touch a prospect at many different points on the way to conversion. Let’s say your prospect saw a banner ad but didn’t click, then Googled your brand name, then searched for coupons, and finally converted on a newsletter click. This perspective considers all of these touches and gives some equity to all of them.

So, what’s the right way to understand whether display ads are working? Should we spend thousands of dollars to implement multi-touch attribution technology, or apply Draconian rules that reward channels only for closing a deal on the final click?

It’s a confusing problem. But, you have options for measuring what’s going on in your conversion funnel.

Here are some things you can do to truly measure the impact of display:

A/B Test Your Ads against a Placebo

There is something called a Public Service Announcement (PSA) test. In this experiment, you serve your display ads to 50% of your audience – business as usual – but the other half of your audience is shown a placebo. They see a PSA. One of my clients, for example, used a “Smokey the Bear” ad about preventing forest fires.

This type of test can take some time to gather meaningful data. It can take a month or more and be quite costly. But, the results will demonstrate whether the potential customers who saw your real display ads have a higher probability of converting than those who saw Smokey the Bear.

Consider this example of a real PSA A/B test we ran on behalf of our client in the home decor ecommerce sector:

Here’s what we uncovered by running this PSA test with Quantcast, an advertising partner:

The “Brand” group of ads, meaning the client’s regular ads, yielded 980 conversions (these are view-throughs) while Smokey the Bear ads converted 871 of the prospects.

When you factor in the number of cookies – in other words, how many times each ad was shown – the regular ads brought in 30% more conversions than Smokey the Bear. For you marketing statisticians, the confidence level here was 95%. You can find more information on statistical significance and what it means for marketers here.

Is this a good result? On one hand, it’s nice to see 30% incremental conversions. On the other hand, what should we make of the fact that almost 900 people saw Smokey the Bear and converted anyway? Is the handful of incremental conversions enough to justify the display ad spending here?

For this particular client, it wasn’t.

We didn’t test only IAB-standard display ads as shown above. IAB-standard is a term Quantcast uses to describe banners and display ads as we usually think of them, e.g. 300×250 and 728×90. The alternative would be FBX (Facebook exchange), mobile, etc.

We also tested ads on the Facebook Exchange. There, we found no statistically significant conversion difference between the client’s real ads and PSAs.

This type of testing is one way to assess value. The key advantage is its quant-driven honesty. It’s hard to argue with the numbers: if Smokey the Bear drives almost as many conversions as a real ad, that’s pretty black and white. If the ads outperform Smokey by 250%, you know you’re getting real value out of your display program.

Another way to think about display performance is to tease apart the different channels and their respective influence — a process called attribution. There are multi-touch attribution tools out there to help automate this method, and that’s what we’ll look at next.

Use Advanced Attribution Technology

If you’re touching prospects on many levels and via multiple channels, some experts argue you’ll benefit from an advanced attribution platform. This technology takes into account all the touches that contribute to a single conversion and algorithmically estimates how each channel fits into the larger whole.

Convertro is one such tool. It’s useful for estimating how many real conversions a given channel is contributing based on all the (possibly hundreds) of touchpoints in the sales cycle.

Also, it can show you data like this:

What we found interesting was the tool’s ability to show how often a given ad network played the role of Introducer (meaning the ad network served the very first ad a prospect saw in your sales funnel), Closer (the ad network showed the last ad viewed or clicked by a prospect before converting), or Influencer (the ad network participated somewhere between introduction and conversion).

This breakdown can be useful in thinking about how important each ad partner is when it comes to guiding prospects to eventually buy.

The table above tells us that the ad networks pictured here acted as influencers about 98% of the time. That means they weren’t likely to be sourcing brand new prospects or “closing the deal” with soon-to-be customers, but they probably contributed awareness along the way.

Continue to Measure Last-click Attribution

The conservative play when it comes to evaluating many different channels is to give maximum weight to last-click conversions and revenue.

That’s because only one lucky ad or channel can be responsible for the final click or view before a conversion. If you credit each channel with view-throughs, you might be praising multiple channels for the same conversion. But, using a tool like Google Analytics to report on conversions and revenue from a given traffic source will, in the traditional view, tell you which channel drove the last click. Revenue resulting from each channel will be counted exactly once.

Some experts argue that last-click attribution has gone out of style and that a more holistic view is in order to determine how prospects interact with your business on their journey through the conversion funnel. It’s true that tracking ROI has become more complex with the advent of drip campaigns, inbound marketing, and many kinds of interactive ads. Still, a more rigid reading of the analytics can offer checks and balances against overspending.

Conclusion

The best thing you can do for your business is to look at attribution in more than one way. If you can afford a PSA A/B test, it’s quite informative (and, in our example case, quite sobering). Otherwise, assessing ad network performance via multiple free tools will give you more insight than simply trusting the dashboard of your ad partner(s). With several perspectives at your disposal, it will be easier to make a call on which channels should be expanded and which should be cut.

About the Author: Igor Belogolovsky is Co-founder of Clever Zebo, a team of conversion rate optimization experts based in the San Francisco Bay Area.

Some of the links in the post above are “affiliate links.” This means if you click on the link and purchase the item, we will receive an affiliate commission. Regardless, we only recommend products or services we believe will add value to our readers.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge