Pre-testing cannot be equated with click-through

by Nigel Hollis | June 12, 2017

In a recent post Faris Yakob concludes that metrics hold the answer resolving the tension between short-term and long-term effects. Agreed. But you will not be surprised that my blood pressure went up when he puts pre-testing on a par with click-through as a metric that responds well to rational advertising. Utter bollocks.

For readers not well versed in British terminology, “bollocks” is defined by Encarta as “a highly offensive term indicating strong disbelief or disagreement”. Why is it that normally intelligent, thoughtful ad agency commentators are so averse to pre-testing? And to equate pre-testing with click-through is just ludicrous. A pre-test is designed not only to predict what will happen in terms of sales, it is designed to give feedback on why people respond as they do so that, if needed, great creative can achieve its full potential. Try getting that from click-through.

click-thru646x366

Everyone agrees you have to understand the intuitive response to advertising, not just the reflective, so why not check whether your ad evokes a positive response and, if not, do something about it?  Advances in methodology mean that pre-testing benefits from the systematic measurement of intuitive, non-conscious response to advertising? (Well, the Link test does, I can’t speak for some of the other systems that claim to measure System 1 responses but still use System 2 approaches).

For the record, pre-testing can measure both the intuitive and the reflective response of an audience to an ad or campaign. If your ad is designed to evoke an intuitive emotional response or implicit impression, rest assured we can measure that response. (Although it would help if you could tell us how you expect the ad to work ahead of time so we can tailor the test to the ad’s objectives) Facial coding will identify the real-time intuitive emotional response to what is shown and said. And, if the ad is intended to deliver an implicit message we can use implicit association testing to see whether that idea is conveyed effectively. In fact, this is now our headline measure of whether the ad makes the desired impression or not.

But here’s the real issue. The fundamental problem with pre-testing is not the research, it is the ads we are asked to test. Most of them are designed to work by engaging a deliberative mindset not an intuitive one. Unfortunately, the two out of three ads still set out to deliver explicit messages in spite of all the evidence that people respond to impressions not messages. You can talk all you like about System 1 responses but if your ad insists on trying to appeal to a System 2 mindset then guess what? Easily appreciated, factual messages are most likely to move the needle.

OK, enough of that, I am not going to be able to burst the ad agency ‘pre-testing is bad’ bubble with a blog post. Fake news about pre-testing has been around for decades and I guess we just need to keep setting the record straight. Pre-testing did not kill Cadbury’s “Gorilla”, we said go for it. Pre-testing helped unleash Dos Equis "Most Interesting Man In The World" It is not pre-testing that is killing creativity, it is fear of failure, and pre-testing offers a chance to help ensure your advertising flies, not fails.

But what do you think? Why do intelligent people still doubt the value of pre-testing. Please share your thoughts.

4 comments

Leave a comment
  1. Dominic, June 21, 2017

    Adrian; how do you think emotional response should be measured?

  2. Adrian Langford, June 20, 2017
    Apologies Nigel if Link has moved on - your competitor Ipsos was certainly using it very recently and making much of their 'interest trace'. Clients loved it of course
  3. Nigel, June 19, 2017
    Well Adrian, I can't help feeling that the fact that you are referring to a technique phased out years ago kind of undermines your argument. No knobs or levers any longer. However, I guess that your comment does prove how durable memories can be. 
  4. Adrian Langford, June 16, 2017
    Pre-testing is wonderful, done sensitively and interpreted carefully. When Faris asserts that pretesting favours advertising built around an IP (information processing) model, he's obviously referring to the two big beasts here - Millward Brown and IPSOS ASI style pretesting. It's no use claiming Link tests are now magically sensitive to emotional response - grasping a knob and moving it to show how 'interested you are in the advertising' is not the same thing as measuring emotional response. If the big quant pretest firms now incorporate 'emotional measures' into their pretesting it is in a very mechanical and clumsy fashion. I'm sorry but this is my practical experience. I've attended pretesting debriefs from the big two research agencies for well over thirty years, and have failed to find one that was remotely helpful in creative development despite being open for guidance. How on earth can a methodology that requires high scores on 'key communication points', not privilege rational work that communicates key points explicitly? This is before we get into the whole area of context for administering the tests — asking people to pay close attention to a film squirted down the line to a tiny screen on their laptop, watched at heaven knows what time of day, and then to pick their way through an extensive questionnaire requiring written and considered responses in a fixed order.

    Leave a comment

    Share