The dogs that don’t bark in the night

The dogs that don’t bark in the night

Library Catagory: 
Sub Header: 
Data and statistics will always be open to interpretation, writes Rory Sutherland, and what you don’t see is as important as what you do.

Many years ago, back when fax machines were a thing, a friend of mine had been called in to evaluate the effectiveness of a luxury car-maker’s much-acclaimed recent advertising campaign.

He waited in a London meeting room for the latest sales figures to be sent from the UK Head Office. The fax machine rang a couple of times, whistled a bit then started to churn out the latest sales graph.

“I think it’s broken,” his companion said. “It seems to have got stuck.” But there was nothing wrong with the fax machine. The sales figures, covering the period both before and after the launch of the advertising campaign, appeared on the chart as an unbroken, straight and - worst of all - horizontal line.

Most of us, I think, would have seen this as decisive - and damning. Clearly the advertising hadn’t had any effect. All the other “soft” measures had been highly positive, but when it came to the one decisive measure - sales - there was nothing to show for it.

For most people, that would have been the end of the matter. But my friend was a complex-systems expert. He immediately got suspicious.

“You never see a straight line like that in any normal real-world environment,” he commented. There is always some noise - from seasonality, promotions, or just general randomness.

He immediately wanted to find out more.

One possibility was misreporting. Were the staff jiggling the sales data to make things seem more consistent, or to exploit some weekly bonus target? In this case, it turned out, that wasn’t it.

So, imagine you were to stand on on a dual-carriageway for an hour and, in that whole time, on the northbound carriageway, one car drove past every ten seconds in perfectly spaced intervals. What would you assume? Well, for one thing, you wouldn’t assume that this was a free-flowing road. Left to its own devices, traffic bunches. Normally there might be one minute in which you saw perhaps ten cars and one minute in which you saw none.

One explanation might be that, just south of your observation point, there was an obstruction in the road, beyond which was a traffic jam. Perhaps a lorry had jackknifed, and the remaining gap was just wide enough for one car to squeeze past every ten seconds.

Whenever a complex environment reveals an unusual consistency, what it suggests is a constraint. An artificial blockage or limitation of some kind in the system. This was what had happened here.

What emerged was that the car-maker had become so eager to sell car loans, they had mandated that no customer could take a test-drive until he or she had first spoken to the dealership’s financial adviser. But there was typically only one such adviser per dealership. If the advisor was busy, you weren’t offered a test drive and couldn’t buy a car. Since the number of financial appointments was a regular constant, it imposed a regular constancy on the speed at which they could sell cars. This made for elegant graphs but very ineffective advertising. Once the constraint was removed, sales began to surge (messily) upwards. Subsequent advertising worked well.

There are a few lessons to be learned here. One is simply that the statistics are always open to interpretation, and that the most obvious explanation, however convincing, may be wrong.

Another good principle is that, in any sales funnel, it makes sense to optimise things from the point of sale backwards. Just as there is no point in widening a road ahead of a bottleneck, there is no point in doing great advertising when your website conversion rate is terrible.
 
But the final, and most important lesson is that null findings can be surprisingly revealing. A recent experiment in behavioural science, which used lottery prizes and a barrage of electronic reminders to encourage patients to take their heart medication, revealed no change in behaviour. To me that  failure is nevertheless telling. If you buy a colleague three alarm clocks and he’s still late for work, it doesn’t mean the alarm clock doesn’t work: it means he hates his job. A more likely explanation was that the medicine had highly unpleasant side-effects - impotence, depression, anxiety - and that the failure to take the medicine was intentional not accidental.

What should you conclude, for instance, if four different creative treatments prove equally effective in generating internet engagement? That creative makes no difference? That all are equally effective? Perhaps. But if that finding emerges time after time, it starts to become suspicious. You would expect some degree of variation. A more likely explanation is that clickbots may be responsible for a large part of your response. Clickbots, after all, do not distinguish between good and bad creative executions, because they are not human. Unfortunately, not being human, they don’t buy much either.

Statistics and data are, like crime scenes, open to interpretation. There is often an Inspector Lestrade explanation and a Sherlock Holmes explanation. And the deepest significance may lie not in what you see but in what you don’t. The dogs which don’t bark in the night.


This piece was taken from Marketing Society print title, Market Leader. Exclusive to members. If you're not a member of the society find out more here.

 

Views1307
Author: The Marketing Society
Posted: 26 Jan 2018
Rate this article:
Rating: 
5
Average: 5 (1 vote)
5/5
View/Post Comments (0) X

Similar Articles

David Ogilvy once remarked that, in the course of his working life there was only one big idea he had failed to sell. ...

More...

Considering how to boost your customers’ self-esteem could be one of the most important things you do, writes Rory...

More...