Web Development

Built by maltpress

Your statistics-based marketing campaign is wrong

You may well be aware of the marketing omnishambles clothing retailer Lyst have created with their recent “we’re selling dogs online – HA HA no we’re not puppy farms are bad” campaign. It’s cynical, manipulative marketing at its worst, but it’s one more example of something which has been bugging me more and more recently: the tendency to take statistics out of context, especially on the web. There are three major issues that a lot of marketing people – particularly social media marketing – and journalists are guilty of. Let’s have a look at all three of them

“One billion Google results”

You’ll see lots of people backing up their arguments with the phrase “a search for x shows y million results”. How does this support their argument? It doesn’t. I just searched for the phrase “dogs have five legs” and got nearly 1.3 million results. This proves it! We’ve been counting wrong for years!

Of course not. A Google search omits context: without quotes around the phrase, the order of those words, and their proximity to each other, is ignored. Any page with any of those words on will bring results. Duplicate versions of a page with those words will bring results.

The solution is to put the phrase in quotes; this brings the number of results down to 506. But there’s still an issue with this, and it’s also an issue with poorly phrased terms. For example, search for the term, with quotes, “vaccines cause autism” (they don’t) and you get 433,000 results. Firstly, there is no quality control – any old idiot like me can use that particular phrase on a post, it doesn’t make it true  – and secondly you will still get hits for pages with the phrase “there is no proof vaccines cause autism” (which of course there isn’t). Google is smart, and context is used to some extent to order results, but the lazy exercise of Googling something and just parroting the number of hits you get is foolish. It’s incredibly susceptible to confirmation bias – the idea that whatever result you get you’ll twist to your pre-held belief. Sources need to be weighed up with a critical eye. The internet is great in its openness and democracy, but there is no editorial control: any idiot with a keyboard can create a web page saying any outlandish thing they want, it doesn’t make it true.

“Only 1000 views”

I had a discussion with a very good, very old friend the other day about Boaty McBoatface. My argument: that while the whole thing was stupid and annoying, because internet “japesters” seemed to forget that real scientists would have to apply for real grants for real research based on that boat, it did have the positive effect of raising awareness of the organisation Nerc and their work. His argument: ditto stupid and annoying, and a terrible bit of communications work for the sciences because only 50,000 people of millions who had engaged with the story had watched a video about the work of Nerc.

Now, that’s true to some extent: the engagement rate is tiny. But let’s put it in context: it’s a boring organisation (I can say that, I was nearly a marine biologist) doing research which is not smashing quarks together or anything sexy like that and no-one has ever heard of them. Look at their YouTube channel and a video put up a year ago has 1000 hits; the next one, about their new research vessel, 50,000 and growing. While the absolute numbers might not be great – and would be considered terrible by any kind of media-savvy FMCG site – within the industry and given Nerc’s standing before this, it’s a phenomenal success.

One of the major issues – and we’ll discuss this more in a moment – is that the traditional measure of ROI doesn’t work in awareness raising. ROI is pretty meaningless when you boil it down to pure numbers if there’s no actual tangible return. What’s the monetary value of a kid getting excited by the idea of exploring the Arctic, then growing up to become a scientist? You may scoff, but that’s pretty much what happened to me – an underlying interest in nature and one decent TV series set my mind on a career which made my choices of GCSEs, A-levels and undergrad degree for me. Yes, my path changed, but who knows what seeing that amazing boat would have done if YouTube had been around back then?

I often build sites for people who have very, very modest aims when it comes to their websites. They want maybe 1000 extra visitors a year. They sell services, contracts, big ticket items: they need maybe one customer to make an entire web rebuild worthwhile. I’m the same: a few good leads through my site in a year can keep me nice and busy. This Lyst conversation has given me nearly 200,000 engagements on Twitter, but from a business point of view they’re utterly useless. None of them is going to pay me £5k plus for a website: they’re people who care about dogs. I’m happy with modest but valuable visitor numbers on the site; 50 people who are looking for a new website are infinitely more useful to me than 200,000 who aren’t. So low numbers are not necessarily bad: they need context.

“Over 10,000 engagements”

At the other end of the spectrum is the statistical faux pas we have with Lyst:

Their point here is that last week, few people were talking about this issue, and now lots of people are. Perhaps a good point, but let’s put this in context:

Twitter is a left-leaning echo chamber. We follow people who think the same things as us, who agree with us. We retweet what they say, join “mobs” against things we dislike. It’s not the place to change people’s minds. 10,000 mentions means nothing: what are these 10,000 people going to do that they didn’t do before? Now, let’s combine this fact with the assertion that a change in numbers of people talking about an issue is a good thing. Last week, 158 people talking about puppy farming: well, of course. Who gets up in the morning, every day, and tweets lists of every single thing they disagree with? This is not a demonstration of 9,842 people being made aware of puppy farms, it’s 9,842 people who had no reason to say anything about them (because they were just going about their lives) suddenly being given reason to say what they think.

Let’s say 200 people today say that pushing children over is a bad thing. Then I go on Twitter and provocatively say that everyone should go out and push over a child right now. Tomorrow, 500 people say pushing children over is a bad thing. Have I cut down the number of children being pushed over? Honestly, there is no way to tell. Have I maybe made someone think about pushing over a child? Again, no way to tell, but yeah, possibly. How many children being pushed over is an OK number in order for me to make 300 say out loud that a bad thing is, indeed, bad? Did anyone need to vocalise how bad pushing children over is? Those 300 people almost certainly thought so already – and, again, we have no way to tell.

Once again confirmation bias comes in to play: we agree with people who say what we think, and we repeat what they say, and people we disagree with we ignore, block or argue with emphatically with no thought to changing our minds, which means there is no statistical usefulness in these numbers. They prove nothing about anything, not even the useful metric of brand reach – because it’s creating an artificial situation unrelated to the brand in question.

What’s worse is that it’s really easy to use bad statistical methods to support what you think this campaign has achieved. Take a look at this article on the Telegraph, particularly the survey at the bottom:

This proves nothing: it’s a terrible statistical tool.

  1. It’s self-selecting, because it’s on the website of a newspaper with a very particular readership – only Telegraph website users are going to complete this, and the fit a particular profile. Much more than this, it’s a sub-set of Telegraph readers who care about this story and who fill in online questionnaires.
  2. The questions are biased: all support the hypothesis that “puppy farming is bad” or that “Lyst are in the wrong”: there is no real range of opinions offered by the questionnaire.
  3. It appears at the end of a rather emotive article, in which opinion is mixed with fact, and words like “outrage” are used.
  4. Of the three questions, the middle one is a positive statement while the others are negative.

Anyone presenting the results of this quick questionnaire as evidence of success of a campaign has no idea at all of statistics and should not be involved in any kind of numbers-based marketing activity. You cannot present quantitative results from this kind of activity without undertaking qualitative research, unless the methods of measurement are truly scientific: unbiased, reproducible, allowing for margins of error. Otherwise it’s lies, damned lies and statistics: numbers being bent to prove what you already think (coming back to confirmation bias again) because they’re so open to interpretation.

The Lyst problem

The main reason for writing this long post is Lyst’s attempt to “raise awareness” of puppy farming and unscrupulous dog breeding programmes. I have huge issues with this campaign, and as you can see, bigger issues with any claims of its success.

My main issue is this: however well-meaning, any collateral damage on this campaign is real. It’s real dogs being affected. The campaign promoted dogs as fashion accessories. There was a long gap. Then it was revealed that this was a hoax. In the intervening period, how many people who wanted dogs were convinced to take the final step and buy one? How many people emailed the site asking for a dog and were frustrated by a lack of immediate response, and then went to other (very real) sources? The BBC showed a programme last night called “Choose the Right Puppy” (you can watch it here). In it, a married mother of two who wanted a dog was being advised about buying a dog. She was travelling to see a breeder, who called to say the last puppy had gone. Despite being led by the expert, despite knowing it was wrong, she immediately started looking for another breeder who could get her a puppy as soon as possible. This is fairly obviously going to happen: once the buying decision is made you need to reach a conclusion. How many of us have considered a big purchase, found it unavailable at our first choice supplier, and spent over the odds getting it from somewhere else?

The risk with Lyst’s stunt is that exactly this will happen, and we get back to my previous question: how many children is it OK to push over in order to get 300 people saying it’s bad? How many puppies being bought badly are acceptable to get people who disagree with puppy farms talking about it? Of course there’s no way of knowing if any puppies have been bought because of this (or will be) but the risk is real.

There are many other flaws with the campaign: manipulation, not denying the involvement of organisations who knew little about the stunt, even now a lack of transparency about who is responsible in terms of animal charities, drawing unrelated organisations in to the conversation, taking email addresses under false pretenses. Perhaps it would have been more acceptable had enquiries resulted in an auto-response raising awareness, but the attempt at satire is clumsy. It’s just such a bad, bad campaign.


All of this is part of a wider issue: a serious lack of understanding of statistics and how to interpret data. Jeremy Hunt is at it with the NHS, misinterpreting select bits of contradictory data from a range of studies in order to push his political agenda. Marketers are more guilty than most, presenting “successes” with no hard facts to support their claims – only their misinterpretations of results which are used to confirm their client’s hopes that their campaign will be a success. It’s a symptom of an industry overrun with well-meaning but ultimately untrained self-proclaimed experts with no formal experience or training in science or maths. It’s not willful or malicious, but this ignorance is costing millions in misdirected marketing money, spent on campaigns which – regardless of the true outcome – will always be hailed as a success.

Back to blog


James Asher

May 11, 2016 at 9:22 am

sounds like you don’t have a clue what you’re talking about


Adam Maltpress

May 11, 2016 at 11:53 am

Happy to discuss. What don’t you agree with?


Leave a comment

By using this site you consent to our use of cookies. To find out more, see our privacy policy. Continue