by Hubertus Hofkirchner -- Vienna, 14 Aug 2015
Response fraud may be a multi-billion dollar problem for the market research industry. So how can you recognise if some expensive research results should better go into the bin rather than on the decision maker’s desk?
Do you remember the Brothers Grimm story of “The Big Bad Wolf and the Seven Kids”? Where the silly duffers actually tell the wolf how they recognised him as a fraud? How he comes back with a white paw and a sweet voice as to finally trick and eat them all, with skin and bone?
If this article now reveals how to detect response fraud, is there a danger that it will go viral in the fraudster community? An instant (anti-)social networking hit, liked and tweeted by cheaters everywhere, who then - armed with new criminal knowledge - will defraud even more survey incentives?
Maybe. Just for four weeks, though. Then there will be a third instalment to this mini-series about how to avoid response fraud in first place.
Here is the crux: the fraudster's profit is quite small, maybe a few dollars. In comparison, the indirect cost of wrong corporate or public policy decisions - the type which warrants the effort of a market research project - can be enormous.
From a researcher's point of view, this total mismatch is outright frightening. Imagine the monetary and reputational damage if your client finds out about fraudster activity before you do.
Further, there will be a real world result in the end. In hindsight, we all are so much smarter. A botched decision may trigger an investigation which may well reveal a previously invisible indicator of respondent fraud. Even the art and value of market research itself may be questioned after blatantly wrong results, like the recent poll failure in the UK.
It is quite difficult to recognise hidden response fraud, when the industry-standard safeguards were already applied to best practice during fielding.
Your best chance lies in prior knowledge. Take a good look at profiling questions. Where an approximate answer is known, such as sociodemographic statistics, but data are out of whack, there may be something fishy. If product usership profiles do not tally with actual market shares, a closer look may be warranted.
In our previous example of a grocery study with a whopping 40% of academics in a respondent sample (while the real national incidence of “Bachelor Degree or More” is only 14%) you have a near-certainty of foul play. A relatively benign explanation could be simple overclaiming, rampant in certain countries as described by Jon Puleston’s ESOMAR 2012 paper on “Online Survey Data Quality”. If fraudster activity is present, the situation is more grave. Low incidence questions get more cheaters, and cheaters lie more than legitimate respondents.
Considering this broad mismatch versus often very close survey results, it becomes self-evident that using them for big decisions would be grossly negligent.
An Explicit Check
If you or your panel supplier already apply the customary precautions and trick questions to detect lying, an alternative way to get a grip on the issue comes as a by-product of the arms race in the War on Click Fraud.
Understandably, advertisers do not like to pay for clicks of robots or users who syphon off advertising dollars. Therefore, the big publishers are expending an enormous amount of money and effort to catch cheaters out. Google is one of those. Unlike Grimms' seven young goats, the internet behemoth is quite tight-lipped about the precise workings of their various traps.
Their well-known Google Analytics tool for websites puts some of this power right at market researchers’ fingertips. Analytics is free and so easy that any kid can use it … sorry computer-age kids: that any adult can.
Fraudster Detection In 9 Steps
- Open a Google Analytics account, if you do not have one already. It’s freemium for up to 5 million calls per month. Let’s assume that you work with a smaller sample size than this.
- Get yourself the asynchronous tracking code from the Admin section, customised for an own webpage under your full control. Important: Use a different site from your survey platform.
- Add the tracker code to your own webpage. Now you are all set, for as many surveys as you like.
- Add a call your own webpage at an appropriate point in your survey (the “Jump Page”)
- Send the respondent from there back to the survey at a point before a response is marked as “Complete”.
- Navigate to “Acquisition” in Analytics to look at your webpage’s traffic sources and drill into “Referrers”.
- Click on the line for your survey’s Jump Page to filter only those.
- Check the column “Sessions”. It should tally with the “Completes” reported by your sample provider.
- Last, look a the column “New Users” which will only count a user’s first landing from your Jump Page but not a repeated landing.
Gotcha! The difference between the two is indicative for the number of fraudsters who were able to trick your and your sample provider’s checks and participate multiple times. Keep track of this ratio between studies and deal with the information as appropriate.
Prevention Is Best
In the next and last blog of this mini-series you will learn how - by changing a paradigm which causes response fraud in first place - you can remove the issue with a new methodology. Like Mother Goat sewing rocks into the Big Bad Wolf’s belly to sink and drown him in the well, for once and for all.