[su_highlight background=”#fffe99″]Summary[/su_highlight]: David Cameron knows that public approval of RAF air strikes against ISIS in Syria has dropped.
We explain what this teaches Migros, Lidl and Tesco about new product research.
Some weeks ago I came across a report (see image) that stated just 29 percent of people feel confident in measuring the ROI (return on investment) of display ads and this drops to just 22 percent for social media marketing.
Accordingly, management is interested in improving its understanding with analyses and analytics when it comes to social media activities. But do managers or politicians understand what we are trying to communicate or convey to them?
If managers read blog entries like this one about how to do surveys, it’s no surprise that they believe it is all easy and cheap to do.
This is the fifth post in a series of entries about big data. Others so far are:
How are management or politicians supposed to understand the difference between analytics, data and analysis? Can we trust polls or should we learn from the Scottish disaster?
For instance, when we go to a dictionary of statistics and methodology from 1993 (Paul Vogt), neither analytics nor business analytics has an entry, never mind data analysis.
Kuhn: Unless we share a vocabulary, we are not a discipline
However, these days, some would claim data analytics is a science (e.g., Margaret Rouse). Still, if something can be called a science (e.g., physics or neuropsychology), its members share a certain set of beliefs, techniques and values (Gattiker 1990, p. 258).
Do people in data analytics or data analysis share a vocabulary and agree to the meaning of basic terms? Not that I am aware of. Therefore, Thomas Kuhn’s (1970) verdict would be: Not a science (yet).
In web analytics, data analytics or data science as well as social media marketing we agree to disagree. But maybe I can clarify some things.
Sign up for our newsletter; this post is the first in a series of entries on business analysis and analytics.
[su_box title=”2 things business, data, financial and web analytics have in common ” box_color=”#86bac5″ title_color=”#ffffff”]
1. All analytics is art that involves the methodical exploration of a set of data with emphasis on statistical analysis.
2. All analytics include the examination of qualitative and quantitative data.
Analytics gives you the numbers, but fails to provide you with insights. For that, we must move from analytics to analysis, and we only gain the necessary insights if we do the analysis correctly.
[su_custom_gallery source=”media: 2649″ limit=”7″ link=”image” target=”blank” width=”508px” height=”552px” Title=”Diagram: Analysis versus Analytics versus Data – why the difference matters” alt=”Diagram: Analysis versus Analytics versus Data – why the difference matters”]
The graphic above illustrates that proper data is the foundation for doing analytics that permit a thorough analysis. Accordingly, using a sample that is not representative of our potential clients or voters is risky.
Nobody would draw any conclusions about attendance at next season’s football matches by asking a sample of baseball afficionados. So, go ahead and ask your social media platform users to vote for this season’s favourite flavoured drink syrup. But such a poll won’t give you an answer that is representative of your customer base.
Nevertheless, this is exactly what Migros did in 2015 (see Migipedia – few very young users participated in the poll, less than 10 wrote a comment during January 2015). It then published a one-page ad (among many more, see below) in its weekly newspaper (e.g., November 30, 2015), claiming that the chai flavour was the winner.
Making such a decision based on this type of unrepresentative poll is a risky choice. You may actually choose to increase production of the wrong flavour!
[su_custom_gallery source=”media: 2781″ limit=”7″ link=”image” target=”blank” width=”520px” height=”293px” Title=”Polling online community members gives you data from a non-representative sample of your customers – is that good enough to launch a new product?” alt=”Polling online community members gives you data from a non-representative sample of your customers – is that good enough to launch a new product?”]
Collecting data that is based on a representative sample of your customers is a costly exercise.
So why not use your online ‘community’ to do a ‘quick and dirty’ poll?
Surely a Twitter, Facebook or website / corporate blog poll is economical. You do it fast and easy and voilà, you got what you need, right? NOT.
Okay agreed, doing the above will strengthen your hand with a CEO. They might not grasp basic methodology issues of sampling or survey research. Plus, you got data from your online community, which is another reason to invest more money there.
In the Migros example above, having an online poll on your Migipedia platform achieves 3 things:
1. it allows your marketing folks and community managers to show the platform is useful for something;
2. regardless of which flavour wins and gets produced, you can always push it in your company newspaper. This way you reach 3 million readers in Switzerland – a country that has 7.8 million inhabitants,
3. even if the new product turns out to be a flop, thanks to other marketing channels, you sell 150,000 to 300,000 (or more) 1-liter bottles of chai tea syrup during the Christmas Season.
With its many resources and varied marketing channels (e.g., weekly Migros Magazin), Migros can ‘afford’ to use shabby research. It is in the enviable position to succeed, in spite of ‘spending’ so much.
The company might never learn that its analysis actually led the team to choose the second or even third best choice. Nonetheless, your marketing clout ensures that you can show it to management as an example of having done the right thing. Of course, we know it was done for the wrong reasons, but since management probably won’t find out, who cares – right?
[su_custom_gallery source=”media: 2793″ limit=”7″ link=”image” target=”blank” width=”530px” height=”308px” Title=”Polling: Opinion on RAF air strikes against ISIS in Syria – up and down each week” alt=”Polling: Opinion on RAF air strikes against ISIS in Syria – up and down each week”]
One poll is worse than none
As the above image from last week regarding air strikes in Syria shows, poll results can change quite a bit within a week.
For starters, no pollster wanting to stay in business will use a non-representative sample to get opinions. Using such data is unlikely to give you the insights you need for Hillary Clinton or any other candidate to succeed during next year’s US election.
[su_custom_gallery source=”media: 2801″ limit=”7″ link=”image” target=”blank” width=”485px” height=”445px” Title=”Polling: YouGov’s Will Dahlgreen never answered this question – so can you trust these results?” alt=”Polling: YouGov’s Will Dahlgreen never answered this question – so can you trust these results?”]
I left the above comment at the end of the blog post (it has not been published by YouGov so far). I asked about things that a good pollster will always publish with the poll results.
For instance, I asked how data were collected, whether the sample is representative, and what the margin of error was. I could not find any information about any of that. Of course, trust is not improved when one fails to publish a reader comment that raises method issues about your poll.
“YouGov draws a sub-sample of the panel that is representative of British adults in terms of age, gender, social class and type of newspaper (upmarket, mid-market, red-top, no newspaper), and invites this sub-sample to complete a survey.”
How exactly this happens with YouGov we do not know, since the methodology outlined on its website is not very detailed.
But David Cameron knows that while 5 million people have joined the ranks of those opposed to airstrikes in Syria in the past seven days, that could change next week. Polls are more interesting when they show a trend, so Mr Cameron still has hope that the opposition even more.
[su_box title=”5 key pointers for explaining the analyst’s work to your management: The case of survey research or polling” box_color=”#86bac5″ title_color=”#ffffff”]
Collecting quality data is followed by analytics, which subsequently require analysis to draw the proper insights. Analysis requires words in addition to looking at the numbers.
To tackle this challenge successfully, we need to do some preparation, as outlined below.
1. Do you have a strategy or a plan?
What is it you want to collect data for and why? This must be explained in a few sentences.
How will these data help you win the election, get the contract or sell more product?
2. How will data help you execute the plan?
You must know what data you need or the rationale for wanting them (see point 1).
What three steps will you take in the next quarter or six months to execute your strategy?
3. Are the numbers complete?
Most monitoring services can tell you everything about Facebook or Twitter.
But what about smaller websites from climate change activist groups, ISIS sympathesizers or peace activitists’ blogs?
Make sure you get the data you need. Is your sample representative of those whose opinion you must know?
4. Do you need social media monitoring?
Knowing what people say about your brand or company is a good thing. The Volkswagen emission scandal (remember #dieselgate) teaches us that in a crisis, simply monitoring the flood of tweets and status updates on Facebook or LinkedIn is of little use.
Like Volkswagen, you can decide to ignore the social media noise. Change your behaviour and communicate openly and directly (click for German-language radio report).
Unless you use social media monitoring to take action after the data are in, why collect it?
5. Do you have data from your customers?
If you have less than 1,000 employees, don’t make a big fuss about social media monitoring.
Focus on things that matter, such as what your clients report regarding warranty service, and the quality of phone support or user manuals. A tweet matters little.
Feedback can be collected in many ways, including customer surveys, discussions with clients or comments on your corporate blog.
Analysing these data provides insights that help improve product, service and so forth.
What it means
Focus on collecting data that help you serve your customers better. Getting a daily digest about the most important key words regarding your brand (e.g., we use DrKPI, #DrKPI, DrKPI BlogRank, #metrics #socbiz) is probably all you need. Instant data may not be needed unless you are a FT Global 500 company.
Restrict yourself to collecting only those data you absolutely and definitely must have.
Make sure that they meet some minimum quality standards. Only this will enable you to trust the analytics and analysis resulting from that work.
Actionable metrics are what matters
Unreliable or invalid data from clients, social media monitoring and opinion polls is a waste of resources.
Please keep in mind, just collecting data without taking action is a navel-gazing exercise.
Always ensure that analytics leads to analysis that goes beyond navel-gazing metrics. Answer these questions truthfully:
A. What will be done with the findings: Unless you take action based on your data, why measure and collect information at all?
B. What kind of data was collected: Make sure you understand how data were collected. Can this polling data be trusted to be representative of the population (e.g., consumers in my country)?
How was something like influence (e.g., Klout) measured (what kind of proxy measure was used)?
If it is not transparent to you, move on and do not waste your time with such a measure or index.
Keep points A and B in mind before you collect data and / or use somebody else’s findings.
‘Total X’ combines xyz Labs’ proprietary Rambo social media measurement tool, and WalkBack®, the leading measurement source of WOM marketing from the Sambo Group, a Laughing Stock company.
Okay, what does the above mean? Who would want to trust this gobbledygook? If marketers or pollsters cannot explain things clearly and precisely, they tend to cover it up in jargon that tells you nothing.
Regardless, 2016 will mark the year where Lidl, Migros and Tesco will do more of these utterly useless polls, to find another ‘winner’ for a new flavour of drink syrup, mustard or soft drink.
Even though social media, community and marketing managers will claim a victory this year, with so much additional marketing around, who is surprised? Put differently, regardless which syrup the company – Migros – would have produced, I dare to claim it would have flown off the shelf anyway.
Combine all the ads and marketing push, and if it tastes okay, success is in the bag. Unfortunately, those that hate research will attribute part of this success to a useless online poll.
Next time you read something like the above, claiming to rank something, check the methodology. Cannot find anything? Just move on because it is probably hogwash.
Vogt, Paul W. (1993). Dictionary of statistics and methodology. Newbury Park, CA: Sage Publications. For information see https://uk.sagepub.com/en-gb/eur/dictionary-of-statistics-methodology/book233364 (5th edition 2016).
2 great reading lists for additional resources about research, polls, survey data and much more:
Join the conversation
- Do you have an example of a great poll / study?
- What is your favourite marketing measure?
- What research methodology would you recommend?
- Other ideas or concerns you have about marketing research, please state it here.
Of course, I will answer you in the comments. Guaranteed.
This post is also available in: Englisch