Uncategorized

The Problem with Privacy

Kristin Luck

Big data is gaining momentum. In 2013, the market reached $18.6 billion and is projected to reach $50 billion by 2017, a 38% compound annual growth rate over the six year period from 2001, according to Wikibon’s Big Data Vendor Revenue and Market Forecast 2013-2017.  Big data related services, which are most representative of those of us in the market research space, makes up 40% of the total. 

Major drivers of growth, according to the report are:

  • Growing confidence in big data products and services among enterprise buyers
  • Growing maturity of those products and services
  • Better privacy, security and governance capabilities
  • Growing number of partnerships between big data and non-big data vendors

Wikibon also details some big barriers to growth. Most relevant to researchers are challenges presented by a lack of best practices for integrating big data analytics into existing business practices, plus concerns surrounding security and data privacy.

In the past year, I’ve spoken across the globe on data privacy, from Canada to Australia and most recently in the United Kingdom. But the US public’s ire at the NSA, fed by continuing data privacy flubs (the most recent including data breaches at Target and data mining complaints lodged against Google and Facebook) have escalated conversations about privacy. From “not-so-private” social media rants, to very real concerns about overall personal security; the U.S. consumer is on high alert.  Those worried about privacy don’t fit into any neat demographics any longer. A few years ago, Knowledge Networks released a study that reported those in younger demographics are less concerned about privacy. Facebook’s Mark Zuckerberg once declared that young people “don’t care” about privacy. I think that’s changing. Consumers (and as an extension, our research respondents) are getting smarter about, and more cautious with, their data, regardless of age. Data security and trust have slowly become the leading talking points of any big data discussion. How do consumers know their data is safe? Are marketers doing enough to safeguard sensitive information?

Beyond the Privacy Policy: Humanising Our Approach
As an industry, we haven’t historically done much to demystify the research process and give consumers greater clarity about what happens with their data. There is much we can do to simplify consumer consent to share data. We also need to take a hard look at whether privacy policies are still a valid way of obtaining compliance.

We should all think about how data privacy needs to be addressed moving forward. Annie Pettit of Peanut Labs recently wrote a great article on humanising research, an important component of the “new normal” of privacy and security in market research.  In Annie’s article she writes:

“Why can’t we simply write better surveys? And by better surveys, I don’t mean even more grammatically-correct and precise. I mean why can’t we use language that is simpler and more reflective of who we are today, language that reflects the current trends and styles we have of communicating with our fellow human beings?”

By refining and simplifying how we communicate with respondents, we have more opportunities to engage respondents and assuage any privacy concerns in a meaningful way, long before the waters get muddy with legalese. Annie and I have had more than a few conversations about the hypocrisy of designing a survey that you wouldn’t even take yourself. I have a similar issue with privacy policies.  When was the last time you read your own firm’s privacy policy? Or any privacy policy for that matter? We just click the box and move on, or we decide it isn’t worth it and drop out altogether.

As an industry, even with the influx of ancillary data sources, we remain largely reliant on respondents for our research. We have a long and rather sordid history of taking advantage of our most precious resource. By boring them with long, dull surveys and failing to properly incentivise, we’ve used and abused them to the point that completion rates and engagement are ongoing challenges.  Unfortunately (for us) respondents are wising up and realising the value of their data. It is paramount that we collectively present clear, understandable data privacy and security guidelines during the research process – and adhere to them.

Relevance and Value: Power Shifts to the People
Monetisation is clearly a major motivating factor for marketers in their ongoing collection of consumer data and it’s no different for researchers. There’s a shift though – consumers (soon our respondents) for the first time are realising the value of their data and are now looking at monetising it. They’re deciding what to share (or not to share) and what value that data has to marketers.

We can learn a lot about privacy, monetisation and respondent satisfaction from looking at the leaders in big data outside our industry.  Companies like Evolv, Ayasdi and The Weather Company are using big data to create meaningful change – from staunching employee turnover, to revealing genetic traits of cancer survivors, to letting shampoo brands target users in humid climates with a new anti-frizz hair product.  These are real world examples of how big data is informing the development of new products and services like never before and offers brands unique opportunities to increase consumer engagement. Greater personalisation (through data mining) offers the ability to target marketing campaigns that offer greater relevance to consumers – surely as researchers we can offer respondents the same?

Consumers feel the most generous with their data when:

  1.  What they share creates relevant products or experiences.
  2.  Privacy expectations are clearly outlined and respected.
  3.  They receive something in return (either directly or intrinsically).

We should take this to heart as researchers because there is a direct correlation here to what respondents are looking for – relevance (will this research create a more meaningful products or services for me personally?), privacy (do I understand what data I’m providing and when?) and proper incenting (what am I getting back?).

Demystify Me: Marketing Profiles
There is much we can do to take privacy challenges head on by demystifying research and the data we collect. Marketing firms like Acxiom are doing their part to take the mystery out of big data with their site (in beta in the United States only) AboutTheData.com. If you’ve ever wondered what your online “marketing profile” looks like and what determines the ads or offers you see online, this is the site to visit. By accessing your online “marketing profile” (the same profile that marketing firms use to target their ad campaigns), you can review and edit everything from basic demographic data to household purchases and interests to ensure you see offers and advertising that are the most relevant to you. Fascinating! I will happily spend 20 minutes editing a profile that will limit the ads and offers I receive for cruises, PhD programs for family therapy, or maritime bath towel sets, and increase the offers I get for half price Jimmy Choos or a surfing vacation in Bali.

It’s easy to see that there’s no simple answer for researchers to address privacy concerns. It is an issue that extends far beyond our industry, and therefore is gaining in awareness and scrutiny across the board. As we shift from online to mobile platforms, these privacy and security issues will shift too – and we need to be ready.  Proactively addressing issues like the ones outlined above, are necessary as the market research industry seeks to avoid the public discord that organisations like the NSA have already attracted.

biopic_kristinKristin Luck is President at Decipher.

 

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles