Commentary

Privacy (and piracy?) in the era of AI and data – Guidance from Canada

January 28th was International Data Privacy Day.  So, it was fitting that ESOMAR, in collaboration with the AI & Data Analytics Thought Leadership Council of the Canadian Research Insights Council, CRIC, organized an expert panel to explore, “Privacy in The Era of AI and Data Based Insights”. I suggest this topic embraces the issue of data piracy that has become ubiquitous for most consumers in today’s tech dominated environment. 

The impetus for this fascinating event were the implications of Canada’s (or any Country’s) proposed Digital Charter Implementation Act, DCIA, (Bill C-11) tabled November 17, 2020. It intends to establish a new privacy law for the private sector and new powers for the ‘Office of the Privacy Commissioner’. 

The session also addressed the role of standards in data governance, transparency of algorithms and decision systems, and ethical implications of data collection and insights generation. It was moderated by Briana Brownell, CEO, Pure Strategy, who is a member of CRIC and its AI Council. 

Carole Piovesan, partner, and co-founder of INQ Data Law, Toronto, put AI in the context of data innovation and privacy reflected by 6 cornerstones each of which have different dimensions whether from the perspective of the consumer, the brand, the researcher/AI model, or the data integrator/analyst. They are:

  1. Automated Decision Systems
    1. Definition: “Any technology that assists or replaces the judgement of a human decision-maker using techniques such as rules-based systems, regression analysis, predictive analytics, machine learning, deep learning, and neural nets.”
  2. Beyond Consent
    1. De-identification is a use without consent!
  3. Right to Data Mobility
  4. Right to Data Deletion
    1. New to the Canadian privacy landscape and aligns more closely with GDPR
  5. Code of Practice and Certification Schemes
    1. CPPA Canada enables organizations to develop their own codes of practice or certification schemes for compliance.  Will this come under heavy scrutiny? 
  6. Stricter Enforcement

Across all these areas she emphasized that the business implications are complex and substantial from each segment’s perspective. 

She also underlined that, Part I of the existing Canadian Personal Information Protection and Electronic Documents Act, PIPEDA would be replaced by the new Consumer Privacy Protection Act (“CPPA”).  CPPA is being introduced under Part 1 of the DCIA.  Part II will create the Personal Information and Data Protection Tribunal, an administrative tribunal empowered to levy significant fines for non-compliance with the CPPA!

As the Sector Specialist for AI/Big Data as well as Health/Biosciences at the Standards Council of Canada, SCC, Marta Janczarski stressed the importance of companies being involved in standardization and conformity in order to engage in global markets competitively. Billions of products are certified annually to national, regional, and international standards. 

Marta referenced, “The global nature of data, and it’s complex use across sectors which provides a key opportunity for standards as well as for conformity assessment”. As part of the data concerns, “Safe, effective and accountable AI is fundamental to unlocking the value of that data.”  In Canada this includes being CyberCan secure; compliance to regulations via certification which recognize the European General Data Protection Regulation, GDPR: ISO 27701; and fulfilling data procurement requirements. 

It would be fascinating to understand the similarities and differences between these Canadian initiatives and those in California under their California Consumer Protection Act, CCPA, in view of California’s leadership in this arena in the US. Clearly the SCC is heavily influenced by the GDPR, which became compulsory back in May 2018, in their efforts to establish data governance standardization.  

Dr. Cody Dodd, Co-Founder and CEO of PigeonLine and Researcher at the London School of Economics, Management Department in Information Systems is a great supporter of the power of AI to meet company CMO needs for faster, reliable consumer insights via digital data collection, integration, and AI. He identified the value of AI to, “identify consumer segmentation opportunities, predict behaviours and uncover hidden customer issues.” 

However, he recognizes the ethical and privacy implications for AI and machine learning and acknowledged the customer rights to be forgotten; to see their data; and to data sovereignty. As such these include key transparency concerns. Consumer rights to not be subject to biased decisions; be the main decision-maker, especially in areas of high discretion; be given explanation behind algorithmic decisions; be given the opportunity to correct or ignore suggestions; and be able to access an audit of decisions. Per this article’s headline, these matters are essentially about possible data piracy! As he explicitly cautioned, “Making unlimited data collection imperative is risky.” 

I believe he echoed, Tim Cook, CEO, Apple January 28, 2021, “If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform.” Are Facebook, Google, Amazon, and Microsoft listening? 

Like the previous speakers, data validity was not given the significance it surely deserves which we in the research arena always hold front and center. How many of us, how many times, have been clearly and completely mischaracterized by digital communications we receive?  

The evolving challenges in data privacy protection and validity, notably for market researchers, were addressed by Reg Baker is North American Regional Ambassador for ESOMAR. He provided a disturbing but generally understood perspective. “Massive amounts of personal data on each of us is being collected and shared with varying levels of knowledge and consent — too often surreptitiously!”  

In considering “best practices”, two vastly different types of data, each governed by different ethical standards are now the foundation of research – primary and secondary data. He proposes, the industry’s increased reliance on secondary data raises 3 key concerns. 

  • Ethical sourcing – the right to use. 
  • Data protection – privacy. 
  • Validity – accurate and fair. 

He added, “The research community understands the consumer is the foundation to what they do.” A cornerstone for ESOMAR and its members has always been, “Research participants are the lifeblood of the research industry.” The keys to progress? Trust; co-regulation; and buyer and supplier support. 

Reg suggested, “Structurally a good process leads to good outcomes”. Achieving such organizational accountability needs to embrace:

  • Privacy by design
  • Privacy Impact Assessments
  • Robust information security and governance practices:
    • Technology
    • Human resources
    • Training
  • Strict limits on data sharing

I believe he is right, and these various initiatives are critical. Otherwise, I can see a potential consumer revolt via evolving “anti-piracy” technologies that – quicky and very very easily, for the non-techies – allow consumers complete and absolute anonymity on their digital exploits including even the most legitimate research. 

Reg announced that a new guideline is being developed by ESOMAR and the Global Business research Network, GRBN for researchers and clients when processing secondary data for research.  Stay tuned! 

To view the entire webinar feel free to register here

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles