If you’ve read anything about privacy in the last few years, you’re certain to have come across the name Dr. Ann Cavoukian. And if you don’t recall her name, surely you’ve heard of her concept of Privacy by Design. With all the data breaches we’ve encountered over the last several years and the most recent debacle with Facebook and Cambridge Analytica, the value of privacy has never been more clear.
Privacy by Design is the idea that every piece of technology, every website, every tool and process ought to consider how to incorporate concepts of privacy from day one and throughout the entire development process. Historically, many products and services have been, and continue to be, built such that privacy is an afterthought – once the product or service has been fully developed, people try to figure out how to retroactively apply privacy components. This strategy can easily lead to unnecessary collection of data, awkward programming work-arounds, and privacy policies that are far too complex for regular people to understand. By accounting for privacy from the start, through Privacy by Design, many of these problems can be prevented or simplified.
Ann’s career is impressive. She had Privacy by Design in mind before serving three terms and 17 years as the Information and Privacy Commissioner for Ontario, the largest province in Canada. Now, she is a distinguished visiting professor and Executive Director at Ryerson Universities Privacy and Big Data Institute. She is also a Senior Fellow of the Ted Rogers Leadership Centre at Ryerson University, and a Faculty Fellow of the Center for Law, Science & Innovation at Sandra Day O’Connor College of Law at Arizona State University.
Her awards are numerous and include being named one of the Top 25 Women of Influence in Canada, ‘Power 50’ by Canadian Business, Top 100 Leaders in Identity, and was awarded the Meritorious Service Medal by the Governor General of Canada for taking her Privacy by Design concept globally.
What’s inspiring about Ann’s leadership is that she never wavered from her commitment to Privacy by Design. Twenty years ago, digital privacy wasn’t a thing. AOL Instant messenger, Yahoo Messenger, MSN messenger, and LiveJournal existed. Skype showed up in 2003, Facebook in 2004, and Reddit and YouTube in 2005. To the average person 20 years ago, privacy was boring and manifested as physically locked filing cabinets in locked rooms – impenetrable without two keys. Yet Ann had the foresight to realize that planning for digital privacy would become paramount. She’s held strong to this message for more than two decades.
Today, her Privacy by Design strategy has traversed the globe and been translated into 40 languages. In 2010, International Privacy Regulators unanimously passed a Resolution recognizing Privacy by Design as an international standard. As we progress with integrating artificial intelligence, machine learning, and deep learning with our marketing technologies, we must take care to implement Privacy by Design. Not because regulators say we should, but because Ann has repeatedly demonstrated that it’s the right thing to do.
You can find Ann on Twitter, Linkedin, Wikipedia, at Ryerson University’s Privacy by Design Centre of Excellence where she is the Distinguished Expert-in-Residence, or her foundation Global Privacy and Security By Design.
You might like these posts too:
- Chemistry For The Greater Good: A leadership profile of Dr. Eugenia Duodu
- Why Love a Leader Anywhere Else: A leadership profile of Sleep Country Canada’s Christine Magee
- Leadership: Three Reasons to Believe in ‘the Why’
This post was written in my role as a consultant for Sklar Wilton & Associates. SW&A has worked for more than 30 years with some of Canada’s most iconic brands to help them grow their brand, shape corporate culture, build successful innovation, define portfolio strategies, and maximize research ROI. They offer strategic advice, business facilitation, research management, qualitative/quantitative research, and analytics. SW&A was recognized as a Great Workplace for Women in 2018, and the Best Workplace in Canada for Small Companies in 2017 by the Great Place To Work® Institute. Inquire about their services here.
From the front row of a packed room, I listened to a presenter discuss sharing YouTube videos with their clients in order to help them better understand consumers. As a power user of social media, and having extensive experience with social media listening, I completely understand the reasoning behind this. Qualitative researchers too have long understood the importance of bringing individual consumers into the boardroom using video evidence. Of course, as a huge proponent of privacy and ethical standards, I had a question for the speaker, one that I have posed many times before including earlier in the day to another speaker at the same conference.
Did you get permission from each person before sharing their videos with your clients?
Sadly, I got the full set of responses I expected.
1) “We don’t ask for permission when we’re sharing videos in the office or pointing things out to one person”
This, I completely understand. It’s not too different from seeing a funny cat video and calling your friends over to watch it with you. Any brand manager can go online while at work or in their leisure time, search for videos and comments related to their brand, and watch them ad nauseam. Social media, like YouTube, is there for random people to find and appreciate random bits of content.
2) “We like to act first and get permission later”
This thoroughly disappointed me. Do I want my consumers to know that’s how I feel about them? That I don’t care about their right to privacy? That I don’t care if they might be embarrassed, ashamed, or humiliated to find out that a video they made for their friends was inserted into an official report, passed around the office to be dissected for hours, and then permanently saved on multiple cloud servers?
Given that I am also a human being trying to understand other human beings, it is essential to me that I reinterpret everything I hear from the point of view of a regular human being. With that in mind, ‘getting permission later’ feels creepy and invasive. It feels like a violation of my personal space. It feels disrespectful. It erodes my trust in the market research profession. It makes me want to stop participating in market research activities like questionnaires and focus groups. Return to the perspective of the researcher and see how these consumer perceptions lead to increased recruitment costs, increased incentive costs, and increased data collection costs. Getting permission from consumers later is an extremely unwise decision. It begs for negative and destructive press. Do you remember the Patients Like Me incident? It didn’t turn out well for them.
If the ethics of not obtaining prior permission don’t bother you, are you more easily convinced by a massive hit to your wallet?
3) “The market research industry doesn’t take enough risks”
This baffles me. You’re okay risking my privacy? You’re okay with the risk of humiliating me? You’re okay risking getting caught? Yes, it’s very easy to take a risk with someone else’s personal life.
If we think about taking risks from the innovation side of things, well, how does respecting and treating people ethically, and being concerned for their privacy and personal space conflict with innovation? How does waiting a day, or even seven days, to get permission to use a personal video stall innovation so much so that a business cannot be profitable? The market research industry does need to push innovation boundaries but it never needs to do so at the expense of the very human being we’re trying to understand.
Imagine you’re a giant, global brand. You’ve just spoken to 500 people at a conference. A journalist in the crowd takes down your words verbatim.
Would you be embarrassed, ashamed, or humiliated if a TV news anchor quoted your brand on live TV saying “Use their videos first and get permission later?”
P.S. yes, the Venn diagram is vastly out of proportion. Discuss.
Live note taking at the 2016 MRIA annual conference in Montreal. Any errors or bad jokes are my own. If you think any of this is legal advice, turn off your internet right now and grab a colouring book and crayons instead.
Panelists: Patrick Cruikshank, Eric Dolden, Derrick Leue, Serge Solski
- What is cyberrisk – extortion, online wire fraud, identity theft
- Legal trends – 3 claims per month for this legal speaker, Canada protects all aspects about a person including which brand of pop they like and what TV shows they watch not just their financial or medical records; doesn’t matter if it’s knowing or careless or preventable you are liable; if you give away confidential information even when you know it’s confidential, you are liable for the costs and profits
- Business don’t report every issue becaus it could put their reputation at risk
- Are market research companies too small for hackers to come after them? Absolutely not. Geography doesn’t matter. You are just a number on the Internet, crimes of opportunity. 80% of attacks are from external parties [yikes 20% are YOUR employees!]; They just need a door to get in and then they can figure out how to get $ from you.
- Newest legislation moved us closer to the American model. Snooping or taking of data without consent, there is an obligation ot report to privacy commissioner whether provincial or federal. If there is a possibility of harm, you are obligated to notify the persons that their information was compromised. Not every unauthorized access requires notification becuase there may be no risk of harm, whether physical, emotional, identify theft, financial loss, loss of business, reputational harm, risk of humiliations, loss of relationship, public safety or health. Snooping without taking also counts.
- PIPDEA protects only PII.
- Breach of confidence is different – giving away information knowingly, trying to get paid twice for the same thing, maybe it’s careless such as an email with an unintended recipient and that would be negligence
- [listening to these speakers makes me really wonder about what I have in my emails, how much PII or confidential information is in there? How many unintended people have I emailed?]
- [really glad MRIA included this session right after the main keynote. This is massively important and business threatening information that we all must know]
- Someone could easily lock us out of our own systems unless we pay them 500 000. Would we tell the right people because this would threaten your current and future business. It can make more sense to pay up rather than report it.
- In every case, even when there was zero harm, judges has said consumers are owed damages because their privacy was compromised, awards are around $5000 up to a high of $20000 in cases of deliberate negligence
- Look at known vulnerabilities like firewalls and failing to updates systems, employees need to know hot to avoid creating holes in the firewall, need to constantly update systems, make sure team doesn’t destroy evidence or you can’t prove that YOU didn’t do it
- Most canadians don’t have adequate insurance for cyberrisk, we’re covered for fire and injury and financial loss and liability but these don’t cover information loss, denial of service attack
- Better to have one insurance companies that covers all the issues as opposed to one covering physical loss, one covering information loss
- Human error is one of the best arguments for buying cyberrisk insurance
- Directors and officers have been named in claims for not being efficient in dealing with issues or not ensuring they stay up to date with issues – e.g., not responding after two reminders, not heeding recommendations
- Small companies probably won’t survive cybercrime while big companies might make it through
- EXPECT to be attacked, this is a hard fact. Be prepared because people and technology have weaknesses. Someone WILL click on that link and download that virus.
Ten Emerging Privacy Challenges for Marketing Research & How to Navigate Them by Howard Fienberg and Stuart Pardau #ISC2015 #MRX
- This is not legal advice 🙂 [and along those lines, please assume my notes are completely wrong. do the research properly and that doesn’t mean perusing this blog post.]
- There are federal and state laws, then more laws segmented by the vertical, and by modality of how you collect information
- a data breach can cost millions, if one data breach is $200, then thousands or millions of breaches is huge money
- be transparent about what you do and don’t do, accurately describe what you do
- data security breaches
- playstation, sony cyber attack, target, home depot, anthem all lost millions of records; most states have data breach notification laws, when a breach occurs, you must report it
- states have different definitions of PII, different time frames, safe harbour for encryption so advisable to encrypt, build your policy for the most restrictive policy
- Do Not Track requests – you need to specify whether you honour these request though you aren’t required to honour them [wow, did not know that]
- Eraser Law – minors have a right to be forgotten, if you know they are minors or your site appeals to minors it applies to you
- 2 beacons and mobile tracking
- tracking in around between brick and mortor without cash register receipts
- where is data going and where is it being shared, can you opt out, how identifiable is the data
- are consumers notified when they’re being tracked, if you don’t like it you can turn off your device [that makes me uncomfortable – IIII have to change my device as opposed to you buggering off?]
- Nomi tracking – say what you do and do what you say, they didn’t let people opt out because people didn’t know they were being tracked
- emerging area with great potential but must be very careful
- Spokeo case – firm does deep web crawls and aggregates it into reports, you can look up yourself or your friends
- must it be concrete harm to bring forward a case? if information is wrong and you can’t prove it, do you have a case; this case could open floodgates. in this case, the information seemed to be better than reality. [better is in the eye of the beholder]
- international data transfer – if you focus on US domestic you’re generally ok, but if one project is outside, then it matters
- if you work with EU, make sure you know the data principles; you can self-certify but then you must adhere to those principles, must renew it every year; requirements for regular privacy assessments; need a privacy officer if you have 250 or more employees
- MR is a data broker, FTC won’t rule our MR
- policy makers are concerned with brokering data for marketing purposes, and verification of respondents
- need option to be able to delete all the information they have about you, this is because we are sometimes lumped in with creepy businesses
- Internet of things – hypothetical security risks right now, unauthorized use of PII, attacks on systems, personal safety
- focus on privacy by design, select providers carefully, control access and monitor constantly
- how do you deliver notifications on a device with no readout
- American community survey – gets response rates around 95%, because government survey and because its mandatory, but mandatory upsets people, voluntary would cut repsonse rates to 50% and we wouldn’t get data from about 40% of the country
- BYOD – bring your own device
- employees can access sensitive company data on their own device – HR, health, financial, trade secrets, client lists, confidential information, or, employees use company devices at home
- if its your own device, you can control it or lock it, otherwise you have no control
- still must notify if data is breached even though its not your device
- Employers could say you’re not allowed to use your own device but this is not realistic, its better to have a policy
- Policy – onboarding documentation, agreements to keep data secure, remote data deletion and limits on apps, data retention, termination process, be clear on who pays for what
- Federal Trade Commission – deceptive and unfair trade practices, polices data privacy and data security
- LabMD/Wyndham hotels cases – failed to institute reasonable and appropriate security measures, case is under appeal and suspect FTC will be allowed to police data privacy
Live blogged in Nashville. Any errors or bad jokes are my own. Any typos are purely the fault of the iPad.
by Peter Milla and Dave Christiansen
CASRO has seen an increase in requests from clients and regulators for data privacy and security compliance
– code of standards
– safe harbor program
COmpliance means confirming to a rule, like a policy or law. CLients want operational transparency.
COmpanies will require 50 percent less business process workers and 500 percent more digital business jobs. especially regulatory analysts and risk professionals. These jobs are generally only in larger companies. This includes privacy officers.
Privacy and security are symbiotic. This can be a crisis for MR. Privacy is appropriate use of the data. security is the confidentiality and integrity of data.
– you cant just destroy data. what about all the backups. the saved copies that everyone has from their piece of the work.
– availability of data could impact life or death in some cases
What drives compliance
– client wants it [i hope vendors want it too. why is because clients want it?]
– legislation or regulation like HIPPA GLB COPAA FTC PIPIDA. you could be accused of unfair trade practice for discontinuing a poor responder.
– gain a competitive advantage
[wow, typing on an iPad keyboard is quiet and completely unobtrusive when you lay it flat! But i cant put pictures or links easily. Sorry.]
ISO 27002 – you cant be certified, you can be compliant
HIPAA compliance case study
– business associates now face liability. Uses not in accordance with BAA. failure to limit PHI. failure to provide breach notification. failure to provide HHS access when required. failure to comply with security rule.
– many companies state one year but they keep it forever
– Protected Health Information PHI.
– employees don’t usually intend to make errors, they just don’t know
– no easy checklist of requirements
– does offer a set of principles. instruction is to take necessary steps to disclose minimum necessary information
– much is process based
HIPAA security rule compliance
– risk analysis – evaluate likelihood of risks, implement appropriate security measures, document those measures, maintaining continuous review and assessment, ensure access control and integrity control, ensure transmission security, keep documentation up to date
BLUE CROSS – just had a breach that affected 80 million US citizens, 25% of the population. names, SIN, birthdays. be sure to use your free annual credit report. Take advantage of free credit monitoring. monitor your children as well. be alert when filing your income taxes.
Top security trends
– cybercrime, privacy and regulation, third party provider threats and breaches, BYOx in the workplace – Bring Your Own Device [like i’m doing right now. are my office security systems on my personal tablet?]
[note to self and everyone. turn the GPS off all of your devices. it is not necessary that every software program knows where you are, where you live, where you work, where your kids live]
Advanced Persistent Threat – APT
– china and Russia and Iran have active cyber espionage, aligned in every industry to take whatever they can, causing information security bar to be raised
CLients expect all their information is safe. need a dedicated person or team. CISSP, CISM, CISA, ISO, SDLC. [we have this person. they went to every single office in every country over the last couple weeks to remind every single person just how serious security issues are.]
[everyone should have come to this session. i don’t care if you think you’re doing fine. you need persistent reminders of just how worried you really ought to be.]
Information security is not IT security. spans people processes and technology. its digital written and spoken. it’s being proactive. it’s an organizational discipline.
– best practice for information security, NIST, CSF, COBIT. can be audited and certified. Earth’s ‘best practice’ its the policies procedures and controls and training.
– it is not industry specific. it is federal, state, industry, contractual relevant.
– vulnerability assessment annually or quarterly, penetration testing, gap assessment, awareness training, internal audit, risk assessment.
[Annie’s free public service announcement – do an internal audit today. if it looks like spam, it probably is. if it doesn’t look like what I usually email to you, i probably didn’t email it to you.]
I did it. Yes. I broke down and spent my Christmas money. Let’s put aside the fact that I still get Christmas money from the moms and move on to what I spent it on.
In just six to eight weeks, this pretty little plum coloured Fitbit will arrive at my door. (The “make it pink so girls will buy it” marketing scheme works on me but plum is just as good.)
Supposedly, it will monitor my heart rate all the time including when I am awake and asleep. It would have been cool to have it a few weeks ago when my four wisdom teeth were ripped out of my face but I’m sure some other quite unpleasant event will greet me soon enough.
I’m quite looking forward to learning:
– how consistent my sleep is, and how many times I wake up at night
– whether my heart rate speeds up or slows down when I get ready for work or leave work, or when I go toy awesomely fun ukulele class
– how incredibly nuts my heart rate is when I speak at conferences, show up at cocktail hour, plow through a crowded exhibit hall. Though I may seem calm and relaxed, it really takes a ton of mind games to turn quiet me into loud me.
And at the same time, I’ll be wondering… If someone gets their hands on my data, what will they do with it? What products will they develop as they learn about me? What heart rate medications will they need to sell to me? What fitness products will they need to sell to me? Will I need to buy the shirt version to measure electrical outputs? The sock version to measure sweat outputs? The earbud version to measure brainwaves? What will marketers and brand managers learn about me and my lifestyle?
Now that I think about it, this is MY form of gamification. I can’t wait to see charts, watch trends, and compare Norms. And now that I’m learning Python and rstats, I would love to get my hands on the dataset of millions of people and millions more records. With permission of course.
Big Data and Privacy: The Legal Landscape Affecting Corporate Research by Shannon Harmon, JHC #CRC2014 #MRX
- our lives are a series of data points
- more opportunity vulnerability and the potential for greater abuse
- smaller entity might purchase data from 3rd party
- who owns the data, who has the right to access the data, what steps are taken to keep it secure
- goal of any regulation is to protect personally identifiable information form breach and misuse
- you can identify people with very little information so keep in mind a lot of information is PII
- Notice and consent: need to provide notice of how the data will be used, and then obtain consent – this is the core of the law related to privacy, you need to make sure the right practices were followed to do this
- Where do we look for oversight? Right now, state attorney general, FTC, FCC, FDA
- Fair information practice principle – only collect what you need to collect and only retain it for as long as is necessary to fulfill the specified purpose
- FIPP – data quality and integrity – organizations should ensure that the PII is accurate, relevant, timely and complete and this is difficult if you’ve purchased the data, supplier should have a structure in place to ensure this
- Consumer privacy protection bill of rights – google search this – things corporations should do to protect privacy, this area will become increasingly more regulated so think ahead
- Fair Credit Reporting Act – example of what big data protection framework should look like, right to review your credit report and make sure it’s accurate and get it fixed if it’s not correct, this is where we’re headed, your digital dossier is being collected and you don’t know how decisions about you are being made, you can’t contest your big data points… right now
- special considerations for health data – apple has stated that any app developers cannot use any of the health data for advertising, or data-mining except to help an individual manage their health or for medical research. but is apple responsible for developer compliance? what if a data broker got the data from someone who wasn’t supposed to have it in the first place?
- considerations for researchers
- where is the data being obtained, what are the sources
- what practices are being used to obtain it and what is your confidence in your aggregator
- how is the data being trained to arrive at conclusions? what algorithms? what human manipulation?
- think about the vendor/subcontractor relationship, is the contractor independent? a substandard contractor impacts you
- we need
- use restrictions – can’t use big data to discriminate on age, race, etc
- oversight – protect against unregulated digital dossiers
- KNOW YOUR INFORMATION SOURCE
- be intimately knowledgeable about your company’s data gathering practices – informed consent, opt-out, internal user access controls
- be ready to evolve as the law is only beginning to be developed in this area
- The Oscars of Marketing Research: Peanut Labs’ Chief Research Officer wins ESOMAR’s Excellence Award for the Best Paper
- Why do people like marketing research surveys?
- In which I rant about showing data in presentations #MRX #CRC2014
- How marketing researchers can start being more ethical right now #MRX
- Discover the Science of Fascination by Sally Hogshead, Fascinate, Inc. #CRC2014 #MRX
Today, I was pleased and, more correctly, honoured to appear before a Senate Committee to speak with Kara Mitchelmore, the CEO of the MRIA, regarding Senate Bill S-4, the Digital Privacy Act. The official opinion will shortly be available but for those of you who can’t wait, here is the basic gist of it. Any inaccuracies here are my own. 1) Breach notifications should be mandatory, and the Privacy Commissioner should be the unbiased third party that determines what is a real risk of significant harm to an individual. 2) The MRIA supports the provisions in the bill which add clarity to what is valid consent. The committee may be interested in our code of conduct which contains a section on the ethical issues in dealing with children and young people. 3) The MRIA is pleased that PIPEDA will be amended to allow the transfer of personal information from an organization to a prospective purchaser or business partner (think mergers and acquisitions). 4) The MRIA does not support allowing organizations to share personal information of individuals to other organizations without consent. It should follow due process such as through a court order.
5) The MRIA would like to close a loophole which allowed organizations to share personal information without consent to an investigative body or government institution. It should follow due process such as through a court order. After we spoke, Michael Geist, a law professor at the University of Ottawa, made numerous excellent points (Michael’s website). Some of his comments are included here (any errors or misrepresentations are my own).
- desire for a lower standard of what constitutes a breach (i.e., it doesn’t need to be a real risk of significant harm, it can be less than that)
- increased reporting of breaches both major and minor, as well as breaches to unauthorized persons that may not have caused ‘harm’
- the expansion of warrantless disclosure must be removed
- order making powers are necessary
- public reporting of the number of disclosures without a warrant should be made on a quarterly basis and individuals should be notified within a certain period
- What is Vue magazine? #MRX (lovestats.wordpress.com)
- Canada’s Digital Privacy Act lets companies share customers’ personal info, privacy critics warn (blogs.vancouversun.com)
- Can Canada’s Likely New Privacy Commissioner Be Trusted to Watch the Watchers? (motherboard.vice.com)
- Why has the Canadian government given up on protecting our privacy? (thestar.com)
- Peanut Labs Ask-Me-Anything with special guest Jim Bryson (web.peanutlabs.com)
- Peanut Labs Ask-Me-Anything with special guest Tamara Barber (web.peanutlabs.com)
Data, Data Everywhere The Need for BIG Privacy in a World of Big Data by Ann Cavoukian, Ph.D., Information and Privacy Commissioner of Ontario, Canada #FOCI14 #MRX #GreatTalk
8:45 KEYNOTE Data, Data Everywhere The Need for BIG Privacy in a World of Big Data
Ann Cavoukian, Ph.D., Information and Privacy Commissioner of ONTARIO, CANADA
- big data and privacy are complementary interests
- privacy by design is a win win proposition
- if you don’t address privacy concerns, there will be a backlash
- privacy = personal control, freedom of choice, informational self-determination, context is key
- in 2010, passed this landmark resolution to preserve the future of privacy, has been translated into 36 languages because people are so desperate for this information
- essence of it is to change the emphasis from a win-lose model to a win-win model, replace ‘vs’ with ‘and’
- you must address privacy at the beginning of a program, embed it into the code at the beginning
- 7 principles –
- be proactive not reactive, prevention not remedial
- default condition needs to be privacy
- privacy embedded into design
- full functionality, positive sum not zero sum
- end to end security, full lifecycle protection, from the outset, from collection to destruction at the end
- visibility and transparency, keep it open, tell customers what you’re doing, don’t let them learn afterwar
- respect for use privacy, keep it user centric
- Big data will rule the world – honeymoon phase, everything else must step aside, forget causality, correlation is enough
- Then the honeymoon phase ends – found data… digital exhaust of web searches, credit card payments, mobiles pinging the nearest phone mast; these datasets are cheap to collect but they are messy and collected for disparate purposes
- Big data is now in the trough of disillusionment
- Google flu trends used to work and now doesn’t because Google engineers weren’t interested in context but rather selecting statistical patterns in the data – correlation over causation, a common assumption in big data analysis, imputed causality which is incorrect
- MIT professor Alex Pentland has proposed a New Deal on Data – individuals to own their data and control how it is used and distributed
- data problems don’t disappear just because you are working with big data instead of small data, you can’t just forget about data sampling
- Forget big data, what is needed is good data
- data analytics on context free data will only yield correlations, add context and then you might be able to impute causality
- once business have amassed the personal information, it can be hard if not impossible for individuals to know how it will be used in the future – “A long way to privacy safeguards” New York Times Editorial
- privacy is not a religion – if you want to do nothing, you can do nothing. but let people choose to do something
- people now have to resign when data breaches happen, you must address them at the beginning
- privacy should be treated as a business issue, not a compliance issue. gain a competitive advantage by claiming privacy, lead with it
- proactive costs money but reactive costs lawsuits, brand damage, loss of trust, loss of consumer confidence
- privacy drives innovation and creativity
- privacy is a sustainable competitive advantage