Tag Archives: privacy

Pitting ethics against innovation in marketing research

From the front row of a packed room, I listened to a presenter discuss sharing YouTube videos with their clients in order to help them better understand consumers. As a power user of social media, and having extensive experience with social media listening, I completely understand the reasoning behind this. Qualitative researchers too have long understood the importance of bringing individual consumers into the boardroom using video evidence. Of course, as a huge proponent of privacy and ethical standards, I had a question for the speaker, one that I have posed many times before including earlier in the day to another speaker at the same conference. 

Did you get permission from each person before sharing their videos with your clients?

Sadly, I got the full set of responses I expected.

1) “We don’t ask for permission when we’re sharing videos in the office or pointing things out to one person”

This, I completely understand. It’s not too different from seeing a funny cat video and calling your friends over to watch it with you. Any brand manager can go online while at work or in their leisure time, search for videos and comments related to their brand, and watch them ad nauseam. Social media, like YouTube, is there for random people to find and appreciate random bits of content.

2) “We like to act first and get permission later”

This thoroughly disappointed me. Do I want my consumers to know that’s how I feel about them? That I don’t care about their right to privacy? That I don’t care if they might be embarrassed, ashamed, or humiliated to find out that a video they made for their friends was inserted into an official report, passed around the office to be dissected for hours, and then permanently saved on multiple cloud servers?

Given that I am also a human being trying to understand other human beings, it is essential to me that I reinterpret everything I hear from the point of view of a regular human being. With that in mind, ‘getting permission later’ feels creepy and invasive. It feels like a violation of my personal space. It feels disrespectful. It erodes my trust in the market research profession. It makes me want to stop participating in market research activities like questionnaires and focus groups. Return to the perspective of the researcher and see how these consumer perceptions lead to increased recruitment costs, increased incentive costs, and increased data collection costs. Getting permission from consumers later is an extremely unwise decision. It begs for negative and destructive press. Do you remember the Patients Like Me incident? It didn’t turn out well for them. 

If the ethics of not obtaining prior permission don’t bother you, are you more easily convinced by a massive hit to your wallet?

3) “The market research industry doesn’t take enough risks”

This baffles me. You’re okay risking my privacy? You’re okay with the risk of humiliating me? You’re okay risking getting caught? Yes, it’s very easy to take a risk with someone else’s personal life.

If we think about taking risks from the innovation side of things, well, how does respecting and treating people ethically, and being concerned for their privacy and personal space conflict with innovation? How does waiting a day, or even seven days, to get permission to use a personal video stall innovation so much so that a business cannot be profitable? The market research industry does need to push innovation boundaries but it never needs to do so at the expense of the very human being we’re trying to understand.

Imagine you’re a giant, global brand. You’ve just spoken to 500 people at a conference. A journalist in the crowd takes down your words verbatim.

Would you be embarrassed, ashamed, or humiliated if a TV news anchor quoted your brand on live TV saying “Use their videos first and get permission later?”

P.S. yes, the Venn diagram is vastly out of proportion. Discuss.

Advertisements

Panel: Privacy Breaches – Blood in the water #MRIA2016 #MRIA16 #NewMR 

Live note taking at the 2016 MRIA annual conference in Montreal. Any errors or bad jokes are my own. If you think any of this is legal advice, turn off your internet right now and grab a colouring book and crayons instead.

Panelists: Patrick Cruikshank, Eric Dolden, Derrick Leue, Serge Solski

  • What is cyberrisk – extortion, online wire fraud, identity theft
  • Legal trends – 3 claims per month for this legal speaker, Canada protects all aspects about a person including which brand of pop they like and what TV shows they watch not just their financial or medical records; doesn’t matter if it’s knowing or careless or preventable you are liable; if you give away confidential information even when you know it’s confidential, you are liable for the costs and profits
  • Business don’t report every issue becaus it could put their reputation at risk
  • Are market research companies too small for hackers to come after them? Absolutely not. Geography doesn’t matter. You are just a number on the Internet, crimes of opportunity. 80% of attacks are from external parties [yikes 20% are YOUR employees!]; They just need a door to get in and then they can figure out how to get $ from you.
  • Newest legislation moved us closer to the American model. Snooping or taking of data without consent, there is an obligation ot report to privacy commissioner whether provincial or federal. If there is a possibility of harm, you are obligated to notify the persons that their information was compromised. Not every unauthorized access requires notification becuase there may be no risk of harm, whether physical, emotional, identify theft, financial loss, loss of business, reputational harm, risk of humiliations, loss of relationship, public safety or health. Snooping without taking also counts.
  • PIPDEA protects only PII.
  • Breach of confidence is different – giving away information knowingly, trying to get paid twice for the same thing, maybe it’s careless such as an email with an unintended recipient and that would be negligence
  • [listening to these speakers makes me really wonder about what I have in my emails, how much PII or confidential information is in there? How many unintended people have I emailed?]
  • [really glad MRIA included this session right after the main keynote. This is massively important and business threatening information that we all must know]
  • Someone could easily lock us out of our own systems unless we pay them 500 000. Would we tell the right people because this would threaten your current and future business. It can make more sense to pay up rather than report it.
  • In every case, even when there was zero harm, judges has said consumers are owed damages because their privacy was compromised, awards are around $5000 up to a high of $20000 in cases of deliberate negligence
  • Look at known vulnerabilities like firewalls and failing to updates systems, employees need to know hot to avoid creating holes in the firewall, need to constantly update systems, make sure team doesn’t destroy evidence or you can’t prove that YOU didn’t do it
  • Most canadians don’t have adequate insurance for cyberrisk, we’re covered for fire and injury and financial loss and liability but these don’t cover information loss, denial of service attack 
  • Better to have one insurance companies that covers all the issues as opposed to one covering physical loss, one covering information loss
  • Human error is one of the best arguments for buying cyberrisk insurance
  • Directors and officers have been named in claims for not being efficient in dealing with issues or not ensuring they stay up to date with issues – e.g., not responding after two reminders, not heeding recommendations
  • Small companies probably won’t survive cybercrime while big companies might make it through
  • EXPECT to be attacked, this is a hard fact. Be prepared because people and technology have weaknesses. Someone WILL click on that link and download that virus.

Rights Of Respondents #AAPOR

Live note taking at #AAPOR in Austin Texas. Any errors or bad jokes are my own.


Moderator: Ronald Langley, University of Kentucky

Examining the Use of Privacy Language: Privacy from the Respondent’s View; Kay Ricci, Nielsen Lauren A. Walton, Nielsen Ally Glerum, Nielsen Robin Gentry, Nielsen

  • Respondents have concerns about the safety and security of their data, Want do know how data is stored, collected, We hear about breaches all the time now
  • In 2015, FCC issued TCPA rules re automatic telephone dialling systems, can’t use them for cell phones without consent
  • Existing page of legalize was terrifying and could affect key metrics
  • 3 steps – in depth interviews, implemented language into TV diary screener, analyzed key metrics to understand impact of new language
  • 30 English interviews, 16 Spanish interviews
  • Did people notice the language, read it, skim it, did they care? Did they understand the terms, was the language easy or difficult
  • Tested several versions, tested a longer version with softer language
  • Only one person understood what an autodialler was, people didn’t realize it was a live interviewer, people didn’t care how their number was dialled if they were going to talk to a live human anyways
  • 2/3 didn’t like the concept, thought they’d be phoned constantly, 1/3 didn’t mind because it’s easier to hang up on a machine
  • People liked we weren’t selling or marketing products, but many don’t see the difference
  • Many people don’t know who neilsen is
  • People liked being reminded that it was voluntary, extra length was fine for this
  • The after version was longer with more broken up sentences
  • Test group had lower return rate but very slightly, lower mail rate
  • Higher return rate for 50 plus, and Hispanic
  • Contact number provision was much lower, drop from 71% to 66%
  • It’s essential to protest so you know the impact
  • [simple language is always better even if it takes more space]

Allowing Access to Household Internet Traffic: Maximizing Acceptance of Internet Measurement; Megan Sever, Nielsen Sean Calvert, Nielsen

  • How do we measure what people watch and buy online in their home
  • How do we access third party data , but then how do we great true demographic information to go with it
  • 22 semi structured interviews – mock recruit into please share your password
  • Ranges from absolutely yes – they believe it’s already being collected anyways
  • Sceptics wanted more information – what are you actually recording, how is my data secure
  • Privacy – security – impact on Internet performance
  • People seemed to think they would Screencap everything they were doing, that they could see their bank account
  • Brought examples of real data that would be collected, what the research company will see, essentially lines of code, only see URL, not the contents of the page, start and stop times; at this point participants were no longer concerned
  • Gave a detailed description of encryption, storage and confidentially procedures
  • Explain we’re not marketing or selling and data is only stored as long as is necessary
  • Reputation of the research company builds trust, more familiar folks were okay with it
  • Script should describe purpose of measurement, what will and will not be measured, how it will be measured, data security privacy and protection policies, impact on Internet performance, reputation of company
  • Provide supplementary information is asked for – examples of data, policies that meet or exceed standards, examples of Internet performance, background and reputation of company 

Informed Consent: What Do Respondents Want to Know Before Survey Participation; Nicole R. Buttermore, GfK Custom Research Randall K. Thomas, GfK Custom Research Jordon Peugh, SSRS; Frances M. Barlas, GfK Custom Research Mansour Fahimi, GfK Custom Research

  • Recall the kerfuffle last year about what information should be told to respondents re sponsor or person conducting the research 
  • Recognized participants should be told enough information to give informed consent – but also if we are concerned about bias, then we can tell people they won’t be debriefed until afterwards; but companies said sometimes they could NEVER review the sponsor and they’d have to drop out of #AAPOR if this happened
  • We worry about bias and how knowing the sponsors affects the results
  • Sponsor information is a less important feature to respondents
  • Do respondents view taking sureys as risky? What information to respondents want prior to computing surveys.
  • Topic, my time, and my incentive are thought to be most important
  • People were asked about surveys in general, not just this one or just this company
  • 6% thought an online survey could have a negative impact 
  • Most worried about breaks of privacy, confidentially; less worried is survey is waste of time or boring, or might upset them
  • 70% said no risk to mental health, 2% said high risk to mental health
  • 23% said stopped taking a survey because it made them uncomfortable – made think more deeply about life, made them angry, made them feel worse about themselves, made them feel sad, or increased their stress
  • Politics made them angry, bad questions made them angry, biased survey and too long survey made them angry [That’s OUR fault]
  • Same for feeling stressed, but also add in finance topics
  • Feel worse about self is the finance topic or health, or about things they can’t have
  • Feel sad related to animal mistreatment
  • People want to know how personal information will be protected, surely length, risks, topic, how results will be used, incentives, purpose of survey – at least one third of people [1/3 might not seem like a lot but when you’re sample is 300 people that’s 100 people who want to know this stuff]
  • Similar online vs phone, incentives more important for online, one the phone people wan to know what types of questions will be asked

Communicating Data Use and Privacy: In-person versus Web Based Methods for Message Testing; Aleia Clark Fobia, U.S. Census Bureau Jennifer Hunter Childs, U.S. Census Bureau

  • Concern about different messages in different place and they weren’t consistent
  • Is there a difference between “will only be used for statistically purpose” and “will never be used for non statistical purposes”
  • Tested who will see data, identification, sharing with other departments, burden of message
  • Tested it on a panel of census nerds :), people who want to be involved in research, 4000 invites, 300 completes
  • People were asked to explain what each message means, broke it down by demographics
  • 30 cognitive interviews, think aloud protocol, reads sets of messages and react, tested high and low performing messages [great idea to test the low messages as well]
  • FOcused on lower education and people of colour
  • Understanding is higher for in person testing, more misunderstanding in online responses, “You are required by law to respond to the census (technical reference)” was better understood than listing everything in a statement
  • People want to know what ‘sometimes’ means. And want to know which federal agencies – they don’t like the IRS
  • People don’t believe the word never because they know breaches happen
  • More negative on the web
  • Less misunderstanding in person
  • Easier to say negatives things online
  • In person was spontaneous and conversation
  • Focus on small words, avoid unfamiliar concepts, don’t know what tabulate means, don’t know what statistical means [they aren’t stupid, it’s just that we use it in a context that makes no sense to how they know the word]

    Respondent Burden & the Impact of Respondent Interest, Item Sensitivity and Perceived Length; Morgan Earp, U.S. Bureau of Labor Statistics Erica C. YuWright, U.S. Bureau of Labor Statistics

    • 20 knowledge questions, 10 burden items, 5 demographic questions, ten minute survey
    • Some questions were simple, others were long and technical
    • Respondents asked to complete a follow up survey a week later
    • Asked people how hard the survey was related to taking an exam at school or reading a newspaper or completing another survey – given only one of these comparisons 
    • Anchor of school exam had a noticeable effect size but not significant 
    • Burden items – length, difficulty, effort, importance, helpfulness, interest, sensitivity, intrusive, private, burden
    • Main effects – only sensitivity was significant, effect size is noticeable
    • Didn’t really see any demographic interactions
    • Burden length difficult; effort importance helpfulness interesting; sensitive intrusive private – these are the three factors 
    • Only first factor related to whether they would answer the second survey
    • Females more likely to respond a second time
    • More sensitive less likely to be answered again, more interestnig in would attract more women the second dime

      Ten Emerging Privacy Challenges for Marketing Research & How to Navigate Them by Howard Fienberg and Stuart Pardau #ISC2015 #MRX

      MRALive blogged from the 2015 MRA Insights & Strategies Conference, June 3-5, 2015 in San Diego. Any errors or bad jokes are my own.

      • This is not legal advice 🙂  [and along those lines, please assume my notes are completely wrong. do the research properly and that doesn’t mean perusing this blog post.]
      • There are federal and state laws, then more laws segmented by the vertical, and by modality of how you collect information
      • a data breach can cost millions, if one data breach is $200, then thousands or millions of breaches is huge money
      • be transparent about what you do and don’t do, accurately describe what you do
      • data security breaches
      • playstation, sony cyber attack, target, home depot, anthem all lost millions of records; most states have data breach notification laws, when a breach occurs, you must report it
      • states have different definitions of PII, different time frames, safe harbour for encryption so advisable to encrypt, build your policy for the most restrictive policy
      • must have a conspicuous descriptive privacy policy
      • Do Not Track requests – you need to specify whether you honour these request though you aren’t required to honour them [wow, did not know that]
      • Eraser Law – minors have a right to be forgotten, if you know they are minors or your site appeals to minors it applies to you
      • 2 beacons and mobile tracking
      • tracking in around between brick and mortor without cash register receipts
      • where is data going and where is it being shared, can you opt out, how identifiable is the data
      • are consumers notified when they’re being tracked, if you don’t like it you can turn off your device [that makes me uncomfortable – IIII have to change my device as opposed to you buggering off?]
      • Nomi tracking – say what you do and do what you say, they didn’t let people opt out because people didn’t know they were being tracked
      • emerging area with great potential but must be very careful
      • Spokeo case – firm does deep web crawls and aggregates it into reports, you can look up yourself or your friends
      • must it be concrete harm to bring forward a case? if information is wrong and you can’t prove it, do you have a case; this case could open floodgates. in this case, the information seemed to be better than reality. [better is in the eye of the beholder]
      • international data transfer – if you focus on US domestic you’re generally ok, but if one project is outside, then it matters
      • if you work with EU, make sure you know the data principles; you can self-certify but then you must adhere to those principles, must renew it every year; requirements for regular privacy assessments; need a privacy officer if you have 250 or more employees
      • MR is a data broker, FTC won’t rule our MR
      • policy makers are concerned with brokering data for marketing purposes, and verification of respondents
      • need option to be able to delete all the information they have about you, this is because we are sometimes lumped in with creepy businesses
      • Internet of things – hypothetical security risks right now, unauthorized use of PII, attacks on systems, personal safety
      • focus on privacy by design, select providers carefully, control access and monitor constantly
      • how do you deliver notifications on a device with no readout
      • American community survey – gets response rates around 95%, because government survey and because its mandatory, but mandatory upsets people, voluntary would cut repsonse rates to 50% and we wouldn’t get data from about 40% of the country
      • BYOD – bring your own device
      • employees can access sensitive company data on their own device – HR, health, financial, trade secrets, client lists, confidential information, or, employees use company devices at home
      • if its your own device, you can control it or lock it, otherwise you have no control
      • still must notify if data is breached even though its not your device
      • Employers could say you’re not allowed to use your own device but this is not realistic, its better to have a policy
      • Policy – onboarding documentation, agreements to keep data secure, remote data deletion and limits on apps, data retention, termination process, be clear on who pays for what
      • Federal Trade Commission – deceptive and unfair trade practices, polices data privacy and data security
      • LabMD/Wyndham hotels cases – failed to institute reasonable and appropriate security measures, case is under appeal and suspect FTC will be allowed to police data privacy

      Data Security… Don’t risk being the weakest link #CASRO #MRX

      Live blogged in Nashville. Any errors or bad jokes are my own. Any typos are purely the fault of the iPad.

      by Peter Milla and Dave Christiansen

      CASRO has seen an increase in requests from clients and regulators for data privacy and security compliance
      – code of standards
      – safe harbor program
      – ISO

      COmpliance means confirming to a rule, like a policy or law. CLients want operational transparency.

      COmpanies will require 50 percent less business process workers and 500 percent more digital business jobs. especially regulatory analysts and risk professionals. These jobs are generally only in larger companies. This includes privacy officers.

      Privacy and security are symbiotic. This can be a crisis for MR. Privacy is appropriate use of the data. security is the confidentiality and integrity of data.
      – you cant just destroy data. what about all the backups. the saved copies that everyone has from their piece of the work.
      – availability of data could impact life or death in some cases

      What drives compliance
      – client wants it [i hope vendors want it too. why is because clients want it?]
      – legislation or regulation like HIPPA GLB COPAA FTC PIPIDA. you could be accused of unfair trade practice for discontinuing a poor responder.
      – gain a competitive advantage

      [wow, typing on an iPad keyboard is quiet and completely unobtrusive when you lay it flat! But i cant put pictures or links easily. Sorry.]

      ISO 27002 – you cant be certified, you can be compliant

      HIPAA compliance case study
      – business associates now face liability. Uses not in accordance with BAA. failure to limit PHI. failure to provide breach notification. failure to provide HHS access when required. failure to comply with security rule.
      – many companies state one year but they keep it forever

      HIPAA compliance
      – Protected Health Information PHI.
      – employees don’t usually intend to make errors, they just don’t know
      – no easy checklist of requirements
      – does offer a set of principles. instruction is to take necessary steps to disclose minimum necessary information
      – much is process based

      HIPAA security rule compliance
      – risk analysis – evaluate likelihood of risks, implement appropriate security measures, document those measures, maintaining continuous review and assessment, ensure access control and integrity control, ensure transmission security, keep documentation up to date

      BLUE CROSS – just had a breach that affected 80 million US citizens, 25% of the population. names, SIN, birthdays. be sure to use your free annual credit report. Take advantage of free credit monitoring. monitor your children as well. be alert when filing your income taxes.

      Top security trends
      – cybercrime, privacy and regulation, third party provider threats and breaches, BYOx in the workplace – Bring Your Own Device [like i’m doing right now. are my office security systems on my personal tablet?]

      [note to self and everyone. turn the GPS off all of your devices. it is not necessary that every software program knows where you are, where you live, where you work, where your kids live]

      Advanced Persistent Threat – APT
      – china and Russia and Iran have active cyber espionage, aligned in every industry to take whatever they can, causing information security bar to be raised

      CLients expect all their information is safe. need a dedicated person or team. CISSP, CISM, CISA, ISO, SDLC. [we have this person. they went to every single office in every country over the last couple weeks to remind every single person just how serious security issues are.]

      [everyone should have come to this session. i don’t care if you think you’re doing fine. you need persistent reminders of just how worried you really ought to be.]

      Information security is not IT security. spans people processes and technology. its digital written and spoken. it’s being proactive. it’s an organizational discipline.

      ISO27001
      – best practice for information security, NIST, CSF, COBIT. can be audited and certified. Earth’s ‘best practice’ its the policies procedures and controls and training.
      – it is not industry specific. it is federal, state, industry, contractual relevant.

      Identify weaknesses
      – vulnerability assessment annually or quarterly, penetration testing, gap assessment, awareness training, internal audit, risk assessment.

      [Annie’s free public service announcement – do an internal audit today. if it looks like spam, it probably is. if it doesn’t look like what I usually email to you, i probably didn’t email it to you.]

      The Internet of Annie #MRX #IOT

      I did it. Yes. I broke down and spent my Christmas money. Let’s put aside the fact that I still get Christmas money from the moms and move on to what I spent it on.

      In just six to eight weeks, this pretty little plum coloured Fitbit will arrive at my door. (The “make it pink so girls will buy it” marketing scheme works on me but plum is just as good.)

      2015/01/img_0065.pngSupposedly, it will monitor my heart rate all the time including when I am awake and asleep. It would have been cool to have it a few weeks ago when my four wisdom teeth were ripped out of my face but I’m sure some other quite unpleasant event will greet me soon enough.

      I’m quite looking forward to learning:
      – how consistent my sleep is, and how many times I wake up at night
      – whether my heart rate speeds up or slows down when I get ready for work or leave work, or when I go toy awesomely fun ukulele class
      – how incredibly nuts my heart rate is when I speak at conferences, show up at cocktail hour, plow through a crowded exhibit hall. Though I may seem calm and relaxed, it really takes a ton of mind games to turn quiet me into loud me.

      And at the same time, I’ll be wondering… If someone gets their hands on my data, what will they do with it? What products will they develop as they learn about me? What heart rate medications will they need to sell to me? What fitness products will they need to sell to me? Will I need to buy the shirt version to measure electrical outputs? The sock version to measure sweat outputs? The earbud version to measure brainwaves? What will marketers and brand managers learn about me and my lifestyle?

      Now that I think about it, this is MY form of gamification. I can’t wait to see charts, watch trends, and compare Norms. And now that I’m learning Python and rstats, I would love to get my hands on the dataset of millions of people and millions more records. With permission of course.

      Big Data and Privacy: The Legal Landscape Affecting Corporate Research by Shannon Harmon, JHC #CRC2014 #MRX

      CRC_brochure2013Live blogging from the Corporate Researchers Conference in Chicago. Any errors or bad jokes are my own.

      Big Data & Privacy: The Legal Landscape Affecting Corporate Research by Shannon Harmon, JHC

      • our lives are a series of data points
      • more opportunity vulnerability and the potential for greater abuse
      • smaller entity might purchase data from 3rd party
      • who owns the data, who has the right to access the data, what steps are taken to keep it secure
      • goal of any regulation is to protect personally identifiable information form breach and misuse
      • you can identify people with very little information so keep in mind a lot of information is PII
      • Notice and consent: need to provide notice of how the data will be used, and then obtain consent – this is the core of the law related to privacy, you need to make sure the right practices were followed to do this
      • Where do we look for oversight? Right now, state attorney general, FTC, FCC, FDA
      • Fair information practice principle – only collect what you need to collect and only retain it for as long as is necessary to fulfill the specified purpose
      • FIPP – data quality and integrity – organizations should ensure that the PII is accurate, relevant, timely and complete and this is difficult if you’ve purchased the data, supplier should have a structure in place to ensure this
      • Consumer privacy protection bill of rights – google search this – things corporations should do to protect privacy, this area will become increasingly more regulated so think ahead
      • Fair Credit Reporting Act – example of what big data protection framework should look like, right to review your credit report and make sure it’s accurate and get it fixed if it’s not correct, this is where we’re headed, your digital dossier is being collected and you don’t know how decisions about you are being made, you can’t contest your big data points… right now
      • special considerations for health data – apple has stated that any app developers cannot use any of the health data for advertising, or data-mining except to help an individual manage their health or for medical research. but is apple responsible for developer compliance? what if a data broker got the data from someone who wasn’t supposed to have it in the first place?
      • considerations for researchers
        • where is the data being obtained, what are the sources
        • what practices are being used to obtain it and what is your confidence in your aggregator
        • how is the data being trained to arrive at conclusions? what algorithms? what human manipulation?
        • think about the vendor/subcontractor relationship, is the contractor independent? a substandard contractor impacts you
      • we need
        • use restrictions – can’t use big data to discriminate on age, race, etc
        • oversight – protect against unregulated digital dossiers
      • KNOW YOUR INFORMATION SOURCE
      • be intimately knowledgeable about your company’s data gathering practices – informed consent, opt-out, internal user access controls
      • be ready to evolve as the law is only beginning to be developed in this area

      Canadian Senate: Boring mumbo jumbo or fascinating privacy discussions? #MRX

      Bumped to first class!

      Bumped to first class!

      On my way to prep for our Senate meeting

      On my way to prep for our Senate meeting

      Today, I was pleased and, more correctly, honoured to appear before a Senate Committee to speak with Kara Mitchelmore, the CEO of the MRIA, regarding Senate Bill S-4, the Digital Privacy Act. The official opinion will shortly be available but for those of you who can’t wait, here is the basic gist of it. Any inaccuracies here are my own. 1) Breach notifications should be mandatory, and the Privacy Commissioner should be the unbiased third party that determines what is a real risk of significant harm to an individual. 2) The MRIA supports the provisions in the bill which add clarity to what is valid consent. The committee may be interested in our code of conduct which contains a section on the ethical issues in dealing with children and young people. 3) The MRIA is pleased that PIPEDA will be amended to allow the transfer of personal information from an organization to a prospective purchaser or business partner (think mergers and acquisitions). 4) The MRIA does not support allowing organizations to share personal information of individuals to other organizations without consent. It should follow due process such as through a court order.

      Senate Committee agenda

      Senate Committee agenda

      5) The MRIA would like to close a loophole which allowed organizations to share personal information without consent to an investigative body or government institution. It should follow due process such as through a court order. After we spoke, Michael Geist, a law professor at the University of Ottawa, made numerous excellent points (Michael’s website). Some of his comments are included here (any errors or misrepresentations are my own).

      Enhanced by Zemanta

      Data, Data Everywhere The Need for BIG Privacy in a World of Big Data by Ann Cavoukian, Ph.D., Information and Privacy Commissioner of Ontario, Canada #FOCI14 #MRX #GreatTalk

      Live blogging from the #FOCI14 conference in Universal City. Any errors or bad jokes are my own.foci14

      8:45 KEYNOTE Data, Data Everywhere The Need for BIG Privacy in a World of Big Data 
      Ann Cavoukian, Ph.D., Information and Privacy Commissioner of ONTARIO, CANADA

      • big data and privacy are complementary interests
      • privacy by design is a win win proposition
      • if you don’t address privacy concerns, there will be a backlash
      • privacy = personal control, freedom of choice, informational self-determination, context is key
      • www.privacybydesign.ca
      • in 2010, passed this landmark resolution to preserve the future of privacy, has been translated into 36 languages because people are so desperate for this information
      • essence of it is to change the emphasis from a win-lose model to a win-win model, replace ‘vs’ with ‘and’
      • you must address privacy at the beginning of a program, embed it into the code at the beginning
      • 7 principles –
        • be proactive not reactive, prevention not remedial
        • default condition needs to be privacy
        • privacy embedded into design
        • full functionality, positive sum not zero sum
        • end to end security, full lifecycle protection, from the outset, from collection to destruction at the end
        • visibility and transparency, keep it open, tell customers what you’re doing, don’t let them learn afterwar
        • respect for use privacy, keep it user centric
      • Big data will rule the world – honeymoon phase, everything else must step aside, forget causality, correlation is enough
      • Then the honeymoon phase ends – found data… digital exhaust of web searches, credit card payments, mobiles pinging the nearest phone mast; these datasets are cheap to collect but they are messy and collected for disparate purposes
      • Big data is now in the trough of disillusionment
      • Google flu trends used to work and now doesn’t because Google engineers weren’t interested in context but rather selecting statistical patterns in the data – correlation over causation, a common assumption in big data analysis, imputed causality which is incorrect
      • MIT professor Alex Pentland has proposed a New Deal on Data – individuals to own their data and control how it is used and distributed
      • data problems don’t disappear just because you are working with big data instead of small data, you can’t just forget about data sampling
      • Forget big data, what is needed is good data
      • data analytics on context free data will only yield correlations, add context and then you might be able to impute causality
      • once business have amassed the personal information, it can be hard if not impossible for individuals to know how it will be used in the future – “A long way to privacy safeguards” New York Times Editorial
      • privacy is not a religion – if you want to do nothing, you can do nothing. but let people choose to do something
      • people now have to resign when data breaches happen, you must address them at the beginning
      • privacy should be treated as a business issue, not a compliance issue. gain a competitive advantage by claiming privacy, lead with it
      • proactive costs money but reactive costs lawsuits, brand damage, loss of trust, loss of consumer confidence
      • privacy drives innovation and creativity
      • privacy is a sustainable competitive advantage

      Other Posts

      Hacking Happinesss by John Havens #FOCI14 #MRX

      Live blogging from the #FOCI14 conference in Universal City. Any errors or bad jokes are my own.foci14

      OPENING MORNING KEYNOTE
      Hacking H(app)iness – How to Give Big Data
      a Direction
       

      John Havens, Founder, THE H(APP)ATHON PROJECT, Author, HACKING H(APP)INESS- WHY YOUR PERSONAL DATA COUNTS AND HOW TRACKING IT CAN CHANGE THE WORLD

      • http://happathon.com/survey/
      • What are YOU worth? What are WE worth? Money? Home life, family
      • I think therefore I am… I sync therefore I am – our identity is our data
      • It’s more than being on facebook. Lots of people are on facebook via photos and references even though they have never touched facebook.
      • Lots of things happen without seeing them – sound waves, stress – but can be quantified regardless – facial recognition technology, MRIs
      • You can wear a device that measures your health or diet or fitness. Allows you to collect a lot of data without deciding exactly what you want to measure.
      • what is a data broker?  [i have no idea]
      • privacy should be considered as control, privacy is personal. do i have the right to see copies of data collected on me
      • the property that you collected, the data that you gathered, that’s ME.
      • get people to trust your use of their data and they will share more with you
      • people who are happier need less medication
      • hedonic happiness goes up and down as good and bad things happen; unomonic happiness is intrinsic well-being such as altruism which makes you feel like you have purpose
      • you can be choiceful about what you allow into your brain – you CAN turn off the 11 pm news, tell yourself 3 three you are happy about
      • “Do you want to go consume a movie?”   “Do you want to consume a barbie doll?”  This is not how people communicate with each other
      • Would you wear every wearable device if someone gave you $20 000? Only if you trust them [yeah, not happening for me!]
      • People think the word consumer is impersonal, commoditized, transactional, negative. Why do we keep on using this word?
      • Consumers WANT to be called guest, shopper, friend, client, valued, customer, person, partner, patron
      • Stop calling them consumers. the paradigm won’t change. the relationship won’t change.  [I’ve switched. I call people people now.]
      • What are you worth? Not money.

      Other Posts

      %d bloggers like this: