Tag Archives: panels

Do panel companies bother to manage their panels? #AAPOR

One of yesterday’s #AAPOR sessions focused on data quality of online panels. One of the speakers posited that maybe panels don’t know or don’t care about their management. This could not be further from the truth.

I’ve been on the management team for several national and global panels, and have also worked with a number of panel managers from competitive panel companies.

The amount of care and expertise that these people put into managing their panels is astonishing. On a daily basis, these folks are analyzing and trying to figure out how to respond to things like
– tenure: how long have people been on the panel as of today, which demographics have been there shorter and longer
– response rates: what are the newest rates by survey, by demographics, by survey category, by client
– supplier health: depending on where a panelist was sourced from, do any suppliers give better or worse data or panelists who stay longer on the panel
-data quality: what people are providing better or worse data, by source, by category, by everything
– invites: which demos are getting more or fewer invites, who is being ignored or bothered

And of course, all of these data, and many more, factor into panel rules dictating how many invites individuals are allowed to receive, whether that rule needs to be changed temporarily or permanently, whether it needs to change by demographic or by source.

You know what, perhaps it would just be easier to read the ESOMAR 28 questions document that most panel companies have created. The moral of the story is that just because you aren’t familiar with what the companies are doing, doesn’t mean they aren’t doing it.

Other Posts

Advertisements

Location Panels: Opting in to sharing your every movement Andrea Eatherly #Qual360 #QRCA

Live blogging from the Qual360 conference in Toronto, Canada. Any errors or bad jokes are my own.qual360

Location Panels: Opting in to sharing your every movement 
Andrea Eatherly, Head of Operations, Placed
  • privacy is a major issue, panel is triple opt-in – 1) download app, we will measure your location 2) second as downloading app 3) registering for panel
  • rewards are provided, lots of people are getting my data anyways so people might as well be rewarded for it
  • insights are all aggregated by demographics etc
  • in-store tracking is not opt-in, if your wi-fi is on, you are trackable
  • Opt-out approaches to location – little to no notification to consumer, no value exchange
  • Opt-in means they get demographics, they can send survey questions to create a loop
  • Understand store traffic, by demographics, in real-time, makes it possible to predict ups and downs in retail sales
  • Connect mobile to in-store activity – watch people search the internet at home and then see which stores they actually go into to shop for the product
  • can figure out whether people using certain apps are more likely to visit certain grocery stores
  • People who watched the biggest loser were more likely to visit fast food chains – an aspirational market
  • People who watched survivor are more likely to go to gyms, fitness stores – good for marketing purposes
  • Can see which shopping apps different retailer shoppers use – walmart vs target vs kohls vs JCPenney, those people all prefer different apps
  • What other stores are your starbacks shoppers going into, competitive stores?
  • Monitor whether ads people see on computer result in people going to the retail store
  • Information can be used to target ads to the right people – your consumers like these specific stores about these categories and these price points

Other Posts

Respondent Identity Verification with Non-Panel, Real-time Samples: Is There Cause for Concern by Nancy Brigham and James Karr #CASRO #MRX

Live blogging from the CASRO Digital conference in San Antonio, Texas. Any errors or bad jokes are my own.CasroDigital

Respondent Identity Verification with Non-Panel, Real-time Samples: Is There Cause for Concern?”


Nancy Brigham
As the research industry evolves toward non-panel sample sourcing and real-time sampling, questions have arisen about the quality of these respondents, especially in the area of respondent identity verification. This research addresses two key questions: Are fraudulent identities a concern with non-panel samples, and what are the research implications of assessing identity validation with these samples? This work examines identity verification and survey outcomes among five different levels of Personally Identifiable Information (PII) collection. In addition to the presenters, this paper was authored by Jason Fuller (Ipsos Interactive Services, Ipsos).

  • Nancy Brigham, Vice President, IIS Research-on-Research, Ipsos
  • James Karr, Director & Head of Analytics, Ipsos Interactive Services

James Karr
  • Do people whose validity cannot be confirmed providing bad data? Should we be concerned?
  •  What do we know about non-panel people? Maybe they don’t want to give PII to just take one survey. Will they abandon surveys if we ask for PII?  [I don’t see answering “none” as a garbage question. It’s a question of trust and people realizing you do NOT need my name to ask me my opinions.]
  • Is it viable to assess identify validation with non-panel sources?
  • In the study, PII was asked at the beginning of the survey [would be great to test end of survey after people have invested all that time in their responses]
  • Five conditions asking for combination of name, email, address
  • Used a third party validator to check PIIEmbedded image permalink
  • 25% of people abandoned at this point
  • Only 4 out of 2640 respondents gave garbage information at this point, 12 tried to bypass without filling it out and then abandoned. It’s so few people that this is hard to trust. [Hey people, let’s replicate]
  • Name and address caused 6% of abandonment, name and email caused only 3% abandonment
  • Did people get mad that we asked this? can we see anger in concept test? no.
  • didn’t lead to poor quality survey behaviours – used a 13 minute survey
  • when given a choice, people prefer to give less information – most people will choose to give name and email, low some people will give all information
  • Simply collecting PII didn’t appear to influence other aspects
  • Did their non-panel source give lower quality data? no. 82% passed the validation test across all conditions. Those who provide the most comprehensive data validate better but that’s likely because it’s more possible to validate them.
  • Real-time sample gives just as good data quality, same pass rates, no data differences
  • Conclude the screening question is necessary, heads up that PII question will be coming
  • Younger ages abandoned more across all test conditions
  • This study only looked at the general population, not hard to reach groups like hispanics, or different modes like mobile browsers, or in-app respondents

Other Posts

Are There Perils in Changing the Way We Sample our Respondents by Inna Burdein #CASRO #MRX

Live blogging from the CASRO Digital conference in San Antonio, Texas. Any errors or bad jokes are my own.CasroDigital

“Are There Perils in Changing the Way We Sample our Respondents?”
Sample and panel providers are always looking to increase their active sample size. In recent years this has taken many companies out of email lists into real time sampling via ad banners or social networks. Research has revealed that panelists recruited by such methods are substantially different than the panelists that opt into online panels. This study addresses the various methods panels implement to generate additional sample, and the tradeoffs these methods require. While there is a clear short term gain of added panelists, there may be long term loss of data stability and panel tenure.

  • Inna Burdein, Director of Analytics, The NPD Group, Inc.
  • Is there a differences among people who take several surveys in a row versus taking surveys off your website
  • Tested data from website survey, email survey, and follow-up survey
  • 1400 completes per group
  • Website takers are younger and newer. Embedded image permalink
  • Website takers express more interest in surveys and incentives [or they just like clicking a lot]
  • Website takers are more online, google a lot, lots of free time
  • Completion rates are higher for website takers, and then follow-on surveys. Email takers are last.
  • Website takers are more satisfied – easy, reasonable, interesting
  • Website takers have more inconsistencies and not following instructions. Follow-ons are more likely to straightline and opt out of responding.
  • Website panelists report more purchases, more store visits, more browsing stores, more online purchases, make home improvements, redecorate, go on vacation, invest in stock market
  • [More likely to report purchases does not mean more likely to purchase]
  • One follow on is kind of normal, but two follow-ons is where the differences happen, more unhappiness, more non-purchase, more straightlining, more use of none of the above
  • Significant differences do emerge [but I wonder how many are truly meaningful, would you run your business differently if you got number A vs number B]
  • Are there perils in changing the way you sample? It depends. Need enthusiastic responders and more representativeness. Tell people to answer on the website. Possibly balance on channel
  • Follow-ons may hurt sample quality if no limit is set – time spent, number of surveys, what is the right rule?

Other Posts

Reaching for the Holy Grail: Validation Techniques for Online B2B Panels by Edward Paul Johnson #CASRO #MRX

Live blogging from the CASRO Digital conference in San Antonio, Texas. Any errors or bad jokes are my own.CasroDigital

“Reaching for the Holy Grail: Validation Techniques for Online B2B Panels”
When completing surveys online respondents have the ability to claim false credentials in order to qualify for higher paid surveys. This research seeks to apply to the online panel space the same methods used to keep telephone B2B sample clean. The presentation seeks to provide conference attendees with a better understanding of the importance of recruitment source to quality; which validation techniques are effective; and the legal and privacy pitfalls to watch out for when validating business sample.

Edward Paul Johnson, Director of Analytics, SSI

  • Our clients want business professionals, qualified decision makers, informed on the topic, engaged and interested
  • We know client lists are skewed
  • We want it all online because it’s faster and convenient
  • Looking for B2B people, it’s hard to first find them. Then you have to reach the boss. Then you have to weed out the fraud to get rid of people who just want the incentives. Then you need to create a lasting relationship because they are valuable for future research as well.
  • Many times, you don’t even know the market size of the sample of people you are looking for – computer hardware purchasers?
  • Fraud is normally very small, normally only 1 to 2% in the overall panel. But fraud has an advantage because they qualify for more than the honest people do. There might be 20% fraud among business people just because more fraudsters qualify.
  • What are our weapons? Existing relationships like hotels airlines and credit cards, data mining the profile to find contradictions, social media linking, phone validation
  • Every weapon is a two edged sword – Bigger panel means lower quality but smaller panel  means higher quality. Only bigger panels allow more fraudsters and smaller panels eliminate honest people.
  • Better to have multiple tests, one with high specificity an done with high sensitivity
  • Data mining the profile removed about 15% of panelists – number of reports, company size were important variables. Unusual company size to number of computers was helpful. Over 45 years old and less than 1 year in the industry helped somewhat.
  • LinkedIn validation – 600 people volunteered to connect to LinkedIn but profiles were often incomplete, email addresses were different. number of connections and skills was helpful but individual skills were too varied to be helpful. Fraudsters likely don’t volunteer to connect their accounts. Wasn’t a good method.
  • Phone validation – Good confirmation test but it excluded good panelists. Some gave bad phone numbers or it was disconnected or they no longer work at that number.  Good confirmation test but not a good entry test.
  • Tips for phone validation – let them know you will call them at work. Call very close to when they joined, within 2 days. Keep the validation short, to 2 minutes, name company title. Use trained interviewers who know how to bypass gate keepers.  The gate keeper might be able to validate this for you.
  • DOES improve data quality. Existing relationships isn’t enough. Be careful of excluding good people, can do just as much damage with false positive and false negatives.
  • It will never be perfect. There is no holy grail but you can improve it all the time.

Other Posts

The Best Panel is a Census Rep Panel, NOT! #MRX

When you’re commissioning a new survey project, it can be hard to select the best survey panel for the job. There are many criteria to judge including response rates, data quality processes, panel sizes, and panel make-up.

Response rates are clear. If panelists don’t answer surveys, you get no completes. Data quality is clear. If panelists are speeders or random responders, you get garbage data. If the panel isn’t large enough, you don’t get enough completes. But what about panel make-up?

Logo for the 2010 United States Census.

What does the ideal panel look like? One of the most common misconceptions is that a survey panel should be census rep. Therefore, a Canadian panel should have the same demographics as the Canadian census and a US panel should have the same demographics as the US census. Unfortunately, this is NOT the ideal make-up of a survey panel.

Let’s think about the kinds of samples that we want to survey. Certainly, many survey projects are interested in census rep samples. Political surveys and social surveys for sure need to understand how a census representative population feels. But think a bit more. How often do you need samples of 1) males 18 to 34, 2) mothers of teenages, 3) mothers of babies, and 4) adults aged 65 and more. Those types of samples couldn’t be further from census rep and yet they are more representative of the types of samples that researchers need, the types of people they need to have as part of a survey panel.

So here is what the ideal panel looks like. It looks like what survey researchers need. And if researchers send 25% of their surveys to people aged 18 to 24, then 25% of their panel should be aged 18 to 24. (It’s actually more complicated than this as we must take into account that young people have lower response rates and therefore the panel should probably be 35% aged 18 to 24.) Similarly, since older people are less often the target of surveys, they should be underrepresented on a panel compared to census. (And even MORE underrepresented because their responses rates are much higher than average.)

The reason for this comes to the annoyance factor. If survey panels were census rep, we would have a lot of very annoyed, very frustrated younger people. They would be receiving far more surveys than other people and the demand on their time would be much more.  On the other hand, older people, who aren’t the focus of as many research objectives would receive far fewer surveys than other people and they would more easily become disengaged and disappointed at the lack of involvement. Neither of these situations is ideal.

So the next time you’re considering a research panel, don’t ask the providers if it’s census rep. Ask instead about the average number of invitations each panelist receives. Find out if some people receive 5 survey invites every week while other people receive only 5 invites every year. Find out if they treat their panelists nicely.

Ethical Framework for SMR, Panel #SoMeMR #MRX #li

10.15 PANEL: Establishing an ethical framework for social media research

  • Tracking developments towards new standards for social media research in Europe and the US
  • Dealing with privacy issues: Assessing attitudes towards privacy in online environments
  • Evaluating ethical guidelines for blog and buzz mining
  • When is engagement with commenters necessary?
  • Evaluating appropriate codes of conduct for qualitative approaches to harvesting data via social media channels

Barry Ryan, Standards & Policy Manager, MRS
Jillian Williams, External Relations Team Leader, Highways Agency
Pete Comley, Chairman, Join the Dots (formerly Virtual Surveys)
Simon McDonald, Business Director, Insites Consulting

  • Barry – Data protection, there is identifiable data that must be protected. Copyright – blogposts, photos, videos, all covered under copyright act. Terms and service – of individual sites must be respected. Internal issue – how MR codes are constructed – participation is based on voluntary informed consent, that is our heritage.  (I’m waiting to hear about the stance on observational research, and qualitative researchers abilities to summarize text.)
  • Simon – Does research industry need rules?  We know it’s a work in progress. We don’t want to be restricted from doing things that other companies are able to do. (e.g., buzz companies don’t have to listen to MR ethics)
  • Pete – Believes in guidelines. Was part of ESOMAR guidelines committee.  Not happy with MRS stance. Bodies should be forward looking and represent us. Tone of consultation document does not do that. Document is like Pope and Catholic church. Applying to the letter. MRS won’t stop us from doing SMR. Seriously risks splitting entire MR industry in the UK. It is that serious. Solutions? We must be more inclusive and representative. Must be provisions for MR to do SMR. Long term, MRS code of conduct is the problem. Informed consent is the problem. We are NOT interviewing people here. This is analyzing public data. Concept of informed consent does not apply. We need to relook at code of conduct.
  • Jillian – As a research client, anonymity is important. Masking isn’t satisfactory.  Client does not want to be on the front page of the Daily Mail. Client will take the flack, not the industry. Clients want to comply with guidelines. Clients pay the bill.
  • Barry – Data Protection Act is the problem. Informed Consent is the first thing in it. There is no distinction between public and private. The MRS Code reflects rules of legislation. MRS made it easy for researchers to comply with data protection act. “This data is accessible” is not sufficient but you can work with the data provider directly and then it’s ok. “Subject to data protection rules” is important. If MRS interpeted law incorrectly, please tell them. [Call to MR company internal legal counsel – does anyone see if there have been misinterpretations of Data Protection Act?]
  • Pete – Data Protection Act is pre-internet. How do we survive as industry until and if there are changes?
  • Barry – Whatever comes next will not be better. “The legislator knows best.”  People want the right to be forgotten (drunken photos should be deleted if the person wants them deleted).
  • Simon – Self-regulation is important. Dialogue with respondents means better qulaity data. Consent is important but best research is also important. Their company had a problem where they friended people for research purposes, with multiple layers of consent, but then of course Facebook lets you see friends of friends, and those friends hadn’t given consent.
  • Jillian – SMR is not necessarily representative.
  • Annie – I asked a question about whether observational research is not research since it’s not informed consent and whether masking is an assault on qualitative researchers who mask for a living.
  • Barry – This is not an assault on qualitative research. There is a separate guidelines for qual research. Maybe the MRS heard nothing back from qualitative researchers and it’s not reflected in the paper. In the online space, everything is data, video, photo. Under the data Proection Act, processing data is engagement. Masking is good for privacy, but it doesn’t rectify the potential unlawfullness of the act of taking the data.
  • Pete – Does everything really need to be masked? “I love McDonalds” maybe not.  Anything Pete says here, he risks it being written down and shared. (Hmmmm….. watch out!) Going out of your way to NOT quote something written in the online public space seems odd. What do you do with gray area of semi-private. Inconsequential “This hotel is lovely” doesn’t need masking but someone’s sexual preference does need masking. It is a minefield.
  • Jillian – Doesn’t like masking at all. We want the insight from the quote as opposed to masked verbatims that aren’t exactly correct and could be misunderstood.
  • Question – why are companies doing this if it’s all illegal under data protection act? [Great question – are we waiting for someone to get sued and go to court before we get an update to the legislation?]

Kees de Jong: Panels are People #MRIA

Welcome to the virtual MRIA 2011 annual conference! This post reflects my personal musings and interpretations of this presentation. It was written during the presentation and posted minutes afterward. Any inaccuracies and silliness are my own.


Keynote Speaker:  Kees de Jong, CEO Survey Sampling Int. “Panels are people?”

  • Everyone is having trouble keeping up with the quality of presentations. Kees is up for the challenge!
  • Kees suggests maybe panels are heading for extinction. If you put all your information – join rates, attrition rates, response rates –  into Excel, you get a decreasing line. Panels are sick, hurt. Only good thing about panels is the community based panel. Panels are from the past, we don’t want to think about it, people are dropping out, hard to maintain.
  • 8 dimensions of why panels are in bad shape.
  • Business/Money:  Panels were just a big database and money came out of it. 13euros per complete 10 years ago. That was attractive at the time. Recently, the price dropped to a couple dollars because of competition. Enormous focus on the monetary component. Panel buying belongs to the procurement department who just cares about price, not quality.
  • Quality: Are panels in bad shape because of quality issues?  Topic of probability samples has just arrived. 🙂  Netherlands did a study of 29 panels. RR had no effect on data quality. Discovery of multiple panel membership was a shock. Ocean for fishing was smaller than they thought. Inattentiveness and professional respondents became new terms. Kim Dedecker now mentioned. 🙂  US over-reacted and everyone jumped on this, not because intrinsically motivated, but because the woman with the wallet had spoken. And then tech companies tried to jump in.
  • Discussions were from the wrong perspective. Whatever you focus on grows. If you focus on bad respondents, then bad respondents grow. We badmouth the millions of people doing the right thing and helping the industry.
    Care: Will panels be extinct because we didn’t care enough? Panelists are most annoyed when they are invited to a survey and then they get screened out, or they screen out after a few questions. Or, when you’re english and get a german invite, etc. Broken links and sloppy work all show a lack of caring. we should have protected them better.
  • Experience: The survey itself. Robert Bain felt the environment of surveys was clunky and disrespectful to panelists. Survey length is key, critical thing. Give people short surveys. It’s very simple. Did a study of 29 companies in Netherlands. Response rate had no issue. After 17 mintues, data quality decreased. We must decrease the length of surveys.
  • Rewards: We need to understand what drives people. Points, money, it’s different for everyone.
  • Are we headed for extinction? Maybe. Ten years ago, there was nothing else online but surveys. Surveys were fun. Now, the world has changed. Competition for attention is exploded. There are still people who love to take surveys but outside that group, it’s hard to get people engaged. We should worry about this.
  • How do we solve this? We innovate. Not just technology but changing how we think and deal with issues.
  • What if there are inmates in a prison taking surveys to make money for the prison? Scary! Technology takes care of that. Phew!
  • There is no standard for validation even if some companies claim they have THE standard.
  • New council RVC, Research Validation Council, for standards of practice to rate respondents and surveys, to be officially announced next week. We need some standardization. Has support from ESOMAR and CASRO.
  • Rethink, redesign, rewrite surveys to get them shorter and better. Don’t screen people out! Route them to a secondary survey. This is the right solution.
  • Last advice: EVERYONE should take their own survey. Why such long attribute lists? That’s not how humans work. And certainly not 20 grids for 20 brands. Stick to one topic. Teach clients that data quality decreases after 17  minutes. Rethink what you do with data. We are used to thinking records but maybe we should start thinking datapoint – not one 40 minute surveys but two 20 minute surveys or four 10 minute surveys. Increase the price of long surveys. Reward people who create great surveys.
  • We focus on people who want to be in panels and this group is getting more and more skewed everyday. Normal aren’t on panels. Solution is not to focus on these people but on innovation. Focus on the billion people who are not on panels. Make the person, topic, and sender relevant. (Notice he didn’t say incentive.) Create dozens of environments so people will answer surveys in a relevant space.
  • Social media and panels are different people. Sample from everywhere not just your panel, blend sample, but make sure your sampling stream is balanced.
  • Holy grail is sample widest range of people based on location, behaviour, and regardless of modality – mobile, inbox, facebook, fully mixed mode.

I’m going to scream if you mention panel quality one more time #MRX

Quality Ice Cream Mural

I received my research organization’s magazine today. Inside were many lovely articles and beautiful charts and tables. I quickly noticed one particular article because of all the charts it had, but the charts are not what caused my fury.

The article was YET ANOTHER one on panel quality. Yes, random responding, straightlining, red herrings. The same topic we’ve been talking about for years and years and years.

Now, I love panel quality as much as the next person and it is an absolutely essential component for every research panel. We know what the features of low quality are and how to spot them and how to remove their effects. We even know the demographics of low quality responders (Ha! Really? We know the demographics of people who aren’t reading the question they’re answering?) But this isn’t the point.

Why do we measure panel quality? Because the surveys we write are so bad, we turn our valuable participants into zombie. They want to answer honestly but we forget to include all the options. They want to share their opinions but we throw wide and long grids at them. They want to help bring better products to market but we write questions about “purchase experience” and “marketing concepts.”

I don’t want to hear about panel quality anymore. It’s been done to death. Low panel quality is OUR fault.

Tell me instead how you’re improving survey quality. How have you convinced clients that shorter is better and simpler is more meaningful? What specific techniques have you used to improve surveys and still generate useful results? Tell me this and I’ll gaze at you with supreme admiration.

Survey Panel Questions – Enough Already!

Survey panels are big right now. If you want to launch a survey to hundreds or thousands of people across the country, chances are you go to a survey panel company. They have pre identified permission based access to people who are ready and willing to take surveys at the drop of a hat. When it comes to panel companies, I’ve had to explain the following two issues so many times I thought I’d just lay them out right here.

1) Your panel size is too small to meet our needs
Then how can one panel company have over 1 million panelists and another company stays in business with only 200k panelists? How does that work?
Well, let’s look at an example.

Company A has 1 million panelists and their return rate is 5%. If this panel was to launch to every single panelist right now, they would get 50 000 completes.
Company B has 200 000 panelists and their response rate is 25%. If they were to launch to everyone, they would also get 50 000 completes.
So basically, these two panels are identical! Different panel sizes, exact same outcome.

Why is Company A so much larger? Well, panels recruit thousands and thousands of people every year. Technically, they could advertise the size of their panel to be anything they wanted to. But, there’s a little thing called the panel rule that determines the real size of a panel.

A company could let anyone be ‘active’ on their panel as long as they clicked (and didn’t even finish!) a survey in the last 12 months. This panel will be really big, but since many of their panelists never even finished a survey, their response rates will be pretty low. On the other hand, another company could use a much stricter rule. Maybe it’s something like people are ‘active’ as long as they finished a survey in last 3 months. In this case, a lot fewer people meet the qualification, but they are all survey completers. This means their response rates are quite a big higher.

So, it comes down to what’s your panel size AND what’s your response rate. That will tell you the real size of a panel.

2) Your panel doesn’t look anything like what I need
This one usually comes to me as “I need to know the demographic profile of your panel so that i can determine whether you can run my survey.” Well, the reason panel companies have such huge panels is so that they can pick and choose from among their panelists to create the group that you need. Even if their panel is only 40% male, they can easily choose a sample for you that is 50% male or even 80% male.

Related Articles

%d bloggers like this: