Tag Archives: polling

How the best research panel in the world accurately predicts every election result #polling #MRX 

Forget for a moment the debate about whether the MBTI is a valid and reliable personality measurement tool. (I did my Bachelors thesis on it, and I studied psychometric theory as part of my PhD in experimental psychology so I can debate forever too.) Let’s focus instead on the MBTI because tests similar to it can be answered online and you can find out your result in a few minutes. It kind of makes sense and people understand the idea of using it to understand themselves and their reactions to our world. If you’re not so familiar with it, the MBTI divides people into groups based on four continuous personality characteristics: introversion/extroversion, sensing/intuition, thinking/feeling, judging/perception . (I’m an ISTJ for what it’s worth.)

Now, in the market and social research world, we also like to divide people into groups. We focus mainly on objective and easy to measure demographic characters like gender, age, and region though sometimes we also include household size, age of children, education, income, religion, and language. We do our best to collect samples of people who look like a census based on these demographic targets and oftentimes, our measurements are quite good.  Sometimes, we try to improve our measurements by incorporating a different set of variables like political affiliation, type of home, pets, charitable behaviours, and so forth. 

All of these variables get us closer to building samples that look like census but they never get us all the way there. We get so close and yet we are always missing the one thing that properly describes each human being. That, of course, is personality. And if you think about it, in many cases, we’re only using demographic characteristics because we don’t have personality data. Personality is really hard to measure and target. We use age and gender and religion and the rest to help inform about personality characteristics. Hence why I bring up the MBTI. The perfect set of research sample targets. 

The MBTI may not be the right test, but there are many thoroughly tested and normed personality measurement scales that are easily available to registered, certified psychologists. They include tests like the 16PF, the Big 5, or the NEO, all of which measure constructs such as social desirability, authoritarianism, extraversion, reasoning, stability, dominance, or perfectionism. These tests take decades to create and are held in veritable locked boxes so as to maintain their integrity. They can take an hour or more for someone to complete and they cost a bundle to use. (Make it YOUR entire life’s work to build one test and see if you give it away for free.) Which means these tests will not and can not ever be used for the purpose I describe here. 

However, it is absolutely possible for a Psychologist or psychological researcher to build a new, proprietary personality scale which mirrors standardized tests albeit in a shorter format, and performs the same function. The process is simple. Every person who joins a panel answers ten or twenty personality questions. When they answer a client questionnaire, they get ten more personality questions, and so on, and so on, until every person on a panel has taken the entire test and been assigned to a personality group. We all know how profiling and reprofiling works and this is no different. And now we know which people are more or less susceptible to social desirability. And which people like authoritarianism. And which people are rule bound. Sound interesting given the US federal election? I thought so. 

So, which company does this? Which company targets people based on personality characteristics? Which company fills quotas based on personality? Actually, I don’t know. I’ve never heard of one that does. But the first panel company to successfully implement this method will be vastly ahead of every other sample provider. I’d love help you do it. It would be really fun. 🙂


The Mechanics of Election Polls #AAPOR 

Live note taking at #AAPOR in Austin Texas. Any errors or bad jokes are my own.

Moderator: Lisa Drew, two.42.solutions 
RAND 2016 Presidential Poll Baseline Data – PEPS; Michael S. Pollard, RAND Corporation Joshua Mendelsohn, RAND Corporation Alerk Amin, RAND Corporation

  • RAND is nonprofit private company
  • 3000 people followed at six points throughout the election, starting with a full baseline survey in December, before candidates really had an effect, opinions of political issues, of potential candidates, attitudes towards a range of demographic groups, political affiliation and prior voting, a short personality questionnaire
  • Continuously in field at first debate
  • RDD but recruited RDD and then offered laptops or Internet service if needed
  • Asked people to say their chance of voting, and of voting for democrat, republican, someone else, out of 100% 
  • Probabilistic polling gives an idea of where people might vote
  • In 2012 it was one of the most accurate popular vote systems
  • Many responders a have been surveyed since 2006 providing detailed profiles and behaviors
  • All RAND data is publicly available unless it’s embargoed 
  • Rated themselves and politicians on a liberal to conservative scale
  • Perceptions of candidates have chanced, Clinton, Cruz, and average democrat more conservative now, trump more liberal now; sanders, kasich, average republican didn’t move at all
  • Trump supporters more economically progressive than Cruz supporters
  • Trump supporters concerned about immigrants and support tax increases for rich
  • If they feel people like me don’t have a say in government, they are more likely to support trump
  • Sanders now rates higher than Clinton on “cares about people like me”
  • March – D was 52% and R was 40%, but we are six months aware from an election
  • Today – Clinton is 46% and Trump is 35%
  • Didn’t support trump in December but now do – Older employed white men born in US 
  • People who are less satisfied in life in 2014 more likely to support rump now
  • Racial resentment, white racists predict trump support [it said white ethnocentrism but I just can’t get behind hiding racism is pretty words]

Cross-national Comparisons of Polling Accuracy; Jacob Sohlberg, University of Gothenburg Mikael Gilljam, University of Gothenburg

  • Elections are really great [ made me chuckle, good introduction 🙂 ]
  • Seen a string of failures in many different countries, But we forget about accurate polls, there is a lot of variability
  • Are some elections easier than other? Is this just random variance? [well, since NO ONE uses probability sampling, we really don’t know what MOSE and MONSE is. ]
  • Low turnout is a problem 
  • Strong civil society has higher trust and maybe people will be more likely to answer a poll honestly
  • Electoral turnover causes trouble, when party support goes up and down constantly
  • Fairness of elections, when votes are bought, when processes and systems aren’t perfect and don’t permit equal access to voting
  • 2016 data
  • Polls work better when turnout is high, civil society is Truong, electoral stability is high, vote buying is low [we didn’t already know this?]
  • Only electoral turmoi is statistically significant in the Multivariate analysis

Rational Giving? Measuring the Effect of Public Opinion Polls on Campaign Contributions; Dan Cassino, Fairleigh Dickinson University

  • Millions of people have given donations, it’s easier now than ever before with cell phone and Internet donations
  • Small donors have given more than the large donors
  • Why is Bernie not winning when he has consistently out raised Hillary
  • What leads people to give money
  • Wealthy people don’t donate at higher rates
  • It’s like free to play apps – need to really push people to go beyond talking about it and then pay for it
  • Loyalty base giving money to the candidate they like, might give more to her if they see her struggling
  • Hesitancy based only give if they know they are giving to the right and iodate, so they wait
  • Why donate when your candidate seems to be winning
  • Big donors get cold called but no one gets personality phone calls if you’re poor
  • Horse race coverage is rational, coverage to people doing well, don’t really know about their policies
  • Lots of covereage on Fox News doesn’t mean someone is electable
  • People look at cues like that differently
  • In 2012 sometimes saw 5 polls every day, good for poll aggregators not good for people wanting to publicize their poll
  • You want a dynamic race for model variance
  • Used data from a variety of TV news shows, Fox, ABC, CBS, NBC
  • Don’t HAVE to report donation under $200, many zero dollar contributions – weirdness needed to be cleaned out
  • Predict contributions will increase when Romney is threatened in the polls
  • Predict small contributions will increase in response to good coverage on Fox News
  • Fox statements matter for small contributors, doesn’t matter which direction
  • Network news doesn’t matter for small contributors
  • Big donor are looking for more electable candidates so if fox hates them then we know they’re electable and they get more money
  • Romney was a major outlier though, the predictions worked differently for him

2016: The year of the outsider #PAPOR #MRX 

live blogged at #PAPOR in San Francisco. Any errors or bad jokes are my own.

The Summer of Our Discontent, Stuart Elway, Elway Research

  • regarding state of Washington
  • it’s generally democratic
  • between elections, more people are independents and then they drift to democrat
  • independents are more social liberals
  • has become more libertarian
  • don’t expect a rebellion to start in Washington state
  • [sorry, too many details for me to share well]

Californian’s Opinion of Political Outsiders, Mark Baldassare, PPIC

  • California regularly elects outsiders – Reagan, Schwarzenegger
  • flavour is often outsider vs insider, several outsiders have run recently
  • blog post on the topic – http://ppic.org/main/blog_detail.asp?i=1922
  • they favour new ideas over experience
  • 3 things are important – approval ratings of elected officials, people who prefer outsiders give officials lower approval, negative attitudes of the two party system
  • majority think a third party is needed – more likely to be interested in new ideas over experience
  • [sorry, too many details for me to share well]

Trump’s Beguiling Ascent: What 50-State Polling Says About the Surprise GOP Frontrunner, Jon Cohen & Kevin Stay, SurveyMonkey

  • 38% of people said they’d be scared if trump is the GOP nominee
  • 25% would be surprised
  • 24% would be hopeful
  • 21% would be angry
  • 14% would be excited
  • List is very different as expected between democrats and republicans, but not exactly opposite
  • quality polling is scale, heterogeneity , correctable self-selection bias
  • most important quality for candidates is standing up for principles, strong leader, honest and trustworthy – experience is lowest on the list
  • Views on Trump’s Muslim statement change by the minute – at the time of this data: 48% approve, 49% disapprove, split as expected by party
  • terrorism is the top issue for republicans; jobs AND terrorism are top for independents; jobs is top for democrats
  • for republicans – day before paris 9% said terrorism was top, after paris 22%
  • support for Cruz is increasing
  • half of trump voters are absolutely certain they will vote for trump; but only 17% of bush voters are absolutely certain
  • among republicans, cruz is the second choice even among trump voters
  • trump has fewer voters who go to religious services weekly, least of all candidates; carson and cruz are on the high end
  • trump voters look demographically the same but carson has fewer male voters and cruz has fewer female voters
  • trump voters are much less educated, rubio voters are much more educated

The impact of social #ESOMAR #MRX 

Live blogged from Esomar in Dublin. Any errors or bad jokes are my own.

When democracy fails to deliver by Ijaz Shafi Gilani and Jean-Marc Leger

  • what explains satisfaction and dissatisfaction with democracy
  • democracy is the worst form of government except for all the others – Winston Churchill
  • Failed as a norm? no
  • Failed in specific cases? yes
  • 75% of people believe democracy is the best 
  • 50% believe they are ruled by the will of the people
  • 35% of upper income americans believe a good way to govern is to have the army rule
  • Nat rep, 52 countries, n=50 000, 10 years apart survey
  • countries who’ve practiced democracy the longest are most disillusioned
  • correlates of disatisfaction include:
  • macroeconomic factors – ecnomy, inequality, size of country
  • demographic factors – gender, age, education
  • identify factor – nationalism, patriotism, attitudes towards globalization
  •  Identify factors seemed to be most relevant for countries practicing democracy the longest
  • political rights and civil liberties have taken a back seat, now its become flight of jobs and immigration
  • linked to inability of govt to copy with “encroachment of globalization”, these people are most dissatisfied 
  • does democacy fail to deliver in a globalized world?
  • democracy might need to reinvent itself

Ireland and same sex marriage by Eric Meerkamper and Aengus Carroll

  • Bill Gates says he is struck by how important measurement is to the human condition
  • we have a unique skillset and tools to measure
  • we have relied too heavily on the same repsondent for too long – Dan Foreman
  • Random Domain InterceptTechnology, based on making errors in the browser bar
  • 51 countries, 51 000 respondents
  • should same sex marriage be legal
  • seems like a safe question but in many parts of the world, this is a death penalty for you and even your family, people need anonymity to answer this question
  • across 8 other countries with marriage quality, only about 50% of population wanted it, so it is still risky
  • about three quarters of of people disagree with marriage eqality in countries where sexual orientation can be a crime [naturally, you’ll be killed if you say otherwise!]
  • yes campaign: what kind of country do you want to grow up in, it’s about human rights, inclusion
  • no campaign wanted a civil partnership not marriage, that kids needs a mom and a dad
  • 72% of young voters wanted same sex marriage which matched the campaign they used, focus on young people
  • young people brought older people to come and vote
  • marriage was not the issue, the issue was discrimination and exlusion
  • this method allows safe measurement

Advancements of survey design in election polls and surveys #ESRA15 #MRX 

Live blogged from #ESRA15 in Reykjavik. Any errors or bad jokes are my own.

I decided to take the plunge and choose a session in a different building this time. The bravery isn’t much to be noted as I’ve realized that the campus and buildings and rooms at the University of Iceland are far tinier than what I am used to. Where I’d expect neighboring buildings to be a ten minute walk from one end to the other, here it is a 30 second walk. It must be fabulous to attend this university where everything and everyone is so close!

I’m quite loving the facilities. For the most part, the chairs are comfortable. Where it looks like you just have a chair, there is usually a table hiding in the seat in front of you. There is instantly connecting and always on wifi no matter which building you’re in. There are computers in the hallways, and multiple plugs at all the very comfy public seating areas. They make it very easy to be a student here! Perhaps I need another degree?

Designing effective likely voter models in pre-election surveys

  • voter and intention and turnout can be extremely different. 80% say they will vote but 10% to 50% is often the number that actually votes
  • democratic vote share is often over represented [social desirability?]
  • education has a lot of error – 5% error rate, worst demographic variable
  • what voter model reduces these inaccuracies
  • behavioural models (intent do vote, have you voted, dichotomous variables) and resource based models (
  • vote intention does predict turnout – 86% are accurate, also reduces demographic errors
  • there’s not a lot of room to improve except when the polls look really close
  • Gallup tested a two item measure of voting intention – how much have you thought about this election, how likely are you to vote
  • 2 item scale performed far better than the 7 item scale, error rate of 4% vs 1.4%
  • [just shown a histogram with four bars. all four bars look essentially the same. zero attempt to create a non-existent different. THAT’S how you use a chart 🙂 ]
  • gallup approach didnt work well, probability approach performed better
  • best measure of voting intention = Thought about election + likelihood of voting  + education + voted before + strength of partisan identify

polls on national independence: the scottish case in a comparative perspective

  • [Claire Durand from the University of Montreal speaks now. Go Canada! 🙂 ]
  • what happened in Quebec in 1995? referendum on independence
  • Quebec and Scotland are nationalist in a British type system, proportion of non-nationals is similar
  • referendum are 50% + 1 wins
  • but polls have many errors, is there an ant-incumbent effect
  • “no” is always underestimated – whatever the no is
  • are referendum on national independence different – ethnic divide, feeling of exclusion, emotional debate, ideological divide
  • No side has to bring together enemies and don’t have a unified strategy
  • how do you assign non-disclosure?
  • don’t know doesn’t always mean don’t know
  • don’t distribute non-disclosures proportionally, they aren’t random
  • asking how people would vote TODAY resulted in 5 points less nondisclosure
  • corrections need to be applied after the referendum as well
  • people may agree with the general demands of the national parties but not with the solution they propose. maintaining the threat allows them to maintain pressure for change.
  • the Quebec newspapers reported the raw data plus the proportional response so people could judge for themselves

how good are surveys at measuring past electoral behaviour? lessons from an experiment in a french online panel study

  • study bias in individual vote recall
  • sample size of 6000
  • over-reporting of popular party, under-reporting of less popular party
  • 30% of voter recall was inconsistent
  • inconsistent respondents change their recall, changed parties, memory problems, concealing problems, said they didn’t vote, said you vote and then said you didn’t or vice versa
  • could be any number of interviewer issues
  • older people found it more difficult to remember but perhaps they have more voter loyalty
  • when available, use  ]vote real from pre-election survey
  • use vote recall from post election underestimates voter transfers
  • caution in using vote recall to weight samples

methodological issues in measuring vote recall – an analysis of the individual consistency of vote recall in two election longitudinal surveys

  • popularity = weighted average % of electorate represented
  • universality = weighted frequency of representing a majority
  • used four versions of non/weighting including google hits
  • measured 38 questions related to political issues
  • voters are driven by political traditional even if outdated, or by personal images of politicians not based on party manifestors
  • voters are irrational, political landscape has shifted even though people see the parties the same way they were decades ago
  • coalition formation aggravate the situation even more
  • discrepancy between the electorate and the government elected

If math is hard, you can always do qualitative research #MRX 

Yup, I heard that from a speaker on stage at the recent AAPOR conference. You see, because if you’re smart, then you’re probably doing quantitative research. Because quant is for smart people and qual is for dumb people. Because qual requires no skills, or at least the skills are basic enough for non-mathematical people to muddle through. Because qual isn’t really a valid type of research. Because nonprobability research is worthless (yup, I heard that too).

Perhaps I’ve taken the statement out of context or misrepresented the speaker. Perhaps. But that’s no excuse. Qual and quant both require smart people to be carefully trained in their respective methods. Each research method is appropriate and essential given the right research objective. 

The marketing research industry has improved greatly in this pointless debate of whose method is better and right. (Neither) Now it’s time for the polling industry to do the same. 

Media Influence on Public Opinion #AAPOR #MRX 

Prezzie #1: Do polls drive the news or vice versa

  • used many popular news stories – ground zero mosque, occupy, gays in military, etc
  • there was no clear relationship… it depends [so maybe that just means its random and we’re fishing for nothing]

prezzie #2: perceptions of news coverage among blacks and hispanics

  • survey done in 2014, it was pre Ferguson, oversampled the minority groups
  • digital divide was not as expected, re minorities would have less internet access
  • blacks use TV and cell phone more
  • hispanics more likely to use cell phone, and less to use paper newspapers or computer/tablet
  • 78% overall use smartphones and actually, more blacks use smart phones for news gathering
  • the major difference is smartphone usage not race
  • diversity of content on digital sources has not happened yet
  • people do find it easier to keep up with news now compared to five years ago, same across all groups
  • but the finding is not so good when trying to find news about their own community (racial community)
  • 3 to 5% believe their community is not covered in the news
  • a quarter believe their community is not reported accurately in the news
  • the two groups go to different media to learn about their community, blacks go to local news organizations, hispanics go to ethincally focused media

prezzie #3: political conspiracies

  • for instance, the obama birth certificate, JFK assassination
  • sharing information does not work, they are motivated by reasoning, people want to believe it, they aren’t motivated to get to THE answer, they are motivated to get to THEIR answer
  • controversy debates may create the perceptions of a controversy even when there isn’t one
  • counter arguments may lead people to hold their beliefs more strongly
  • global warming covered on fox news is controversial stye, cnn is dominant covering vaccines, birtherism was covered on NBC and it was one-sided style
  • there is definite link between the types of news stations you watch and whether you believe in the hoaxes
  • increased attention to current events leads people to be less likely to endorse hoaxes except when people are exposed to controverial coverage, creating controversy 
  • increased coverage by cnn saying that vaccines do not cause autism led to 4% increase in people believing vaccines cause autism

prezzie #4: different types of voters to media reporting

  • is there a media agenda effect and does it influence individual agendas 
  • used survey data over 4 years since 2009, surveys every 3 months
  • media agenda does affect individual agends with regard to events but not for employment
  • but individual characteristics does moderate this
  • have to adapt model by event

prezzie #5: survey literacy and poll interpretation

  • how to people interpret polls in media
  • credibility of results depends on source characteristics, poll characteristics, person characteristics – media souce ideaology, poll quality, reporting transparency, political interest and knowledge
  • transparency initatives – quality of polls being reported, sampling details
  • used amazons mechanical turk for data preparation
  • 2 issues were gun control and abortion which weren’t in the policy agenda during the coding phase, and fairly close to 50/50 issues
  • [more pages of huge regression results, sigh, come on aapor presenters, we can do better than this]
  • credibility was affected by media source
  • it also matters whether the general public is evenly split or more extreme
  • education also had a signficant effect
  • motivated reasoning plays an important role in credibility, transparency matters for consumers and for experts, transparency increases credibility

Liked this session 🙂

Evaluating polling accuracy #AAPOR #MRX 

moderated by Mary McDougall, CfMC Survox Solutions

prezzie 1: midterm election polling in Georgia

  • georgia has generally been getting more media attention because it is bluer than expected, change may be fast enough to outpace polls, population has changed a lot particularly around atlanta, georgia has become less white
  • telephone survey using voter registration info, tested three weights – voter data, party targets, education weights
  • registered voting weight was much better, education weighting was worse
  • voter weight improvied estimates in georgia but you need voter information
  • [why do presenters keep talking about need more research for reliability purposes, isn’t that default?]

prezzie #2: error in the 2014 preelection polls

  • house effects – difference between one poll and every other poll, difference from industry average
  • they aren’t what they used to be, used to be interview method and weight practices
  • regression model is better than difference of means tests
  • could it be whether the pollster is extremely active or if they only do it once in a while
  • results show the more poll the more accurate you are, if you poll more in risky areas you are less accurate – but overall these results were kind of null
  • second model using just many pollsters was much better – arkansas had a lot more error, it had the most pollsters
  • in the end, cant really explain

prezzie #3: north carolina senate elections

  • to use RDD or registration based sampling; will turn out be high or low; a small university has limited resources with highly talented competition
  • chose RBS and did three polls, worked saturday to thursday, used live interviewers, screen for certain or probably will vote
  • RBS worked well here, there were demographic gaps, big race gap, big party gaps

prezzie #4: opinion polls in referendums

  • [seriously presenters, what’s with these slides that are paragraphs of text?]
  • most polls are private and not often released, questions are all different, there is no incumbent being measured
  • data here are 15 tobacco control elections and 126 questions in total, courts forced the polls to be public, find them on legacy library website
  • five types of questions – uninformed heads up questions where you’re asked whether you agree or strongly agree [i.e., leading, biased, unethical questions. annie not happy!]
  • predictions are better closer to the election, spending is a good predictor, city size is a good predictor
  • using the word ‘strongly’ in the question doesn’t improve accuracy
  • asking the question exactly as the ballot doesn’t improve the accuracy
  • asking more questions from your side of the opinion doesn’t improve the accuracy 
  • polls often overestimate the winner’s percentage
  • [these polls are great examples of abusing survey best practices research]
  • post election surveys are accurate and useful for other purposes
  • [big slam against appor for not promoting revealing of survey sponsors]

prezzie #5: comparing measures of accuracy

  • big issue is opt-in surveys versus random sample [assuming random sampling of humans is possible!]
  • accuracy affected by probability sampling, days to election, sample sizes, number of fielding days
  • used elections in sweden with has eight parties in parliament, many traditional methods are inappropriate with multi-candidate elections
  • sample size was good predictor, fielding days was not predictive, opt-in sample was worse but overall r square was very small

prezzie #6: polling third party candidates

  • why do we care about these? don’t want to waste space on candidates who only get 1% of the votes
  • 1500 data points, 121 organizations, 94 third party candidates – thank you to HuffPollster and DailyKos
  • aggregate accuracy was good, most were overstatement, but there was systematic bias
  • using the candidates names makes a difference, but if you name one candidate, you should name them all – i know i’m not voting for the top two candidates so i’m probably voting for this third party person you listed
  • accuracy gets better closer to the date, sometimes you don’t know who the third party candidate is till close to the date
  • live phone and IVR underestimate, internet overestimated
  • there were important house effects – CBS/yougove underestimate; PPP overestimates; on average FOX news is fairly accurate with third party candidates

Panel Discussion on Political Polling & Media in Canada: “Election Polling in the West – Has it Changed The Research Industry For the Better?” #MRIA14 #MRX

Live blogging from the #MRIA national conference in Saskatoon, Saskatchewan. Any errors or bad jokes are my own.saskatoon

Panel Discussion on Political Polling & Media in Canada: “Election Polling in the West – Has it Changed The Research Industry For the Better?”

Moderator: Steve Mossop, President, Insights West; Panelists: Éric Grenier, polling analyst and the author of ThreeHundredEight.com; Tim Olafson, Co-founder, Stone-Olafson; Scott MacKay, President – Probe Research Inc.; Lang McGilp, Senior Research Executive, Insightrix Research

  • [note – there is lots of debate and differences of opinions among the speakers, i have not indicated who said what]
  • many voters change their minds at the very last minute, political polling is not broken in canada
  • 22% of voters changed their mind in the last few days of the election
  •  we always ask “how would you vote today.” We need to ask the right questions
  •  there is still a role for public election polling, parties have the information so the public should have it as well
  • campaigns will be dominated by internal polls because they will put out poll results themselves
  • no more trust between public and pollsters and we need to rebuild that trust; better polling costs money, money that we don’t have, need more cooperation between pollsters and journalists is reported the right way
  • pollster in-fighting looks terrible, media sees the fights and focus on the people who got it wrong
  • Embedded image permalinktired of giving away free polling to build up brand recognition
  • pollsters doing a crappy job of setting the context, they focus on a time frame or election, they don’t look back at the last election at society at the bigger picture
  • we need to get rid of the ban on polling publication
  • industry needs to be less competitive and more open with best practices
  • we have civic reasons to do the polling, good for democracy
  • there are many people who want us to get polling wrong
  • there are too many free polls, angus reid in the west complains the most
  • some people think more pollsters is better – 280 pollsters were doing it in the US, Canada probably only needs around 12. comparatively, not as many in Canada
  • do engineers or lawyers offer free engineering and free lawyering? Free undervalues our work
  • some firms refuse to release any public that is not paid for
  • paid for polls are more accurate because you ask more questions
  • we trivialize elections with so many polls based on insufficient survey questions; will the media cover costs of a 60 question survey
  • polls these days are just horse race measures
  • how can we prove that polling works – we’ve called elections accurately for the last 50 years, except when it’s wrong [margin of error people]
  • polling used to be much more accurate, record was unblemished. what happened? we started using online panels. some panels aren’t good for this kind of research. telephone method is not dead. it works well. panels won’t work in smaller regions. Do not write off the phone at this point.
  • there are region specific panels that were built carefully, based on telephone recruit. These panels are extremely accurate.
  • method doesn’t matter. society has changed. it used to be the newspaper in the evening and news on TV at night. Now news is instant all day long.
  • not a lot of telephone any more in ontario but any methodology can get it right
  • turnout determines accuracy of polls, it’s luck
  • voter turnout is declining especially among younger people which means we will need to build likely voter models, this is new for many people
  • some region have publicly available voter lists, can be purchased, can determine who has and hasn’t voted
  • is it intention, past behaviour that predicts best?
  • we don’t ask the right questions, need to probe the undecided better, shouldn’t focus solely on undecided voters and they could be leaning heavily into one camp
  • maybe we don’t know what’s going on
  • how can we do a better job of predicting elections? voter models which we really haven’t been using [seriously? you aren’t using models? i’m seriously shocked.]
  • a publication ban is not a polling ban, we should keep polling until the end so we get a better sense of what’s going on
  • perhaps publish your numbers as an exit poll
  • people dislike polling because ‘we’re wrecking democracy,’ we’re telling people ahead of time what will happen
  • need more transparency, show the numbers, show the questions, show the weighting – this helps to avoid in-fighting
  • prediction markets – one happened in BC and followed the polls exactly but it was wrong at the outcome like all of the polls

Other Posts

Enhanced by Zemanta

The Big Shift by Darrell Bricker #MRIA14 #MRX

Live blogging from the #MRIA national conference in Saskatoon, Saskatchewan. Any errors or bad jokes are my own.saskatoon

Opening Keynote Presentation
Darrell Bricker – The Big Shift

  • What region is less likely to believe it is disrespectful to put the Canadian flag on their underwear – BC
  • What province is more religious – Atlantic
  • What region has most fire detectors, fire protection – Atlantic
  • What region is less likely to believe Canada would be better off immigrants when back to where they came from – Alberta
  • What region cares most about environment – Atlantic
  • What region believes Anne Murray should be put on Canadian money – Manitoba, Saskatchewan
  • What region believes they belong to the country first, province second – Ontario
  • What is Canada’s favourite doughnut? – Boston Cream
  • You THOUGHT you knew Canada but it’s not what you thought
  • We used to be English and french and very white, more rural and focused on natrual resources, driven by Montreal and Ottawa, trusting of authorities, fearful, paternalistic, conservative, judgmental, anti-american, used to be equal male female across the ages
  • The age pyramid we used to know is gone. Fewer kids being born, average Canadian lives a lot longer.
  • You need 2.1 kids per couple to grow but we are only producing 1.7 kids per. Japan is worse at only 1.4 per. Is this why japan is obsessed with creating robots? Germany’s population is shrinking.
  • Population of prairies is actually growing more now than in 1961, BC is growing much more than it did in 1961. This changes the balance of power.
  • Calgary is the fastest growing city in Canada.  Montreal has the oldest population, isn’t growing, high debt.
  • Why the surge? It’s immigration. Three times as many immigrants 1986 vs 2010. We accept more immigrants than any other country in the world.
  • Far more immigrants coming under economic class, family class is now far behind, and refugees is even further behind. New software engineers, new pipeline workers, service jobs – these people are coming with skills.
  • They are coming from Philippines, india, china, UK, US, France, Iran, UAE, Morocco, South Korea.  In 1970 it was pretty much UK, US, West Indies, Italy, Portugal. Very very different.  Settling around Toronto, and suburbs of western Canada. They aren’t going to Quebec or Atlantic Canada. We are becoming a pacific population and a brown population.  Families are all moving west as well because that’s where the opportunities are.
  • Toronto is 49% foreign born, Vancouver is 39% foreign born. Miami is 37%, LA is 35%, Montreal 20%, Halifax 5%.
  • Go Canada!

    Go Canada! (Photo credit: Taylor.McBride™)

    Why are immigrants coming here? We need them. By 2020, we’ll be one million skilled jobs short.

  • Canada is third in the percentage of people assessing the current economic situation in their country, western Europe is a disaster. Canada feels twice as good as they US 60% positive vs 30% positive.
  • Immigrants feel we are polite, beautiful country, friendly, good health care, care about the environment, educated, different from Americans, well off, welcoming, tolerant of different people
  • Canada third on list of “country you’d move to that is at least five hours away and you’d live there for at least two years”
  • The NEW Canadian mindset
  • more urban, multi-cultural, world traders, older, more female, tolerant, opinionated, demanding, more ideological divided, less engaged with traditional institutions, aggressively Canadian
  • We’ve had to learn how to join a french community and an english community live together, we are doing it without guns and fighting, we don’t have massive civil disruption
  • Maybe you should market to single, older, women – their housing, travel, personal needs
  • respect for every institution has declined, people are demanding more choice and more voice
  • new Canadian pride – less apologetic and more in your face
  • the emerging Canadian political landscape – consolidating right
  • old way to win elections was about Ontario and Quebec, keeping Quebec happy
  • in 2011, new coalition emerged with the west, rural areas,  and Toronto suburbs, Quebec population has decline and had less impact on elections; government is part of the problem now, new Canadians support the monarchy not the ‘older’ Canadians
  • NDP is now about to get over 25% which means Liberals can’t win.

Other Posts

Enhanced by Zemanta
%d bloggers like this: