Tag Archives: Jon Puleston

Prediction Science: How Well Can We Predict the Future? Jon Puleston, Lightspeed GMI #NetGain2015 #MRX

Netgain 2015Live blogging from the Net Gain 2015 conference in Toronto, Canada. Any errors or bad jokes are my own.

Prediction Science: How Well Can We Predict the Future?

Jon Puleston, VP of Innovation at Lightspeed GMI

  • [Jon had the room take a quiz on a random series of questions]

  • What has he learned about predictions? Better at predicting certain things, behaviors. Not so good at predicting prices.
  • You can isolate people who are good at predicting, perhaps find them and use their super power 🙂
  • Prediction isn’t dependent on sample size. One person can be sufficient. It’s more about sample diversity and the intelligence of that sample.
  • 16 is a crowd if they are all well-informed. That’s all you need.
  • You could ask 1000 people or 5 people about the weather next week, but you really only need to ask the 1 person who saw a weather forecast.
  • How do you aggregate crowd wisdom – mean/median/mode
  • Crowds mean that errors get distilled out
  • A crowd of people predicted the price of the ipad within 1%
  • But without any knowledge, the crowd is ignorant. People NOW can’t predict the weight of a cow.
  • Prediction is littered with cognitive biases – 68% of people will say that a coin will toss heads because we always hear ‘heads’ first
  • People’s preferences for wine depend on whether you ask “Who prefers red wine?” vs “Who prefers white wine?”
  • People who check their emails before breakfast are more likely to say people check their email before breakfast
  • Emotions get in the way of making valid predictions. We are more positive about our own teams vs other teams when predicting score counts.
  • Do you you clean up after a meeting vs Do people clean up after meetings. People say they do but they don’t. [I do. Even when I wasn’t in the meeting.]
  • Read: http://www.amazon.com/Expert-Political-Judgment-Good-Know/dp/0691128715
  • only 48% of stock market gurus stock market predictions were correct
  •  People who ‘bet’ 1 unit not as good as 2, but betting 3 is a little better. People who bet against are really good.
  • After 15 people predict and other people see that, they start to predict the same way, the answers don’t move.
  • But in an independent voting method, larger surveys are better than 15 people
  • Best predictive market situations allow sharing of information, let people discuss and debate
  • e.g., in guess which mug is most popular, someone will suggest mugs are good gifts, someone will suggest lots of people garden, people decide that the gardening mug will be most popular
  • Think of board room meetings where they didn’t discuss things before they vote on a decision.  Stray comments are problematic.
  • Try dividing up the herd and then recombine the three groups back into one. Helps improve accuracy just like how we run 3 focus groups not 1.
  • Let people change their opinions in surveys [We NEVER let people do that!]
Advertisements

Could We Be More Innovative About How We Innovate? by Jon Puleston #CASRO #MRX

Live blogging from the #CASRO tech conference in Chicago. Any errors or bad jokes are my own.

Could We Be More Innovative About How We Innovate? by Jon Puleston, VP of Innovation, Lightspeed

  • much advice about innovation is common sense
  • what IS innovation: creation of a new viable offering, doing something different that has impact
  • ten types of innovation from how you price, structure, product offerings, service relationships, network, channel, brand, engagement
  • we focus mostly on product performance and product system, such as what dyson and microsoft do
  • 80% of feedback in grit report is on product innovation
  • we need to combine as many innovative approaches as we can
  • gillette is innovative on its pricing model, UPS is innovative in it’s networking
  • workshops are more than idea generation, they are establishing a culture that acts as arrowheads to drive through innovation; process of disparate groups with different skills helps establish a culture of collective problem solving
  • companies like google, IBM, and GE allocate real time to innovation, not outright production
  •  top down commitment to innovation – board back headline innovations, build it into the financial model, planned to be funded
  • don’t be distracted by low hanging fruit – incremental innovations (Kaisen) represent 85% of innovation but rarely dliver competitive advantage. Big innovation leads to differentiation despite being risky
  • think of something no one else has done and then do it
  • back a mix of ideas, not just the blockbuster big bets but also the promising mid-range ideas
  • use the right framework to evaluation innovation – they can’t compete with established models. don’t think that it will sell itself or that we will have no competitors, review and revise assumptions along the way
  • you CAN disinvest in an idea along the way
  • “The Innovators Dilemma” book highly recommended. Buy it here
  • don’t let the core business compete against innovation because the core business will always win
  • most big innovative ideas come at moments of crisis, when people face deadlines and they have run out of solutions
  • reward process not success, allow people to fail

Other Posts

Best of #Esomar Canada: Jon Puleston Games a Better Survey #MRX

esomar logoThis is one of several (almost) live blogs from the ESOMAR Best of Canada event on May 9, 2012. Any errors, omissions, and ridiculous side comments are my own.

————————————————————————————–
Jon Puleston, VP GMI

Creative survey design and gamification

  • We’ve taken a journey of massive pages of boring hard to read text to simple clear phrasing in 140 characters
  • Now we need to do the same for surveys and that doesn’t mean just throwing a thumbs up picture on your survey. (Massive giggling in my head. I know that tactic extremely well!)thumbs up
  • Survey writers are competing against Twitter and Farmville and that’s why we see a horrible decline in completion rates
  • We need to see surveys as a creative art form but we are still at the 1980’s whacky clip art and funky page transition stage
  • We need to let respondents give feedback in the way they want
  • A typical survey starts with a terribly boring long question of precisely what we want to know. We forget just how important foreplay is (ah yes, joke intended)
  • Think about survey design as a television interview with a celebrity. Would you ask George Clooney “On a scale from 1 to 10, how much did you enjoy making that last film?”
  • A first bad question means respondents will have no respect for the rest of the questions.
  • Imagery is very important and yet we just slap on a really dumb smiley face or thumbs up. Why haven’t you used a design firm to do this? It’s not about making it look nice. It’s about communicating effectively.
  • Must grids dominate every survey? They don’t capture the imagination of respondents.
  • The average person spend 4.3 seconds considering their answer to a question. For a grid, it drops down to 1.7 seconds.
  • We are researchers and yet we don’t use research to design better surveys. We rarely pilot surveys any more even though it’s extremely easy to pilot an online survey. You can get 15 completes and make tweaks all within an hour.
  • Now for a game. Jon took two volunteers who weren’t as familiar with gamification.
  • First women was asked a series of ordinary questions. Tell me about toronto. What’s your favourite meal. What’s in your fridge. What would you do with a brick. Jon responded with “um”, “ok”, and stared at his notes most of the time. Then he complained that she was too creative and ruined his demonstration. (Yup, that’s how live demos ALWAYS work. 🙂 )
  • Second person was then brought into the room and asked about his family, to write a postcard home to his family, to imagine he’s been convicted and is on death row and has to describe his last meal, to name his favourite foods within two minutes and anything not named can never be eaten again, who can think of more uses of a brick.
  • Game conclusion: you can get the same kind of data, but more of it and in more detail if you are creative.
  • The approach you use to ask a question can enthuse and excite people to give answers.
  • Book on gamification “Reality is broken”
  • Check out website gamification.org
  • What defines a game: Anything that is thinking and that we do for fun. Games have rules, skill and effort.
  • Gardening is a popular game for older age groups. There are rules for planting seeds, skill and effort to get a wonderful garden. (heeeey, was I just called old?)
  • Twitter is a game that tries to get you to say your thought in a short space and then get the message out
  • Email is a game: do you get the feedback you wanted
  • Surveys are games…… just rather dull ones
  • In a face to face interview, people can’t just turn their backs and walk away. We’re still working on that heritage.
  • Don’t ask for my favorite colours – ask what colour you’d paint your room
  • Don’t ask what you want to wear – ask what you’d wear if you were going on telelvision
  • Don’t ask what do you think of this product – ask what would you value this company at
  • Describe yourself vs describe yourself in exaclty 7 words. It’s liberating and you’ll actually get more information.
  • How much do you like these music artists – If you owned a radio station, which of these artists would you put on your play list.
  • Add in a competitive element. what brand comes to mind – how many brands can you guess in two minutes.
  • Rewards. Give them points for answering questions correctly. Stake some ‘money’ on what brand reflects this logo. These allow people ot be more circumspect instead of overly positive about everything. Makes it more easy to give a negative answer.
  • The best games encourage us to think and we like thinking.
  • But remember, games can affect data. Greater thought and consideration changes answers. Point scoring can steer data badly as people try to cheat the system. (This is the same with any research. Once you improve the research, the historical norms, whether they are correct or incorrect, are no longer applicable. But… if the survey was horribly boring, just how valid were those results? Be honest with yourself!)
%d bloggers like this: