Tag Archives: mobile

Questionning mobile methods and contradictory data #MRIA2017 

Live note taking at the #MRIA2017 conference in toronto. Any  errors or bad jokes are my own.

Bridging the Survey vs. Sales Data Disconnect

Mariangelica Rodriguez, Consumer Insights Manager, PepsiCo Beverages Canada, William Pink, Chairman, Kantar Millward Brown – Marketing Science Council

  • The difference between what people say and do can be large, especially when you match opinions to home scan data
  • This type of data can get hidden because marketers don’t know how to deal with it
  • Is something in the system broken? 
  • Is there a better and different approach to monitor brand performance. What signals should be measured in the short term? Is an ongoing tracker survey necessary? How do you evolve from a survey mindset to a connected data intelligence mindset?
  • Businesses want speed to insight, connected data, less asking and more observing, migration from descriptive to prescriptive, more accurate and granular understanding
  • Need faster decision making, quarterly used to be frequent but now we need days or minutes
  • Data should never be treated in a vacuum disregarding all other pieces of data
  • Consider setting up alerts for when data moves outside a defined margin
  • We may live in a survey first world but its surround first and survey second – use search and social data for signals of campaign impact and brand strength
  • Use intuitive associations, speed of response tests to understand how people feel about brands ad services, tells a completely different story than surveys [I LOVE implicit data because people can’t and don’t want to tell you the truth]

Swipe Right > T2B: How Incorporating User Design from Tinder and Uber Can Improve Mobile Research

Kevin Hare, Vice President, Dig Insights Inc.

  • Mobile devices have matured – swipes, menu stacks, pinch to zoom. Consumers have a new set of behaviours to indicate preferences and make decisions.
  • 19% of surveys are mobile optimized, 55% have bad design that leads ot poor survey experience  [it is SHOCKING that we choose to do this.]
  1. Tinder is a dating app with simple interface – swipe left or right. You can swipe right or left on products too. Or on features, brands, services. Intuitive interest is a quick swipe right.  Considered interest means you read the description first. Intuitive rejection means a quick swipe left. Considered rejection means you read it then reject it. Process is intuitive. Survey questions often correlate which means you’re asking too many questions. This method helps that problem. Can replicate box scores with this data. Can also do network maps and correspondent maps.
  2. Chatbots. Way to access information, make decisions, and communicate. Beginning of a new form of digital access. People spend most of their time on just five downloaded apps. Conversation is a natural user interface. Not much too learn. AI tools aren’t perfect but they are exploding. 80% of people like the experience which is 4 times more than survey numbers.
  3. Google maps.  Your phone defaults to tracking you.  Google can make much of this information available to you via APIs.  Use it to track purchases. Pick the date you went shopping, identify how you paid. Then go to google maps and choose the location you went to.  Helps with recall, you can check the map to see where you were that day. Engaging map deliverables for your clients.
  4. Ratings. Feedback loops from simple five start rating system returns many metrics on how to improve service.  Use a system like this at end of survey. Give a star rating. Give a few easy prompts for what did you like or dislike. This is how uber does it, also hotel ratings. Step 1, choose overal satisfaction. Step 2, choose the satisfied features. Step 3, choose the dissatisfied features.

Bridging the Marketing & Research Chasm

Neil Rennert, Marketing Research and Consumer Insights Manager, Canada Dry Mott’s, Juliann Ng, Vice President Consulting, GfK

  • Ask a question three ways – from the client perspective (e.g., to get a bonus), from the business perspective, from the research perspective. 
  • “A more beautiful question” book to consider reading
  • We’re sort of trained to just answer the question, don’t challenge the question. The questions you asked are shaped by your experiences. 
  • Try asking ‘why’ a few more times, not just once or twice. 
  • Think about opening and closing. Close an open ended question and you’ll get a brand new perspective. You could get contradictory answers. 

Mobile devices and modular survey design by Paul Johnson #PAPOR #MRX 

Live blogged at the #PAPOR conference in San Francisco. Any errors or bad jokes are my own.

  • now we can sample by individuals, phone numbers, location, transaction
  • can reach by an application, eail, text, IVR but make sure you have permission for the method you use (TCPA)
  • 55+ prefer to dial an 800 number for a survey, young perfer prefer an SMS contact method; important to provide as many methods as possible so people can choose the method they prefer
  • mobile devices give you lots of extra data – purchase history, health information, social network information, passive listening – make sure you have permission to collect the information you need; give something back in terms of sharing results or hiding commercials
  • Over 25% of your sample is already taking surveys on a mobile device, you should check what device people are using, skip questions that wont render well on small screens
  • remove unnecessary graphics, background templates are not helpful
  • keep surveys under 20 minutes [i always advise 10 minutes]
  • use large buttons, minimal scrolling; never scroll left/right
  • avoid using radio buttons, aim for large buttons intead
  • for openends, put a large box to encourage people to us a lot of words
  • mobile open ends have just as much content although there may be fewer words, more acronyms, more profanity
  • be sure to use a back button if you use auto-next
  • if you include flash or images be sure to ask whether people saw the image
  • consider modularizing your surveys, ensure one module has all the important variables, give everyone a random module, let people answer more modules if they wish
  • How to fill in missing data  – data imputation or respondent matching [both are artificial data remember! you don’t have a sense of truth. you’re inferring answers to infer results.   Why are we SOOOOO against missing data?]
  • most people will actually finish all the modules if you ask politely
  • you will find differences between modular and not but the end conclusions are the same [seriously, in what world do two sets of surveys ever give the same result? why should this be different?]

What lies beneath the multiple screens by Raymond Lo, #AMSRS

  

  • People still watch a lot of TV, 22 hours a week, 79% of australians will second screen while watching tv
  • what are the motivatioal links between the 1st and 2nd screen, how can we monetize that
  • difficult to capture accurately, ethnography approach may influence behaviours, activities differ by day of week
  • people downloaded an app for passive behaviour – websites visits and apps in the foreground
  • app doesn’t drain the battery, no app updates, people forgot the app was there and they forgot to delete the app
  • also did a diary study in half hour blocks, prompted them twice per day
  • grazing is a habit, filler activity, 80% of multiscreen behaviour is unrelated, driven by FOMO (fear of missing out) and FONK (fear of not knowing)
  • higher the engagement with the genre the more likely the multiscreen behavior is related
  • dramas, movies, sitcomes have the higest engagement, when people multiscreen its highly related
  • news and kids shows play in the background but multiscreening is generally unrelated
  • genre dictates a stay with me or look at me disruption strategy
  • multiscreen behaviour is best captured passively, grazing is unconscious and prolific making it difficult to record

Data quality standards in mixed mode studies by John Bremer #CASRO #MRX

Live blogging from Nashville. Any error or bad jokes are my own.

– the most boring thing you can do with mobile is take a survey on it [HA! very true]
– it makes boring surveys more convenient than ever before
– dramatic growth in people starting surveys on mobile
– not all survey modes produce the same experience. there is differential completion rates. higher drop out rate on mobile particularly when the surveys get longer. different demographic set on mobile so quotas may overpopulate quickly.
– completion rates differ on mobile by country
– many people take surveys on multiple modes and this happens in every country. In the US, 60% of people ONLY take surveys on mobile [did i hear that right?]
– how do we treat quality in mixed mode studies. how should quality techniques be applied?
– why don’t we put quotas on mobile, should we?
– where 8% of people suspended a survey on tablet or computer, 20% of mobile phone starts were suspends
– tablets end up looking a lot of like computers
– think about your speeding metric. survey lengths differ greatly by mode. so if you’re including mobile phones times in you calculation, that raises the median time and raises the speeding time so that you’re cutting out more computer people than you ought to. you might need to use device specific speeder rules. [tell me now, who does this! we ought to! Love this 🙂 ]
– one you remove speeders, you can use a generic rule for random responding and straightlining
– they have a dataset of people who have taken an omnibus on a computer and on a mobile. it’s a matched dataset. [and John wonders, is it omnibii? 🙂 ]
– mobile responders always take longer and it gets worse the longer the survey gets. it’s not as far off for a shorter survey.
– we know there is a true mode effect
– must test quality at the mode level, must adjust speeding at mode level
– they recommend 5 to 10 minute survey though people still do 45 minutes on a mobile.
– you cut surveys into modules, they will take all the modules in a row.

[thanks for presenting data and tables John. i like that you don’t dumb things down. we need more of this because researchers KNOW NUMBERS even if people think its funny to say they don’t]

What they can’t see can hurt you: improving grids for mobile devices by Randall Thomas #CASRO #MRX

Live blogged in Nashville. Any errors or bad jokes are my own.

Frances Barlas, Patricia Graham, and Thomas Subias

– we used to be constrained by an 800 by 600 screen. screen resolution has increased, can now have more detail, more height and width. but now mobile devices mean screen resolution matters again.
– more than 25% of surveys are being started with a mobile devices, less are being completed with a mobile device
– single response questions don’t serve a lot of needs on a survey but they are the easiest on a mobile device. and you have to take the time to consider each one uniquely. then you have to wait to advance to the next question
– hence we love the efficiency of grids. you can get data almost twice as fast with grids.
– myth – increase a scale of 3 to a scale of 11 will increase your variance. not true. a range adjust value shows this is not true. you’re just seeing bigger numbers.
– myth – aggregate estimates are improved by having more items measure the same construct. it’s not the number of items, it’s the number of people. it improves it for a single person, not for the construct overall. think about whether you need to diagnose a person’s illness versus a gender’s purchase of a product [so glad to hear someone talking about this! its a huge misconception]
– grids cause speeding, straightlining, break-offs, lower response rates in subsequent surveys
– on a mobile device, you can’t see all the columns of a grid. and if you shrink it, you cant read or click on anything
– we need to simplify grids and make them more mobile friendly

– in a study, they randomly assigned people to use a device that they already owned [assuming people did as they were told, which we know they won’t 🙂 ]
– only have of completes came in on the assigned device. a percentages answered on all three devices.
– tested items in a grid, items by one by one in a grid, and an odd one which is items in a list with one scale on the side
– traditional method was the quickest
– no differences on means
[more to this presentation but i had to break off. ask for the paper 🙂 ]

What’s Hot and What’s Not Hot by Ray Poynter, Vision Critical University #NetGain2015 #MRX

Netgain 2015Live blogging from the Net Gain 2015 conference in Toronto, Canada. Any errors or bad jokes are my own.

What’s Hot and What’s Not Hot: Ray Poynter, Director of Vision Critical University

  • Ray’s book are for sale at the back of the conference. Find him and he will sign your book! (Yes, please!)

  •  What is still hot?
  • Mobile is really big and that’s why Ray has written a book on it [buy it 🙂 ]
  • Why is CATI so big – in this room, most people do NOT answer the landline in their home. Mobile used to cost more. Not sure if the person will be driving when you call a mobile phone. Hard to geographically target mobile phones like you could RDD.
  • PEW research is top notch CATI probability surveys. It is the majority of what they do and they have just recently bumped the percentage of their calls that is mobile.
  • Online surveys – 30% are attempted by people on mobile. Some people KNOW they are doing mobile and others don’t. May be 50% in just a couple of years. But only 15% of surveys are suitable for mobile devices. Most surveys are not optimized for mobile. Not thought about wording or question types. Not even checking the data to see if mobile vs laptop data are different.
  • ray poynterIn 950 Tesco stores, they do surveys on tablets with geolocation, datestamp, etc.
  • Heineken did a beer audit in Africa. Recruited interviewers, gave them a smartphone. Phone made SURE every location was geotagged. Photos of every location. Quality of data was far superior.
  • Communities
  • Companies doing so are beginning to disappear because communities are more mainstream. Everyone has their own community.
  • DIY is enormous in society. DIY travel, DIY bank machines, Uber, AirBNB, ZappiStore.
  • DIY has spawned automation. If every idiot can write a survey, they will. So let’s make it safer.
  • SurveyMOnkey is the biggest survey platform in the world.
  • To be hot, it must be scalable and it must work – NPS doesn’t do this. 🙂
  • DIY isn’t great with efficacy. There won’t be many neuroscience for dummies books in the near future.
  • What is HOT right now
  • In the moment – Ask the breakfast survey the very second you finish your breakfast. Survey about the hotel registration before they open their hotel room door.
  • Location Based Research – Put a geofence around a starbucks so you know who walk in or out. This also attracts aggressive marketers, not just researchers. So the message on your phone could be a survey or a sales pitch – ShopKick. Do they turn on the microphone on your phone? Do they turn on your camera? Do they tell you they have done so?
  • Microsurveys – RIWI, google consumer surveys. Usually 1 to 3 questions. Google is up to 10 questions. Won’t tackle your problems that have a high dollar value associated with them.
  • Automation – Automate reports as well as research process. What do we add to this? What do we add to the trends? What canNOT be automated?
  • Always choose the simplest tool – don’t need to take a picture of every window and find software to count those pictures. [sounds stupid but really think about it]
  • What’s bubbling new and exciting
  • Text analytics – sentiment analysis is getting better for all except twitter. much better for emails and letters to companies, comments on youtube, inbound call centers, which letters are genuine sales leads or complaints or bomb threats, reaction marketing.
  • Web Messaging – Whatsapp, WeChat. People are doing less talking to everyone and more talking to individuals. In comparison, whatsapp grew WAY more quickly than facebook and twitter.  This is massively scalable. Panel companies will go this way. [They already are!]
  • ResearchBots – Processing time and moderators takes a lot of time. New things don’t work all the time and that’s why it’s bleeding edge.  Not very scalable at this point
  • NOT so hot
  • Facial coding – good with an extremely experienced trained person sitting in the same room. Via webcam isn’t quite so good. Fully automated is very clever but delivers almost nothing. Software can identify specific pictures but a human must still go and interpret all those pictures. Great for assessing people’s reactions to packages. Not a general purpose tool. Doesn’t suit most research problems.
  • Webcam Qual – You don’t want to take video from home  because you still have to brush your hair and change out of your pajamas. Webcam on the bus means everyone behind you on the bus sees the images too.
  • Social media research – We thought it would destroy MR but it’s really a niche. Most research teams have scaled back on this. Maybe using tweets only. not used so much for insights but more for reactions to advertising campaigns. Social does answer questions not asked. Social usually doesn’t answer your specific research questions.  Vendors often say “I agree it has under-delivered but my company is doing it right!”
  • Social media 2.0 – integration with marketing, integration with survey research, integration with tracking, interrogative.
  • BT Case Study – Net Easy – how easy is it to work with BT was a better measure than NPS. They looked online for people talking about ease or difficulty and responded with solutions. Achieved a 3.5million reduction in costs by doing this. 600 000 people who would have called a telephone were able to DIY from the website.
  • What about passive data, gamification, biometrics, wearables, quantified self, Internet of Things, single source, neuroscience. There is too much stuff to register the quality of everything. You can’t learn it all.
  • Gamification doesn’t solve a lot of problems but it HAS made us rethink what we’re doing it.
  • Behavioural economics is really efficacious but it is incredibly specific.
  • Passive data from phone recording everything you press and everywhere you go. Won’t see big movements here. It will be mostly qualitative.
  • Big data is beginning to move but predictiveness is limited right now.
  • Wearables – sharable is great but these people are not yet representative. Mostly qualitative and very targeted.
  • Geotracking – very tiny right now, works well in qualitative. Can draw maps of where individual people went. Mapping ebola is a different story – limitations of cell phone towers in other countries makes it impossible to map journeys in small locations.
  • Internet of things – only exists in minds and publishers right now.
  • Single source – Means tying together many data sources, it’s a power battle, a methodology battle. WHO is the single source? The telcom? A research company? Privacy battles of combining data.
  • Top 2 Things to think about.
  • Mobile – traditional, in the moment, multimedia, passive
  • Integrative and participative – 360 panels, databases, communities, social, mobile, qual, collaborative all together
  • “We will always need faxes”  “We will always need horse and buggies” ….. We will NOT always need surveys. Ray thinks no more surveys in 20 years – classic 20 or 30 minute surveys. Suspects only 33% of spend will be on surveys by 2019.
  • We need to redesign our ethics – most of our ethics were established 60 years ago mostly by men, all of them white, and most of them dead

How a Mobile-Enabled World is Changing Research Presented by Roddy Knowles, Director of Mobile Research #CRC2014 #MRX

CRC_brochure2013Live blogging from the Corporate Researchers Conference in Chicago. Any errors or bad jokes are my own.

How a Mobile-Enabled World is Changing Research
Presented by Roddy Knowles, Director of Mobile Research

  • “This is the year of mobile” we say this every year
  • what steps do you take for mobile friendly design
  • 1 in 5 survey starts on a mobile device; 2 in 5 panel enrollments is on a mobile device – this is a 100% increase over last year
  • over time, people realize that surveys aren’t always designed for mobile devices
  • meet responders when and how they prefer – at home, work, on the bus
  • mobile reminds us that real paper take our surveys, we lose the human element sometimes
  • Mobile helps with feasibility, data quality, and representativeness
  • Data quality
    • bias towards visible answer choices
    • scale biases
    • count biases – few choices selected on a long list
    • straightlining – mitigated by good design
    • you need to test your specific situation to be aware of potential problems with your survey
  • Data comparability
    • data generally comparable
    • [remember – even if you give the same survey to the same people just one day apart, the data will be different]
    • excluding mobile people from a desktop survey means the data will be less representative, less tech savvy people, fewer early adopters, fewer shopping-centric people, certain tech occupations excluded
  • Best practices
    • avoid wide grid qrid on a smartphone – people still do this!
    • responsive design is not a large font grid on a smartphone
    • keep it short, try for ten minutes
    • use fewer answer options where possible
    • aim for a 5 point scale
    • make sure all scale points are visible without scrolling
    • allow “fat finger” responses on a phone, tiny radio buttons mean you will hit the wrong button
    • avoid need to scroll, pinch, and zoom
    • open ends are shorter but ask the questions well – don’t ask for a novel, ask for a succinct response
    • you can use audio/visual but test it first. if people can’t see the video your data will be poor quality
    • don’t use flash
    • use responsive design – PROPERLY, make sure text size is good
  • they’ve created a scoring system to show four buckets – mobile optimized (you might get a hand-written thank you note if you score this high), mobile friendly, mobile possible, mobile incompatible
  • let’s not torture panelists
  • not every survey is designed to be a mobile survey so don’t do it if it’s not
  • response rates have doubled, dropouts have dropped, fewer reminders, more efficient [impressive!]

Mission Possible: Innovative Solutions to Challenging Project Briefs Presented by Roddy Knowles, Director of Mobile Research #CRC2014 #MRX

CRC_brochure2013Live blogging from the Corporate Researchers Conference in Chicago. Any errors or bad jokes are my own.

Mission Possible: Innovative Solutions to Challenging Project Briefs
Presented by Roddy Knowles, Director of Mobile Research

  • Mission: Invade personal space – What is the relationship between toilet paper and flushing issues – we need you to capture in the moment usage of toilet paper
  • Mobile survey was easy, quick – how many sheets, are you folding/wiping/crumpling, how many times do you wipe?
  • how do you make a survey mobile friendly?
    • break it up into chunks – served 10 questions at a time
    • did a side by side of chunks vs full survey to compare
    • initially difficult to set up, but faster in field, resulted in better member experience, provided better data, challenge made us stronger because we realized we could handle the process
  • IMG_3533[1]Mission: Tag, track and talk
    • measure shopper at 3 points in shopper journey – pre shop planning, in-store, making/consuming product
    • difficult to do without mobile
    • in-store researcher or facility testing was not natural and not preferred
    • first profiled respondents and then recruited the right people, surveyed about planning for shopping and then sent them to the store. they didn’t know what product was going to be assigned to them
    • reimbursed people for the product
    • mobile IHUT validated with phones, added depth with open-ends and media
    • let them see how people use the product differently than expected
  • Mission: understand categories of websites while shopping impacts purchase decisions
    • monitor shoppers full digital footprint
    • metered panelist on mobile phone and computer
    • looked at app usage on smart phone and social media listening
    • let them see what else they are doing at the same time – email, games, while they’re occupied with the product
  • IMG_3534[1]Mission: Unicorn hunting
    • find people who are at the very very beginning of a purchase decision – needle in a haystack
    • follow entire process and see when it does or doesn’t end up in a purchse
    • used metered panel – computer and mobile
    • specifically monitored auto websites, bluebook
    • opt in for GPS tracking, matched up to dealership locations
    • surveyed during and after, and really try to get right at the point of the experience instead of relying on memory
  • Mission: Track Santa Claus
    • study over the last two years, track data around black friday, cyber monday, christmas, new years
    • used metered panel again, which websites, which phone apps, which stores they visit
    • used a mobile diary, had people check in with them while they were shopping, a short survey about categories, basket spend, receipts from shoppers
    • mobile is starting to look more like PC, see bumps around the exact holidays
    • many stores opened extremely early on these holidays but the data showed that people did go earlier and there were spikes in traffic at the early time slots. the people going really early spend more
  • integration is key, this is how you meet objectives in creative way

What’s Beyond Mobile? New MR: General Mills Goes All In by Ryan Backer, General Mills #CRC2014 #MRX #GreatTalk

CRC_brochure2013Live blogging from the Corporate Researchers Conference in Chicago. Any errors or bad jokes are my own.

What’s Beyond Mobile? New MR: General Mills Goes All In by Ryan Backer, General Mills

  • 2011 saw opportunity for mobile technology; 2013 felt confident they’d figured out mobile; 2014 formed emerging technologies team
  • 2011, they bet big. spent a year trying to get smart. convinced that by 2014 80% of research would be mobile.
  • Believers because 1) they felt they had to, mobile is surpassing desktop, default access point for all things internet, even surveys designed for desktops, particularly for multi-culturals and millennials, attention spans keep shrinking 2) pride themselves on bringing innovation to the marketplace, love idea of getting richer insights using bells and whistles of the phone – location, audio, visual, in the moment qualitative, feedback WHILE it’s happening 3) it felt right, traditional research feels ancient, we can now ask people to take a picture of their snack drawer and then see products they would never imagine sitting right beside your product, have people take pictures in the stores of how their product is always out of stock, it felt more natural in terms of ethnographic research
  • How did it go? GEMO – good enough, move on. there are plenty of warts but let’s move on
  • They heavied up and sold hard – did a lot of hand holding for any research project that seemed to work best with mobile so that every person had a positive experience with it. Had major meetings with large teams to showcase the mobile research so that everyone could see they could do mobile research too
  • They redefined best practice.
    • Ethnography – Not limited to certain geography, more honest feedback, it’s not a stranger interviewing you, you’re in your own home
    • Shopper satisfaction – Do it while you’re still in the store, not the next day after you’ve forgotten everything
    • IHUT – save on shipping costs, send them to store to buy something, ask questions about finding the product, comparing the products
    • Instant A&U – increased speed because everyone has phone with them at all times, can push a survey at any time
  • Invented new capabilities – answered questions they’ve never been able to get answered before, ask questions they’ve never had before
    • Mobile Missions: send people on a task
    • Geo-intercepts: without physically being in a store
    • LaunchWatch: get results on brand new products immediately, real time basket analyses
    • OTC ad-tests – typically in a controled environment but now they can do it “On The Couch,” as ads are experienced for the first time while you’re on your couch
  • Mistakes along the way
    • Lift and Shift ROR – You can’t just put a standard survey on a phone, know when to put audio video so that the research is done well not just somewhere else, know that results will be different because the method and timing and location is different
    • Multimedia paralysis – how do you handle hundreds and thousands of visuals, it’s very time consuming
    • New Method Flops – They never turn work away, but sometimes it just doesn’t work
    • DIY Scissors – They are many DIY solutions, anyone thinks they can do research and it results in “Running with scissors” research
  • Abandoned the mobile team now, but no longer have an in-house team, it’s just regular research now
  • Next is still mobile but also other things
  • We’ve reached the tipping point. Stop talking about mobile and start talking about device agnostic.
  • Close to 80% now, but not by dollars
  • They have a solid rolodex of suppliers and now recommend those suppliers to their teams
  • Know your barriers
  • data stitching is not easy, gamification is on their list, still need true full service – know research AND technology AND storytelling
  • Exponential change
    • Internet of things – connect all device and eeverything in between
    • quantified self – behavioral is now diary but we’re going way beyond that
    • augmented reality and virtual reality – could be THE next big thing
    • 3D printing – especially in food industry
    • artificial intelligence – natural language interpretation, unstructured data interpretation
    • robotics
  • looking for new applications of mobile, new data streams, and new technology/gadgets
    • Geofence consumers to within 4 inches when they are in front of a box of cheerios, how much time in front of those cheerios, build your own granola bar game
    • data streams previously unavailable – fitbit, wearables, consuming calories, how you burn calories
    • fridge with internet, clip on your shirt and takes a pictures every 30 seconds [need one!], coupons and menu ideas direct to your fridge monitor

Getting to Deeper Insights Using Real Time Mobile Phone Video Chats for Qualitative Research by Rachel Geltman #CASRO #MRX

Live blogging from the #CASRO tech conference in Chicago. Any errors or bad jokes are my own.

“Getting to Deeper Insights Using Real Time Mobile Phone Video Chats for Qualitative Research” by Rachel Geltman, CEO, Video Chat Network

  • starbucks case study – saw more vivid language describing the drink, before and after what they were expecting, sensory cues as they were drinking it, stronger reactions to the product versus a scale on a survey, role of a cup of coffee in a day
  • in-home experience was less visceral, more generic, professional descriptions, not the laughter and love seen in the in-store experience
  • case study with SUVs – they ensured safety of the driver first. more descriptive brand imagery, more reflections on themselves as drivers, more under the radar descriptions of tiny little things that no one really pays attention to.
  • in-home experience was more like listening to them recite a commercial they saw on TV, there is was conscious recall of things many of which could be forgotten. you can’t understand the experience of being in the vehicle
  • case study on mobile phone – it’s all about the touch, sensory experience of holding the phone, they’ve already done all the research, they know the pricing. people talk about how it feels, how it looks. people say the like to get a feel for the phone but they can’t put their hand around it because of the weight of the lock in the store [YEAH!!!!]
  • Embedded image permalinkin-home – they had to probe to see if people touched or picked up the phone. lots of talking about what people said or what the salesperson said. people would respond that they picked up the phone but they didn’t really talk about how the phone felt in their hands
  • in-environment testing is a powerful stimulator, more descriptive, more creative, more passionate, more insightful [GET THE WEIGHT OFF THE IN-STORE PHONE. sorry for yelling 🙂 ]
  • latent motivators like touching, sensing happens in-experience not in-home, very vivid and concrete language that a creative person would be desperate to get
  • really good for up-front developmental, exploratory research, good for ‘how should a store be designed’, what should the copy read, how should the packaging be designed

Other Posts