Skip to content

Linking Emotions, Needs and Behaviors – A New Qualitative MR Technique

August 24, 2014

Needs-Feelings-BehaviorsA large number of the projects I do include an objective similar to: Increase understanding of customer’s perceptions and needs. My recent investigation into a self-awareness technique called Adaptive Inquiry has provided what I regard to be breakthrough insights into the way that participant’s interpretations of their emotions relate to their needs and drives their behavior. Check out the Adaptive Inquiry website if you want to learn more about the underlying theory. Here is how I applied my new awareness of this technology in a recent study.

Overview:

A client already has deep insight into the mindset of the patients they have served for many years. However, their intention to launch a new advertising campaign led them to ask me for a uniquely different way to gain a greater depth of insight into the emotional state of a subset of patients.

Methodology:

In consultation with the creator of Adaptive Inquiry, Charles Jones, I created a set of “emotion cards” covering the full spectrum of feelings that these patients might experience. At the appropriate time during a live, in-depth interview I exposed the full array of cards to the participant and asked him or her to:

  1. Select all the emotion cards that apply when considering the impact of their medical condition.
  2. Rank the selected emotion cards in order according to the intensity and frequency with which they are experienced
  3. Read aloud and complete the script on the back of the top few cards representing the predominant emotions at play

The script for all the emotion cards has the same first line. However, the bottom line varies for each emotion depending on the unique mapping provided by Adaptive Inquiry. The one for sadness might be the easiest to relate to.

The script on the “Sad” card looks like this:

When I think of the impact I experience from my medical condition, I feel:
Sad… because I don’t know how to fulfill my yearning for ____.

The second line of the script, adapted from the mapping of perceived unfulfilled needs to emotions as delineated  by Adaptive Inquiry, makes this process both unique and powerful. As patients read the script aloud they go right to the heart the matter… the unfulfilled need that is driving their feelings.

Outcome:

In many instances as participants read the script aloud and intuitively fill in the blank they look at me with both surprise and vulnerability. When I reassure them and ask about their experience they indicate they have just learned something new about their emotional experience, and the unspoken question in their eyes is “how did you know that about me?” The further conversation about their emotional response to their medical condition often covers new ground resulting from their new awareness and establishes a reference point for the rest of the interview.

Synthesis:

One of the premises of Adaptive Inquiry is that by clarifying the source of our emotions we can “Feel deeply and Think clearly”. I believe that this exercise helps participants to view their emotions in a new way and makes their responses throughout the rest of the interview more clear and consistent and true to what really drives their behavior.

If you have a project that would benefit from accurate insight into the emotional experience of your customers please contact me. I would love to brainstorm with you on ways that this technique or others might support your qualitative market research needs.

Click here to learn more about Dave Kreimer and Next Step Consulting

The Cutting Edge of Digital/Print Hybrids

July 9, 2013

hybridAs part of their BookSmash Challenge, Harper Collins has provided links to a variety of  what they regard to be excellent examples of the potential for digitally enhancing the print reading experience.

The NY Times article Snow Fall is the only example they provide that is browser-based. It is of particular interest to me as someone who regularly skis at Stevens Pass (near Seattle) and who is familiar with the challenges of back country skiing because it reports on an avalanche accident in that location. This superbly written article caught my attention when it was released this past winter. The digital enhancements are top-notch – a collection of videos, photos, satellite footage and even advertisements (that are surprisingly unobtrusive). I would characterize the overall hybrid reading/viewing experience as elegant. For me, the digital enhancements bring a whole new life to this detailed and precisely reported story.

I believe that anyone, in the publishing industry should examine this digital/print hybrid experience and learn from it. I have not yet viewed the other examples offered by Harper Collins (click here and scroll to Current Out-of-the-Box Reading Apps). However based on their descriptions, I would guess that they are worth checking out. It is interesting (and confirming of what I believe to be recent trends in the education market) that the iPad apps so dramatically dominate the provided examples of “out-of-the-box” applications.

The work I do with Next Step Consulting supports many of my publishing clients to bridge their products from print to technology. Great models for what is possible have been hard to find. In my opinion, Snow Fall begins to fill this gap.

A Vision of the Future of Marketing

April 30, 2013

scaleMcKinsey & Company just released this compelling report entitled “The coming era of ‘on-demand’ marketing”. It is thought-provoking and basically rings true for me. This vision of “on-demand” marketing is driven by big data and the article is a call to large companies to begin establishing the data collection systems that will enable them to effectively invest in and execute a full range of “touch points” as their customers interact with their brands. I particularly appreciate the infographic labeled “Scenes from the future of on-demand marketing” (which is just past the opening section), that provides an example of the many touch points resulting from a typical consumer purchase.

I plan on reading this report at least a few more times to develop an understanding of the role Next Step Consulting plays throughout this evolution. Overall, I am confident that companies will always need to understand the needs and desires of their customers while conceiving new products, as well as testing their “user-friendliness” while they are being developed. I believe that big data will help to identify general needs and directions for new products but expect that qualitative research will continue to be the best tool to flesh out the details indicated by the statistical trends.

I estimate that currently at least 80 percent of my projects relate to new product development and initial marketing preparations. But once a product is established in the market, is there a role for qualitative market research throughout the process of “on-demand” marketing as envisioned by McKinsey & Company? From my perspective I see many opportunities related to gauging the nature of each touch point. I know that I receive far more survey requests than I used to after I make a purchase or interact with customer support personnel. My guess is that this type of immediate, ongoing and relatively inexpensive type of testing will be the norm of the future. I would also guess that online communities and panels might become increasingly important as companies follow the mood, needs and desires of their “tribes”.

Of course, time will tell. The report also emphasizes the need for consumers to trust and cooperate with companies in terms of providing and allowing the use of personal data. No doubt there will be a direct link between the need for qualitative research and the quality and reliability of the information gleaned from big data.

 

Lie Witness News and the Validity of Market Research

April 24, 2013

JKL LWNI both love and hate the feature on the Jimmy Kimmel Live show that is called “Lie Witness News”.

In this first clip the interviewer asks questions related to the presidential inauguration and in this second clip she asks questions related to the Oscars. People “on-the-street” answer the interviewer’s questions about these events, decisively respond to her probes and also add elaborate detail. However, the questions were asked days before these two events actually happened. In other words, the people in these videos are completely making it up.

The ridiculous nature of the interviewer’s probes makes for good entertainment but leaves me somewhat incredulous regarding the outrageous lies these folks are telling. I wonder how many times a participant in one of my studies was acting similarly. I feel horrified while witnessing the power of a well-executed leading question that would be totally inappropriate for real market research. In the extreme I wonder about the validity of the whole model of interviewing respondents or even worse, conducting groups with participants and relying on what they say.

Now… to be fair, there is a lot I don’t know about how these interviews are conducted. One would suspect that folks are selected based on their likelihood to  respond inappropriately and encouraged to behave outrageously since they are generating footage for entertainment, not data for market research. I also wonder about the incidence of folks who respond by lying and agreeing with everything the interviewer says. How many interviews do you think they have to conduct to get the 5-10 clips that they include in a single segment of the show? If, like me, you basically trust human beings to be truthful, you would hope that the incidence of outrageous clips  is lower, rather than higher. (I asked a question about this through the shows website, and if I get an answer I will report it.)

I chose the above two clips because they do show the flip side of this entertainment stunt. In the first clip a woman admits that she is lying and just going along with everything the interviewer says. In the second someone responds accurately that the Oscars have not yet happened and despite being drunk, sticks to her story.

Ultimately, I am not really worried about the participants in my studies behaving similar to those in these two videos. However, watching Lie Witness News makes me even more aware of the importance of following basic market research principles such as properly setting up a group or interview, avoiding direct and leading questions, and examining data in a discriminating manner that draws inferences more than relying only on what people say.

Prius vs. Downton Abbey and the Value of Market Research

February 25, 2013

ImageSometimes I look at a product and think that it obviously reflects the benefits of excellent market research. At other times I look at a product and think that it lacked adequate market research because it missed the mark in such an obvious way. Both the BBC and Toyota have clearly made their mark on the world with their high quality “products”. However, in my recent experience one did well (from a market research perspective) and the other definitely did not.

To start with the positive, the automobile manufacturing industry is pretty much a testament to the ability of car makers to fathom and respond effectively to the desires of drivers with ever increasing levels of refinement. I just brought my older Prius to my local dealer for a recall repair. In the spirit of turning lemons into lemonade all was handled beautifully making the chore of delivering and being without my car for a day as easy as possible. Plus, I enjoyed learning about the new features of the newer Prius models during the short time I spent in the shop while drinking their excellent coffee and waiting for my shuttle ride back to my office. In a surprisingly no pressure manner a salesman briefed me on the differences, leaving me with the impression that there was a model for just about every type of car owner as well as some really cool new features I would enjoy using.

In contrast, the ending of the last episode of season three of Downton Abbey is an example of an obvious lack of market research. (Spoiler alert… if you do not want to know what happens in the final moments of this last season’s episode do not read further.)

If you are not among the millions familiar with the story you need to know that in the last few minutes of the show one of the key characters dies in an automobile crash. For many reasons this plot turn seems random, illogical, cruel and redundant to the existing story line, not to mention counterproductive to developing the story in coming seasons. It also totally altered the mood of the story they had built to that point from sweet to tragic.

A Google search shows a long list of articles about the outraged reaction from viewers with headlines such as this one “Is it time Downton died and went to TV heaven?”, which pretty much reflects my feelings immediately after the show ended. Apparently the negative reaction was strong enough that the same search also shows explanations from the producers and from the key actor involved. These explanations tell me that the character Matthew had to die because the actor, Dan Stevens, was not willing to sign a contract for the next season. If only they had put a similar level of creativity into addressing this situation as they did to the many other aspects of the show.

The Huffington Post’s one question survey asking:  “Is Matthew’s death good for “Downton Abbey”? is currently running at about 83% agreeing that it is not a good thing. Surely, a proper market research effort would have uncovered the likelihood of this outcome and identified better alternatives prior to alienating a large part of the audience.

The moral of this post is that when business circumstances surprise you test your response options so that you don’t disappoint your “viewers” or customers or clients with your own unstudied and unpleasant “surprise”.

Ideal, average and outliers… selecting market research study participants

January 28, 2013

This post by Seth Godin speaks to the need to consider the experiences of the right types of people when making marketing and other types of decisions. He divides individuals into these three categories:

  • Ideal
  • Average
  • Outlier

These are useful categories to consider when designing a market research project.

Most qualitative market research projects strive to recruit average participants assuming that their opinions and experiences represent the largest number of users of a product or service. I believe that most of the time this is an effective approach.

However, at certain times I have sought out more ideal participants. For example, interviewing thought leaders sometimes helps to predict future trends. This post describes some important things to consider when conducting this type of research. Also, recruiting a small segment of ideal users in usability testing allows me to compare their usability experiences to those of more average users.

I do not recall a time that I intentionally recruited outliers for a study. However, occasionally they show up accidentally, due to the vagaries of the recruitment process. Most of the time I dismiss these participants when it becomes clear that they are not representative of the average. However, I recall completing some interviews with outliers that provided interesting contrast to the practices of the other participants and helped my clients to understand the perspective of their fringe customers.

Bottom line, there are no rigid rules governing how to populate a study except to select the mix of participants that best fulfill the research objectives.

Common Mistakes That Make Market Research “Dangerous”

January 21, 2013

lit matchA respected leader in the MR community recently posted a link to an article with this provocative title: Focus Groups Are Dangerous. Know When To Use Them. To put this in perspective, I believe that the same can be said about almost anything (matches, gasoline, staplers). This includes items from the market research repertoire from “archetypes“ to “benchmarks” to “closed-end questions “ to a “Z-Test” (thanks to GMI’s Glossary of market research terms – especially for providing MR related references beginning with the letter “Z”!)

Recent conversations with some of my more experienced and brilliant clients highlight several mistakes that they see occurring on an increasing basis that illustrate the dangers of conducting research that is poorly conceived and executed.

Mistake

Impact

Insufficient investment Small sample sizes or not enough groups cause decisions to be made with limited, inconclusive or inaccurate data
Failure to define specific research objectives Generates data that does not answer the critical questions required to make “next step” business decisions and misses opportunities to explore important issues of interest
Using flawed test materials Makes the study results suspect due to bias or poor comparisons
Testing too late Project deadlines and the current state of development make the designated corrective actions unfeasible

 

I believe that one of the most common causes of these mistakes is a lack of “band-width” on the part of my corporate clients. It is important to note that this is not their fault. Almost all of my active clients find themselves in a situation where downsizing and budget cuts have increased their workloads to the extent that it seems impossible to give any one aspect of their jobs the attention it deserves. Add budget cuts and tighter development timelines to the mix and you greatly increase the likelihood of making one of the cited mistakes and generating “dangerous” market research findings.

Unfortunately, some respond to this dilemma by completely eliminating market research and developing new products without the guidance of ongoing customer feedback. I believe that this is a risky proposition on many levels.

My proposed solution is to hire a market research professional who can ensure that each study gets the attention it deserves. This previous post entitled, Is your market research firm SMALL enough to “DO IT ALL”? speaks to this issue.