Whether a formal research session or an informal gathering of a group of customers or stakeholders, focus groups are one of the most popular ways to try to learn customer preferences and get feedback on potential solutions. They’re popular because they’re easy. Gather a bunch of folks in a room, and voilà, insights galore, right?
Unfortunately, focus groups are particularly subject to biases. They’re like those funhouse mirrors, warping reality in subtle, sometimes unnoticed ways. Two biases have the biggest impact.
Two Biases
The first is what’s called selection bias, where the data sample is unrepresentative of the larger population in some way. We talk with the customers who are easiest for us to reach. But being easy to reach may mean they’re unrepresentative of our larger target market in some way we may not realize. And, therefore, the data we get from them won’t be representative of the larger target market.
Even if you’re careful to get a representative group together, focus groups still suffer from another big bias problem: anchoring. It’s human nature—we want to be liked, to chime in, to nod along with the group. If Joe starts waxing lyrical about a feature, soon, everyone’s singing in the choir, and you might think, “Aha, that’s the chorus they all want.” But maybe it’s just Joe’s tune they’re humming.
Take Richard’s wife. She was in a car focus group, and guess what? They all ended up gabbing about this one car use case, making it seem like the be-all and end-all. In reality, it’s a blip on her driving radar, but the people running the focus group wouldn’t know that. They just saw a whole group emphasizing that one thing.
This same thing can happen in settings like sprint reviews. One participant emphasizes a particular point of feedback or a particular unsolved problem. Others chime in, trying to be helpful and to connect. But that item can seem more important and more universal than it actually is.
Get Better Feedback From Groups
We’ve been experimenting with how we can get better feedback from groups here at Humanizing Work. We just wrapped up a beta group for our new 80/20 Product Backlog Refinement online course (which we’re super excited to share with the world next week!), and we were determined to mitigate selection and anchoring biases with that group, while still getting some of the benefits of working with the group as a group.
First off, we deliberately recruited participants from different segments of our target market. We still included actively engaged members of our Humanizing Work community—they give us good feedback and tend to buy our new products and services. But we also recruited non-customers: Product Owners, ScrumMasters, and coaches who haven’t bought a product from us yet. We looked for participants from large companies and small ones. We found people from several different countries.
Before we convened the group and risked anchoring, we asked each participant open-ended customer problem interview questions to get their individual take on the problem they were hoping to solve with 80/20 Product Backlog Refinement. No groupthink, no chorus, just solo acts.
As they dove into the course, we interspersed questions to collect individual feedback. Again, before the group started sharing feedback together as a group.
Now, we could have just collected this individual feedback and left it at that. But there are benefits to bringing a group of customers together. What outcomes were we hoping for by having live meetings of our beta group?
- People are more likely to complete an online course when they have a sense that they’re doing it along with a group of fellow learners
- Creative ideas can come from members of a group playing off of and building on each other’s ideas
- We just thought this great group of practitioners and coaches should know each other
We had three live Zoom sessions with the beta group—a kickoff, a halftime huddle, and a wrap up. Each one crafted to dodge that pesky anchoring bias.
For the first session, we focused on connection and momentum going into the beta period. We wanted the group to have a conversation about the problems they were hoping to solve with the course. But we wanted to avoid anchoring bias. So, we visualized the data from the individual problem interview questions on the Miro board and facilitated a group activity to respond to and interpret the data. This allowed the full set of data to frame the conversation instead of just what the first person said.
The mid-beta session was a mix of a sprint review and a creative solution interview. We shared what feedback we were hearing and what we’d done or planned to do in response to it. And then we facilitated an activity to build on some of the issues and suggestions participants had raised. We kept the activity fairly divergent, focused on generating a range of ideas and building on each other’s suggestions. This gave us better input and, again, avoided anchoring bias.
In the final session, the product was pretty much complete, so our focused changed to getting useful feedback from the beta group for marketing and launching the course. Again, we were vulnerable to anchoring bias, so we alternated between individual activities to general shared data and group activities to process and build on the data. The group represented a diverse range of roles and contexts, so it was particularly useful to see those variations show up on the Miro board before we had a conversation that might have hidden them.
Your Turn
Next time you want to get useful info from a group of customers, be intentional about fighting selection and anchoring biases. Recruit your participants carefully to try to get a representative sample of your larger target market. And mix individual data collection with group activities to avoid anchoring.
Learn More With Us
As we mentioned, our new 80/20 Product Backlog Refinement launches next week. We’ll share more info about the course soon. But if you know you’d benefit from growing your skills in finding good slices of value at every level of detail and organizing them into a backlog with just the right info at the right time, you can grab the course now at the special pre-release price of $129 through Nov 10, 2023.
Our beta-test customer advisory board shared the following feedback after completing the course:
There is so much here I can bring back to my teams and POs to make our jobs easier! (-Cristy, ScrumMaster)
This course has practical content to apply right away. (-Yu Ju, Agile Coach)
Last updated