Product Validation
|

8 min read

How We Find the Right People for Product Validation

Product Validation
|

8 min read

How We Find the Right People for Product Validation

Finding participants for product validation isn’t easy. Finding the right participants is even more difficult.

There are far too many variables in any given market and for any given product for me to recommend a step-by-step process for recruiting participants for a study. However, defining the parameters of the project and using the right channels to approach potential participants will help you find the right people. With feedback from the right people, you can validate your feature/product with increased accuracy. Here’s the steps we follow at UserVoice to find the right people for product validation.

1. We Determine the Scope of the Project

Defining the scope of the project helps us find a jumping-off point for gathering data. As a product manager, you might be inclined to only talk to people that fit the buyer persona, but it’s often not as simple as that. In most cases, it is okay to look beyond buyer persona. You have to consider factors like your deadline, whether it’s a new feature or product, and your goal. For example, your project scope will be different if you’re expanding into a new market as opposed to helping your current market with a problem.

At UserVoice, we look at three factors when determining the scope of our project:

  • Deadline — We always start here because our timeframe will dictate how many people we can interview/survey.
  • New feature vs. new product — Launching an entirely new product requires a much bigger sample size than adding a feature because you’ll need to find product-market fit.
  • New market vs. current market — Trying to expand into a new market through a new feature means having to formulate a completely different set of questions. That’s because you’ll have to talk to people who don’t use your product.

Once we understand what we’re launching and why, we’re in a position to set the parameters for our data collection.

2. We Choose the Sample Size and Method of Data Collection

Sample size and method of data collection are often interdependent variables for us because we have to make sure we have enough time to collect data. A smaller sample size means we can do interviews, while a sample size of a thousand people or more limits us to surveys. When introducing a new feature to a current product, we usually go with a smaller sample size. That’s because, with a current product, we have an established userbase, and the new feature is usually a result of the userbase’s feedback. Larger sample sizes are usually good for new products because we want to be extra thorough in the first stage of product validation.

When validating a new feature, we prefer to start with a smaller sample size and conduct interviews through video calls. Video calls allow us to collect in-depth feedback, have conversations with customers, and develop a better understanding of their problems.

When doing interviews, we usually start with 5-10 participants. We then analyze the data we’ve gathered, schedule more interviews if we need more data, and keep interviewing more people until we can identify trends and patterns in their answers. The smallest sample size I’ve ever worked with was 10 people. In a recent project, I interviewed 30 people. It all depends on the quality of the answers you get.You should go for bigger sample sizes if:

  • There are no clearly identifiable patterns in the data collected
  • You don’t have clear answers to your questions after your first round of interviews

Let’s say you want to introduce a feature that allows users to post updates to social media sites directly from your app. You interview 5-10 people to see what they think. It looks like a couple were for it, a couple were against it, and the rest were indifferent. That’s not really enough data to confidently validate or invalidate your idea, so you know you need more data.

With surveys, the rule of thumb is a sample size of 400 for a potential userbase of 100,000, but if you do six-week development cycles and roll out new features every quarter, you won’t have the luxury to collect data from hundreds or thousands of people every time. So we save surveys for much larger projects.

Building a Product Validation Process

Building a Product Validation Process

Download Now

We Decide How to Reach Out to Potential Participants

Once we’ve established a small group of people to start talking to, we need to figure out the best way to reach them. How you reach out to potential participants will determine whether they reply or want to participate in your project. There are two types of participants and multiple ways to reach out to them:

Existing Customers

We usually contact UserVoice customers via email. We’ve seen good success rates with email because it’s never a cold email. We use our own product, UserVoice, to identify customers who have expressed the pain point that we're trying to solve. These users may have left feedback regarding this problem in the form of a feature request, a support ticket, in a sales call, etc. The other way we choose existing customers to speak with is based on their in-app behavior. If we are exploring how to improve an area of our application, I use a tool like Heap to identify users who commonly visit this part of our app or perform certain actions. And in some cases, users who visit a page but don't perform a certain action. You want to be careful not to only speak with power-users, but users of varying activity.

Tools such as Amplitude, Heap, and FullStory also help us identify these existing customers based on their behavior in app, job title, company size, product usage, etc. This way, we have a point of reference to begin the conversation. There’s usually monetary compensation involved when it comes to users recommended by these tools, so the participation rate over there is decent.

Participants We Find Online

We’ve also achieved great success through InMail campaigns on LinkedIn. When reaching out to people on LinkedIn, you have to be careful not to sound like a sales rep. For example, ‘This is Jared from UserVoice, and I would love to talk to you about this new product we’re working on...”  sounds like I’m trying to sell them the product. So instead, I’d go with something like, “Hey, this is Jared from UserVoice. I saw your profile and noticed that you manage xyz product. I was wondering if I could pick your brain about in-app surveys. We’re working on a new feature for product managers that will help them set these surveys up and collect data in two steps, but first, I wanted to talk to multiple product managers and get their opinion before we actually start development. Would you be open to participating in a survey/interview?”

We also use User Interviews to find participants online. You can set the criteria for the validation project based on job title, industry, and similar variables and the tool finds people willing to talk to you.

We also use our own validation tool to conduct surveys, gather user suggestions for new features, get users to vote on new feature ideas, etc., so that means we rarely have to do InMail campaigns or rely on User Interviews.

We Screen Our Participants

By screening potential participants, we know we’re talking to people who have the problems we’re trying to solve. If a user fits our buyer persona but doesn’t need the solution our feature offers, then they’re not a good fit for product validation.

For example, common screener questions may include:

  • What industry do you work in?
  • How would you describe your job function?
  • How many employees does your company have?
  • Which of the following tools do you use for work? (followed by a list of tools)
  • When was the last time you used [name of a tool]?
  • How often do you use [name of the tool]?
  • How would you rate [name of a tool] on a scale of one to five?

And so on.

We'll often straight up ask them if the problem we're trying to validate is a problem they experience. For example, if we want to find out how to make it easier for PMs to store/submit product feedback on behalf of their users, we might say something like:

  • Tell me about the last time you captured feedback on behalf of a customer, or
  • On a scale of 1-10, how happy are you with your current feedback process?

These screener questions help us make sure we’re interviewing people who face the problems we intend to solve with our products.

We Continue to Gather Data Until We Have What We Need

When we gather feedback in the early stages as well as during development, it helps us validate existing ideas, come up with new ones, and improve our products/features.

Our goal is to always be talking to people. By listening to our customers, we make sure we’re building products that meet their needs and solve problems that are important to them. We have six-week development periods and two-week cool-down periods that follow. While most of the research happens in those two weeks, it’s not limited to that. We talk to customers every day, even during development. In short, continuously searching for participants, reaching out to them via email, and screening them helps us find the right people for product validation.

Jared Shaffer

Product Manager