Closed-Ended Survey

Closed-Ended Survey

In Brief

Closed-ended surveys help you converge on what's relevant in great detail, particularly to customers or prospects. They are designed to create structured quantitative data, which lends itself to statistical techniques. These surveys preselect a number of possible answers for each question. They also help explore categories of data, for example, to explore segmentation.

Helps Answer

  • What is the breakdown of client concerns/problems/preferences in terms of percent of all clients in a segment?
  • How X varies with Y (e.g., how many part-time students find advanced calculus challenging?).
  • What patterns emerge over time (if repeated over time)?
  • Ranking questions like: What is the order of priority? Who/what is the best option?
  • What provides the most satisfaction?

Tags

  • Quantitative
  • Analytical
  • Convergent

Description

Closed-ended surveys consist of closed-ended questions only. These types of surveys are most useful for exploring "known unknowns." Typically, this means that:

  1. You've tried exploring what you don't know.
  2. You've already chosen your direction, and
  3. There are still holes in your knowledge.

For example, a startup founder who has achieved problem-solution fit and has performed some smoke testing around key "happy case" assumptions could use a closed-ended survey to prioritize or discover other issues.

As the survey giver, you are interested in finding patterns in the answers. By focusing in depth on a particular group of people, you hope to uncover hidden patterns in the data you gather.

Here are a few examples of closed-ended questions:

  • How do you feel on a scale of 1 to 10 (10 highest)?
  • Are you pregnant? yes/no
  • What's your blood type?
    • A
    • B
    • AB
    • O

These surveys can be delivered offline (expert with clipboard), online (popup form), or as a hybrid (iPad at a conference).

Time Commitment and Resources

Varies significantly with the survey length and methodology chosen. It can take an hour to configure an exit survey on a website, or it can take weeks to perform a large-scale, in-person survey and manually enter the data into a useful format.

How to

  1. Be very clear what you want to learn up-front when designing the questionnaire. Usually this goal will flow from your current overarching goal in the business. Ideally you should be able to write out the goal of the survey in one sentence. For example, a survey would have very different questions for each of these goals:

    • Discovering user problems
    • Improving an existing product/UX
    • Keeping track of user satisfaction (as a proxy for referrals)
    • Improving customer service
  1. Formulate questions you'd like respondents to answer. As a rule of thumb, only include questions that you think will result in specific actions. For example, how will you use the data you gather to inform your decisions when it comes to catering, lodging, transportation, registration, event activities/workshops, and speakers? Types of questions include:
    • Dichotomous: Do you have private health insurance? yes/no
    • Likert-type scale: To what extent are you satisfied?
      • 100 percent
      • 75 percent
      • 50 percent
      • 25 percent
      • 0 percent
    • List of items: My favorite food group is:
      • Grains
      • Meat
      • Fruit and Vegetable
      • Milk
    • Ordinal: Sort the following according to the order of importance to you:
      • Price
      • Speed
      • Cost
    • Matrix question: Please rate the following company divisions with respect to knowledge of MS PowerPoint on a scale of 1 to 5:
      • Marketing
      • Product
      • Operations
      • Sales
      • IT
      • Finance

3. Collect the answers. This can be done face-to-face, by phone, online, or via postal mail.

4. Organize the answers into a useful format. Typically a spreadsheet is good enough to organize the data. Then it can be loaded into a statistical tool, like R, or a database for further inquiry.

Interpreting Results

  • Use simple statistical techniques like correlation, ANOVA, or regression for further insights. You can also hire a data scientist to help interpret your results.
  • Be sure that you have enough respondents to be able to use rigorous tools. Check your sample size as a whole, but also for each subgroup. If you don't have enough within a segment, you can't make a statistically informed statement about that group of people.
  • Based on your responses, check to see if any of the questions were too obviously worded or confusing.
  • Look at overall scores. Compare average answers to a benchmark or predetermined expectations.
  • Also look at the distributions of responses. Are they normally distributed? Skewed? A power law distribution (i.e., 80/20)?
  • Create a visual summary of the results.
  • Take action!

Potential Biases

  • Avoid overly sensitive questions — you are unlikely to get a "true" response.
  • Avoid leading questions, which subtly prompt the respondent to answer in a particular way.
    • "Are you for or against an increase in tobacco tax rates?" (Non-leading)
    • "Are you in favor of increasing tobacco tax rates to protect our children's health?" (Leading)
  • Take into account that your respondents will often not want to admit to unsavory or socially undesirable behavior or preferences, particularly if they don't feel safe or the results aren't confidential.
  • Using emotionally loaded content can predictably skew results to a "yes" or a "no" or cause the respondent to abandon the survey if they don't identify. For example, "Where do you enjoy drinking beer?" implies that the respondent enjoys drinking beer specifically, and would yield unpredictable results at an AA meeting.
  • While conducting surveys, never ask people what they would like to pay for. Usually they lie or are simply unaware.

Field Tips

  • Keep questions simple on closed questionnaires. For example, avoid hypotheticals.
  • Ask (and learn) one thing at a time. You can check for correlation and causation later.
  • Allow the respondent the option of answering with "not appropriate"/"don’t know"/"have no strong feelings."
  • Got a tip? Add a tweetable quote by emailing us: [email protected]

Case Studies

Tools

References

© All Rights Reserved            updated 2024-12-26 23:56:15

results matching ""

    No results matching ""