Survey Mode and Questionnaire Design

Joe Ripberger

Review

Sources of Survey Error (TSE)

  • Specification (validity) error: difference between the concept the researcher intends to measure and the construct actually captured by the survey question
  • Measurement error: difference between the value a survey question records and the true value for the respondent, due to question wording, interviewer effects, mode, recall, or response biases

Survey Mode

Mode Effects

  • Survey mode (mode of data collection): the way questions are delivered to respondents and how their answers are returned to the researcher
    • Example: face-to-face, telephone, mail, web, or mixed-mode
  • Mode effects: differences in how people answer survey questions that arise because of the mode of data collection rather than their true attitudes or behaviors; can stem from multiple sources

Sources of Mode Effects

  • Sampling frame / design: different modes rely on different frames (e.g., RDD for phone vs. address lists for mail)
  • Coverage: some groups are more or less reachable in a given mode (e.g., households without internet for web surveys)
  • Nonresponse: response rates differ across modes (e.g., mail surveys often have lower response rates than face-to-face surveys)
  • Measurement quality: mode may influence how accurately attitudes and behaviors are reported
    • Social desirability bias: stronger in interviewer-administered surveys
    • Data completeness: item nonresponse tends to be lower in interviewer-administered surveys, higher in self-administered surveys
  • Response effects: differences in how people use response options depending on the mode
    • Order effects: tendency to choose the first or last option presented (first in self-administered, last in interviewer-administered surveys)
    • Acquiescence: tendency to agree with statements regardless of content (may vary by mode)
    • Extremeness: tendency to select extreme categories on rating scales (may vary by mode)

Mode Comparisons

Note

Most systematic comparisons (studies that directly compare answers across modes, holding everything else constant) find that the average differences are often small to modest relative to other sources of survey error.

But, mode effects do exist, especially for sensitive questions (e.g., reporting income, voting, or stigmatized behaviors), where interviewer presence can lead to social desirability bias.

Questionnaire Design

Theory

  • Questionnaire design begins with the identification of concepts—the abstract ideas or phenomena that researchers seek to understand and measure
  • These concepts are typically embedded within broader theories that specify how and why concepts are related

Questionnaire design is often an exercise in translating theory into measurement—moving from abstract concepts and hypothesized relationships to concrete questions that can be observed and analyzed

Example

  • RQ1: To what extent do members of the public support research and development on fusion energy?
  • RQ2: What factors explain public support or opposition to fusion energy research and development?

Survey Response Process

  1. Perception: respondents see or hear the question and attend to its words or visual layout
  2. Comprehension: they interpret what the question means and what kind of information is being requested
  3. Retrieval: they recall relevant information from memory needed to form an answer
  4. Judgment: they evaluate or summarize the recalled information to arrive at an answer that fits the question’s intent
  5. Response: they map that answer onto the available response options and record or report it

Errors can occur at any stage, leading to measurement error

Shortcuts in the Survey Response Process

Respondents do not always engage in the full cognitive process when answering survey questions; instead, they sometimes rely on shortcuts that simplify the task:

  • Satisficing: giving an answer that is satisfactory but not fully thought out
  • Acquiescence: agreeing with statements regardless of content (“yea-saying”)
  • Non-differentiation: choosing the same response for a series of items (e.g., straight-lining)
  • Primacy and recency: selecting the first or last response option seen or heard
  • Heuristic responding: relying on general impressions or social norms instead of memory or reasoning

A key objective in questionnaire design is to prevent or limit mistakes and/or satisficing

Exercise: Survey Questionnaire Cheat Sheet

Goal: distill key lessons from today’s readings into a one-page “cheat sheet” you can use when designing and evaluating survey questions.

Groups

  • Group 1: Ben, Vanessa, Alexis, Joy
  • Group 2: Nate, Lauren, Bulbul, Charlie
  • Group 3: Anna, Riley, Laken

Instructions

  1. Review the Readings
    • Focus on:
      • Conceptualization and operationalization
      • The survey response process
      • Common sources of measurement error
      • Strategies to reduce satisficing and improve data quality
  2. Create Your Cheat Sheet
    • Summarize key rules, tips, and red flags for writing good questions
    • Keep it concise and practical
  3. Add Examples
    • For each principle, include one good or bad example
    • Draw from readings or personal experience
  4. Prepare to Share
    • Highlight your three most important rules/tips (2–3 minutes per group)

Deliverable: one-page summary (handwritten or digital) capturing the key do’s and don’ts of questionnaire design.