| Cart Total:
Menu
Learn About Quality

What is a Data Collection Survey?


Quality Glossary Definition: Survey

Variations: questionnaire, e-survey, telephone interview, face-to-face interview, focus group

Survey is defined as the act of examining a process or questioning a selected sample of individuals to obtain data about a service, product, or process. Data collection surveys collect information from a targeted group of people about their opinions, behavior or knowledge. Common types of example surveys are written questionnaires, face-to-face or telephone interviews, focus groups and electronic (e-mail or Web site) surveys.

Surveys are a valuable data collection and analysis tool that are commonly used with key stakeholders, especially customers and employees, to discover needs or assess satisfaction.

When to Use Surveys to Collect Data

It is helpful to use surveys when:

  • Identifying customer requirements or preferences
  • Assessing customer or employee satisfaction, such as identifying or prioritizing problems to address
  • Evaluating proposed changes
  • Assessing whether a change was successful
  • Monitoring changes in customer or employee satisfaction over time

How to Administer a Survey

  1. Determine what you want to learn from the survey and how you will use the results.
  2. Determine who should be surveyed by identifying population group. If they are too large to permit surveying everyone, decide how to obtain a sample. Decide what demographic information is needed to analyze and understand the results.
  3. Determine the most appropriate type of survey.
  4. Determine whether the survey’s answers will be numerical rating, numerical ranking, yes-no, multiple choice or open-ended, or a mixture.
  5. Brainstorm questions and, for multiple choice, the list of possible answers. Keep in mind what you want to learn, and how you will use the results. Narrow down the list of questions to the absolute minimum that you must have to learn what you need to learn.
  6. Print the questionnaire or interviewer's question list.
  7. Test the survey with a small group. Collect feedback.
    • Which questions were confusing?
    • Were any questions redundant?
    • Were answer choices clear? Were they interpreted as you intended?
    • Did respondents want to give feedback about topics that were not included? (Open-ended questions can be an indicator of this.)
    • On the average, how long did it take for a respondent to complete the survey?
    • For a questionnaire, were there any typos or printing errors?
Also test the process of tabulating and analyzing the results. Is it easy? Do you have all the data you need?
  1. Revise the survey based on test results.
  2. Administer the survey.
  3. Tabulate and analyze the data. Decide how you will follow through. Report results and plans to everyone involved. If a sample was involved, also report and explain the margin of error and confidence level.

Survey Considerations

  • Conducting a survey creates expectations for change in those asked to answer it. Do not administer a survey if action will not or cannot be taken as a result.
  • Satisfaction surveys should be compared to objective indicators of satisfaction, such as buying patterns for customers or attendance for employees, and to objective measures of performance, such as warranty data in manufacturing or re-admission rates in hospitals. If survey results do not correlate with the other measures, work to understand whether the survey is unreliable or whether perceptions are being modified by the organization’s actions.
  • Surveys of customer and employee satisfaction should be ongoing processes rather than one-time events.
  • Get help from a research organization in preparing, administering and analyzing major surveys, especially large ones or those whose results will determine significant decisions or expenditures.

Adapted from The Quality Toolbox, Second Edition, ASQ Quality Press.

Featured Advertisers