by Sarah Hellesen
Evaluation, like research, often revolves around asking questions. How can our program reach more people? What percentage of youth in our jurisdiction smoke or vape? How can our coalition improve?
One of most common methods for answering questions like these is to conduct surveys, such as public opinion polls or coalition satisfaction surveys.
It seems simple at first—ask a question, get an answer. But there’s a lot more that goes into answering a survey question than you might realize.
Over the summer, I wanted to learn more about this process, so I took two online courses on writing and evaluating good survey questions. The first, “Designing Quality Survey Questions,” was a two-day e-study offered through the American Evaluation Association and taught by Sheila B. Robinson and Kimberly F. Leonard. The second, a four-hour online workshop taught by Paul D. Umbach of Percontor, LLC, was titled “Writing and Evaluating Good Survey Questions.” Both courses asked participants to put themselves in the shoes of someone responding to a survey.
For example, let’s say a researcher asks you this question:
“How many alcoholic drinks have you consumed in the last month?”
To answer the question, first you have to understand it. What information is the researcher trying to learn? When they say last month, do they mean the last thirty days? Or just the month prior to this one?
Then, you must retrieve the necessary information from your memory. You need to think back over the last month and try to recall the days on which you drank alcohol, and how many drinks you had each day.
After that, you must add all these instances up, and decide if the answer you’re going to give is accurate. But maybe one of the days in the month was a holiday or celebration, and you consumed many more drinks than you normally would. Do you include those in your response, or leave them out? Maybe you feel that the person asking the question will judge you negatively for your response, so you deliberately give a lower or higher number.
Finally, after all of this, you have to candidly report the information back to the person asking.
At each of these stages, there is the potential that the accuracy of the reported information will be affected. There is also the potential that the person will lose motivation to continue answering the survey. With that in mind, when designing a survey, you should aim to lighten the cognitive load on your respondents as much as possible.
One way to do this is to avoid double-barreled questions. These are questions that are asking about more than one thing, but allow for only one answer. “Was the training interesting and useful? Yes or No” is an example of a double-barreled question, because it is asking about two separate variables (“interesting” and “useful”) while only allowing you to give one answer. This question, then, would be better split into two separate questions.
Another thing to consider when designing a survey is cultural differences between the people writing the survey and those answering it. For example, if a survey asks about “traditional tobacco” to determine rates of conventional tobacco product use, the question may mean something different to a member of an indigenous tribe than what the survey designer intended. These considerations only scratch the surface of survey design.
But when it comes to evaluating your surveys, Umbach offers three standards that all survey questions should meet:
- Content standards: Are the questions asking about the right things?
- Cognitive standards: Do respondents understand the questions consistently? Do they have the information required to answer them? Are they willing and able to formulate answers to the questions?
- Usability standards: Can respondents complete the questionnaire easily and as intended?
He also gives these general guidelines for writing survey questions, whether open, closed, attitudinal, or factual:
- Use words that virtually all respondents will understand
- Make the questions as specific as possible
- Keep the question structure simple
- Be aware of contextual effects
- Write questions with respondent motivation and cognitive effort in mind.
Remember, TCEC is here to help! Check out our data collection instruments database for surveys we’ve tested, and our resources on designing and conducting surveys. Feel free to contact us with any questions!