Sign up for our
Good Company e-newsletter:
October 24, 2012 | Volume 6 | Number 9
October 24, 2012
By Amy M. Garczynski & Matthew J. Grawitch, PhD
“How successful are you?” This is a question that might seem relatively straightforward. Yet, how you rate your level of success can actually be affected by the way response options are presented to you. Believe it or not, 34 percent of people report high life success when they respond on a -5 to +5 scale, but that number drops to 13 percent when they respond on a 0 to 10 scale (Schwarz, Knauper, Hippler, Noelle-Neumann, & Clark, 1991).
Self-report surveys are an important way for employers to obtain data. They are commonly used to measure everything from well-being to attitudes toward work. People often think of these self-reports as “measurement instruments” that accurately assess information. However, even minor changes in the framing of items and response options can drastically alter responses (Schwarz, 1999). As such, it is important for survey designers and administrators to be aware of these framing issues and effective strategies for resolving them. This article addresses three pitfalls that can have a big impact on survey responses and suggests several solutions to these problems.
Biased responding. Biased responding occurs when an employee does not truthfully answer survey items. Response bias is the tendency to provide the answer that one believes the questioner wants to hear rather than the true answer. There are several reasons why response biases can occur. First, and foremost, it can occur when participants have a fear of retribution for answering truthfully. For instance, employees may provide distorted responses to a question asking them to evaluate a boss out of fear that they may be overlooked for a promotion.
Additionally, biased responding can occur out of a desire to respond in a socially desirable way. Social desirability is a tendency to over-report good behavior and under-report bad behavior (Crowne & Marlowe, 1960). Hence, social desirability can occur when respondents feel there is a specific answer the survey designers are looking for, and they are motivated to provide that correct answer. Furthermore, social desirability can occur when survey participants seek to provide responses that present them in the best light possible. Regardless of whether respondents experience a fear of retribution, a desire to provide the “right” answer, or a desire to present themselves in the best light possible, biased responding can have an adverse effect on the validity of research results.
Framing past events. If your survey asks about events that occurred in the past, such as reporting stress from the past month, it is important to acknowledge factors that may affect the accuracy of responses you receive. For instance, if the behaviors being reported are uncommon or if they occur at random, responses will be less accurate (Schwarz, 1999).
In addition, responses tend to be more extremely positive and negative when set in the past tense, relative to the present tense (e.g., Garczynski & Brown, 2012; Parkinson, Briner, Reynolds, & Totterdell, 1995). Hence, past tense framing can lead to a more substantially-skewed positivity or negativity bias than present tense framing. Though there are numerous reasons why past tense framing might be appropriate, survey designers need to ensure that they have fully considered the implications of that framing on potential survey responses.
Response options. One thing that many survey developers overlook is the idea that the response options given to participants shape their interpretation of an item. As the example at the opening of this article illustrates, the numeric scaling of responses affects the question’s interpretation. When participants are asked to report success using a -5 to +5 response scale, they assume a bidirectional construct, where negative numbers indicate failures, and positive numbers indicate successes. When there are no negative numbers, as is the case with a 0 to 10 scale, participants assume a uni-dimensional construct which only assesses successes without taking into account failures.
In addition, it is important to consider whether it is better to include open-ended, fill-in-the-blank responses or closed responses, such as lists of choices. If open-ended questions are used, employees are more likely to omit or forget information to include, whereas they may be more likely to endorse the same information if it is presented as part of a response list. However, the drawback of using a list of options is that participants are less likely to write in alternate responses (Schwarz & Hippler, 1991).
Though the three survey design issues presented above can have an adverse effect on the quality of the data that are collected by a particular survey, awareness of the issues is an important first step in developing appropriate safeguards to minimize poor-quality data. Subsequently, there are a few tactics you can utilize to help safeguard your surveys from biased responding, the positivity/negativity bias that develops from framing items in the past tense and inappropriate response framing.
In closing, the best advice is to take care when writing surveys. All too often, surveys – because they are fairly easy to create – are created by employees who lack particular expertise in item wording, response framing and effective survey methodology. Yet, survey design is both a science and an art, and care must be taken to develop an appropriate tool and process for obtaining valid data.
The accuracy of survey results can be affected by minor factors in survey framing. If employers are aware of factors that can influence responding and take appropriate steps to ensure effective survey design and implementation, they may be able to address potential pitfalls before they have a chance to negatively affect the validity of survey results.
Crowne, D. P., & Marlowe, D. (1960). A new scale of social desirability independent of psychopathology. Journal of Consulting Psychology, 24, 349-354.
Garczynski, A. M., & Brown, C. M. (2012). Reframing rejection in the past tense impacts perceptions of self worth. Unpublished Manuscript.
Parkinson, B., Briner, R. B., Reynolds, S., & Totterdell, P. (1995). Time frames for mood: Relations between momentary and generalized ratings of affect. Personality and Social Psychology Bulletin, 21, 331-339.
Schwarz, N. (1999). Self-reports: How the questions shape the answers. American Psychologist, 54, 93-105.
Schwarz, N. & Hippler, H. J. (1991). Response alternatives: The impact of their choice and ordering. In P. Bierner, R. Groves, N. Mathiowetz, & S. Sudman (Eds.), Measurement Error in Surveys (pp. 41-56). Chichester, England: Wiley.
Schwarz, N., Knauper, B., Hippler, H. J., Noelle-Neumann, E., & Clark, F. (1991). Rating scales: Numeric values may change the meaning of scale labels. Public Opinion Quarterly, 55, 570-582.
E-mail questions or comments to: email@example.com