Ask for Raw Data Rather Than a Calculation

Let’s consider a common situation where you want to report on a numeric finding that has at least two components. For example, you want to report a benchmark for the ratio of practicing lawyers of a law department or law firm to all other staff. On your survey questionnaire, should you ask each element of the ratio separately and do the math yourself, or should you ask the respondent to provide the ratio directly after you give them clear instructions for how to do the calculation?

Root Out Respondent Bias in Your Survey

Online surveys, like any other research method, need to recognize and deal with various kinds of bias. The term has a technical meaning – artefacts that skew your data in some way, not the colloquial meaning of a person showing improper prejudice or pre-judgement. Here are some potential sources of bias in online surveys, at the level of the survey overall rather than specific questions and analytic steps: Sampling bias: This occurs if the collection of respondents you obtain (the sample) is not representative of everyone you want to study (the population).

See Which Questions Have Correlated Responses

At times, several questions in an online survey aim to tap into a common underlying concept that interests the sponsor. The answers by respondents to these related questions should be internally consistent, i.e., the responses should correlate strongly and positively. Surely there is a statistical tool to evaluate the degree of correlation! Yes, Cronbach’s alpha measures internal consistency or reliability and tells how strongly the responses to a group of questions correlate.

Code Text Comments

It is a good survey practice to ask for only a limited number of comments, such as in an upward evaluation of managers, “Please write up to three suggestions for what actions the manager should do more.” Then, create three text boxes for their suggestions. Text questions slow respondents and fatigue them; far better to concentrate on numeric answers. But when you include a free-text opportunity, give thought to several questions they raise.

Push Respondents to Discriminate among Selections

Here are three varieties of questions that encourage participants to take a stand on a set of selections. Best-worst questions: Many surveys include questions that ask respondents to choose among several selections at once—picking out of a large number of selections a small number of the best and worst options, the most important and least important, or whatever are the opposite criteria. For example, a survey had 22 characteristics on which participants rated their manager.