1. Simplicity of Language
Questions should be written in plain English that is easy to understand. Any terms that are used must be understandable to the people taking the survey. A question such as “Does your boss engage you in interactive dialogue?” is useless since very few people know what is meant by interactive dialogue.
2. No Ambiguity
Questions must be clear and precise and not open to several interpretations. The survey should not use words like: often, usually, or normally”. These terms mean different things to different people. What is “normal” for one person may not be normal at all for another. The following item is an example of a poor question because it is ambiguous. “Our customers normally do not complain very much.”
3. High Level of Confidence
The results of a survey are worthless unless one can be assured that they are statistically accurate. This is expressed as the confidence level. Most surveys aim to produce confidence levels of .95 or higher. This means that the results will be accurate 95 times out of 100. Confidence levels are dependent upon the number of people surveyed and there are precise formulas for calculating the confidence level you will get for the number of people you survey. Surveying 30 of your 700 customers will give you a low confidence level and meaningless results.
4. A Carefully chosen Sample
In addition to having the right number of people in the sample, you must also choose the right mix of people. There are a number of sampling procedures that may be used, depending on what you are trying to accomplish. Samples can be selected at random, by cross section, stratified according to specific criteria etc. You must know which one to use, when to use it and how it will effect the results of the study.
5. Selecting the Most Appropriate Question Format
Questions may be asked in a variety of formats, each of which has certain pros and cons. A person may be asked to rate something according to a certain scale, to check off their response for the choices that are given to them, to answer simply yes or no, or to write in how they feel about a certain issue. Each of these formats generates different amounts of information. Some methods are more powerful than others and will give you more complete, more accurate and more useful information than others. Deciding what kind of questions to use is an essential part of the questionnaire design process.
6. Selecting the Right Response Scale
Just as there are many ways to ask a question, there are several ways to have a subject respond. A person may be asked to either agree or disagree with a statement; or they may be required to respond using a scale of five or even ten points. Using a small scale normally gives limited results while using a scale that is too large can also be a problem if the person taking the survey cannot distinguish between 10 or more degrees of difference.
7. Survey Administration
Once the survey has been designed, one must decide how to administer it. Will you mail it out, hand it out, administer it in person or interview people by telephone? How will you encourage people to complete the survey and how are they to send it back to you? What is an acceptable rate of responses? There are factors to be considered for each of these issues and the best test designers will and should explain them to their clients.
8. Scoring and the Turn Around Time
How the survey will be scored and how long it will take to get the results (the turn around time) can be important factors in your choice of a survey design and provider. Today, most if not all surveys, are scored by computer. At the very least they should be analyzed by computer. This will give you the fastest turn around time and the greatest degree of complexity and sophistication in data analysis. Large surveys probably ought to be scanned into computers. In general, the faster the turn around time, the better. Waiting several months to get the results may mean that conditions have changed in the interim.
9. Deciding on the Right Questions
A survey is only as good as the questions it asks, so one of the most important decisions is that of determining what questions to ask in the first place. Some issues are rather straight forward and the questions are self evident. However, test developers must avoid making the mistake of asking only what someone thinks are the important questions. For example, asking the sales department to develop the questions for a customer service survey is not a good idea. A much better approach would be to ask the customers what they think the important issues are. Focus groups are a great help in such cases.
10. What kind of Report will be Produced?
Survey results are usually reported in the form of tables, graphs and written comments. You must decide how much complexity and detail you want and can use. A 500 page report with hundreds of tables and graphs can look impressive but be virtually useless if no one reads it, understands it or knows how to use it. A good rule of thumb is to strive for simplicity, clarity and brevity. Decisions about the report should be tempered with some thought about who is going to read it, who is going to use it and how the information is going to be disseminated to other people in the organization and the people who took part in the survey.