I have been involved with many user surveys over the years. Some have gone well. Some have been a complete waste of time and effort. The main distinction between them is the surveys that were professionally developed and pretested would succeed. The ones that had been knocked together by a well meaning manager were often destined to failure.
It comes down to this – unless you have experience designing surveys, then it’s best to either hire someone who has had experience and training, or find another way to collect the same information.
Proceeding with a flawed survey will just produce results that are tainted with bias and other data warping horrors. Not something you want in your user research, eh.
Survey Design Tips
Now I know some of you are going to be forced into running surveys despite my advice. That’s okay, business is like that, sometimes you have to compromise.
Better that you at least know the pitfalls and can correct them as need be. So here is a list of tips for online survey design that you may find invaluable:
This may sound really silly, but you have to know what the survey is for and what it needs to achieve before you start. Otherwise you will get side tracked asking meaningless questions. For example you may want to find out about the user demographics, their preferences, and the users needs and wants.
Keep it Short
There is nothing worse than an online survey that seems to go on forever. Asking page after page of questions. When you are putting together questions. Stop and think. Can I get that information elsewhere. If you can scrap the question. Similarly if the question is not directly related to the goal of the survey.
The point is to design a short sharp survey someone can complete in under 5 minutes. I don’t know how many online surveys I just abandoned after about 5 minutes, it must be in the hundreds.
Tell Me How Many Pages
Your respondent’s time is valuable, they are doing you a huge favour filling in your survey. Respect that. Just like you minimise the number of questions in the survey, it is also a good idea to let people know how many pages they have to complete. The best way to do this is to display an indicator of their progress. This will have a negative impact if your survey is too long. However it will have positive one if the survey is short.
Let’s go Back
Again this is a simple issue, let respondents navigate forward and backward (not via the back button) in the survey and review their answers, if they so desire. Remember the respondent is doing you a favour, don’t make it hard for them.
Its About the Design
At the end of the day all the information you’re gathering about the users will be applied to the design of the web site. Be that on a visual, informational or interactive functionality level. It follows that when you include a question you should ask yourself – “Is the data collected going to influence the design of the site.” If the answer is – No; then remove the question. Now it’s not a hard and fast rule, but still one you should consider.
Say No to Checkboxes
I know you might love them, but a checkbox is just a bipolar field, yes or no, on or off. You will get a higher response rate if you present a checkbox as a series of radio buttons with a yes or no response. It’s instantly clear to the user what response is required. Also you have the advantage of taking up more visual space and hence avoiding the question being missed.
Say No to Select Lists
This is interesting, as I have discussed earlier, certain demographics have an issue with realising that they can scroll down the list and pick the unseen items on a single line select list. A good way to avoid this issue is to use radio buttons with a multiple choice layout.
Add Some Other
When you put an alternative list of multiple choice items in a survey, how can you really be sure you have all the possible choices. I have always found it’s a good idea to allow for an “Other” field and have space for the respondent to fill in their alternative. You usually discover you have missed a few alternatives I find.
It’s a simple thing, but it’s a good idea to have the default setting for any multiple choice items to be “no response” that is when all the fields are not selected. I would also consider adding a “not applicable” or the like, response as well. Mainly because there can be cases when the respondent has no experience with what you are asking.
Getting Likert Scales Right
Likert Scales are those multiple choice responses that go “Disagree 1 2 3 4 5 Agree” . Now these are very good at gathering information where there is going to be a distinct difference of option. However the result of a Likert scale question is not a series of interval measurements. But in fact it is just a scalar representation of extremes from agreement to disagreement (in this case). When using a Likert scale it’s a good idea to have a mid point (odd number of values) to allow the measurement of the common mid point.
ALso if you are measuring a very subjective issue. It’s a good idea to label all the scale with the equivalent labels to help remove any bias or misinterpretation.
Pretest the Questions
Writing survey question is something you have to do carefully. Respondents will attempt to interpret your questions. And subsequently provide you with the information they think you are expecting to get. Also they will try and determine how you are going to use their answer and respond appropriately. This leads to bias in the results.
The way to avoid this is to pretest your questions. A pretest will tell you the questions that are always going to be skipped, give similar answers, and questions that are just confusing or misleading. Just like we user test, so we also need to user test the survey as well. Ironic really.
Stop Question Skipping
It’s simple your respondents will skip a question if they don’t understand it, are confused or just plain bored with a your too long questionnaire. The solution is keep it short and on topic and ensure the questions are not confusing.
Multiple Choice Order
Ordering the multiple choice responses is very important. However if you put the responses in their natural order (high to low) or just listing them as you think of them is dangerous. This presents bias to the respondent. Who will select that response that looks like it is the one you want.
What you need to do is scramble the responses. Still, expect some respondents to lean towards selecting the first or last items as they see these as the important ones.
The arrangement of your questions can have a great influence on the responses. If you put too many questions that have a similar response or layout together (especially multiple choice). You will get a leaning toward the same response for all the questions. You have to vary the responses and keep the respondent on their toes. However you don’t want to confuse them. So mix it up a little.
This is something you would think wouldn’t be happening in surveys, but it still does. The use of leading question is still an issue. Ensure the words you use don’t imply any unwanted response. and that they don’t point the respondent to any sort after response. I have always found that open ended questions like those in an interview are the only way to go here.
Supplied responses to questions need to have no ambiguity in them at all. Remember what you interpret as meaning one thing, someone else will see if as completely different. It’s recommend that you ensure that all supplied responses are 100% rock solid in what you want them to mean. Use the contemporary language and terms of your audience, also avoid verbs that have a double meaning.
The use of negative terms in a question is not a the best solution. In a lot of cases people will mis-read the question as an implied positive. Which will give you a completely skewed dataset. The simple solution is just to present all the questions as a positive outcome.
If you really have to use a negative, a way around this is to highlight a simple negative like for example – NOT. Just bold and capitalise it.
Getting Ranges Right
When you ask about a range or the like, don’t imply a level of use for a response. As the respondent will just assume you are looking for answer within this level of the range.
For example. “How many times do you visit our site a week:” this is bad, it implies you must visit at least once a week. Where as “How often to you visit our site:” is a better alternative as it leaves the value ranges to the supplied responses.
Pre-testing on the supplied responses will also give you a realistic response range as well.
Now this list is by no means complete, what additional pointers would you include as UX professionals?