User Surveys – Do it Right or Not at All.

Feb
17
2010

You have been registered.

I have been involved with many user surveys over the years. Some have gone well. Some have been a complete waste of time and effort. The main distinction between them is the surveys that were professionally developed and pretested would succeed. The ones that had been knocked together by a well meaning manager were often destined to failure.

It comes down to this – unless you have experience designing surveys, then it’s best to either hire someone who has had experience and training, or find another way to collect the same information.

Proceeding with a flawed survey will just produce results that are tainted with bias and other data warping horrors. Not something you want in your user research, eh.

Survey Design Tips

Now I know some of you are going to be forced into running surveys despite my advice.   That’s okay, business is like that, sometimes you have to compromise.

Better that you at least know the pitfalls and can correct them as need be. So here is a list of tips for online survey design that you may find invaluable:

  • Survey Objective

    This may sound really silly, but you have to know what the survey is for and what it needs to achieve before you start. Otherwise you will get side tracked asking meaningless questions. For example you may want to find out about the user demographics, their preferences, and the users needs and wants.

  • Keep it Short

    There is nothing worse than an online survey that seems to go on forever. Asking page after page of questions. When you are putting together questions. Stop and think. Can I get that information elsewhere.  If you can scrap the question. Similarly if the question is not directly related to the goal of the survey.

    The point is to design a short sharp survey someone can complete in under 5 minutes. I don’t know how many online surveys I just abandoned after about 5 minutes, it must be in the hundreds.

  • Tell Me How Many Pages

    Your respondent’s time is valuable, they are doing you a huge favour filling in your survey. Respect that. Just like you minimise the number of questions in the survey, it is also a good idea to let people know how many pages they have to complete. The best way to do this is to  display an indicator of their progress.  This will have a negative impact if your survey is too long. However  it will have positive one if the survey is short.

  • Let’s go Back

    Again this is a simple issue, let respondents navigate forward and backward (not via the back button) in the survey and review their answers, if they so desire. Remember the respondent is doing you a favour, don’t make it hard for them.

  • Its About the Design

    At the end of the day all the information you’re gathering about the users will be applied to the design of the web site. Be that on a visual, informational or interactive functionality level.  It follows that when you include a question you should ask yourself – “Is the data collected going to influence the design of the site.”   If the answer is – No; then remove the question. Now it’s not a hard and fast rule, but still one you should consider.

  • Say No to Checkboxes

    I know you might love them, but a checkbox is just a bipolar field, yes or no, on or off.   You will get a higher response rate if you present a checkbox as a series of radio buttons with a yes or no response.   It’s instantly clear to the user what response is required. Also you have the advantage of taking up more visual space and hence avoiding the question being missed.

  • Say No to Select Lists

    This is interesting, as I have discussed earlier, certain demographics have an issue with realising that they can scroll down the list and pick the unseen items on a single line select list. A good way to avoid this issue is to use radio buttons with a multiple choice layout.

  • Add Some Other

    When you put an alternative list of multiple choice items in a survey, how can you really be sure you have all the possible choices. I have always found it’s a good idea to allow for an “Other” field and have space for the respondent to fill in their alternative.  You usually discover you have missed a few alternatives I find.

  • No Response

    It’s a simple thing, but it’s a good idea to have the default setting for any multiple choice items to be “no response” that is when all the fields are not selected.  I would also consider adding a “not applicable” or the like,  response as well. Mainly because there can be cases when the respondent has no experience with what you are asking.

  • Getting Likert Scales Right

    Likert Scales are those multiple choice responses that go “Disagree 1 2 3 4 5 Agree” . Now these are very good at gathering information where there is going to be a distinct difference of option. However the result of a Likert scale question is not a series of interval measurements. But in fact it is just a scalar representation of extremes from agreement to disagreement (in this case).  When using a Likert scale it’s a good idea to have a mid point (odd number of values) to allow the measurement of the common mid point.

    ALso if you are measuring a very subjective issue.  It’s  a good idea to label all the scale with the equivalent labels to help remove any bias or misinterpretation.

  • Pretest the Questions

    Writing survey question is something you have to do carefully. Respondents will attempt to interpret your questions. And subsequently provide you with the information they think you are expecting to get. Also they will try and determine how you are going to use their answer and respond appropriately. This leads to bias in the results.

    The way to avoid this is to pretest your questions. A pretest will tell you the questions that are always going to be skipped, give similar answers, and questions that are just confusing or misleading. Just like we user test, so we also need to user test the survey as well. Ironic really.

  • Stop Question Skipping

    It’s simple your respondents will skip a question if they don’t understand it, are confused or just plain bored with a your too long questionnaire. The solution is keep it short and on topic and ensure the questions are not confusing.

  • Multiple Choice Order

    Ordering the multiple choice responses is very important.  However if you put the responses in their natural order (high to low) or just listing them as you think of them is dangerous. This presents bias to the respondent. Who will select that response that looks like it is the one you want.

    What you need to do is scramble the responses. Still,  expect some respondents to lean towards selecting the first or last items as they see these as the important ones.

  • Duplicate Answers

    The arrangement of your questions can have a great influence on the responses.  If you put too many questions that have a similar response or layout together (especially multiple choice). You will get a leaning toward the same response for all the questions. You have to vary the responses and keep the respondent on their toes. However you don’t want to confuse them. So mix it up a little.

  • Leading Questions

    This is something you would think wouldn’t be happening in surveys, but it still does. The use of leading question is still an issue.  Ensure the words you use don’t imply any unwanted  response.   and that they don’t point the respondent to any sort after response. I have always found that open ended questions like those in an interview are the only way to go here.

  • Ambiguous Questions

    Supplied responses to questions need to have no ambiguity in them at all. Remember what you interpret as meaning one thing, someone else will see if as completely different. It’s recommend that you ensure that all supplied responses are 100% rock solid in what you want them to mean. Use the contemporary language and terms of your audience, also avoid verbs that have a double meaning.

  • Negative Questions

    The use of negative terms in a question is not a the best solution. In a lot of cases people will mis-read the question as an implied positive. Which will give you a completely skewed dataset. The simple solution is just to present all the questions as a positive outcome.

    If you really have to use a negative, a way around this is to highlight a simple negative  like for example – NOT. Just bold and capitalise it.

  • Getting Ranges Right

    When you ask about a range or the like, don’t imply a level of use for a response. As the respondent will just assume you are looking for answer within this level of the range.

    For example. “How many times do you visit our site a week:” this is bad, it implies you must visit at least once a week. Where as “How often to you visit our site:” is a better alternative as it leaves the value ranges to the supplied responses.

    Pre-testing on the supplied responses will also give you a realistic response range as well.

Now this list is by no means complete, what additional pointers would you include as UX professionals?

Tags: , , , , , , , ,

4 comments

  1. Don’t have too many different types of question formats.

    Eg. Q1 is multichoice, Q2 is likert scale, Q3 is open text, Q4 likert scale, Q5 “rank these” grid, Q5 yet-another-format, Q6 omgwtfbbq my brain just melted.

  2. @eric – but you don’t want 5 questions all likert scale, all the same number of responses in the matrix, you’ll just end up with the same results for the most part. At least its not a 25 question nightmare ;)

  3. I disagree with scrambling multiple choice response order, especially as advice for amateur survey designers. Scrambling assumes that the text of the response choices will absolutely be interpreted the way intended by the author.

    What I see too often are choices like:
    – it’s OK
    – it’s acceptable
    – it’s good
    – it’s bad
    – it’s great

    When poorly written choices are scrambled, there’s no way to know how the author differentiates “OK” and “acceptable”. Of course, the ideal solution is to not have poorly written choices, but leaving the choices in order helps protects against problems from poorly written choices.

    I would really, really like to see a stronger post that outright says “If you design your own survey and you don’t have years of task-specific training, you are introducing unacceptable bias and error and your survey data is mostly useless.” As this post stands, I can’t send this post to my boss because he’ll think that having done 5 surveys in the past counts as “survey experience”, which means he’s safe to keep doing what he’s doing.

  4. @dave This is where testing the survey, especially these responses is important.

    You have to get them just right. You are really looking at responses that will mean or be weighted equally overall across the audience, ones with a good statistic spread. However they also have to give a preferential bias for each audience group.

    Otherwise having alternative responses that are just additional filling is a waste of time. Maybe this is even the wrong style of question for the information you want to capture in this case.

    What is survey experience, if over 5 surveys your boss has discovered that the data was biased, that mistakes were made. And on the subsequent survey they improved their skills and technique.

    Then yes they are on the way.

    But if they just repeated the same skill set with no real thought. Then reality is they aren’t gaining experience.

    It’s the old adage: 10 years at a job = 1 years experiences repeated 10 times.

    Sadly I know that the business world is not black and white and if I stand up here on a soapbox and say – “DON”T do it”, people will anyway. So better to educate them a little in the ways of doing it correctly.

Comments are now closed, move along, nothing to see here.