Shipping surveys is easy. Getting great survey results? Not always so simple. There are a few key guidelines you’ve got to follow if you want your surveys to attract results, reach the right users, and help you reach your goals. In this post, let’s break down ten of the most common survey mistakes people make and how to get yours back on track. Soon you’ll be a master at survey design.
The good news is, it’s easy to design and execute a solid survey if you’re thinking critically about what your goals are and how you are going to achieve them.
Sin #1. You’re playing it fast and loose with targeting—or not targeting at all.
Surveys are not one size fits all. The information you collect with surveys can help you understand your users’ motivations and predict future user behavior… but only if you are targeting segments that make sense for the survey you’ve written. Don’t just send a survey to all users and see how many responses you get. For example, it’d be pointless and annoying to be asked to fill out a survey about the shopping experience if you haven’t made a purchase yet.
Manage segmentation by regularly tagging and updating users with attributes that are relevant to their demographics and behavior. When you ship your surveys, target segments that make sense. Strive to ask questions that are relevant to your users’ segment and app experience.
Sin #2. The survey is inconvenient or annoying for the user.
Response rates are low, completion rates are low, and you’re not getting enough data to really understand the full picture.
Execution matters. You can write a great survey, but if it requires too much effort from your users, very few of them will commit to it. For mobile apps, the best way to deploy a survey is within the app itself. Avoid interrupting the app experience with a popup that sends them to an online form, email client, or a poorly optimized webpage — it’s disruptive and time-consuming. Using an in-app survey tool like Instabug will give them a seamless, distraction-free experience that keeps them in-app and engaged. The cool thing about Instabug survey design is that you can target the exact moment the user will see your survey.
Sin #3. You have no idea where you’re going with this survey.
If you’ve got a lot of random data and no idea what to do with it, it’s probably because your survey plan wasn’t solid before you started writing your questions. Maybe you asked some questions that seemed interesting at the time, but now you’re realizing the data isn’t actionable. How do you write a survey that nets you information that’s actually useful?
Know your goals before you start surveying. What do you hope to accomplish with this questionnaire? The survey format and questions you choose will depend on your goals. Once you know precisely what you want to learn, it’s easier to write questions designed to obtain that information. An easy way to get started is to map out your questions in a table like below to organize your thoughts and intentions.
“I want to understand…”
What question will get me the answers I need?
Which answer format makes the most sense for this question?
|…user goals.||What problem/goal are you solve or achieve with this app?||Open text (verbatim)|
|…value perception.||How would you rate the value for the price of this app/in-app purchases?||Multiple choice:|
|…the user experience.||In your own words, what do you like most about this app?||Open text (verbatim)|
|…brand loyalty.||On a scale of 1 to 10, how likely are you to recommend this app to a friend or colleague?||NPS survey|
You can find more templates here. Instabug offers three survey types: NPS Rating, App Rating, and Custom, all of which allow you to design and ship your own survey without writing any code yourself.
Sin #4. The questions and answers are inconsistent or out of sequence.
Are you unintentionally confusing your users? How about I tell you a story—I once had a high school physics teacher who used to love playing games with exam questions to see if we were paying attention. There’d be several multiple-choice questions in a row that offered answers on a scale from least to most, lowest to highest, you get the idea. On the occasional question, he’d flip the answer format around just to see if we were paying attention. And I get it: confusing high schoolers by the classful was probably really fun (for him). But it’s not a good survey design strategy!
If you’re going to give your users a sequence of multiple-choice options, make sure they’re always in the same order. The same goes for the way you phrase your questions—maintain consistency whenever possible.
Sin #5. The survey questions are misleading, confusing, vague, or loaded.
Word choice makes a big difference. Asking “how much did you enjoy this level?” is different than asking a user if they enjoyed the level. Do the questions use biased or emotional language that might affect the user’s answer?
Also on the list of questions to avoid are questions that confuse the user or force them to think too hard. For example, a question like “how important is app quality to you?” is not exactly a meaningful question on its own, as realistically, few to none of your users would answer that quality doesn’t really matter. Neither will they know how to answer “double-barreled” survey questions like “Please rate the efficiency and friendliness of our customer service” — you’re trying to get them to rate two different things at the same time and that’s not possible to do with one answer.
How do you get clear answers? Asking clear questions is a must for great survey design. Use neutral, unbiased language that doesn’t put any ideas in your users’ heads, so they can think and answer for themselves. Keep your ideas to one per question — skip the double-barreled survey confusion. And if you’re going to ask users to tell you how important something is to them, the best way to get a clear answer is to use forced ranking; otherwise, they’ll just say everything is important.
Sin #6. The survey is too loooooooooooooong.
Users are dropping out before they finish the survey. They just can’t seem to make it to the end without losing steam.
The average person’s attention span isn’t all that long, and if they’re on their phones, you can expect even less of it—3 minutes or less. You’ll want to make the most of what you’ve got. Increase your completion rates by reducing your survey length. Keep it concise, choose your questions carefully, and aim to collect only useful information. For example, don’t collect demographic information if you already have it. You don’t want to squander their attention span on pointless data.
Got too many questions? Release multiple smaller surveys (“micro-surveys”) instead. Split your segment into groups and send each group a portion of the complete question set.
Sin #7. You’re sending surveys at the wrong time.
You’re sending surveys to segments that make sense, but you’re just not getting enough responses. Are you choosing strategic moments to ship those surveys? If you’re getting answers, your timing might be off.
Timing and targeting are two of the most important — and most overlooked — aspects of survey design. In order to gather data that’s fresh, relevant, and useful, use event-based targeting to capture your audience at opportune moments. For example, the best time to ask a user about their purchase experience is right after they’ve successfully completed a purchase, while the experience is still new in their minds. If a user abandons their cart, that’s a great time to send a different survey about how you could make the purchasing experience easier for them.
If you’re specifically trying to boost your app store rating, the best time to ask users to rate your app is after they’ve had a “happy moment” inside your app — this could be beating a level, earning a reward, matching with another user, etc. Moments that make your user feel good are the perfect moments to ask them to rate your app because those are the times they’re most likely to leave you positive reviews.
Sin #8. Expectations aren’t being communicated.
One of the main barriers to survey completion is the user not knowing what kind of commitment they’re in for. If you’re prompted to take a survey, but you don’t know if it’ll take 3 minutes, 5 minutes, or 15 minutes of your time, are you going to press Start? What if you’ve started taking a survey, but you have no idea how many more questions there are and if it’s worth any more your time? Are you likely to complete it? For many of your users, that answer will be “no.”
The easiest way to get around this problem is to communicate your expectations. How long will the survey be? How will you use their data? A simple welcome message at the beginning of the survey is often enough to handle that. “We’d love your input — would you be willing to take a 2-minute survey to help us out? We won’t share your data” is a quick, friendly message that lets them know what to expect, and communicates that you’d be grateful for their input. You can set up and customize welcome messages with Instabug. Your survey should also have a progress bar so your users know how many questions are left.
Sin #9. There aren’t any open-ended questions.
They say there are things in this world that you know you know, you know you don’t know, you don’t know that you know, and you don’t know that you don’t know. Confusing, right? Well, that fourth category is your blind spot, and if you don’t let your users speak up you won’t know what’s in it. Multiple-choice answers, number scales, or rankings are great for grabbing information that’s easily quantifiable, but it’s the qualitative data that will help you unwrap the enigmatic “whys” of your user behavior.
If they’ve got something to say that you didn’t anticipate, hopefully your survey design makes some room for unstructured feedback. Ask questions that can’t be answered with yes or no. Give them room to write what they want. Otherwise, you’re missing out on the chance to discover points your team has overlooked, or maybe didn’t even think to ask. A final “Is there anything else you’d like to add?” at the end of your survey just might unlock a few surprises.
Sin #10. The survey ends at “submit.”
Are you finished with the survey after you’re done collecting data? If your answer to that is yes, then you’re missing out on some serious opportunities to really get to the bottom of what drives your customer’s experiences and motivations.
User feedback collection is an ongoing process — and it doesn’t stop at the end of the survey. First, don’t forget to thank your users for their time and help. A quick thank you message at the end of the survey shows your users that you care.
Next, it’s time to reach out to your survey respondents. This is outrageously easy to do if you’re using an in-app feedback tool like Instabug. You don’t have to reply to them all, of course, but you can help your users and yourself if you follow up with some of them post-survey. Try to respond to your at-risk users as soon as possible. They may not be expecting a response, but if you manage to get back to them within 24 hours, their experience and survey answers will still be fresh in their minds. When you’re really trying to understand your users, you’ll not only gain valuable feedback to improve your app, but also an opportunity to convert a detractor into a new fan.