Join CTO Moataz Soliman as he explores the potential impact poor performance can have on your bottom line. 👉 Register Today

ebook icon

User Feedback

General

In-App Survey Questions: Guidelines and Templates

In-App Survey Questions: Guidelines and Templates

When it comes to collecting feedback, in-app survey questions are one of the most effective tools you can use. They introduce very little friction to your users while allowing you to ask for specific feedback. Moreover, research has shown that more than half of mobile users expect companies to directly ask them for feedback.

However, the amount and quality of feedback you receive from surveys will depend on the quality of those surveys. Knowing what, how, and when to ask in your surveys is crucial. In this post, we will look at some general guidelines for designing survey questions followed by some templates for common questions.

‍

Before Writing Your In-App Survey Questions

‍

The goal of the survey

The first order of business is to determine what you are trying to learn from this survey. Try to clearly define your goals to help you decide on what type of questions you need to use in your survey.

‍

Your target sample

When you have enough testers, it is considered good practice to segment them and send more targeted surveys. However, keep in mind that a small target sample can’t provide you with quantitative data, and qualitative data from a large one can be too much to analyze. Besides the size of the sample, also consider the type of people in the sample. Things like age, gender, technical level, etc. will change how you should frame your survey questions.

‍

Your sampling method

If you are going to send a survey to only a subset of your beta testers, it is important to avoid bias in selecting the sample. This bias is introduced when the sample is not representative of your larger target population, like surveying power users to represent all testers for example. A randomized selection is the best approach to avoid this type of bias.

‍

In-App Survey Question Writing Guidelines

‍

Open-ended vs. close-ended questions

Open-ended questions are good for qualitative feedback and exploratory research to find out what your testers think. They are also a good way to dig deeper into a specific area when coupled with a specific question. Moreover, they can generate useful feedback and insight from a small sample. However, they are wieldy and time-consuming both for your testers to submit and for you to go through so don’t overuse them.

Close-ended questions are good for quantitative feedback and to answer specific questions you might have. This quantifiable data shows you the “big picture”, and is a good way to reveal trends and patterns in your app’s use. Additionally, they are the least time-consuming and offer your testers the least amount of friction to complete the survey.

‍

Wording and language

Ask only one thing per question, and keep your questions short and precise. Avoid using ambiguous wording, negatives, and double negatives as they can confuse your testers. Additionally, limit the number of questions you send per survey to around five questions to boost completion rates.

Use simple language and avoid using any technical jargon that might fly over your testers' heads. Try to avoid using acronyms and abbreviations in your survey or spell them out when you do. Additionally, don’t use language that might evoke emotion or lead your testers to a specific answer to avoid biased results.

‍

Multiple choice

Avoid using yes or no questions or dichotomies in favor of using a scale. This will give you more nuanced feedback and reveal more insights. Generally, it is good practice to use an odd numbered scale that has a middle point or “neutral” choice, but you can use an even numbered scale to force a positive or negative choice when needed.

The choices you provide for the testers must be exhaustive and mutually exclusive i.e. they should cover all the possible answers and there should be no overlap between the choices. You can include an “other” option with or without requiring to specify if you can’t cover all the possible answers. You can allow multiple selection for questions that can have more than one answer, but consider limiting the number of choices possible. This will force your testers to prioritize their answer and give you a better idea of what matters most.

‍

In-App Survey Question Templates

‍

You can use your survey to collect feedback about many different aspects of your beta app. Here we will list some survey question templates sorted by what they are trying to address.

‍

Product feedback

  • What problem/goal are you trying to solve/achieve with the app? (text field)
  • Did the app help solve your problem/achieve your goal? (scale of 1-5)
  • What triggers would prompt you to use the app? (text field)
  • Which features of the app are most/least important to you? (multiple choice)
  • How would you feel if you can no longer use the app? Why? (text field)
  • Which features didn’t work as expected? (text field)
  • Are there any features you expected to find but didn't? (text field)
  • What features would you like to add to the app? (text field for new ideas or multiple choice to prioritize roadmap)
  • How satisfied are you with the stability of the app? (scale of 1-5)
  • How satisfied are you with the security of the app? (scale of 1-5)
  • How satisfied are you with the ability to integrate with other services? (scale of 1-5)

‍

User experience

  • How satisfied are you with the look and feel of the app? (scale of 1-5)
  • What was your first impression of the app? (text field)
  • How satisfied are you with the ease of use of the app? (scale of 1-5)
  • How satisfied are you with the installation and onboarding experience of the app? (scale of 1-5)
  • What confused/annoyed you about the app? (text field)

‍

Market research

  • What price would you be willing to pay for this app? (multiple choice)
  • How clear do you find our pricing? (scale of 1-5)
  • How would you rate the app's value for money? (scale of 1-5)
  • What are the alternatives that you are considering to the app? (text field)
  • How does the app compare with competitors? (scale of 1-5)
  • How did you find out about the app? (multiple choice)

‍

Support

  • How would you rate your customer support experience? (scale of 1-5)
  • Did we solve/answer your issue/question? (yes or no)
  • How much time did we take to address your concern? (scale of 1-5)
  • What type of support communication methods do you prefer? (multiple choice)

‍

Beta testing

  • Overall, how would you rate the beta program? (scale of 1-5)
  • Did you find it easy to know your responsibilities as a tester? (scale of 1-5)
  • How easy was it to report issues you encounter? (scale of 1-5)
  • Do you have any comments/suggestions for the beta program? (text field)

‍

General feedback

  • How would you rate the overall quality of the app? (scale of 1-5)
  • How likely are you to recommend this app to a friend or colleague? (scale of 1-10)
  • Do you have any additional comments/suggestions? (text field)

‍

Tips for Designing In-App Survey Questions


  • Avoid asking too many questions. You will learn nothing if users abandon the survey because it takes too much time.
  • At the beginning of your survey, set your users' expectations about its length and how much time it should take.
  • Use a mix of open and close-ended questions but limit the number of text fields as much as possible.
  • The sequence of your survey questions makes a difference. Keep questions that might influence the user's answers towards the end of the survey.
  • Try to put the easy, short questions towards the beginning of the survey, and the longer open-ended ones towards the end to boost completion rates.
  • Make sure you ask only one thing per question, to avoid confusion and inaccurate results.
  • Avoid asking hypothetical questions; answers to hypothetical situations are often inaccurate and might not represent their actual response.
  • Try to avoid framing answers on agree/disagree scales as they can be biased towards "agree".
  • Multiple choice questions can have a bias towards the first answer or choice, so try to randomize the order of the choices.
  • Before you send your survey, test it out internally with your team and externally with a few testers if possible.
  • Make sure to actually analyze the results and act on the feedback you received.
  • Remember that often what people say can be different from what they do. Put your survey results in context by analyzing what users actually do while using your app.
  • After you act on the feedback you receive, try to close the feedback loop and notify the survey respondents of the changes you made. This will make them feel appreciated and be much more likely to give feedback in the future.
Want to try our latest AI features?
Sign up to the closed beta now
Thank you! We'll contact you once a spot opens up.
Oops! Something went wrong while submitting the form.

Learn more:

‍

Instabug empowers mobile teams to maintain industry-leading apps with mobile-focused, user-centric stability and performance monitoring.

Visit our sandbox or book a demo to see how Instabug can help your app

Seeing is Believing, Start Your 14-Day Free Trial

In less than a minute, integrate the Instabug SDK for iOS, Android, React Native, Xamarin, Cordova, Flutter, and Unity mobile apps