Collect Meaningful Feedback

Leading Practice

Want to view more like this? Check out our
Leading Practice & FAQ Home Page!

When taking on a new project, seasoned professionals and those just starting out in their career likely all ask themselves the same question — Where do I begin?

This article will help to answer that big question by providing a framework you can rely on when working through projects called the Survey Design Cycle. This cycle will take you from ideation, through build and delivery, all the way to analysis and action.

Survey Design Cycle

Survey Design Cycle

The survey design cycle has six parts: Need, Design, Build, Collect, Report, and Act. Getting all clients and stakeholders on board with this cycle at the beginning of your project gives everyone involved a clear picture of what's to come and prevents projects from getting derailed and losing focus along the way. 

Within this process, you'll work together to:

  • Set goals for your survey.
  • Define an audience.
  • Intentionally choose question types and distribution methods that help you avoid common pitfalls and roadblocks.
  • Make a plan of action for what to do with the data once you've collected it. 

Each of these steps play an important role in not only the management of the project as a whole, but also will help ensure that all elements of your project are working together to help you achieve your goals.

Need

Need

The first step of the survey design cycle, Need, helps you define why this project is important and what the end goal of the project is. 

Need can be broken down into three parts:

  1. Goals and Objectives
  2. Brainstorm 
  3. Selection and Refinement

Goals and Objectives

During the goals and objectives defining process, you'll have a discussion with all key stakeholders so that everyone clearly understands what success will look like and how project goals will be achieved. 

First, we recommend discussing project goals. A goal is what you are going to do with the data and why. 

An effective goal would be something like "I am going to stock my grocery store with healthy products people want to buy to increase revenue," instead of, "I am going to make more money for my business."

Once the goal is established, you'll come up with objectives. These objectives frame the questions you'll ask to help you achieve your goal. Ideally, you'll have somewhere between three and five objectives. Too many objectives can blur the scope of the project, so keeping these refined will ensure everyone stays on track. 

With the goal of, "I am going to stock my store with healthy products people want to buy to increase revenue," your objectives may be:

  • Determine what types of healthy foods people in my city are currently buying and eating
  • Evaluate the level of interest in healthy food choices
  • Identify barriers in purchasing and eating healthy foods

Brainstorm

In this same discussion, we also recommend brainstorming the answers to the following questions. These will help inform how your team will tackle the rest of the Survey Design Cycle.

  • What will you do with the data? Will it be analyzed alone or with other data? Will it be used only internally or publicly? Is it for your organization’s use or for a client?
  • What do you want to ask to support those goals? What are the questions you need to ask in order to reach your project goals?
  • Who is your audience of respondents? Are they customers or potential customers? What is their geographic location, language, age, financial status?
  • How will you distribute the survey? Online via a website or email, in person or phone interviews, on paper?
  • What kinds of reports do you need? Are you reporting only for yourself or for others?  Do you need to provide reports in a particular format (Word, PDF, PowerPoint, Excel, SPSS)?

When brainstorming with your team, consider some of the following tips:

  • Give everyone the chance to write down their answers to these questions on their own for around 5 minutes.
  • Make sure everyone gets the chance to share their thoughts from this independent process.
  • Encourage all participants to speak and let ideas flow freely; no ideas are bad ideas!
  • Have a scribe to make sure all ideas are captured for selection and refinement later.

Selection and Refinement

Now it's time to take all of the ideas generated during your team brainstorm and refine them.

When determining the questions, audience, distribution methods, report format types, data analysis plans, and post-analysis actions you'll ultimately select, keep your goal in the forefront. Everything should tie directly back to the survey goal.

One way to do this is to write your survey goal and learning objectives down on a whiteboard and see which of your ideas fits under each. If it does not fit, it might be a good idea to remove it from the scope of this project and revisit for a different project.

This is also a good time to revisit and refine your goal and objectives to see if there is anything that needs to be changed before moving on to the design of your project.

Now that you've refined the project and have a clear picture of what completing it will entail, it's also important to consider Return on Investment (ROI) at this time. If you are going to spend more time and money on running the research for this need than the overall completion of goals would generate, it may be best to come up with a different solution. 

Design

Design

The design phase is where you'll start to think about the survey taking experience. 

You'll put yourself into the shoes of your survey taker and consider different elements that make for a clear and direct survey taking experience, which will in turn improve your data quality.

Lack of focus, bias, fatigue, and miscommunication can create a negative or confusing survey taking experience for your respondents, leaving you with unusable data. Making intentional choices during this phase will help you avoid these common pitfalls.

Design breaks down into three parts:

  1. Organize
  2. Eliminate Bias
  3. Re-establish Focus

Organize

Using clear and simple language in your survey questions and having a survey design that gives a direct and clear path to complete the survey will make it easy for your respondents to give good answers and complete the survey, and help prevent survey fatigue.

Shorter surveys might have one question per page so that respondents can focus. If your survey is longer, you might show more questions on a page (but no more than 10). When grouping questions together on pages, be sure that the content of the questions align, as unrelated questions on the same page could lead to confusion.

Display your respondents’ progress as they move through the survey. This can be particularly useful in longer surveys, where people will be less likely to abandon their task if they can see that they are nearing the end of the questions.

Later in this article, we'll talk about how you can use Logic to fight survey fatigue as well.

Eliminate Bias

Evaluating your survey and questions to make sure that you are not introducing any bias or leading your respondents to particular answers will help you get much more effective survey data.

Getting objective answers is much more useful that getting the answer that you might wish for. When you are designing survey questions you’ll want to be sure that question wording is not leading respondents to particular answers.

For example, if you ask “How satisfied are you with our great selection of healthy products?” where possible answers are “Very Satisfied,” “Satisfied,” and “Dissatisfied,” you’ve told the respondent that the products are great within the question, and given them limited options to respond to the question.

To remove the bias, maintain your neutrality by rephrasing the question to ask, “What is your level of satisfaction with the products available in our store?” and offer them a wider range of answer options, including, “Very Satisfied,” “Satisfied,” “Neutral,” “Dissatisfied,” and “Very Dissatisfied.”

Re-establish Focus

Asking good questions is key to establishing focus within a survey. To ensure your survey has clear focus: 

  • Ask brief and direct questions.
  • Consider the reporting options available for the question types you choose.
  • Carefully read and edit question text and answer options for potential areas of overlap, confusion, or inconsistency.
  • Ask the fewest number of questions possible to get the data you need.
  • Tie all questions directly back to the survey's goal and objectives.
  • Cover only one topic per survey.
  • Ask questions that derive actionable results.

Let's think back to the survey goal "I am going to stock my grocery store with healthy products people want to buy to increase revenue" and objectives to determine what types of healthy foods people in my city are currently buying and eating, evaluate the level of interest in healthy food choices, and identify barriers in purchasing and eating healthy foods.


Asking "How many hours a week do you spend shopping for groceries?" would return answers that do not connect to any objectives. Similarly, if you ask "Which of the following products would you be interested in?" and include items you cannot stock, the data collected has no actionable result. 

Build

Build
During the build phase, you'll take the ideas and principles from design and put them to action.


Construct

Alchemer allows you to choose from a number of different question types. Choosing the right type of question will make a big impact on getting the most effective survey data for your project.

Quantitative question types allow you to set up a discrete set of answers that a respondent needs to choose from. Radio button, checkbox and dropdown menu questions are good examples of this type of question along with rating scales and ranking questions.

These kinds of questions produce clean, easy-to-analyze charts that will allow you to draw conclusions about trends and patterns among your respondents. They are also quick for respondents to answer, reducing survey fatigue.

When using scale questions, there are many possibilities for how you set up the question. As a rule, answers are read from left to right. The most common scale used is the 1-5 scale, however, the 1-10 scale allows for more variability in option choice.  When adding a "neutral" option in you scales, be aware that the data you collect might not be actionable. However, if the option is not included, responses my skew more positive so it is important to note this when analyzing data. 

Qualitative questions allow respondents to enter their own answer, such as textbox and essay type questions. While harder to efficiently analyze, qualitative questions are important because they can give you direct insight into a respondent’s point of view.

You will need to determine how you’ll handle the data these questions produce before you distribute your survey. You can create a word cloud with the most commonly used words, use open text analysis to identify patterns, simply read each one and make notes, or a combination of all of these.

Validate

Validation allows you to ensure collecting cleanest data possible. Alchemer offers validation tools to ensure that the data you're collecting is in the intended format. For example, if you're collecting email addresses, you can ensure they're collected in the format emailaddress@provider.com.


Test Reports

Once the survey is finished, it's time to test your survey! It's a good idea to test the survey on your own and also share your survey with people who are not involved in the project as they have a fresh set of eyes and might catch mistakes that those who have been involved with the project from the beginning could miss. You can also generate test responses, which will allow you to easily obtain a set of data to make sure your data is gathered in the correct format.  

Once you've collected all of your test responses, it's time to make sure you are gathering the data you need for reports. Are your questions reporting the way you expect? Are you able to create the reports you need using the data you collected? Is the data in the format you need? Is the data format beneficial and relevant to the survey goals and objectives?


Apply and Test Logic

Adding Logic to your survey build can also help to ensure that you're only showing relevant content to your respondents, fighting survey fatigue. It can also help fight bias in your surveys. We've compiled some helpful logic types for each of these categories below.

Fatigue-fighting logic

  • Skip Logic interrupts the default survey flow and directs the respondent to another location, ensuring they're only seeing what's relevant to them.
  • Question Logic allows you to choose parameters for when to show survey questions based on responses to previous answers. 
  • Percent branching allows you to assign a percentage of respondents to see a specific set of questions that differ from another branch's set of questions, so that not all respondents have to answer every survey question.
  • Piping allows you to display a single answer from a previous question on a later page, so that questions are tailored to respondents answers.

Bias-fighting logic 

  • Randomize questions, pages, and answer options.
  • Disqualify survey respondents that do not meet certain criteria that may be necessary to complete your survey.
  • Set Page Timers to ensure that respondents don't have the time to find external resources to help them answer your quiz questions, ensure that video or audio clips included in your survey are watched/listened to in their entirety, and prevent people from speeding through your survey if you are using panel respondents or are offering an incentive.

Collect

Collect

At this point, you've built a thoughtfully designed survey and tested the logic and reporting to make sure that the survey will truly help you accomplish your goals and objectives. Now it's time to collect data! 

In the collection phase of the design cycle, you'll define who you are sending the survey to, and how to best reach them.

Choose mode of survey

As you'll be using an online survey as your mode, it's important to consider how respondents can and will access this survey. 

For example, if a portion of your group of respondents lives in rural America and is aged 70-75, their computer access may be more limited than 20-25 year old respondents in an urban setting. This could introduce bias, so considering mode and distribution methods are important!

You might also have to distribute your survey in a single location with poor internet access, so displaying your survey on a kiosk using offline mode could help improve respondent experience and quality of data.

Choose sample

Before we dive into choosing your sample, there are a few definitions to note. A population represents the entire group of individuals for which you are trying to draw conclusions. A sample is a sub-group of the population, chosen using statistically valid means, in order to represent the population as a whole. 

When determining who to send your survey to, you'll want to consider if you want to survey everyone (the population) or a percentage of the people (the sample). Often, the sample is the better choice as it is less expensive, you decrease your likelihood of introducing survey fatigue if you need to survey your population more than once, and a statistically valid sample is just as effective as surveying the entire population.

If you do choose to go with a sample rather than the entire population, it's important to make sure that the sample you are using for comparison accurately represents the entire population. 

For example, if your population is high school and college students in the city of Boulder, you would need to make sure that your sample has the same ratio of high school to college students as the actual population.

If you're unsure what the right sample size for it to be statistically accurate, you can safely say that if you have 400 responses, you have enough of a sample group. However, you might be thinking "I don't even have 400 people in my population to choose from." If that's the case, you can use a sample calculator.

A sample calculator takes into consideration the margin of error (or confidence interval), as well as the confidence level, and how large your population is.

The confidence interval is used to indicate the reliability of an estimate. If you use a confidence interval of 5 and 60% of your sample picks an answer; you can be confident that if you asked that question to the entire population, between 55% and 65% would have picked that same answer.

The confidence level is the probability that the value of a parameter falls within a specified range of values. It is expressed as a percentage and represents how often the percentage of the population who would pick an answer lies within the confidence interval. The 95% confidence level means you can be 95% certain.

You can pull a sample population from:

  • Customer/Internal Lists
  • Websites/Online Forms
  • Social Media
  • Panel Companies

Engage panel (when applicable)

A panel company is an organization that exists to sell anonymous survey responses to marketers and market researchers. We can help connect you with the right panel company for your needs.

Report

Report

Now that you've collected all of the data you need, its time to report on it.  In this step of the Survey Design Cycle you'll:

  1. Clean Data
  2. Analyze Data
  3. Run Initial Reports
  4. Create a Final Report

Clean data

Cleaning your data helps you to make sure that all of the data you have is valid and usable. 

First, you'll want to remove any responses of people who may not have given you their honest feedback. Possible things to look out for here include people who answered all multiple choice questions in a straight line or pattern, or, if you implemented a page timer, people who responded to your survey unusually quickly. If you used open-ended answers, read the responses to make sure they make sense. Nonsense answers may indicate a bot, or someone speeding through a survey to receive an incentive.

Next, prepare your data for analysis. If validation was not used in the build phase, or there were any breaks in the validation, it's important make sure all answers are collected in a consistent format. 

Finally, you'll prepare any open-text data for reporting. When analyzing open-text responses, it's important not to introduce any new bias. Be sure not to change question or answer text when preparing your data.

Open Text Analysis is a very useful tool for quantifying and transforming open text responses into actionable data. Using Open Text Analysis, you can read through responses to each open text question in your survey and bucket them into categories. This allows you to report on textboxes, essays, and other textboxes as a pie chart or bar chart! Learn how to navigate open text analysis and bucket responses to prepare your data for reporting.

Run individual reports segmented per learning objective

Segmented reports can be used to determine the "highlights" of data collected as they relate to the ultimate actions so that you can truly understand the most significant findings of your research. 

Here are the 3 learning objectives we identified at the beginning of our project. 

  • Determine what types of healthy foods people in my city are currently buying and eating.
  • Evaluate the level of interest in healthy food choices.
  • Identify barriers in purchasing and eating healthy foods.


In this stage, we'd create segments that include the data for questions related to types of healthy foods the sample currently buys and eats, interest in healthy food choices, and barriers in purchasing and eating healthy foods.

Analyze Data

When looking at these reports, the data that you're looking at should connect directly back to your objectives. Ask yourself: Did you get your questions answered? Is the data in the format you expected? Are you seeing the trends you anticipated? If you realize that the answer to any of these questions is no, you can always redistribute the survey to ensure you collect actionable and usable data.

There are many reporting and analysis tools you can use to help make sure your data analysis results in accurate and usable conclusions.

  • Standard Report. The Standard Report sets the standard for reporting in Alchemer. There are a multitude of customization options you can choose from within this reporting option to make sure your data is displayed to suit your  needs.
  • Statistical Packages for the Social Sciences (SPSS). SPSS is an analytics software tool used by marketers and researchers around the world. Most top research agencies use SPSS to analyze survey data and mine text data to get the most out of their research projects. If your company has a copy of SPSS and you're looking for a more advanced reporting tool, this export is an excellent option.
  • Excel Export. This allows you to export your raw data into an excel sheet for use in any application of your choosing.
  • TURF Report. TURF stands for Total Unduplicated Reach and Frequency. It is a statistical model that estimates market potential with limited resources. It can help you choose the best configuration in order to reach the maximum people in your target audience.

Consider the following suggestions for effective reports

  • Write a summary at the beginning of your report. In this summary, include information about the ultimate goal of the survey, who was surveyed, who was the population, who responded, and basic highlights of the survey audience and your data to introduce the findings.
  • Write a mini-summary for each individual objective. These can be included in the report segments, and can include the recommended actions to take based on the results of the survey.
  • Include any interesting and unexpected trends. This optional step is where you can mention any good to know, but not need to know information found. For example, you might have found a new, unintended segment of your population that could help you make good business decisions moving forward. 
  • Write a conclusion. This should recap what actions are going to be taken (if any) based on your findings. This conclusion should be agreed upon by all stakeholders. It's important to determine what metrics will be used to measure the success of the actions that will be taken based on this survey data. You might even consider creating a survey to send to stakeholders in order to gain feedback on the project and put actions into motion.

Act

Act

Now it's time to do something with your data! 

This step is where you begin to see the fruits of your labor, but can often be put off or forgotten. It's important to empower action taking amongst your team by reminding them of the goals of the survey and how those won't be achieved by letting the data sit without any actions taken.

Once actions plans have been rolled out and are live, it's important to monitor the impact of those actions. This is where the metrics established when you reporting on your findings really come into play. If you found that your population wanted the addition of more local farmers' produce in your grocery store, did adding these items actually work to increase revenue in a meaningful way?

You can also get feedback on the study to determine what worked, what didn't work, and where you want to improve for next time. This can be in the form of a short survey sent to all that were involved in the process. Ask for any suggestions that they may have so that you can work better together in the next study and improve the process. The best way to improve is to ask for feedback!

Finally, share the feedback. This should be a very short summary of all of the data in the study. It can include highlights, final financial information (ROI), and actions that were taken and their success. You can also include any changes that will be made moving forward based on the feedback you received from stakeholders about this study.



Basic Standard Market Research HR Professional Full Access Reporting
Free Individual Team & Enterprise
Feature Included In