In this tutorial, we'll take a closer look at the process of developing a survey. I'll give to you a hypothetical situation, and we will go through each step of designing the survey and consider questions that we might ask, and issues that might arise in each of the steps so that you can leave this tutorial with a really thorough understanding of what the survey development process might look like.
Recall that the steps involved in designing a survey include choosing a survey topic, selecting the sample or audience for your survey, selecting a survey delivery approach, developing your survey questions, and finally, analyzing your data. So let's examine each of these steps in turn using a hypothetical situation. Let's say that a school district is considering implementing a new learning management system.
Step one is to identify the survey topic. Here you identify the topic of the survey, and also identify which specific pieces of information you want to find out about that topic. So in the case of the school district that is considering implementing a new learning management system, obviously the overarching topic of the survey is learning management systems. But more specifically, what the school district wants to find out is, what are the desired features of a learning management system?
The next step in the process of developing a survey is to select the sample or the audience for the survey. Who is going to be taking the survey? Not only do you want to identify who is going to be surveyed, but you also want to be clear on why you are selecting those people or those groups of people. It's important to survey all of the individuals who will potentially be impacted. So the hypothetical school district is going to want to be sure to survey the teachers, and the students, and the administration, and the parents, and potentially, even some other groups of other stakeholders in the school district.
The next step is to determine the delivery approach for the survey. This is when you make a decision about how the survey will be created and how it will be implemented. In most situations, you are going to find that it will be most effective to use an online tool. Many of these online survey tools will allow you to both create and deliver the survey. And in fact, most of them will also provide at least some level of analysis or reporting of your results as well.
Let's say that our hypothetical district has decided to use a Google Form to create and deliver this survey. The next step in the process is developing your questions. Here you want to think about what types of questions you're going to use and how you are going to structure those questions. Let's take a look at the survey that has been created for our hypothetical situation of a school district shopping for a new learning management system.
The survey begins with a structured question that asks the person taking the survey to identify their role in the district. Now while this list may have been generated by the district in an attempt to create a representative, yet finite list of the different categories of stakeholders that might be involved in the selection of this learning management system, a potential problem here is that not all groups in the district might have been represented.
For example, what if the attendance secretary in the main office attempts to take the survey and finds that there really isn't a category here that would accurately represent the role of the attendance secretary? So the district may wish to include an option here where maybe the category is just called other, and other people who find that they don't fit into any of the predefined categories can still let their opinion be heard and feel that they are accurately represented in the appropriate group of stakeholders.
The next question is unstructured and asks the survey taker to explain why they feel it is important for the district to implement a new learning management system. Well, this question is clearly biased. What if the survey taker doesn't feel that it's important at this time for the district to implement a new learning management system? This question assumes a given point of view. And so the district will want to go back and revise this question to remove the bias.
The next question is another unstructured one asking the survey taker to list features that are important to them in an LMS. Now actually both of these questions have a problem with validity. Both questions use the acronym LMS, but neither question explains what LMS stands for. So another change that the district might choose to make in this survey is to explain the acronym each time that it's used. While this may seem redundant, it may actually help to create more consistent results in the survey as people who are unfamiliar with the term won't have to go back and keep looking up what that acronym stands for.
Next, we have a large grid that actually represents several questions in one. Instead of having the stakeholders rate each individual feature of the LMS on a scale, instead what has been done here is the possible features have been listed down the left column. And then the survey participants are asked to rank these features in order from least important to most important according to each individual survey taker in the role that they play in the district.
Next is an unstructured question prompting the survey taker to share any concerns that they might have regarding the selection and implementation of the new learning management system. This is followed by a question that asks the survey taker whether they would be interested in serving on a committee that will research the available learning management system options. There's a problem with this question, though. It's definitely biased. Look at those answer choices. Yes, I would love to be on the committee, and, no, I don't care which option we choose.
There may be any number of reasons why an individual may not want or even be able to serve on this committee. So implying that anyone who selects no is doing so because they don't care is actually quite insulting. And this biased set of answer choices needs to be changed so that people don't feel pressured to answer one way or other.
The Last question in the survey is an unstructured question that allows the survey taker one more chance to list any LMS features that are important to them, but that were not part of the grid above. We have a little bit of a validity problem here when we combine this question with the initial question that asked the survey taker to list the features that are important to them in a learning management system. And especially when we consider that the survey takers are provided with this big grid in which they are asked to rank the relative importance of the features, probably the district is going to want to go back and just remove this first question asking about features.
So let's take a look at the new and improved survey. Survey takers can now select the other option for their role in the school district, and they can identify their specific role in the text box. The two initial unstructured questions have actually been completely removed. The survey takers still have the open-ended or unstructured question that will allow them to share their concerns.
The answer choices for the committee question have been changed to just a simple yes and no. And there is one more open-ended or unstructured question which does provide that opportunity for the survey takers to list any features that they feel may have been missing from the above list.
Now that we've developed the questions for the survey, the final step in the survey development process is deciding how you are going to analyze your data. Think about what kinds of graphs or charts you may want to use in order to display the results of your survey. And do consider the specifics-- exactly how are you going to analyze the data that you gather?
So in our hypothetical situation, is the district going to look outward views of teachers compared to those of students compared to those of parents? In fact, if they're not going to compare those different interest groups, they probably don't even need to have that first question on the survey. So clearly, they do intend to compare the needs and wants of these various groups of stakeholders.
Now that I've walked you through the process of designing a survey from start to finish, now it's your turn to stop and reflect. Choose a topic that is relevant to your school community right now, and think through the process of developing a survey that would accurately gauge opinions on the topic.
To dive a little deeper and learn how to apply this information, be sure to check out the additional resources section associated with this video. This is where you'll find links targeted toward helping you discover more ways to apply this course material. Thanks for watching today. Have a great day.
(00:00 - 00:54) Introduction
(00:55 - 01:27) Topic
(01:28 - 02:05) Sample
(02:06 - 02:46) Delivery Approach
(02:47 - 08:21) Questions
(08:22 - 09:08) Analyze Data
(09:09 - 09:48) Stop and Reflect