Writing Your Own Survey Questions

About

The following information is intended as a general level guide for writing questionnaire surveys for any discipline. This information will be helpful to those constructing their own survey as well as those adding items to questionnaires within the Survey Builder. In the Survey Builder you can enter your own items and response categories using the icon “Create New Items”.

There are many issues not dealt with in this introduction including sampling issues (who and how many), design issues (when and how often), and analysis and interpretation issues. More information on these issues can be found in the Project Builder within the SoTL website and in the readings listed at the end of the last section.

For further professional assistance in designing, writing, and administering surveys please contact the staff at the UCF office of Operational Excellence and Assessment Support (OEAS).

Sections
1. General Helpful Points
2. Writing the Items
3. Writing the Response Choices
4. Pre-Testing or Checking your Survey
5. Further Help

1. General Helpful Points

Don’t reinvent the wheel. Make sure there is no pre-existing pre-tested set of questions that address the exact same topic you are interested in because many pre-tested items and questionnaires already have established validity.

Don’t start this process without a complete study plan. Before you write your survey, make sure you have finished thinking through your entire study including identifying all the factors you want to consider, how you want to measure them, who you will be studying, and how this survey will be administered. The Project Builder tool in the FCTL website can help with this process.

Remember your survey’s purpose. Every question that you ask should be about the specific purpose you are interested in investigating and the information you want to obtain from this survey. Before you write your questions identify the issues you want information on and then write your questions based only on this template. This also helps guide your survey length.

Remember that different question formats exist. Depending on the kind of information you want to obtain and the kind of summaries or analyses you require, different response formats may be more appropriate. These formats are reviewed below.

Work methodically. Writing good clear items and a questionnaire that gives you the best possible information you can obtain is a function of the thought and effort you put into the survey’s development. Remember that question wording can be misinterpreted, if possible, take the time to pre-test or show your survey to others who can provide feedback about clarity and content.

Be grateful and gracious to your respondents. If possible, thank your respondents at the beginning and end of your survey for their time and format your survey to be as readable and easy to follow as possible. Try to avoid sensitive or threatening questions. Sometimes direct questions that may be disturbing to some respondents can be rephrased.

Unless this survey is being administered for course assessment you will need to obtain IRB approval. Click here for more information about the IRB process.

2. Writing the items

Do not start by writing items. Start by listing the topics that you want information on or the knowledge/skills/attitudes that you are interested in examining. If possible, obtain the input of a representative sample of experts (this may be other instructors or other students whose opinions you are assessing) to determine the main content areas that questions should address. This list should then be used as the template for the content of the questions.

Adjust your survey length. Carefully examine each item that you have written and eliminate items that do not contribute unique necessary information. Also split questions that are compound or asking two different things at once. Overall survey length should be based on content needs but also keep your audience in mind, you want them to continue paying attention and not skip questions.

Know your audience/participants. Keep the general reading level of your survey takers in mind when reviewing questions for clarity. Word can check documents for reading-grade level (under TOOLS, then OPTIONS, find the GRAMMER AND SPELLING tab and turn on “show readability statistics”. When you spell check your document you will now also get readability scores). These scores also provide other helpful information like the percentage of your document written in the passive form.

Obtain the kind of information you need. Make sure that your questions ask for the kind of response you need for your study goals and objectives. If you need numerical data for subsequent analysis your questions should be answerable in discreet categories and not free-answer (unless you have a system for coding these responses). Also examine each item for precision, are the questions too broad or too general to obtain the information you require?

Match your question with your responses. There should be a match between the phrasing of your items and the response choices that are given, both in grammar and in meaning.

Other general recommendations:

  • Keep your questions simple but precise- ask for exactly what you want.
  • Avoid questions that could be interpreted more than one way.
  • Avoid leading questions or those with obvious socially desirable answers.
  • Phrase sensitive questions carefully and place at the end of the survey.
  • Put general questions before specific ones.
  • Avoid compound questions, have only one topic/phrase per question.
  • Place items with similar phrasing/response choices together.
  • Use negative phrases and questions with “not” carefully and sparingly.
  • Avoid overly-technical language unless you are testing this knowledge.

3. Writing the response choices

Choose the most appropriate response format. Your choice of format should be based on the purpose of your survey. In general, there is no single correct format. Free or open response items are appropriate for obtaining qualitative or exploratory information, and forced choice formats, like yes/no or 5 point rating scales ranging from “strongly disagree” to “strongly agree” are appropriate for obtaining quantitative information and when all appropriate responses are known.

Types of response format:

Free or open response items. These items leave a blank space or line after the question.

  • Guide the level of specificity you are looking for in their answer with your question. For example, if you don’t want a brief answer, don’t ask a very specific question that can be briefly answered.
  • Don’t ask questions that require several levels of thought, like self-evaluation of a recalled event, as answers are harder to compare/interpret.
  • If you have a longer survey with many of these items you can induce responder fatigue and people may not finish/give short answers only.
  • Unless you have a pre-determined coding scheme, these answers can be difficult to compare and score, but they are good for initial or exploratory research.

Yes/No items. These items provide limited information, although they are easy to score. Consider adding a “don’t know” or “no opinion” if you only want answers from more knowledgeable/interested sample.

Labeled response or Likert-type scales. These are multiple choice answer formats with a range of labels that can vary from numbers (1 to 10), to single words (never, sometimes, always), to short phrases (strongly agree to strongly disagree), to full sentences (like a range of attitudes on a topic).

  • The choice between the number of responses (ex., 5 vs. 7) is debated, but more than 7 are often unnecessary and less than 5 may affect the size of the range of answers you observe, which can affect statistical calculations, if this is a concern. An odd number of responses is preferable because it gives a middle point of choice.
  • Choose clear labels for your response choices and use them consistently throughout the survey. Readers will likely not notice minor changes in wording.
  • Make sure these response choices are exhaustive (ex., provide an “other” response if necessary) and mutually exclusive to prevent confusion.
  • Ensure your labels are meaningful and will mean the same thing to every person. For example, what do you mean by “frequent” versus “seldom” usage of the internet? Define your terms where possible.

Match your responses with your question. As mentioned above, the grammar of each response should be appropriate to the question. Each response should either form a full sentence with the question stem or be a complete sentence on its own.

4. Pre-testing or checking your survey

Get feedback: There are various ways that you can get feedback on your survey before the final administration of your study.

  • You can have the survey read by peers or survey construction experts for clarity and to evaluate its content or face validity (whether the questions actually relate to the concept you are trying to measure), and are appropriate in terms of length and clarity.
  • You can administer your survey to a sub-sample of the people you will be administering your survey to for your study, one at a time, and ask them what they think you are looking for with each question to check your intended versus perceived meaning. Ask respondents to rephrase items as they understood them or to think aloud as they derive answers.
  • You can administer your survey to a sub-sample of the population you want to measure for your study and examine their responses.
  • If you have enough people fill out your survey, you can examine the individual items to see which ones are the best items using IRT (Item Response Theory) analysis. This will give you statistics on individual items- the extent to which they are good indicators of the trait/ability/attitude your test is attempting to measure.
    • For more information see Dr. Brown's online textbook with other links to websites, tutorials and texts.
  • If you intend to publish your findings in empirical journals you may want to validate all or part of your survey. A full validity study would involve repeated administrations of your survey and other theoretically similar and dissimilar surveys and comparisons of responses on these surveys, as well as performing other analyses like item analyses (use the references below to find statistics/psychometrics texts for more information on this topic).

Further help:

The UCF Office of Operational Excellence and Assessment Support (OEAS) offers further professional assistance in designing, writing, and administering surveys.

Dr. Trochim's Research Method website contains further helpful information on survey construction issues.

Dr. Brown's online IRT textbook with other links to websites, tutorials and texts.

Further reading:

Fink, A. (2006). How to conduct surveys: A step by step guide (3rd Ed.). Thousand Oaks: Sage Publications.

Introductory text on the main issues involved in writing surveys, analyzing information from them, and presenting your findings.

Fowler, F. J. (1993). Survey research methods (2nd Ed.). Newbury Park, CA: Sage Publications.

A text on testing methods and theory for those already familiar with basic statistical concepts and techniques.

Nunnally, J. C. (1978). Psychometric Theory (2nd Ed.). New York: McGraw-Hill.

An advanced text on testing methods and theory with in depth explanations of reliability, validity, and statistical testing.

Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: Wiley.

Introductory text on writing and administering surveys written in easy-to-understand language.

Trochim, William M. The Research Methods Knowledge Base, 2nd Edition.

Well-designed website with further information on survey construction, validity and reliability issues, scaling, and other research topics

 

Faculty Spotlight View Other Award Winners

Melody Bowdon
College of Arts and Humanities Melody     Bowdon I have taught undergraduate writing classes for over fifteen years, and the most gratifying aspect of my experience has always been seeing students make ethical use of concepts and techniques learned from my classes in their lives as professionals and citizens. For me, teaching writing is teaching thinking, and th...

Christopher Leo
College of Business Christopher       Leo Mark Twain said that "the two most important days in your life are the day you are born and the day you find out why." One of my goals as a college instructor is to help students answer the latter part of that statement. It is (or should be) the single most important topic on every student's mind. My primar...

Aubrey Jewett
College of Sciences Aubrey Jewett Active learning fosters academic success and critical thinking. When students take part in the learning process they learn more, retain more, and think more than when operating in the passive mode. Thus whether analyzing pressing public policy questions, performing statistical analysis on data to investigate a res...