logo Writing multiple choice (AND other) questions

"Executive" Summary:

  • Items (question + options) should ...
    • Test concepts defined in the course's learning goals.
    • Test same learning level as corresponding learning goal.
    • Reflect a balanced range of "Bloom's levels".
    • Use item analysis or validation to establish validity.
  • Stems (the question) should ...
    • Use complete statements.
    • Keep total length as short as possible, without being confusing.
    • Ask for the correct answer, not the wrong answer.
    • Avoid absolute terms.
    • Also avoid imprecise terms.
    • Avoid logical and grammatical clues.
  • Options should ...
    • Follow gramatically from the stem.
    • Relate to other options.
    • Only use "none of the above" and "all of the above" sparingly.
    • Have numerical or chronological options ordered logically.
    • Have varied positions for correct answers.

Additional resources:



This page is derived from the 8-pg summary at CTLT hard copy resources.

General remarks:

Select topics for your questions by reviewing the readings, activities and objectives. Identify the important concepts to be tested. Well written learning goals do this for you automatically. Guidelines for selecting topics are:

  1. Ask only about important concepts.
  2. Constantly ask yourself "does this really matter?"
  3. Follow learning goals closely.
  4. Learning Goals are easier to use if they encompass a range of Bloom's Taxonomy.
  5. Ask questions covering a range of Bloom's Taxonomy. More on this below, including corresponding question stems.
  6. Use precedent: morph old questions, start from old question sets, use online question sets, etc.
  7. Have colleagues "vet" the questions you write. This is very important.
  8. Examine student response patterns to improve questions for next time.

How to ask it?

When beginning to construct multiple-choice question you should pose the question (the "stem") first. A well-constructed stem is a stand-alone question that can be answered without examining the options. Some guidelines are:

  1. Stems should be stand-alone questions.
  2. Stems should be grammatically complete.
  3. Negative stems should be used with caution.
  4. Do not incorporate words in distractors that could be incorporated in the stem.
  5. State stems so that one option is indisputably correct.
  6. Construct using simple sentences.

The wording of the stem and the verbs it contains, determines the overall cognitive level of the question. It can be useful to use Bloom's Taxonomy to help you prepare the stems to test concepts at the appropriate level. But don't ignore "low level" learning goals. Treat Bloom's taxonomy as a pyramid or sequence of foundation to expert behaviour. Writing multiple-choice questions at the higher Bloom’s is possible, but can be very difficult and time-consuming.

What options to include?

Create the options (both correct and incorrect answers) after writing the stem. Options should focus on testing the understanding of important concepts and testing common misconceptions. The challenge of creating plausible distractors is the most difficult aspect of creating MCQs. Guidelines for construction options:

  1. Good distractors are:
    1. Accurate statements that do not meet the full requirements of the problem.
    2. Incorrect statements that might seem right to the student.
    3. Each incorrect option should be plausible but clearly incorrect.
  2. The intended answer is correct and un-arguably the best.
  3. "All of the above" should be avoided; "none of the above" should be used with caution.
  4. Distractors should be of similar length.
  5. Distractors should be grammatically consistent with stem.
  6. Use parallelism in constructing the distractors.
  7. Numerical answers should be placed in numerical order if possible.

When developing options it is useful to map them on a continuum from correct to incorrect in order to visualize the “correctness” of a given option.

If all distractors except the correct one are clustered around the incorrect end of the spectrum then the question will be unambiguous.
If options cluster at the correct end of the continuum the stem should include words like: which is MOST significant? What is MOST important? What would be the BEST solution? These kinds of questions require finer discrimination by the students. They can also lead to problems…..use with caution, and validate if possible.

How hard?

Multiple-Choice Questions have a reputation for only testing lower level skills like knowledge and recall. However it is possible to write questions targeting a higher “Blooms” level. Here is one example:

In your argument, you are citing a number of cases from different courts. This is the first time you cite any of these cases. What is the most accurate citation sentence (use your citation manual)?
  1. Wyman v. Newhouse, 93 F.2d 313, 315 (2d Cir. 1937); Henkel Co. v. Degremont, 136 F.R.D. 88, 94 (E.D. Pa. 1991), Willametz v. Susi, 54. F.R.D. 363, 465 (D. Mass. 1972).
  2. Henkel Co. v. Degremont, 136 F.R.D. 88, 94 (E.D. Pa. 1991); Willametz v. Susi, 54. F.R.D. 363, 465 (D. Mass. 1972); Wyman v. Newhouse, 93 F.2d 313, 315 (2d Cir. 1937).
  3. Willametz v. Susi, 54. F.R.D. 363, 465 (D. Mass. 1972); Henkel Co. v. Degremont, 136 F.R.D. 88, 94 (E.D. Pa. 1991); Wyman v. Newhouse, 93 F.2d 313, 315 (2d Cir. 1937).
  4. Wyman v. Newhouse, 93 F.2d 313, 315 (2d Cir. 1937), Willametz v. Susi, 54. F.R.D. 363, 465 (D. Mass. 1972), Henkel Co. v. Degremont, 136 F.R.D. 88, 94 (E.D. Pa. 1991).

Here students are asked to select the citation that is most accurate. All citations have errors and the students are really being asked to “hypothesize” which errors will have the greatest impact on the citations effectiveness.  This question is testing at a very high “Blooms” level. Example due to Sophie Sparrow and Margaret McCabe from the Pierce Law Center in Concord, New Hampshire.


Ideally, difficult questions should be "validated". That means tested with students to ensure they read and think through the question in the same way as you expected them to when it was designed. For example, have a past student or graduate student "think aloud" while you watch them answer the question. Item analysis can also help identify questions that are not performing as you expected.

A good example of how validating with students reveals important information about how students perceive questions being posed is in Ding, Reay, Lee, and Bao, Are we asking the right questions? Validating clicker question sequences by student interviews, Am. J. Phys. Vol. 77 No.7, July 2009. Here is a useful paragraph from their "Summary and Discussion" section:

"Many validity issues missed by physics {geoscience, chemistry, whatever ...} experts were revealed by student interviews. Why do experts miss these issues? For them, correctly bringing in relevant information is an automatic task, much as driving a vehicle is for an experienced driver. When answering these questions, the experts optimize their attention allocation, ignoring irrelevant information and filling in missing information. But students don’t possess domain knowledge with the same breadth and depth and their knowledge often is not hierarchically structured. Consequently, students will sometimes perceive the questions differently than experts".

Item Analysis

Item Analysis is a highly recommended process that analyzes each item in terms of student response patterns. It helps you assess the test's validity, check for possible biases, and evaluate student strengths or weaknesses. The Vista course management system can produce tables for item analysis using "reporting of assessments". See Vista documentation or ask a colleague or Vista support person. For an introduction to Item Analysis see the last two pages of the 8-pg Introduction the Multiple Choice Question Writing (MS Word *.doc format) mentioned above.