Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 33 Next »

 Rationale

Critics declare that Multiple Choice Questions (MCQs, for brevity) assess only the lowest levels of student learning and offer students greater opportunities to guess at answers. They reward memorization while failing to stimulate communication, critical thinking, and analysis skills.

These complaints are often justified, particularly in regards to MCQs garnered from textbook supplements. But MCQs as assessment items can be defended on several points. In most fields, multiple forms of assessment are necessary because none are comprehensive or without drawbacks. MCQs’ obvious advantage is that they can be quickly graded, more so with computer technology. It may be true that broadly, education in the information age should de-emphasize memorization in favor of learning interpretation and creation skills. But many instructors argue that in order to build or exercise these skills, even college students need to learn some basic facts or concepts that previous education, popular culture, or internet searches do not typically provide. Carefully written MCQs can even test somewhat higher-level learning, such as comparison, application, and possibly basic analysis, focusing on particular concepts.

Unless employing a proctor or costly technologies and web services, an instructor cannot monitor student behavior if the student is taking an exam outside the classroom, either composing answers on paper or through a web-based exam system. A common assumption among faculty is that if a student has access to source material, such as a course textbook, their class notes, or the internet, then multiple choice questions are useless.

But with careful question-writing and assessment design, MCQs can still be useful tools online. Quick quizzes, or sets of MCQs, can be useful formative assessments, serving as auto-graded, low-stakes homework that lets students know how they are doing. Employing features included in many learning management systems, even exams can have multiple choice questions that challenge students to demonstrate what they have learned.

The advice below is directed at aiding instructors to get the best out of MCQs, and to demonstrate that MCQs still have some value even for online quizzes and exams. But within larger exams, instructors should use MCQs together with other question types or activities that enable students to demonstrate what and how they have learned, in different ways.

Traditional Guidelines for Good Multiple Choice Questions

Treat exam writing as a process, with drafting and revising stages. Don’t try to write more than a few MCQs in a single day. One tactic is drafting one or two questions after you craft a lesson or teach a class, when your sense of what you are teaching is strongest. Later on, revise these questions together. Check to make sure that each has a clear correct answer and that other options are clearly wrong. Decide whether any question yields the answer to another too easily.

Reusing MCQs is practical, but draft a few new MCQs each time you teach a course, so to be able to swap out questions in the active question bank, and occasionally retire a few questions.

The general instructions for the multiple choice questions should be explicit, both within the exam itself, and your discussion of the exam in class: Read the questions carefully. Choose the best answer, the one for which no room for quibbling or reasoned debate exists.

If you find yourself repeating a phrase across answer options, include it in the question stem, for efficiency.

For example, if I want students to choose the correct definition of Historiography, the stem should be “Historiography is the study of,” instead of including “is the study of” in each answer option.

Answer options should not quote sources. Appropriately rephrase them away from textbook language or favorite ways you say things in class. If you have specific reasons to use key terms, phrases, or ways of wording things, consider adding such clues in both correct and incorrect answers throughout the exam.

include at least three wrong answer options or “distractors” that are clearly incorrect to prepared students, but plausible nonetheless. Silly answers don’t help assess student comprehension. If you're really struggling to find a fourth distractor for a question, consider jargon (even invented) that sounds plausible only to the unprepared.

Negative question stems should be used sparingly. If you must, highlight the negative modifier for clarity: “Which of the following was NOT true concerning 1964 Presidential candidate Barry Goldwater?” “Margaret Sanger advocated all of the following except:” Avoid double negatives.

Be consistent in grammar, spelling, capitalization, syntax, and formatting. Carefully edit your questions to eliminate typos. If you copy and paste parts of questions between computer programs, be sure that font style and size, spacing, and other attributes are consistent. Any inconsistently can otherwise genuinely or misleadingly hint at a right or wrong answer.

Considerations for Online Exams

MCQs should be considered within the context of good online exam design.  MCQs are best employed when delivered to students from a larger question bank, at random, with perhaps answer choices randomized within each question.

Contents

 

 

 

 

 

Anatomy of a Multiple Choice Question

Typical of education, even multiple choice questions have jargon! In a multiple choice question, the question or partial statement that prompts students to choose the correct answer is called the stem. Wrong answers are called distractors.

 

 

 

 

 

D2L offers many options for crafting and delivering multiple choice questions. See our tutorial video for details

Online Multiple Choice Questions

To balance some advantage gained by students through source availability, multiple choice questions can be more challenging, without simply making students sweat over details or arcana. Good MCQs that better reflect information-age practice might bend or break traditional rules for writing MCQs. However, reasonably prepared students should be able to interpret and answer the question confidently. Give students the opportunity to show you what they’ve learned, but do not present them with brain-teasing puzzles or trick questions.

Traditionally, instructors are warned to avoid adding more than is strictly necessary to a MCQ stem, to define the question or the problem. On the other hand, “wordy problems” closely reflect the real world, where knowledge workers must come to identify what the question or problem really is in order to answer it. But make certain that prepared, diligent students can reasonably sift the question free from detail, to get a fair shot at answering it correctly.

Other instructors frown on stems that consist of a short phrase, perhaps lacking a verb, and that alone do not define a question or problem. But with carefully written alternatives, short stems that merely contextualize options can create effective and reasonable questions:

 

The third option is a basic, major development in U.S. history, that is commonly covered in college-level freshmen courses. But among the general public fascinated with presidents and battles, it is not widely known. A student even moderately prepared for the exam should have no trouble; a student searching for the answer on the web will waste valuable time. The distractors are reasonable, sounding plausible to guessers but no match for the correct answer for those who paid attention in class or read the textbook.

Various sources counsel avoiding MCQs that present dissimilar alternatives, but it can be a subject matter, judging how dissimilar choices should or should not be. The following has choices that are quite dissimilar, but plausible, with one clear choice for students who read and discussed Clausewitz’s concepts on warfare:

Based on class discussion of Clausewitz’s friction, a student should identify the last answer as the correct answer. The other three choices, dissimilar in several ways, should only entice students who failed to read or participate in class. (the second option is full of jargon that merely sounds militarily impressive.) At the same time, students coming to the subject cold by typing “Clausewitz” and “Friction” into Google will need to spend at least a few minutes reading on the subject. Unless they quickly answered all other questions, they probably don’t have the time to research this outside their own notes.

For an added challenge, especially to collaborators, you could add correct answers from some questions as distractors in other questions. For example:

Multiple-Choice Questions: Testing Facts or Comprehension?

Conventionally, MCQs test facts: names, events, dates, definitions, and so forth. But they can go beyond that, up a few steps on Bloom’s Taxonomy. For example, an MCQ can require students to apply theories, general principles, or models to identify or interpret a situation, draw conclusions, or forecast. The following (in addition to the Clausewitz example above) is an example:

Load factor--the first answer option is correct--is a standard indicator of utility business efficiency, and historically important for transportation history. Perhaps in class, a historical example included data from Buffalo or Cleveland, but not Pittsburgh, and was presented alongside other considerations of public utility businesses. For students who didn’t pay attention to course content, the wrong answers sound plausible and the question stem doesn’t provide nearly enough for the internet to help.

More Resources for Multiple Choice Questions

Weimer, MaryEllen.  "Examining Your Multiple-Choice Questions."  Faculty Focus.  26 February 2014.  http://www.facultyfocus.com/articles/teaching-professor-blog/examining-multiple-choice-questions/

Center for Teaching and Learning at the University of Texas at Austin.  "Multiple Choice Questions."  http://ctl.utexas.edu/teaching/assess-learning/question-types/multiple-choice

Teaching Effectiveness Progam at the University of Oregon.  "Writing Multiple-Choice Questions that Demand Critical Thinking."  http://tep.uoregon.edu/resources/assessment/multiplechoicequestions/mc4critthink.html

 

  • No labels