The quest for distractors: Getting the wrong answers right

I need distractors. And not because I am bored. No, I am looking for information on distractors in multiple choice questions.

The word distractors is used for the alternatives to the correct answer. So we are talking the baddies, the wrong ones. But how do I get those right?

Literature is certain about one thing in relation to distractors. And anyone who has ever developed a test with multiple choice questions will agree:

It is difficult to create good distractors.

Naturally, writing a good question (or item) is an art. But we get them right most of the time. However, creating good distractors – nice alternatives – is a real challenge.

Here’s some help.

An important condition for the answer alternatives is: All distractors must be likely. That is, all distractors should be a potential answer to the question.

But when you are not an expert in the subject matter then you will most likely see all of the response options as equally logical or probable.

The purpose of creating good distractors is distinguishing the good from the bad test takers. No doubt you want a good candidate to answer all of the questions correctly. And the candidate who has not studied hard enough will be thrown off balance by the answer alternatives. He will start guessing what the correct answer is and, eventually, he will fail the test (hopefully). Guessing is done when a candidate is unsure. And what this means is that the candidate will try and argue which answer fits the question.

Recommendations for creating great distractors:

  1. all distractors must be equally likely and grammatically and factually correct.
  2. all distractors must be of similar length and try to keep them brief. Provide relevant information in the question instead of in the answer alternatives.
  3. make use of counterexamples for creating distractors, do not use (double) negatives.
  4. all distractors must be written in the same style. If possible, avoid lingo and watch out for vague descriptions.
  5. use a limited number of distractors. Three answer alternatives are as good as four alternatives. In practice one out of four answer alternatives is rarely selected.

And on that last point: It is a best practice to analyse the performance of your items. Verify whether all of your answer alternatives were used by candidates. Have a look at this pie chart.

afleider

You can see that the alternative D is never selected. Alternative C is selected by two of the 201 candidates.

Conclusion: there is work to be done.

In any case, alternative D can be deleted.

Last but not least: Four-eyes principle

Remember always that the best way to make good questions and answers is the four-eye principle.

When you create a question with possible answers, you have to check the question by a colleague. Two see more than one.

Good luck with making good distractors.

Want to read more?

Writing good multiple choice test questions

Multiple choice is king

Computer based testing tools all offer the well-known and highly maligned multiple choice question, also known as the MCQ item type.

But did you know that testing (or examination) tools offer many other different item types? And that most of these are based on closed questions?

Candidates’ responses to closed questions can be automatically marked.

In my view this is a great example of the benefits that testing software offers versus the classic paper and pencil test.

Providing the candidate with the result of his or her test does not require manual intervention. The result can be automatically sent to the candidate or the institution that sponsors the test.

Types of closed questions

Examples of closed questions include:

  • Multiple Choice Single Response: One of the answers is correct
  • Multiple Choice Multiple Response: More than one answer is correct
  • Drag and drop (matching): Drag and drop an object in an image or piece of text
  • Ranking: Put lines of text or images in the correct order
  • Fill in the blank: Enter the correct word or combination of words into a text box
  • Hotspot: Place a marker on the correct spot in a picture, video or image
  • Numerical: Enter the answer to a numerical or mathematical question

All of these questions can be marked automatically.

But are all these item types used?

Well no, not really.

Have a look at this data that we pulled from Sisto:

 

Itemtype Volume  Share
Hotspot (single marker)

3,988

0,0%

Hotspot  (multiple markers)

4,248

0,0%

Fill in the blank (single)

37,043

0,1%

Fill in the blanks (multiple)

56,546

0,1%

Multiple Choice Single Response

38,692,327

90,2%

Multiple Choice Multiple Response

1,273,913

3,0%

Numerical

898,283

2,1%

Essay (computer based)

1,428,539

3,3%

Ranking

81,307

0,2%

Essay (on paper)

185,965

0,4%

Matching

10,930

0,0%

Speaking

231,100

0,5%

Upload

166

0,0%

Total

42,904,355

100,0%

Clearly the multiple choice item type is the most popular. By a long way.

So why do the technical specifications of tenders or requests for proposal often put such a strong focus on the types of items that a vendor is able to support?

It really is not as important as we are led to believe.

Take it from me: The multiple choice question is king!