Make Me a Match!

2-pattern-matching

One of my favorite (but not used all that much) test item types is the “matching exercise.”  One class I teach has quite a bit of vocabulary that my students just flat-out need to memorize.  Matching seems like a good, concise way of testing them with a minimum amount of pain on their part (writing the answers) and my part (creating the test).

The sources all agree on the definition:

A matching exercise consists of a list of questions or problems to be answered along with a list of responses.  The examinee is required to make an association between each question and a response.

(Source:http://www.iub.edu/~best/pdf_docs/better_tests.pdf)

I was pleased to see this same source describing the types of material that can be used:

The most common is to use verbal statements…  The problems might be locations on a map, geographic features on a contour map, parts of a diagram of the body, biological specimens, or math problems.

Similarly, the responses don’t have to be terms or labels, they might be functions of various parts of the body, or methods, principles, or solutions.

This other source, http://teaching.uncc.edu/learning-resources/articles-books/best-practice/assessment-grading/designing-test-questions, lists

  • terms with definitions
  • phrases with other phrases
  • causes with effects
  • parts with larger units
  • problems with solutions

As you can see, this test item format is well-suited for testing the Knowledge Level of Bloom’s Taxonomy, however several sources hint that it can apply to the Comprehension Level “if appropriately constructed.”

Only one source discusses in detail how to “aim for higher order thinking skills” by describing variations that address, for example, Analysis and Synthesis.  (http://www.k-state.edu/ksde/alp/resources/Handout-Module6.pdf)

One variation is to give a Keylist or Masterlist, that is information about several objects, and have the student interpret the meaning of the information, do comparisons (least/greatest, highest/lowest, etc.), and translate symbols.  The example gives three elements from the periodic table with the properties listed below them but no title on the properties.  The questions ask “Which of the above elements has the largest atomic weight?” and “Which has the lowest melting point?” and other similar inquiries.

Another variation is a ranking example:

Directions:  Number (1 – 8) the following events in the history of ancient Egypt in the order in which they occurred, using 1 for the earliest event.

These directions are followed by a list of events.

While I see these variations more as the “fill-in-the-blank” types, their connections to matching properties to objects or events to a time line make it reasonable to treat them as matching types.

What are the advantages and disadvantages of matching exercises?

(Source: http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/)

These questions help students see the relationships among a set of items and integrate knowledge.

They are less suited than multiple-choice items for measuring higher levels of performance.

 

(Source: http://www.iub.edu/~best/pdf_docs/better_tests.pdf)

Because matching items permit one to cover a lot of content in one exercise, they are an efficient way to measure.

It is difficult, however, to write matching items that require more than simple recall of factual knowledge.

 

(Source:  http://teaching.uncc.edu/learning-resources/articles-books/best-practice/assessment-grading/designing-test-questions)

Maximum coverage at knowledge level in a minimum amount of space/prep time.

Valuable in content areas that have a lot of facts.

But

Time consuming for students.

There are design strategies that can reduce the amount of time it takes for students to work through the exercise, and others that don’t put so much emphasis on reading skills.  We’ll look at those in the next post.

Multiple Choice and Bloom’s Taxonomy

Blooms_Taxonomy_pyramid_cake-style-use-with-permission

*Graphic from http://tips.uark.edu/using-blooms-taxonomy/

It is often thought that multiple choice questions will only test on the first two levels of Bloom’s Taxonomy: remembering and understanding.

However, the resources point out that multiple choice questions can be written for the higher levels:  applying, analyzing, evaluating, and creating.

First, we can recognize the different types of multiple choice questions.  While I have used all of these myself, it never occurred to me to classify them.

Source: http://www.k-state.edu/ksde/alp/resources/Handout-Module6.pdf

Types:

Question/Right answer

Incomplete statement

Best answer

In fact, this source states:

…almost any well-defined cognitive objective can be tested fairly in a multiple choice format.

Advantages:

  • Very effective
  • Versatile at all levels
  • Minimum of writing for student
  • Guessing reduced
  • Can cover broad range of content

Can provide an excellent basis for post-test discussion, especially if the discussion addresses why the incorrect responses were wrong as well as why the correct responses were right.

Disadvantages:

  • Difficult to construct good test items
  • Difficult to come up with plausible distractors/alternative responses

They may appear too discriminating to students, especially when the alternatives are well constructed and are open to misinterpretation by students who read more into questions than is there.

So what can we do to make multiple choice questions work for higher levels of Bloom’s?

Source: http://www.uleth.ca/edu/runte/tests/

To Access Higher Levels in Bloom’s Taxonomy

Don’t confuse “higher thinking skills” with “difficulty” or “complicated”

      • use data or pictures to go beyond recall
      • use multiple choice to get at skill questions

Ideas:

  • Read and interpret a chart
  • Create a chart
  • “Cause and effect”  (e.g., read a map and draw a conclusion)

Another part of this source brings up the idea of using the “inquiry process” to present a family of problems that ask the student to analyze a quote or situation.

    • No more than 5 or 6 questions to a family
    • Simulates going through inquiry process, step-by-step
      • Identify the issue
      • Address advanced skill of organizing a good research question
      • Ask an opinion question (but not the student’s opinion)
      • Analyze implicit assumptions
      • Provide for a condition contrary to the facts, “hypothesize”

This source gives some good ideas, too.

Source: http://www.k-state.edu/ksde/alp/resources/Handout-Module6.pdf

Develop questions that resemble miniature “cases” or situations.  Provide a small collection of data, such as a description of a situation, a series of graphs, quotes, a paragraph, or any cluster of the kinds of raw information that might be appropriate material.

Then develop a series of questions based on that material.  These questions might require students to apply learned concepts to the case, to combine data, to make a prediction on the outcome of a process, to analyze a relationship between pieces of the information, or to synthesize pieces of information into a new concept.

In short, multiple choice questions, when designed with good structure and strategies, can provide an in-depth evaluation of a student’s knowledge and understanding.  It can be challenging to write those good questions but the benefits are worthwhile.

I thought about writing a summary of what we have learned about multiple choice questions but found this funny little quiz to be better than anything I could come up with:

Can you answer these 6 questions about multiple-choice questions?

Using Bloom’s in Test Writing

bloom-verbs

When I first started considering Bloom’s Taxonomy, I thought it was good to help expand my ideas on how to test but I struggled with applying it directly.  I appreciated the increasing cognitive levels but needed help in writing test questions that utilized them.

What I found were lists of verbs associated with each level.  A good one to start with is:

Source: http://www.lshtm.ac.uk/edu/taughtcourses/writinggoodexamquestions.pdf

A table of suggested verbs mapped against the Anderson and Krathwohl adapted levels of Bloom’s Taxonomy of Cognition Cognitive Level Verb Examples

  1. Remember: define, repeat, record, list, recall, name, relate, underline.
  2. Understand: translate, restate, discuss, describe, recognise, explain, express, identify, locate, report, review, tell.
  3. Apply: interpret, apply, employ, use, demonstrate, dramatise, practice, illustrate, operate, schedule, sketch.
  4. Analyse: distinguish, analyse, differentiate, appraise, calculate, experiment, test, compare, contrast, criticise, diagram, inspect, debate, question, relate, solve, examine, categorise.
  5. Evaluate: judge, appraise, evaluate, rate, compare, revise, assess, estimate
  6. Create: compose, plan, propose, design, formulate, arrange, assemble, collect, construct, create, set-up, organise, manage, prepare.

Here is an extensive list that is printable on one page, useful for reference while you are designing your test:

Bloom’s Verbs, one page.

Other useful lists:

Bloom’s Verbs for Math

Bloom’s Question Frames (looks very good for English, literature, history, etc.)  This gives you nearly complete questions which you can manipulate into test items appropriate to your discipline.

More Bloom’s Question Frames (2 pages).

Bloom’s Verbs for Science

What comes across to me again and again throughout the sources is that considering the hierarchy when designing exams creates a culture of learning that involves thinking deeply about the course material, taking it beyond simple rote memorization and recitation.

This culture would benefit from also considering Bloom’s while you are teaching.  Modeling higher level thought processes, showing joy at cognitive challenges, exploring topics in depth (if time permits) or mentioning the depth exists (if time is short) can send a strong signal that thinking is valued and important to learning.

Another view on Bloom’s as applied to test writing is to consider the knowledge domains inherent in your course material.  They are:

Source: http://www.lshtm.ac.uk/edu/taughtcourses/writinggoodexamquestions.pdf

The kinds of knowledge that can be tested

Factual Knowledge

Terminology, Facts, Figures

Conceptual Knowledge

Classification, Principles, Theories, Structures, Frameworks

Procedural Knowledge

Algorithms, Techniques and Methods and Knowing when and how to use them.

Metacognitive Knowledge

Strategy, Overview, Self Knowledge, Knowing how you know.

When I put this list with the verbs lists, I get more ideas for test questions and directions for exploring student acquisition of the course knowledge.

Defining Bloom’s Taxonomy

fx_Bloom_New

One recurring recommendation in the resources is that we should consider Bloom’s Taxonomy when designing tests. To do so, we should know what it is.

The triangle above is a version of the revised Bloom’s, using active verbs and with an addition of one level and a slight reordering at the top.

According to http://www.learnnc.org/lp/pages/4719,

Bloom’s Taxonomy was created in 1948 by psychologist Benjamin Bloom and several colleagues. Originally developed as a method of classifying educational goals for student performance evaluation, Bloom’s Taxonomy has been revised over the years and is still utilized in education today.

The original intent in creating the taxonomy was to focus on three major domains of learning: cognitive, affective, and psychomotor. The cognitive domain covered “the recall or recognition of knowledge and the development of intellectual abilities and skills”; the affective domain covered “changes in interest, attitudes, and values, and the development of appreciations and adequate adjustment”; and the psychomotor domain encompassed “the manipulative or motor-skill area.” Despite the creators’ intent to address all three domains, Bloom’s Taxonomy applies only to acquiring knowledge in the cognitive domain, which involves intellectual skill development.

The site goes on to say:

Bloom’s Taxonomy can be used across grade levels and content areas. By using Bloom’s Taxonomy in the classroom, teachers can assess students on multiple learning outcomes that are aligned to local, state, and national standards and objectives. Within each level of the taxonomy, there are various tasks that move students through the thought process. This interactive activity demonstrates how all levels of Bloom’s Taxonomy can be achieved with one image.

Further, http://www.edpsycinteractive.org/topics/cognition/bloom.html tells us,

The major idea of the taxonomy is that what educators want students to know (encompassed in statements of educational objectives) can be arranged in a hierarchy from less to more complex.  The levels are understood to be successive, so that one level must be mastered before the next level can be reached.

And also,

In any case it is clear that students can “know” about a topic or subject in different ways and at different levels.  While most teacher-made tests still test at the lower levels of the taxonomy, research has shown that students remember more when they have learned to handle the topic at the higher levels of the taxonomy (Garavalia, Hummel, Wiley, & Huitt, 1999).

Let’s see what each level represents.  The following list is based on the original Bloom’s categories but it is still enlightening.

Source: http://www.calm.hw.ac.uk/GeneralAuthoring/031112-goodpracticeguide-hw.pdf

Table 2.2 Bloom’s taxonomy and question categories

Competence        Skills demonstrated  

Knowledge             Recall of information

Knowledge of facts, dates, events, places

Comprehension    Interpretation of information in                                                                            one’s own words

Grasping meaning

Application            Application of methods, theories,                                                                          concepts to new situations

Analysis                 Identification of patterns

Recognition of components and their           relationships

Synthesis                Generalize from given knowledge

Use old ideas to create new ones

Organize and relate knowledge from several areas

Draw conclusions, predict

Evaluation             Make judgments

Assess value of ideas, theories

Compare and discriminate between ideas

Evaluate data

Based on the work by Benjamin B.S. Bloom et. al. Evaluation to Improve Learning (New York: McGraw-Hill, 1981)

We will look at these in more detail in the next post.