In many fields, “common sense” can lead students astray. Before stepping into a classroom, students have formed hypotheses and theories based on observations and experience, but what seems to make sense based on casual observation may be, in fact, false. These misconceptions can be worse than complete ignorance, as the misconceptions have to be corrected in order for new information to be learned. In fact, most of the time, students simply modify their existing understanding to accommodate the new concepts rather than internalizing the correct knowledge, leading to a mash-up of correct vocabulary mixed with partially correct theories (Hestenes, 2006).
There are several important questions related to student misconceptions. First, what misconceptions do students have when they begin a course? Also, is the course effective at replacing misconceptions with a deep understanding of the concepts which are essential to the course, or are students learning the material by rote? Finally, are some teaching methods more effective for imparting this deep learning? Obviously, these misconceptions can be challenging to assess using conventional methods.
One way to address these misconceptions is by administering a Concept Inventory assessment. A concept inventory is a multiple choice test that forces students to choose between the correct concepts and common sense alternatives (Hestenes, Halloun & Wells, 1992). The inventory is administered at the beginning of a course to get a baseline level of student understanding, and again at the end of a course. The difference between the scores represents the students’ change from misconception to accurate and deep understanding of the concepts.
Because concept inventories are designed to assess understanding of concepts, the questions focus on reasoning, logic, and general problem solving, rather than facts, definitions, or computations. Initial questions may be followed by a second multiple choice question that asks for the reason why an answer was given. For example, the following two questions are part of the Chemistry Concept Inventory (Mulford, 1996.) Answers follow at the end of the article.
Unlike traditional multiple-choice exams, concept inventory questions are criterion-referenced, meaning the questions should be directly linked to the concepts and misconceptions the inventory is designed to assess. The distracters (incorrect responses) for each question should be matched to common misconceptions.
To create a concept inventory, begin by selecting the theories or concepts that are most critical to success in the subject area. Then, identify common misconceptions that students have about those concepts. For experienced faculty members, this could be based on observation and experience, at least initially. For greater accuracy, misconceptions can also be identified through open-ended exams that require students to explain their reasoning. Interviews with students are very informative about the common sense theories they have constructed. It also may be possible to review literature on common student misconceptions about the concepts.
Use the common misconceptions to develop multiple-choice questions that are problem-oriented and concept-based rather than computational or factual. To many faculty, the questions on a concept inventory seem to be too easy or trivial, but that is natural (Hestenes, Halloun & Wells, 1992). Because the questions are based on essential concepts as opposed to complexities, errors are indicative of lack of understanding, while correct responses may not indicate mastery as traditionally understood.
After administering the concept inventory as both a pre- and post-test, compare the scores. Ideally, the scores should improve substantially. If there is little change overall, or little change for a particular concept, reconsider the questions, and examine the teaching strategies used. If possible, it is particularly helpful for multiple faculty members to administer the inventory to multiple sections. Over time, continue to revise teaching strategies to improve students’ mastery of the concepts they struggle with.
Naturally, there are many factors that affect the results of a concept inventory. The ultimate goal is to identify student misconceptions and to determine whether those misconceptions are corrected. Hestenes and Halloun (1995) argue that a well-written concept inventory, like their Force Concept Inventory (FCI), is best analyzed as a whole rather than as individual questions. The result is an indication of how well students understand the concepts overall, as opposed to how they respond to specific questions.
Developing an accurate and valid concept inventory is a matter of research, time, and revision. Fortunately, many individuals who have already developed concept inventories welcome other faculty to use their exams and to add their data to the ongoing study of the instrument. Several of those examples follow.
Examples of Concept Inventories:
Concept inventories are most common in mathematics, the sciences, and engineering, but can be applied to any field. The first widely-disseminated concept inventory was the Force Concept Inventory (Hestenes, Halloun, & Wells, 1992), which assesses basic understanding of Newtonian physics. There are also concept inventories to assess introductory knowledge in chemistry, digital logic (a branch of computer science), and statistics, among many others. Use the links below to view several examples (some require a password that can easily be acquired by emailing the contact listed on the website.) Many of the teams welcome other faculty to use the inventories and contribute additional data to ongoing evaluation projects.
Additional examples are available at https://engineering.purdue.edu/SCI/workshop/tools.html (Allen, 2007).
The Faculty Development and Instructional Design Center will be offering a workshop on this topic titled "Concept Inventories: Measuring Learning and Quantifying Misconceptions" on March 8, 2011 from 11:30 - 1:00. Registration details will be available soon.
Allen, K. (2007). Concept Inventory Central: Tools. Retrieved September 28, 2010, from https://engineering.purdue.edu/SCI/workshop/tools.html.
Hestenes, D. (2006). Notes for a Modeling Theory of Science, Cognition and Instruction. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/Notes_on_Modeling_Theory.pdf.
Hestenes, D., & Halloun, I. (1995). Interpreting the FCI. The Physics Teacher, 33, 502-506. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/InterFCI.pdf.
Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30 (3), 141-151. Retrieved October 1, 2010, from http://modeling.asu.edu/R&E/FCI.PDF.
Mulford, D. (1996). Chemistry Concepts Inventory. Retrieved October 1, 2010 from http://jchemed.chem.wisc.edu/JCEDLib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html.
Answers to sample questions
Spectrum is a newsletter for faculty published every fall and spring semester by Faculty Development and Instructional Design Center, Adams Hall 319, Northern Illinois University, DeKalb, Illinois 60115. Phone: (815) 753-0595, Email: firstname.lastname@example.org, Fax: (815) 753-2595, Web site: www.niu.edu/facdev. For more information about featured articles or upcoming faculty development programs, please contact the Center at (815) 753-0595 or email@example.com
Last Updated: 02/08/2011