Multiple-choice (MC) tests are commonly used in educational settings for an assessment of learning outcomes. This test format has many advantages as an instrument to measure knowledge and skills, due to its high standardization and economical application. Despite the high popularity of MC-questions, difficulties must be taken into consideration. A bias in test results due to the effects of guessing might for example create problems. Persons not convinced they know the answer might guess correctly and produce a better result than they deserve. Even though MC-tests have been explored mostly from a test-theoretic perspective, little reliable empirical evidence exists pertaining to strategies people use when solving MC-items. Understanding these processes might help to improve construction of MC-questions in the future and improve validity of tests. This project implements a stationary eye-tracker which records eye-movements to get some direct insight into a person's cognitive processing while solving a MC-Test on a computer. The first research approach will investigate the impact of knowledge (experts vs. novices) on solution strategies in MC-tests. Potential influencing factors, such as testing-time or knowledge about how to answer MC-questions (testwiseness) will also be considered in future research.
Our eyes are "the key to the visual world" and provide continuous information about our environment. As only a small area on the eye's retina (the fovea) gives a clear reflection of the outside world the eyes move steadily, more or less unconsciously. The actual processed visual stimulus must fall on the fovea, the place of sharpest vision. Hence the current directional view indicates where a person places attention and what information is received or processed. Thus, not only the content but also the timing of information acquisition can be traced. As modern eye-tracking technology allows a precise recording of eye-movements, various parameters can be utilized for analysis and thereby to assess solving strategies of MC-items. Data collection is supplemented by questionnaires and interviews to enhance information of the subject's processing.
In a first study an expert-novice comparison is used to investigate whether there are in fact strategic differences in solving MC-items dependent on the status of knowledge. The influence of a set testing-time on expected strategies of experts and novices is evaluated in a second study. Other important parameters, such as knowledge about how to solve MC-tasks in general (testwiseness) and MC-item format will be analyzed in following research.
Especially for this research project a MC-test with 22 questions on a specific science topic was developed, validated and normalized for different samples. The topic was chosen in a distinct field that is not general knowledge and requires a certain expertise to correctly answer the questions. Based on student samples, experts and novices recruited for the studies can be distinguished.
It is expected that experts and novices differ considerably in their approach of solving MC-questions, thus different strategies can be identified as a function of prior knowledge. For example, most likely experts will pay less attention to wrong options (distractors), solve questions faster and on average give more correct answers than novices.
Dr. Gun-Brit Thoma
Dr. Marlit Annalena Lindner
Prof. Dr. Olaf Köller
Dr. Inger Marie Dalehefte, Universitetet i Agder, Norwegen