You are here: Home / Research / Projects / COGEA

COGEA

Cognition in Educational Assessment

Background

The increasing impact of standardized measures of students' competencies, especially in politically influencing large scale assessment like PISA, PIRLS and TIMSS demonstrates the importance of high quality assessment tools and test items for educational research.

To optimize such tests and reduce error variance in the measurement of domain specific competencies, more systematic knowledge concerning effects of item characteristics is needed. For instance, assessment and item content should only address the focal construct. However, many factors unrelated to the focal construct may impact item processing and bias the assessment outcome.

As an example, high reading requirements in tests assessing science competence might interfere with the intended measurement of science competence, as reading ability most probably influences successful test-taking. Integrating a picture into an item may help particularly students with poorer reading skills as reading demands are thereby reduced. Thus, the representation format (text plus additional picture) could enhance comprehension of test items by representing the item content more clearly and therefore minimizing the construct irrelevant variance. However, it still remains unclear which underlying mechanism causes item processing outcomes. Specifically, there is a lack of systematic research about item characteristics and their interaction with students' characteristics. This type of information is needed to improve item designs and thereby reduce test validity issues or compensate for potential bias.

Methodologies

As a theoretical background, we employ well-known theories and models from cognitive psychology and focus on the following three methodological strands:

  1. Using data from large scale and longitudinal assessments like TIMSS, PISA and NEPS we conduct detailed analyses using complex statistical analyses based on Item Response Theory (IRT) and Structural Equation Modelling (SEM) to investigate which item characteristics predict item difficulty (and for whom) and to identify measurement bias.
  2. We experimentally manipulate substantial item characteristics (i.e. item format, presentation of information, use of pictures) and investigate their influence on psychometrical test outcome parameters in paper pencil tests to investigate isolated item characteristic effects for certain groups of students. Furthermore, we measure potentially important student characteristics (e.g. working memory, prior knowledge or reading abilities) to connect them to the psychometrical test outcome in relation to item characteristics.
  3. In addition to the experimental paper-pencil studies we use process-oriented approaches to investigate mechanisms of item processing on a fine-grained level by using eye tracking methodology and think aloud protocols. These process data are also analyzed with regard to students' characteristics.

 

Aims and Scope

The COGEA project aims to systematically investigate test characteristics and student characteristics to provide differentiated evidence about the effects of item characteristics in certain groups of students and thereby allow for an optimization of test construction in practice. We are also highly interested in students' solution processes of different item types in general and in generating cognitive oriented models of test processing.

Staff

Dr. Steffani Saß
Dr. Marlit Annalena Lindner
Benjamin Strobel
Prof. Dr. Olaf Köller