178
pragmatic constraints of a large-scale assessment. As in any large-scale
assessments the range of
feasible item formats is limited. However, with computers for assessment, the types of response
formats can include
interactions with text, such as highlighting and drag-and-drop, as well as
multiple choice and short constructed response items (to which students write their own answer).
100. Response formats can be differentially sensitive to individual differences. For example, cloze
and sometimes multiple choice are typically more dependent on decoding skills, because readers
have to decode distractors or items, when compared to open constructed response items (Cain &
Oakhill, 2006). Several studies based on PISA data suggest that the response format has a
significant effect on the performance of different groups: for example, students at different levels of
proficiency (Routitsky & Turner, 2003); students in different countries (Grisay & Monseur, 2007);
students with different levels of intrinsic reading motivation (Schwabe, McElvany & Trendtel, 2015),
and boys and girls (Lafontaine & Monseur, 2006, 2006b; Schwabe, et al., 2015). Given this
variation, in
measuring trends over time, it is important to maintain a similar proportion of tasks in
multiple choice and constructed response formats from one administration to the next. A further
significant consideration in the context of reading literacy is that open constructed response items
are particularly important for the reflection and evaluation aspect, where the intent is often to
assess the quality of thinking rather than the conclusion itself. Nevertheless, because the focus of
the assessment is on reading and not on
writing
, constructed response items should not be
designed to put great emphasis on assessing writing skills, such as spelling, grammar, etc. Finally,
students in different countries are more or less familiar with various response formats. Including
items in a variety of formats is likely to provide some balance between more and less familiar
formats for all students, regardless of nationality.
101. In summary, to ensure proper coverage of the ability ranges in different countries, to ensure
fairness given the inter-country and gender differences observed and to ensure
a valid assessment
of the
reflect and evaluate
aspect, both multiple choice and open constructed response items
continue to be used in PISA reading literacy assessments regardless of the
change in delivery
mode. Any major change in the distribution of item types in print reading might also impact the
measurement of trends.
Достарыңызбен бөлісу: