Unpacking why student writing does not match their science inquiry experimentation in inq-its

Haiying Li, Janice Gobert, Rachel Dickler

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Science assessments should evaluate the full complement of inquiry practices (NGSS, 2013). Our previous work has shown that a large proportion of students’ open responses did not match their scientific investigations (Li et al., 2017a). The present study both unpacks and compares the sub-components underlying students’ performance for experimenting to their written open responses. These findings have implications for the assessment of inquiry practices, design of real-time scaffolding, and teachers’ instruction of science.

Original languageEnglish (US)
Pages (from-to)1465-1466
Number of pages2
JournalProceedings of International Conference of the Learning Sciences, ICLS
Volume3
Issue number2018-June
StatePublished - 2018
Event13th International Conference of the Learning Sciences, ICLS 2018: Rethinking Learning in the Digital Age: Making the Learning Sciences Count - London, United Kingdom
Duration: Jun 23 2018Jun 27 2018

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Education

Keywords

  • Educational data mining
  • Natural language processing
  • Science inquiry assessment

Fingerprint

Dive into the research topics of 'Unpacking why student writing does not match their science inquiry experimentation in inq-its'. Together they form a unique fingerprint.

Cite this