Abstract
Science assessments should evaluate the full complement of inquiry practices (NGSS, 2013). Our previous work has shown that a large proportion of students’ open responses did not match their scientific investigations (Li et al., 2017a). The present study both unpacks and compares the sub-components underlying students’ performance for experimenting to their written open responses. These findings have implications for the assessment of inquiry practices, design of real-time scaffolding, and teachers’ instruction of science.
Original language | English (US) |
---|---|
Pages (from-to) | 1465-1466 |
Number of pages | 2 |
Journal | Proceedings of International Conference of the Learning Sciences, ICLS |
Volume | 3 |
Issue number | 2018-June |
State | Published - 2018 |
Event | 13th International Conference of the Learning Sciences, ICLS 2018: Rethinking Learning in the Digital Age: Making the Learning Sciences Count - London, United Kingdom Duration: Jun 23 2018 → Jun 27 2018 |
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- Education
Keywords
- Educational data mining
- Natural language processing
- Science inquiry assessment