CMU-HCII-20-106
Human-Computer Interaction Institute
School of Computer Science, Carnegie Mellon University



CMU-HCII-20-106

Harnessing Student Solutions to Support Learning at Scale

Xu Wang

August 2020

Ph.D. Thesis

CMU-HCII-20-106.pdf


Keywords: Deliberate practice, immediate feedback, human-machine collaboration, artificial intelligence in education, learning at scale, higher education, open-ended assignment, multiple-choice question, automatic question generation, learnersourcing, crowdsourcing


A challenge to meet the demand on higher education and professional development is to scale these educational opportunities while maintaining their quality. My dissertation work tackles this challenge by harnessing examples from existing resources to enable the creation of scalable and quality educational experiences. Deliberate practice targeting specific skills, appropriate scaffolding with timely feedback helps novices become experts. However, the feature of deliberate practice with timely feedback is often missing in college instruction. On the one hand, instructors believe they should assign flexible work, but the very multifacetedness that makes it authentic impedes students' ability to learn because they rarely get timely, attribute-specific feedback. On the other hand, instructors find designing materials that offer focused practice and immediate feedback to be time-consuming and challenging.

This dissertation contributes insights about developing effective learning at scale systems by leveraging the complementary strengths from peers, experts, and machine intelligence, differentiating it from existing systems that solely rely on machine or crowds of peers. This dissertation introduces a technique UpGrade, which uses student solution examples to semi-automatically generate multiple-choice questions for deliberate practice of higher order thinking in varying contexts. From experiments in authentic college classrooms, I show that UpGrade helps students gain conceptual understanding more efficiently and helps improve students' authentic task performance. Through an iterative design process with instructors, I demonstrate the generalizability of this approach and offer suggestions to improve the quality and efficiency of college instruction.

This dissertation suggests another layer to further distinguish knowledge components, by the required generation and evaluation efforts in problem-solving. The practical implication for a more nuanced understanding of knowledge components is to help instructors make more nuanced and accurate instructional decisions, e.g., using "evaluation-type" exercises for evaluation-heavy skills. This dissertation provides further evidence that instructors have so-called "expert blind spots", revealed through cases where their beliefs and student performance do not match. More generally, this work suggests that the reasoning behind educational decisions can be probed through well-designed, low-effort, experimental comparisons toward more nuanced and accurate reasoning and decision making, and ultimately better design.

164 pages

Thesis Committee:
Kenneth R. Koedinger (Co-Chair)
Carolyn Penstein Rosé (Co-Chair) (LTI/HCII)
Jeffrey P. Bigham (HCII/LTI)
Chinmay Kulkarni
Scott Klemmer (University of California San Diego)

Jodi Forlizzi, Head, Human-Computer Interaction Institute
Martial Hebert, Dean, School of Computer Science



Return to: SCS Technical Report Collection
School of Computer Science homepag e

This page maintained by reports@cs.cmu.edu