Purpose: The authors "present studies focused on how students learn computer programming, based on data drawn from 154,000 code snapshots of computer programs under development by approximately 370 students enrolled in an introductory undergraduate programming course" (p. 561).
Findings: "The results show that students’ change in programming patterns is only weakly predictive of course performance. [The authors] subsequently hone in on 1 single assignment, trying to map students’ learning process and trajectories and automatically identify productive and unproductive (sink) states within these trajectories. Results show that [the authors'] process-based metric has better predictive power for final exams than the midterm grades" (p. 561-562).
Recommendations: The authors "argue that developing new automated data collection and analysis techniques, rather than automating and scaling up the outdated, behaviorist-inspired teaching and assessment approaches that have dominated educational institutions, could offer new, scalable opportunities to advance student-centered, project-based learning" (p. 562).
Sample Size: 346
Participant Type: Students whose code snapshots were analyzed. "All students were undergraduates or graduate students enrolled in a programming methodology course at a research university" (p. 571).
Notes: 346 students in total, of which 74 in spring 2012 data (14,000 code snapshots) and 272 students in fall 2012 data (140,000 code snapshots)