Understanding how embedded peer comments affect student quiz scores, academic writing and lecture note-taking accuracy
Design and methods
This study uses direct empirical investigation of graduate students participating in an academic writing class working collaboratively online. During feedback on collaborative note-taking activities, data was gathered on student feedback to test for its effects on performance and understanding.
This study aimed to better understand the relationships between peer feedback in the context of online collaborative note-taking and how comments impacted student performance and understanding.
The use of peer comments in online note-taking was found to positively impact student quiz scores and academic writing skills. However, no statistically significant correlation was found between comments and the completeness of their notes taken, suggesting its limits to promote deeper understanding.
The level and detail about the comments made and the average number of comments made weekly by each group was also low. There was also no pre and post test conducted to test the validity of the results.
Designers and teachers using online collaborative activities could benefit by understanding the nature in which peer comments can enhance student learning, bearing in mind the need for explicit guidance in how to comment and at what level of knowledge their comments should target.
Online collaboration, peer editing and commenting are widely used by educators and the public. Understanding how these elements operate might improve the quality of knowledge artefacts such as academic writing and research notes.
Existing literature focuses mainly on peer feedback on writing, this research seeks to find out more about the impact of online peer comments in particular on collaborative note-taking.