The CeLS environment automatically gathers and analyses information submitted by the different users and shows it in various forms (customizable for instructors). The picture attached to this feature shows statistical analysis, histogram, and collection of student justifications to their grading (presented anonimously).
The Rationale Behind the Feature (Specific Design Principle):
Make the synthesis of the peer-evaluation results visible for learners
Context of Use:
The study took place in an educational-philosophy course for undergraduate level at the Technion, taught by the authors of this feature. The main goal of the course was to help students develop their own perceptions about fundamental issues in education and schooling (e.g. what is the goal of schooling? What contents should be taught in school? What should be the role of the teacher?). A main theme in the course is the “ideal school” project, in which groups of 3-4 students constructed a conceptual model of a school that met their evolving educational perceptions.
Toward the end of the semester each group gave a short presentation of one day in their ideal school. For this purpose, most students used PowerPoint, but other less-conventional means, such as drama-performances were also used. The presentations took place in three class meetings, with three or four presentations in each session. One challenge we faced was how to ensure that students make the most out of these meetings. Prior teaching experience in similar contexts revealed that students tend to focus on accomplishing the course’s requirements (their own presentations in this case) and less interested in their peers’ projects.
This challenge was addressed by designing a peer-evaluation activity, in which students were involved in the assessment of their peers’ “ideal school” presentations. The rationale for engaging students in this activity was: a) to ensure their involvement in their peers’ projects, b) to create a framework for them to learn from each others’ projects, c) to help them develop evaluation skills that they would need as future educators, and d) to reinforce criteria for designing their projects. The analysis of this peer-evaluation activity by the instructor involved the integration of hundreds of assessments (35 students, times 10 groups, times about four criteria).
To help facilitate the analysis we decided to use a computerized system, which enabled us to gather, present and analyze these assessments in a productive manner. The activity was therefore performed online with the CeLS environment (Collaborative e-Leaning Structures), a novel system that allows the instructor to create and conduct a variety of online structured collaborative activities (http://www.mycels.net)
This feature was crucial in enacting the peer-evaluation activity, since it involved hundreds of assessments, which needed to be presented in some way. (It wasnt at all practicle to do this manually). Gathering of the information eneabled us to discuss results with students after the first round of evaluations, and thus imporve the activity in the second and third runs.
Kali, Y., & Ronen, M. (2005). Design principles for online peer-evaluation: Fostering objectivity. Proceedings of CSCL 2005 (Taipei, Taiwan). In Koschmann, T., Suthers, D.D. & Chan, T.W. (Eds), Computer support for collaborative learning: The Next 10 Tears! Lawrence Erlbaum Associates.