[Logged as: View only to: Public Group]  Login
  Feature Name: Neutral space for stating non-objective viewpoints in peer-evaluat...
 
Author: Yael Kali

Category: Evaluation Tools, Collaboration tools

Subject: Social sciences

Kind: Element/Applet

Audience:
 Elementary School
 Middle School
 High School
 Higher Education
 Teachers & Principals
 Other


Projects:

Software URL: CeLS (Collaborative E-Learning Structures)

Created by: Miky Ronen, Holon Academic Institute of Technology, Israel

Reference URL

This Feature is connected to (3) Principles
  • Scaffold the development of classroom norms
  • Encourage learners to learn from others
  • Involve students in evaluation processes
     
    Feature in Visual Map
     
    Description:
    The initial design of the peer-evaluation activity included criteria that were derived from students’ suggestions in a classroom discussion that occurred prior to the presentations and included the following: a) is the uniqueness of the school apparent? b) is the rationale clear? c) are the activities that take place in the school demonstrated clearly? The activity included an online form in which students were required to grade each of the group-presentations between 1 (poor) to 7 (excellent). The form also included text fields for students to justify their grading according to the three criteria (see figure).

    Following an implementation of this version it was evident that for this specific context, in which the contents that are evaluated have to do with beliefs and morals, which bring to biased scoring (see field-based evidence below), we added another text field called My personal opinion about this school. This field was not considered a criterion that should effect scoring. Rather, it was intended to provide general feedback for presenters as to the degree of acceptance of their ideas among other students.
    The Rationale Behind the Feature (Specific Design Principle):
    Enable students to state their personal, non-objective viewpoints about their peers’ work
    Context of Use:
    The study took place in an educational-philosophy course for undergraduate level at the Technion, taught by the author of this paper. The main goal of the course was to help students develop their own perceptions about fundamental issues in education and schooling (e.g. what is the goal of schooling? What contents should be taught in school? What should be the role of the teacher?). A main theme in the course is the “ideal school” project, in which groups of 3-4 students constructed a conceptual model of a school that met their evolving educational perceptions. Toward the end of the semester each group gave a short presentation of one day in their ideal school. For this purpose, most students used PowerPoint, but other less-conventional means, such as drama-performances were also used. The presentations took place in three class meetings, with three or four presentations in each session. One challenge we faced was how to ensure that students make the most out of these meetings. Prior teaching experience in similar contexts revealed that students tend to focus on accomplishing the course’s requirements (their own presentations in this case) and less interested in their peers’ projects. This challenge was addressed by designing a peer-evaluation activity, in which students were involved in the assessment of their peers’ “ideal school” presentations. The rationale for engaging students in this activity was: a) to ensure their involvement in their peers’ projects, b) to create a framework for them to learn from each others’ projects, c) to help them develop evaluation skills that they would need as future educators, and d) to reinforce criteria for designing their projects. The analysis of this peer-evaluation activity by the instructor involved the integration of hundreds of assessments (35 students, times 10 groups, times about four criteria). To help facilitate the analysis we decided to use a computerized system, which enabled us to gather, present and analyze these assessments in a productive manner. The activity was therefore performed online with the CeLS environment (Collaborative e-Leaning Structures), a novel system that allows the instructor to create and conduct a variety of online structured collaborative activities (http://www.mycels.net)
    Field-based Evidence:
    Outcomes indicated that the refined design, which enabled students to express their personal viewpoints, assisted students to better differentiate between objective criteria and personal stands. This was evident from a higher correlation between the set of scores provided by the instructor for each of the groups, and those provided by students (r=0.62, p=0.03) compared to the first iteration. Furthermore, the learning gains from the peer-evaluation activity, as indicated from the attitude questionnaire, seemed to be higher in the second iteration. However, it was found that since the contents that are being evaluated involved cultural and political values, tensions in class-discussion between students aroused, and infiltrated as biased scoring and inappropriate and even offending justifications in the peer-evaluation activity (Kali & Ronen, 2005).
    References:
    Kali Y. and Ronen M. (2005), Design principles for online peer-evaluation: Fostering objectivity. Proceedings of the 10th CSCL conference, Taipei, Taiwan.

    Image:(Click to enlarge)