Group Assessment with Competition

Close Icon
These competencies are explicitly fostered and described in Competence View.
These competencies are fostered in this course but are not explicitly described in Competence View. Please contact the responsible person for further information. Competencies in grey are fostered in this course but are generally not the focus of Competence View, which focusses on cross-disciplinary competencies.

In the third semester of their Mechanical Engineering bachelor's program, nearly 500 students engage in the course "Innovationsprojekt" to develop a robot from concept to functionality. Teams of five to six students use a structured approach involving bi-weekly sprint cycles. Throughout the 10-week development phase, each team meets weekly with their coach to refine their approach and overcome challenges. The coach assesses every two weeks the methodical work on the basis of a predefined metric and provides feedback.

At the project's culmination, the robots are brought to life in two competitive events. First, there are graded qualification rounds where each robot is assessed based on predefined criteria. Then, the ten best-performing teams from the qualification advance to a public final, where they compete against each other. This final event provides an opportunity to showcase the robots to a broader audience beyond the course participants.

All Course Assessments

Overview of the Course

What is the subject context of the course?

The core of the course is the practical application of development methods in a challenging engineering project from design to production and testing. The students learn how to systematically approach and structure a complex development project together in a team.

What should students learn and be able to do at the end of the course?

After successfully completing the project, students have a thorough theoretical and, above all practical understanding of systematic product development in an innovative context.

They are able to systematically generate technical solutions to complex problems and practically implement them in the form of prototypes.

In addition, students practice in self-study the necessary skills, such as design, programming, control, 3D printing, team management, team dynamics and many more.

Why was the specific assessment format chosen?

We chose the format of pure group performance and group assessment, and this has proven successful.

There are two types of assessment:

  • Bi-weekly assessment of the applied development methodology based on a predefined metric:
    The development methodology is relevant because this is the core competence that students should learn and be able to apply in other projects after the course.
  • Assessment of the performance of the final prototype in the form of a competition:
    It is relevant how well a finished product (or, in this case, a final prototype) performs. And the competition is an extremely motivating experience for students.

How are students prepared for the assessment?

Learning materials, tools, working materials and work areas are available to students.

Accompanying the practical experience of developing a product, students are assigned tasks that are pertinent to their current phase of development.

Instead of a classical classroom lecture, weekly podcasts are published. These are used to make the methodology more tangible and to address current problems that students struggle with.

In preparation for the competition, a test track is provided, which the students can use for testing their robots regularly under the competition conditions.

Course Description

Fact Sheet

Resources

Grading and Feedback

Staff Workload (472 Candidates)

Time Staff Investment
Expert Input 5 h / Week Professor
Project Coordination 100% Doctoral Student (or Research Associate)
Project Coordination and Supervision 20% Doctoral Student (previous Year's Coordinator)
Training Student Coaches 1/2 Working Day + 1 h / Week each 30 Bachelor and Master Students (Coaches)
Technical Support 10-20% Technical & administrative Staff
Podcast Production 5 - 8 h / Week 1 Bachelor or Master Student
Supervision & Support at Work Areas 3 h / Week each 12 Bachelor and Master Students
Coaching Sessions 5 h / Week each 30 Bachelor and Master Students
Event Organization (Competitions) 3 h / Week 1 Bachelor or Master Student
Qualification for the Final 1 Working Day 5 Doctoral Students, 20 Bachelor and Master Students
Final 1 Working Day 22 People

Extra Information

  • Expert Input
    Expert input consists of regular meetings with the coordinators, creating podcasts and consultations with students (once per week).
  • Training Student Coaches
    There is a mandatory half-day course for all coaches. The course is conducted by PBLabs and the project coordinators.
    Additional weekly technical training (1 hour) is provided by the project coordinator.
  • Coaching Sessions
    1 student coach supervises 3 teams.

Shared Experience

How many times has the assessment been conducted in this format?

The course has been offered yearly at ETH in this format since 2013. In 2023, the course was moved from the 2nd to the 3rd semester. At the same time, the course was overhauled, but the basic principle of ‘Assessment of the development methodology’ and ‘Assessment of the performance of the final prototype’ remained.

What contributed to the success?

  • Course
    • Perhaps the most important factor is a good, challenging project that allows students to experience a sufficient amount of success (at an appropriate level of difficulty) and provides the best possible learning experience.
    • Success on a course-level is when everything is running smoothly from an organizational point of view and when students can fully concentrate on their task.
    • Each year the assessment is revised with the intention of improving it for examiners and students.
  • Assessment of the development methodology
    • This assessment requires a precise evaluation metric.
    • With dozens of teams, a lot of personnel is needed. They all have to evaluate consistently and without bias.
    • Consistent training of all examiners is also required to ensure standardized feedback. In our case, this is very important because teams are alternately assessed by different coaches. This procedure (alternating the evaluator) has proven successful.
  • Assessment of the performance of the final prototype in the form of a competition
    • For such a large course, the organization and planning of the competition day must be very detailed. It is still an exam situation, which can cause stress for the students.
    • The examiners and examinees must be precisely informed about the procedure. The rules must be unambiguous and consistently enforced.
    • The location is important and influences the course of the competition and the mood on the day.
  • Group assessment
    • It is our task to design fascinating projects so that students would be sad if they could not participate. Thus, with 500 students, we have a negligible number of free-riders.
    • The team starts with goal setting and expectation management, as a result, most group conflicts that may arise can be minimized.

What were the challenges and how were they overcome?

We are very happy with the assessment in form of a competition. Assessment of the development methodology is the much more difficult part:

  • You can only observe and thus evaluate a fraction of the students’ work.
  • It is difficult in a dynamic project to determine meaningful evaluation criteria at every point in time. And it is difficult to determine evaluation metrics that can act dynamically, but at the same time are specific and allow for a uniform, examiner-independent assessment.
  • The assessment is under continual development.

From experience it is essential to success that:

  • group registration by Moodle is made by a certain deadline, after which students are assigned to teams.
  • roles within the team must be assigned and announced via Moodle before access to the laser cutter or factory is granted.

Are there any further developments planned?

We will continue to revise the metrics for the assessment of the development methodology and align them even better with the learning objectives. It would be helpful to get access to the students’ daily work in order to be able to assess this. This would then enable direct feedback, which is extremely important for the learning process.

What tips would you give lecturers who are planning a similar assessment?

Group examinations are sensible and feasible. If the project is exciting enough, the students will want to participate and engage, and there will be less problem with free-riding.

Always develop the assessment with a focus on the learning objectives, even if it can be difficult. Don’t just assess anything just for the sake of it.

Don‘t be put off by the effort involved. The students’ reactions and feedback are worth it. And don’t expect it to be perfect the first time!

I consider a course successful when students have achieved their learning objectives and enjoyed the experience.
Mark Zander, PhD Candidate and Supervisor of the Innovationsprojekt

ETH Competence Framework

Subject-specific Competencies

  • Concepts and Theories
  • Techniques and Technologies

Method-specific Competencies

  • Analytical Competencies
  • Decision-making
  • Problem-solving
  • Project Management

Social Competencies

  • Communication
  • Cooperation and Teamwork
  • Leadership and Responsibility

Personal Competencies

  • Creative Thinking
  • Critical Thinking

Overview of the ETH Competence Framework

loading