We spoke with the project team that converted the lecture “Dynamische Erde I” to a flipped learning context. Dr. Oliver Bachmann and Léon Frey shared their experiences.
The aim of this project was to produce a series of videos which replaced a portion of the lecture “Dynamische Erde I” at the Department of Earth Sciences. Part of the lecture will still be held on campus. The videos will be watched by the students individually as a preparation for the lectures held in class. The videos cover a considerable part of the content in an easy-to-understand way. This is a “flipped learning” teaching approach which plans the necessary knowledge acquisition as an individual activity for students to complete (in this case watching videos) in their own time. The face-to-face time is then used to further engage in deeper discussion.
What triggered this experiment?
During the corona pandemic, when lectures were held online, it became evident that high-quality online material, in particular podcasts, would greatly enrich a lecture. Online lectures via Zoom do work but they should be augmented by other teaching methods. For this reason, this project was launched.
Which specific actions were taken?
We first thought about the content for the videos. Which elements should be part of the videos and what should be kept in class at ETH? We made this decision by identifying pure information that we wanted to convey in contrast to sections of the course that required interaction, activity and discussion. After that, we wrote scripts for the videos and created the necessary illustrations. We filmed both at ETH and on different locations the field using both a camera and a drone. The last step was editing the videos and making them available for the students.
What were the results or outcomes of the project?
The result of this project is flipped learning scenario which includes a series of videos on mineralogy, magmatic processes, metamorphic processes and the rock cycles. The videos are available here (videos are in German): https://www.youtube.com/channel/UC-Zw-otyiP39U2zvzgCq0cQ/
Can you describe the impact on students?
Students will be able to watch the videos starting in the fall semester 2022, which lies in the near future at the time of writing. Therefore, we don’t know the impact yet. However, we did some test videos last autumn. In a survey, students rated these videos very positively. We look forward to seeing what the student feedback is after the autumn semester 2022!
What lessons did you learn? What would you do differently next time?
Video production takes a lot of time. More than you might think at first. And there is always the temptation to do more and to do it better – to do another take, trying to do better than in the last take. Therefore, it is important to know when it’s enough – or when time does not allow for more attempts. Next time we would proceed the same way – in the end everything worked out well and as planned.
What first steps do you advise for others who are interested in doing the same?
Don’t underestimate the time video production needs. Our team came into this project with existing experience and skill in creating such videos and also invested a fair amount of time in learning how to do it well. And don’t underestimate the skill required to stand in front of the camera and speak confidently. If you have no experience in neither of those, plan for enough time to practise and get used to it.
This project was funded by Innovedum, the Rector’s fund for advancing innovative education at ETH Zürich. You can keep up with development of this project in the public Innovedum database. If you are interested in applying for a project yourself, you can find information and the login to the application process here: www.innovedum.ethz.ch.
In the framework of the computational competencies initiative at ETH, a JupyterHub has been established at LET. This brand-new JupyterHub serves JupyterNotebooks to everyone involved in teaching and learning at ETH.
JupyterNotebooks are interactive documents, which combine code, text and animations. Different programming languages, like Python, R, Julia, Octave or Open Modelica are supported. Sign in through a plug-in from your course in Moodle and enjoy using JupyterNotebooks without additional authenticaton or the need to install anything on your computer. This holds for everyone involved in a course. No matter if your role is student or teacher, you can reach your personal JupyterLab environment on the JupyterHub with one click and it runs in your browser.
This is what the plug-in in Moodle looks like, which takes you straight to your JupyterHub hosted by LET
In your course you can use JupyterNotebooks as
interactive textbooks which support lectures or exercises
assignment sheets, where students answer questions and write code in a pre-defined (coding) environment
or just as an environment to combine text, code, and visualizations, either for students to work on assignments or for teachers for demo purposes
learning journal for documenting learning progress
Choose JupyterNotebook as type of assignment in Moodle
First, start your JupyterHub through the plug-in in Moodle. Either create a new JupyterNotebook right in your JupyterHub or upload your work. Also include additional files, which you might want to distribute together with your Notebook, like data files, etc, in the same folder on your JupyterHub. Once your assignment in the form of a JupyterNotebook and optinal accompanying files are ready on your space on the Hub, you can include it directly in the assignment activity in Moodle: When you choose Jupyter notebooks as submission type, it shows you the folder tree in your Jupyter workspace on the Hub. Select a folder to distribute all files inside this folder to your students in the form of an assignment.
The students will be able to select a folder on their Jupyter workspace once they download the assignment. And when the assignment is finished and ready to be submitted, again the students will be able to select a folder from their JupyterHub workspace to submit, which might of course contain more files than just the JupyterNotebook itself.
Additionally, students can not only use the JupyterHub through distributed assignments, but they also get the same plug-in in Moodle to reach their space on the JupyterHub to do their own coursework.
First users
During fall term 2021 first pilot users have been using the JupyterHub for in-class exercises, for documentation and evaluation of lab experiments, for entire homework assignments and also as a tool to complete a part of an assignment. There are of course many more use cases, and we can even offer you to use JupyterNotebooks on the Hub in your exams.
Interested? Just contact us at jupyterhub@let.ethz.ch for more information and to activate the JupyterHub for your course in Moodle. As of now, the Hub won’t be available by default for your course.
We are looking forward to welcome new users across all departments!
In a project-based course, students learned to apply materials knowledge and skills to the construction of a boat that could navigate an unknown terrain using artificial intelligence. We talked with the two lecturers Rafael Libanori and Henning Galinski, and the department’s Educational Developer Lorenzo De Pietro to find out more about this innovative course in the Department of Material Sciences.
Lukas Wooley and Sebastian Gassenmeier get their boat ready.
What triggered this experiment?
Originally, we were inspired by AP50, a project and team-based introductory physics course taught at Harvard. We wanted to do more with problem-based learning at ETH Zurich and achieve a different kind of learning environment. Students tend to expect that lectures just “give the knowledge”, but there is so much more to teaching. We realised it’s important to teach students how to learn more efficiently and take more responsibility for their own learning. In this course, we give students scientific questions to answer themselves. We wanted them to start taking risks and to have the freedom to fail, which is what science is all about. It’s not just theoretical input. Interpersonal and technical skills are just as important as academic skills.
What exactly did you do? We applied for Innovedum funding and when we were successful recipients we created a course that gives the students a project that has a connection to material sciences, as well as other areas such as controlling and artificial intelligence. We receive support from Antonio Loquercio in the controlling and computer vision part. He is currently a Post-Doc at the University of Berkley, California. Without him, it would have been very difficult to achieve the computational goals of the project. Students attended 4 weeks of theoretical classes and then started working in teams. The goal is to construct a model boat which can intelligently navigate a course using the Materials Design Lab at D-MATL. We also employed PhD students as coaches to support the students.
What were the results? We had 16 students who completed the course in the spring of 2021. The challenges were big and so we were thrilled by the final outcomes. The students took it seriously and at the end of the course there were four final boats. The students displayed great creativity, such as building small experimental set-ups along the way. They were able to solve problems on their own, in groups and learn from each other.
What is the student perspective? Students were frustrated initially because we took a passive approach to communicating knowledge, but they saw the benefit of this approach at the end. We believe that learning should strain their abilities and that it is iterative. But it is wonderful that it ended on a celebratory note with the functional boats that successfully navigated the terrain.
What lessons did you learn? We realised that in the future we need to spend more time explaining our approach to teaching and clarifying expectations right from the start. We also plan to pay close attention to the gender-balance among our students as we want to maintain a good mix as the course grows.
What are your plans for the future regarding this project? Due to the current curriculum revision projects in our department, there will likely be an increase in hands-on courses like this one. So, this course represents a new way of teaching, like a prototype for the new curriculum. The results will be looked at closely and are quite important for future decision-making in the department. Teaching this way is also a development opportunity for the lecturers.
What first steps do you advise for others who are interested in doing the same? We think it is important that teaching is viewed as a design science, in other words that it benefits from careful planning and time. We recommend visiting other courses that already use this kind of approach and speaking with the course leaders to gain inspiration and practical ideas for implementing project and problem-based learning in your own course. We would be happy to share our experiences with others.
The lecture series «Pharmazeutische Fallbeispiele» [Pharmaceutical Case Studies] is a compilation of seven 2-hour sessions for around 75 students of the BSc Pharmaceutical Sciences. Surrounded by lectures and lab work on basic science and pharmacotherapy in the third year of studies, our autumn series aims to showcase the complexity of and our fascination for later pharmacy practice issues, giving the students a new perspective on all the other courses’ material as well.
One of our primary learning objectives states that students should be able to analyse simple case studies from pharmacy practice and present, explain, and discuss them in plenary, based on their current pharmaceutical knowledge. To let students achieve this objective, we had already included group work and presentations, where they discuss their thoughts on a given case study with our and their own literature resources (e.g., which drug class is most appropriate for which kind of nausea).
In 2019, we realized that student participation dropped towards the end of the semester when the big exams of the other courses approached. The sessions were mainly visited by the presenting student groups, whilst their peers focused on learning for ECTS.
2020 challenged us to go digital. This simultaneously provided the opportunity to have Moodle supporting us in our different teaching elements. The Moodle group selection allowed our students to choose their own peers. Folders, surrounded by explanatory text, helped us in embedding the asynchronous learning material (i.e., preparatory reading).
Most importantly, however, we created a simple quiz with four questions for each of the seven 2-hour sessions, focusing on the day’s learning objectives. We specifically aimed to include questions on the preparatory reading, our frontal inputs, the presentations by their peers, and one additional pharmacy practice issue. Moodle badges allowed for a simple gamification of the quiz. We shortened our lessons and offered the time to complete the quizzes during the two hours to not increase the overall student workload.
Student feedback was overwhelmingly positive: They appreciated our efforts concerning the Moodle course, liked the variation with peer presentations, stated having fun completing our quizzes, and were happy about the interactive segments. There were still fewer students present live towards the end of the semester, but the completed quizzes suggest a shift towards asynchronous learning by watching the recordings when taking a break from learning for ECTS.
We will most certainly keep our Moodle course even when going back to physically present teaching. The students seemed to be engaged in the course material by asking us interested follow-up questions concerning the preparatory reading and even our quizzes. The administrative work for setting up the course was hefty, but well worth it. One advise to my previous self: Cramped shoulders won’t help you in troubleshooting issues in the Moodle group selector faster.
The benefits of classroom visits (and how to make them effective)
By Tommaso Magrini, PhD student, Departement of Materials
It requires effort, time and theoretical preparation to be truly able to deliver a good lecture, to properly plan a class, or to assess the performance of your students. As I am writing this document, I am still in this process of learning. Nevertheless, the more I am involved in teaching, the more I can experiment with new techniques or approaches, keeping the structures that worked, adjusting those that didn’t. One technique I plan to keep is classroom visits with peers.
Over the course of my doctoral studies I enrolled in the program “Learning to Teach”. One of the most interesting things I have learnt is the concept “learning by doing”. Not only is “learning by doing” more fun and engaging for the students, but it has also been proven to be the best ally for the lecturers to reach their learning objectives. Step by step, I introduced different activities in my classes, ranging from simple short discussions to more advanced posters and presentations. Those activities not only proved to be a good way to keep the students focused and active, but were also be extremely useful for us as teachers to assess if the concepts our class has been built on, are solid and stable in the students’ minds, or if they need repetition or consolidation. To understand whether the activities I have planned are meaningful and well aligned with the learning objectives, I have asked for peers to visit my classes, sit in and provide me with feedback.
Over the last semester I started implementing a new activity at the end of each lecture. Expecting my students to build on the concepts seen in my class and restructure them into a broader and more applied context, I proposed the following activity: divided in groups the students would need to come up with a shared idea, describe it schematically on a poster and then pitch it in front of the class. This would then foster a discussion between ‘critical friends’, that would openly challenge each group’s idea, with the goal of improving it and helping its realization. The classroom visit proved to be crucial in this phase.
As a lecturer, during such vivid and intense scientific discussions, I have to occupy several roles at the same time. Indeed, not only I have to moderate the discussion, but I also have to evaluate how deep and relevant the discussion is, while I assess whether the students have reached the milestones and the key learning objectives or not. For this reason, the presence of my colleague, that sits ‘outside the discussion’ and evaluates the classroom response to the activity is extremely important. If at the beginning he would observe only, at later stages we were also able to switch roles and evaluate the class activity in turns. Being able not only to take part in the discussion but also to observe it from outside and take notes gave me a more complete vision of the activity.
At the end of each class, I would always have a debriefing with my colleague. Its goal was to sum up the positive and negative aspects of the lecture in a constructive and unbiased way. These debriefings helped to correct the weak parts of the lecture and expand the positive ones. It was clear from the first time on, that the students were responding to the activity we planned with a positive attitude and with enthusiasm. Furthermore, through the classroom visits, we realized that we could more efficiently assess the classroom knowledge by using ‘exam-like questions’ during the discussion.
As a matter of fact, the ‘simple’ classroom visits have evolved, in our experience, into an open ideas exchange, built on honest and constructive feedback, that would help me improving my teaching style, the structure of my classes and the realization of more targeted and better structured learning activities.
Among other things, ETH Zurich’s EduApp allows instructors to pose clicker questions during lectures. Instructors can interrupt lectures to ask questions from the students and get and give feedback on learning progress. Lecturers can also trigger phases of peer-instruction, where students discuss their initial answers to a question with one another and then reanswer the question – in effect, the students are teaching each other during those phases, thus “peer instruction”. By asking students to answer a question twice, lecturers gather data on student understanding. But how meaningful is this feedback data, in particular, when answering is voluntary and ungraded?
A group of mathematics instructors at ETH’s D-MATH worked with LET to analyze EduApp data using Item Response Theory (IRT), Classical Test Theory (CTT) and clustering methods. Over the course of the semester, 44 clicker problems were posed – 12 of them twice, as the instructor decided to insert a phase of peer-instruction. The following figure shows an example of the kind of problem being analyzed:
The problem shown was used in conjunction with peer-instruction; the gray bars indicate the initial student responses, the black bars those after the discussion. A simple, unsurprising observation is that after peer-instruction, more students arrived at the correct answer. What can we learn from these responses? CTT and IRT can provide psychometrics that help understand this instructional scenario.
When it comes to being “meaningful,” the “discrimination” parameter of a problem is of particular interest: how well does correctly or incorrectly answering a problem distinguish (“discriminate”) between students who have or have not understood the underlying concepts?
CTT simply uses the total score as a measure of “ability”, but also has a measure of discrimination (“biserial coefficient”). IRT estimates the probability of a student arriving at the correct answer for a particular problem (“item”) based on a hidden (“latent”) trait of the student called “ability” – typically, higher-ability students would have a higher chance of getting a problem correct. How exactly this probability increases depends on problem characteristics (“item parameters”).
In IRT, the ability-trait is determined in a multistep, multidimensional optimization process, where the difficulty and discrimination parameters of particular problems (“items”) feed back on how much correctly answering that problem says about the “ability” of the student; “high-ability” students are likely to get correct answers even on high-difficulty, high-discrimination problems.
The results of their study were extremely
encouraging: using both CTT and IRT, almost all 44 problems under investigation
exhibited strong positive discrimination in the initial vote. This means that
the better the student understood the underlying concepts, the much more likely
they were to give the right answers – and vice versa. A low
discrimination, on the other hand, means a problem provides less meaningful
feedback. For the handful of problems which had lower (yet still meaningful!)
discrimination, this could be explained by other problem characteristics, for
example, that at the time they were posed, they were still too hard or already
too easy – but even that feedback is meaningful to the instructor for future
semesters.
The truly surprising result of the study was that in all cases of peer-instruction, the problem had even stronger discrimination afterwards! Yes, unsurprisingly more students answer correctly after discussion with their neighbors (the problem becomes “easier”), but: peer-instruction does not simply allow weaker students to enter the correct answer, it apparently helps them to perform at their true potential.
For the purposes of the study, the clicker data had to be exported manually, but the next version of EduApp, slated to be released in December 2020, will allow export of data for learning analytics purposes directly from the interface – the following figure shows a sneak preview of that new functionality.
The exported data format is compatible with
input for the statistics software R, and there are variety of guides
available for how to analyze this data (https://aapt.scitation.org/doi/abs/10.1119/1.5135788
(accessible through the ETH Library) provides a “quick-and-dirty” guide).
The full study, including results from
Classical Test Theory and clustering methods, as well an outlook for new EduApp-functionality
is available open-access in Issue 13 of e-learning and education (eleed)
under https://eleed.campussource.de/archive/13/5122.
Schlechte Unterrichtsbeurteilungen für innovativen Unterricht?
“Mit viel Engagement habe ich meine Lehrveranstaltung nach neuesten didaktischen Erkenntnissen umgestaltet. Sei es im Flipped Classroom oder mit erhöhtem Einsatz von Clicker-Fragen, die Studierenden waren während der Präsenz aktiv gefordert und haben auch gerne mitgemacht. Doch mit der Unterrichtsbeurteilung kam die grosse Enttäuschung. Die Studierenden bewerten mich und meinen Unterricht deutlich schlechter als vorher. Sie bevorzugen sogar Frontalunterricht, denn damit würden sie besser lernen. Habe ich etwas falsch gemacht? Soll ich wieder zurück zu meiner altbewährten Vorlesung?”
Was hier wie ein
Einzelfall klingt, ist doch sehr verbreitet. Zahlreiche Untersuchungen
dokumentieren, dass Unterrichtsformen mit Lerner zentrierten Methoden häufig zu
schlechten Evaluationsergebnissen führen (z.B. Seidel, 2013). In einer kürzlich publizierten Studie haben
Louis Deslauriers und Kollegen/innen der Harvard University genau diese
Problematik untersucht (Deslauriers,
2019). Sie gingen der Frage nach, ob es bei aktiv involvierten Studierenden
eine Diskrepanz zwischen dem tatsächlichen und dem empfundenen Lerngewinn gibt.
Falls Studierende den Eindruck haben, weniger als in einer Vorlesung gelernt zu
haben, dann führt dies zwangsläufig zu schlechteren Evaluationsergebnissen. Die
experimentell angelegte Untersuchung bestätigte die negative Korrelation
zwischen der Selbsteinschätzung und dem effektivem Lerngewinn. Folgende drei Gründe
sind dafür verantwortlich:
Interaktive Unterrichtsmethoden verlangen eine erhöhte kognitive Leistung, die von Studierenden dann nicht unbedingt mit dem Lernen in Verbindung gesetzt wird.
Insbesondere Studienanfänger/innen verfügen noch nicht über die Fähigkeit, ihr eigenes Wissen in einem neuen Fachgebiet korrekt einzuschätzen.
Die klare und sprachgewandte Präsentation beim Frontalunterricht verleitet Studierende dazu, ihr eigenes Verständnis in Vorlesungen deutlich zu überschätzen.
Aufgrund der Ergebnisse einer Nachuntersuchung schlagen Delauriers und Kollegen/innen einige recht simple Massnahmen vor, um dieses Missverhältnis zwischen gefühltem und tatsächlichem Lerngewinn zu unterbinden. Zentral dabei ist, die Befürchtungen und Ängste der Studierenden ernst zu nehmen und sie zu thematisieren. So kann eine kurze Darlegung der Lernvorteile von aktivem Unterricht (z.B. Freeman, 2014) während der ersten Unterrichtsstunde bereits erste Befürchtungen auffangen. Aber auch im weiteren Verlauf der Veranstaltung sollte immer wieder auf den erzielten Lernfortschritt hingewiesen werden. Damit erlangen die Studierende eine bessere Einschätzung ihres eigenen Lerngewinns. Hilfreich ist zudem, auf die Gefahr der Lernillusion bei eloquenter Redegewandtheit des Dozierenden in Vorlesungen hinzuweisen (z.B. Toftness, 2018).
Auch an der ETH konnten wir den Einfluss dieser Massnahmen bestätigen. In einer Studie am Departement Physik verglichen wir den Lerngewinn zwischen interaktivem Unterricht und Vorlesung. Bereits in der ersten Lerneinheit wiesen wir die interaktive Gruppe ausführlich auf die positiven Auswirkungen des interaktiven Unterrichts hin. Zusätzlich wurden Unsicherheiten bezüglich des eigenen Lerngewinns im Semester kontinuierlich thematisiert. Bei der Unterrichtsevaluation konnten wir daraufhin kein Missverhältnis zwischen effektiver und geschätzter Lernleistung feststellen. Studierende im interaktiven Unterrichtsformat erzielten einen höheren Lerngewinn und gaben signifikant bessere Werte bezüglich ihres eigenen Lernens an als jene in der parallel durchgeführten Vorlesung (Schiltz, 2018).
Tipp: Was hier jetzt speziell für interaktive Unterrichtsformen gilt, lässt sich sicher auch auf jeden anderen Wechsel der Lernform übertragen. Insbesondere wenn die neue Lernform noch nicht geläufig ist, sollte man die anfänglichen Bedenken der Studierenden ernst nehmen und ihnen klar vermitteln, welchen Nutzen sie vom Wechsel zu erwarten haben. Daneben ist es wichtig, Ergebnisse der Unterrichtsbeurteilung (ob gute oder schlechte) kritisch zu hinterfragen. Nicht immer ist der kausale Zusammenhang zwischen studentischer Zufriedenheit und tatsächlichem Lernerfolg gegeben (z.B. Carpenter, 2020).
Does anything ever happen after those teaching evaluation surveys?
Maybe you know
the problem. You want feedback on your teaching from your students. You want to
know what they think went well, and what didn’t. Maybe you need their
evaluations for future job applications. In whatever case, in your evaluation a
more or less representative amount of feedback and number of ratings would come
in handy. But your students are sick of evaluations! They wonder why they have
to fill something out which will be of no use to them, and nothing ever comes
of evaluations anyway… .
So the
muttering of students. However, something actually does happen with student
evaluations – even if most students aren’t aware of it. It is rare that someone
attends a course twice, and there is little opportunity to find out whether
lecturers have implemented their students’ wishes. Therefore, to let students
and others know what happens after a questionnaire is submitted and why
evaluations are important to teaching quality, we have created a 3-minute video
with the help of Youknow (specialists in explainer videos). Please show this
video to your students and motivate them to take part in the survey! This is
especially useful if you have the opportunity to conduct the evaluation in
class.[1]
The challenge for us was to explain the entire comprehensive, stringent evaluation process from survey via publication of findings to deduction of appropriate measures briefly and appealingly. A bigger challenge was to be responsive to students and take their criticisms seriously, while also dignifying the engagement of most lecturers. Whether and how well we have achieved this in three minutes of moving images is yours to decide!
[1] By in class evaluation we mean that you programmed the time your evaluation survey will be send out to students, and you ask them during your lecture time to fill in the survey, via their laptop or smartphone.
Case Study – Peer Review Mastering Digital Business Models
As part of a series of case studies, staff
at LET sat down to have a conversation with Prof. Elgar Fleisch, Johannes
Hübner and Dominik Bilgeri from the Department of Management, Technology, and
Economics (D-MTEC) to discuss their Mastering Digital Business Model (MDBM)
course.
What is the project about?
In this Mastering Digital Business Model
(MDBM) course, Prof. Elgar Fleisch, Dominik Bilgeri, George Boateng and Johannes
Huebner teach Master’s level students a theory- and practice-based
understanding of how today’s information technologies enable new digital
business models and transform existing ones. The course contains a novel examination
mode, a video group project is introduced as a core element contributing to the
overall course grade. In addition, students are asked to participate in a
peer-to-peer review of the videos produced by other student groups, which is
independent of the grading and is geared towards giving students insights in
how other groups solved the challenge. The best-rated videos are then shared
with the entire class in the end of the semester.
As part of this newly created examination element, course participants (in teams of two to three students) explain one of the major lecture topics (theoretical lenses) in the first half of their video.Then they apply the same lens by analysing a company, aiming to better understand its underlying business model. Companies are pre-selected and allocated to students for fairness reasons. Every year, we choose a pool of interesting companies in the context of digital transformation, the Internet of Things, Blockchain, e-health, etc.
What
motivated you to initiate the project?
The core idea was to improve students’
learning success by using an examination format that not only requires learners
to reiterate theoretical contents, but also apply the theory in a practical
context. The students have different backgrounds, and do not necessarily have a
strong business focus, which means that many of the concepts taught in class
may be rather abstract. We used the video format and specific companies as case
studies, because we think this is a good way to trigger curiosity, show
concrete examples of modern companies in a compact form, and force students to
reflect deeply upon theoretical frameworks compared to other examination
formats.
How
did you do it?
Aside from the weekly input lectures, we
ask students to form groups in the beginning of the semester. We then provide a
list of theoretical core topics from which each group can choose one. In
addition, we randomly assign each group to a case company. The theoretical
topic then first needs to be explained in the first half of the video, and then
be applied to the case company in the second half. Here we thus used a prosumer
approach, where students become part of the course because they create a small
section of the content. The best videos are shared with the class, and can be
reused as additional learning materials for future cohorts. This set-up generally
resulted in high-quality videos, perhaps also since students knew their videos
will be used again.
Students also had to review the video
projects of five other groups. They had to clearly describe whether and how
their peers used certain perspectives (called “lenses” in the course) which played
a role in the video and in their feedback. In this way they analysed once more how
the newly learned concepts were visible in other companies – a positive side
effect being that they also honed their reflection and feedback skills.
Did
you have the support you needed for the project? Is there additional support
you wish you had had to help you to achieve your goals?
We asked two students from previous cohorts
to join us as tutors, and support this year’s groups primarily with technical
questions about video-making (e.g. tools, quality considerations etc.). In
addition, we designed one of the lecture slots as a coaching session during
which we would further support student groups with their questions. In total,
this approach allowed us to provide the students with high-quality supervision
with reasonable effort.
Please
describe some of the key outcomes of the project
To most students, the task of creating a
video was new. We received feedback that while the initial effort for learning
how to make a video was higher compared to other examination formats, it was also
fun and very helpful to really understand and apply the new concepts. They said
that they learned things more deeply and more sustainably because they had to
consider all details and aspects – compared to the practical exercises they are
familiar with in other courses. By carefully phrasing their arguments in giving
feedback on peer videos, students became more aware of their own thinking and
argumentation.
We observed that the questions asked by
students once they start creating videos were different and went deeper, i.e. their
reflections were based on many concrete examples of companies, and the concepts
were put into perspective. The same sub-concepts have a different meaning in
another context, and students now see the overarching principles better and can
argue more precisely about theoretical aspects. Without these concrete
examples, it would have been harder to concretely grasp the theoretical
aspects.
How
did the project impact learners or the way in which you teach?
We were surprised by the high quality of the
best student videos. The teaching team is now really motivated to continue
innovating on our approaches in other courses. We saw clearly that when
students are very active we get better results, deeper learning and better
reflection.
What
lessons learned do you want to share with your colleagues?
It can really pay off to try things and to
experiment. We think that nowadays the classic format of passive lectures and
final exams may not always be the best choice. We believe the improved outcomes
through students who were actively engaged by the video assignment justified
the investment in developing new approaches and tools.
When considering videos as an examination
format, you should define the entire course/project very clearly. When
describing what production options students have for videos, you should be very
precise. Offering too many options can be counterproductive. It is better to
present 3-4 crystal-clear examples and stick to them.
Also, we would recommend managing students’
expectations clearly in the beginning of the semester, and highlighting both
the benefits and challenges of this examination format. Of course, this becomes
easier after the first year, when you can draw from the experience of the first
cohort, and also provide examples of prior videos to illustrate what is
expected of the groups. Because the students are co-creators you get new and
relevant content which enriches the course and can serve to motivate both
students and teachers.
What
are the future plans for this work? How do you plan to sustain what you have
created through the project?
We plan to optimize some details of this
course, and to go even more in the direction of a flipped classroom to use this
teaching approach in other courses. We will create a library of the student
videos to provide it as additional learning materials in future editions of the
course.
Student feedback
By MDBM Student Cristina Mercandetti (mercandc@student.ethz.ch)
Your opinion about this course and the peer
review & video production process – how has it influenced your learning
process? Cristina Mercandetti: I really enjoyed both the
course and the video production process. I think they complemented each other
very well and we were able to directly apply the theoretical knowledge learned
in the course to work on our project. It helped me to think more critically
about the course content, and really dive into some of the lenses and models
presented. I don’t think this would have been possible without the video
production, so it definitely improved my learning process.
Do
you think this approach could be used in other courses?
Cristina Mercandetti: Yes, I think this approach could easily be used in other
classes. However, I think part of the fun in this class was that the video
production was something very new and refreshing (a side effect was that I
learned how to cut a short movie). I imagine that if several classes introduced
this it would lose some of its novelty and could be stressful, as it took a lot
of time.
Final remarks about the course Cristina Mercandetti: I really enjoyed the whole
class, and heard a lot of good things from other students too.
As part of a series of case studies, staff
at LET sat down to have a conversation with Prof. Volker Hoffmann (SusTec, the Group for
Sustainability and Technology) and Erik Jentges (Educational
Developer) from the Department of Management, Technology and Economics (D-MTEC)
to discuss their corporate sustainability project.
What
is the project about?
The course “Corporate Sustainability” aims to enable students to become advocates of sustainable business practices in their later careers. Each year it attracts 150-200 students with diverse disciplinary backgrounds and different educational levels (BSc, MSc, and MAS). We adapted the Six Sentence Argument (6SA) method for this course. The method focuses on enhancing critical thinking skills through structured writing and guided, double-blind peer-review.
What
motivated you to initiate the project?
We wanted students to get a clearer picture
of what sustainability really is. In the course, they develop not only a deeper
understanding of corporate sustainability but also the skills to give and
receive feedback.
How
did you do it?
At the core are four topics that relate to
the sustainability of corporations. These are assessment, strategy, technology,
and finance. We developed digital learning modules (videos, some with
interactive elements) that explain key concepts to support the most relevant and difficult
parts of the lecture. Also, we want to develop students’
critical thinking skills. In e-modules, students learn to formulate concise and
short arguments with the 6SA method. The core idea builds on the
assumption that writing is thinking.
In the e-modules, students face a decision
(a micro case based on the lecture content) and argue for their preferred
course of action using a logical structure of exactly six sentences. Each
sentence fulfils a specific function in the overall argument and has a 20-word
limit. A clear grading rubric enables students to assess 6SAs in double-blind
peer reviews. These have been continuously adapted and improved since 2015. The
specialized online tool “peergrade” also helped us to conduct a smooth process
– for both students and teachers.
Through the peer assessment, students engage
critically with their peers’ arguments and receive constructive feedback on
their own arguments. With the 6SA exercise, students learn to argue with
clarity, and it helps them to reflect on the way they and others think.
During the second half of the semester,
students work in diverse teams to prepare mock debates, consulting strategies,
economic models and campaign videos. In this phase, they are coached by several
postdoctoral and doctoral researchers from SusTec, the Group for Sustainability and Technology. The students then present their projects and display their skills in
a group puzzle session and are debriefed in the following final lecture
session. Students receive grades for both individual and group performance and
can earn a bonus on their exam grade when completing the critical thinking
exercises.
Did
you have the support you needed for the project? Is there additional support
you wish you had had to help you to achieve your goals?
The project received funding from different
sources. This helped us to hire academic staff to assist the development of new
teaching approaches and the production of high-quality videos. In addition, we
received specialist guidance in the instructional design and production of
videos.
Please
describe some of the key outcomes of the project
With regard to our feedback modules, we think
that the quality of the argumentation and peer reviews has increased over the
years. For example, we learned that the effective design of such peer
assessment exercises for students requires training on how to give constructive
feedback and that it should involve several feedback loops to support the
development and refinement of critical thinking skills. Overall, the course now
integrates many innovative teaching elements and was a finalist in the 2018 ETH
KITE award.
How
did the project impact learners or the way in which you teach?
When students are able to write better and concise
arguments that convince critical readers, and if they can give constructive
feedback to arguments that are being made to justify strategic decisions, then
they are able to actively shape good decisions in a company setting – they can
be change-makers for corporate sustainability. The students were motivated by
the new teaching approaches such as the supporting videos, interactive
questions inside the videos, and the critical thinking exercises. Peer
assessment is “homework” for the students, but they know that they can earn a
bonus on their exam grade – and they are already rehearsing for some parts of
the final exam.
With regard to students’ learning, the peer
review process itself is convincing. What is unique to our teaching situation is
the incredible diversity in the classroom. A 19-year-old Swiss environmental
science student may be sitting next to a 25-year-old Chinese student who is pursuing
a master’s degree in management, who in turn sits next to a 35-year-old
American part-time student with a PhD in chemistry and a management position
with responsibilities for 20 employees in a multinational company. Peer
feedback is a powerful solution to bridge these gaps of different levels of
experience and cultural backgrounds. It allows younger students to write a
creative and brilliant argument without being intimidated by more senior
students. It allows a shy and quiet student to gain confidence by formulating a
convincing argument whose strengths are recognized in their peers’ feedback. It
creates a space for older students to learn how to coach younger classmates with
constructive feedback to improve their reasoning.
That is why at D-MTEC, we use peer feedback
in other courses as well. Students learn more when actually giving feedback
compared to when only submitting an assignment.
What
lessons learned do you want to share with your colleagues?
At the beginning, it was a lot of work and many
people were involved, but it was worth it. Today, with regard to the critical
thinking exercises, we have continuously refined our processes. Every student writes
three reviews, thereby ensuring that everyone also receives much more feedback
than a single lecturer could provide. The main work for lecturers is providing
an overview of the themes in the arguments and summarizing the activity for all
students. This lets them know that their individual contribution becomes part
of a collective intelligence. There are always truly smart and innovative
solutions that need to be shared with the whole class. Also, there is little
effort involved in re-grading/moderating student questions about feedback,
because we train students to write helpful and considerate feedback and make them
aware of that they also have to learn how to receive feedback, especially if it
is feedback that they don’t want to, but need to hear.
For the production of videos, we recommend planning enough time and engaging with video experts and instructional designers early on. Especially writing a concise script for a short video requires a surprising amount of time until it effectively conveys your key points.
If you are interested in applying these concepts in your own courses please contact LET.
Note: The project received funding from different sources (Innovedum, Emil Halter Foundation, ETH Critical Thinking Initiative).
Additional
resources and comments
Article: Kölbel, J., & Jentges, E. (2018). The Six-Sentence Argument: Training Critical Thinking Skills Using Peer Review. Management Teaching Review, 3(2), 118–128. https://doi.org/10.1177/2379298117739856