Getting students to mark each others’ work has two huge advantage – it is brilliantly formative and it saves you a ton of time. But how accurate is it? And how do you validate that it really is helping students learn?
This guest post is by Adam Williams, Teacher of IT and Computer Science, City of Norwich School an Ormiston Academy.
I have trialled Structured Peer Assessment exercises over the last few weeks with my classes. When I first saw it pop up as a new way of getting structured written work out of students I jumped at the chance. They are focused both during their own responses and even more so when they are giving feedback on others.
Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced.
I had planned to block out one of my whole one hour sessions to trial this initially but due to time constraints of the lesson I ended up with about 40 minutes left . To start with I showed them the video provided on the website and set the question “People often want to buy the latest smartphone or other computing device, even though the devices they own still work. Discuss the impact of people wanting to upgrade to the latest smartphone. In your answer you might consider the impact on: * stakeholders * technology * ethical issues * environmental issues!”
and set them off with 30 minutes, leaving a little bit of time for feedback at the end of the session.
They took it very seriously (They are an optional GCSE class taking my subject as an extra option) They could see the benefits themselves. Responses to other students was purposeful and exceptionally useful for them to draw out misconceptions and I love that I can pick up their answers afterwards, display them on the board and dissect where and why they would be picking up/losing marks and how they compare with other answers. They also quite liked being high up on the leaderboards as ranked by their peers.
From their feedback they would have preferred a little bit less time on the feedback as they felt it was just too long to be reading through the same content worded slightly differently a number of times.
Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced. They are now more succinct and have learned over a few of these that sometimes quality over quantity in an exam question is good.
Imagine asking one of your classes a deep, but deceptively-simple, question. Have them judge each others’ answers anonymously, give their reasons for the judgements, then assess the reasons as well. At the end of the process you get a mark. Automatically.
No exercise books, no late-night marking sessions. Just high-quality formative assessment. Empowered, gamified, peer-supported learning that just works.
Now imagine running this lesson with an Ofsted inspector in your classroom. Think they’d be impressed? So do I.
So you can try SPA for yourself, I have created this set of demonstration exercises you can assign to your students. It costs nothing so give it a go.
Here are a few examples of SPA questions. Add your own!
- You are a serf in a Norman village. Describe your day.
- Where would you rather live – Singapore or Dubai? Why?
- How could you use a barometer to determine the height of a tower?
- Why can a cheetah run faster than a gazelle?
- Explain why metals are sometimes defined as plasmas.
- How would you measure the volume of a dog?
SPA is a technology we patented several years ago and have been quietly working on ever since. This is the first time we’ve had it available in the main Yacapaca interface.
And here it is. Patents are written in a language that is absolutely mind-numbing to the rest of us so I’ll forgive you for not reading it. Basically what it covers is our Thurstone Ranking approach to peer assessment system, which is absolutely unique in the world.
We applied for this patent four years ago so it has been a long slog to get it granted. You can imagine that corks are popping at Yacapaca HQ tonight.
Starting from Monday, we are introducing a new level of challenge into our quizzes. We shall be asking students to write formative feedback statements for each other. This is strictly experimental, and I have my finger on a button to switch it off instantly if it is not working.
Here is what your students will see during their next quiz:
Last term, we ran a series of limited tests which convinced me that a significant minority of students can write formative statements that actually Continue reading