Getting students to mark each others’ work has two huge advantage – it is brilliantly formative and it saves you a ton of time. But how accurate is it? And how do you validate that it really is helping students learn?
This guest post is by Adam Williams, Teacher of IT and Computer Science, City of Norwich School an Ormiston Academy.
I have trialled Structured Peer Assessment exercises over the last few weeks with my classes. When I first saw it pop up as a new way of getting structured written work out of students I jumped at the chance. They are focused both during their own responses and even more so when they are giving feedback on others.
Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced.
I had planned to block out one of my whole one hour sessions to trial this initially but due to time constraints of the lesson I ended up with about 40 minutes left . To start with I showed them the video provided on the website and set the question “People often want to buy the latest smartphone or other computing device, even though the devices they own still work. Discuss the impact of people wanting to upgrade to the latest smartphone. In your answer you might consider the impact on: * stakeholders * technology * ethical issues * environmental issues!”
and set them off with 30 minutes, leaving a little bit of time for feedback at the end of the session.
They took it very seriously (They are an optional GCSE class taking my subject as an extra option) They could see the benefits themselves. Responses to other students was purposeful and exceptionally useful for them to draw out misconceptions and I love that I can pick up their answers afterwards, display them on the board and dissect where and why they would be picking up/losing marks and how they compare with other answers. They also quite liked being high up on the leaderboards as ranked by their peers.
From their feedback they would have preferred a little bit less time on the feedback as they felt it was just too long to be reading through the same content worded slightly differently a number of times.
Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced. They are now more succinct and have learned over a few of these that sometimes quality over quantity in an exam question is good.
Imagine asking one of your classes a deep, but deceptively-simple, question. Have them judge each others’ answers anonymously, give their reasons for the judgements, then assess the reasons as well. At the end of the process you get a mark. Automatically.
No exercise books, no late-night marking sessions. Just high-quality formative assessment. Empowered, gamified, peer-supported learning that just works.
Now imagine running this lesson with an Ofsted inspector in your classroom. Think they’d be impressed? So do I.
So you can try SPA for yourself, I have created this set of demonstration exercises you can assign to your students. It costs nothing so give it a go.
Here are a few examples of SPA questions. Add your own!
- You are a serf in a Norman village. Describe your day.
- Where would you rather live – Singapore or Dubai? Why?
- How could you use a barometer to determine the height of a tower?
- Why can a cheetah run faster than a gazelle?
- Explain why metals are sometimes defined as plasmas.
- How would you measure the volume of a dog?
SPA is a technology we patented several years ago and have been quietly working on ever since. This is the first time we’ve had it available in the main Yacapaca interface.
Beth wrote “I did screen shot one question that came up whilst I was testing a quiz I had written and used it as a plenary to the previous* lesson as part of the critera setting for the next task.”
* I think this should have been “next”.
This is a fantastic idea. It should be easy to train your students to screenshot particularly challenging choices and Continue reading