Integrating Structured Peer Assessments into KS3 English units of work

Ruth Greener Guest post by Ruth Greener, Teaching, Learning and Assessment Coordinator and Teacher of English, St. Andrews School, Green Valley Campus, Thailand

 

I recently wrote two new Structured Peer Assessments to use in KS3 English.

  • Honeydukes: “Write 100 words to advertise a brand new product that will go on sale in Honeydukes in Hogsmeade.”
  • School Uniform: “Write 200 words to argue in favour of or against school uniform.”

Both SPA activities went well, but the shorter time worked better.

I used Honeydukes with Year 8 on a 25 minute timer. We’ve been reading Harry Potter & the Prisoner of Azkaban. They designed their products on paper in class and then did the SPA activity as a first draft, which went on to become a pitch for Dragon’s Den, an oral assessment.

With Year 9, School Uniform was a bridge from rhetorical speeches into a class debate on school uniform vs. dress code. I gave the students a bit longer (40 minutes), to construct a speech using some rhetorical techniques.

The time pressure is popular with students, and so is the recognition provided by the Judgement Rank.

I enjoyed being able to compliment students on their useful feedback for each other as well as embarrass the ones who were being lazy on that front. They hadn’t realised beforehand that I could look at everything once the activity was finished.

New design for the Whiteboards

The Yacapaca whiteboards are designed to be projected to the whole class during or immediately after a quiz. My colleague Sasha Sirota has just completed a refresh of the design, following a brief to reduce the cognitive load.

Access either of the two whiteboards from the Results dropdown:

Whiteboard select

Teams Whiteboard

Whiteboard teams

Team results update dynamically as the results come in. This is a great motivator for students if you run it during the quiz. For extra oomph:

  • allow multiple attempts at a short quiz, not one long one. This results in more, faster updates.
  • get students to name their teams, using the Manage Teams button at the bottom of the page. This button also allows you to override the default team memberships. Yacapaca auto-assigns balanced teams to promote competitiveness and give even your weakest student an equal chance to be on a winning team.

If you have not created teams, the Teams Whiteboard will open in individual students mode.

Analysis Whiteboard

whiteboard analysis

Use this after the quiz is finished. By default it shows a list of questions sorted by the average score for each question. Click on any question to get the screen above. Note the three steps to viewing it:

  1. Initial view shows the question and options only.
  2. Show Graph shows how students answered.
  3. Show correct shows which answer(s) was/were correct.

Use them in sequence for dramatic effect and to build student engagement.

I hope you like the new Whiteboard designs as much as I do!

What is Structured Peer Assessment?

Getting students to mark each others’ work has two huge advantage – it is brilliantly formative and it saves you a ton of time. But how accurate is it? And how do you validate that it really is helping students learn?

Structured Peer Assessment (SPA) is our patented system to deliver accurate, demonstrably-formative peer assessment. Here is what it looks like from the student perspective.

The theory behind SPA comes from Alastair Pollitt’s seminal 2004 paper Let’s stop marking exams (pdf) which introduced the concept of Comparative Judgement (CJ) and led to a number of initiatives.

The core idea behind CJ is this: instead of trying to mark one piece of work against a strict set of criteria – a scoring rubric – CJ presents the marker with pairs of answers and simply asks “which is better?” Multiple markers work in a team, so that each answer is assessed by several different people, against multiple other answers. By processing all these comparisons through an appropriate algorithm, those data can be converted into a rank order for all answers, and thus into grades.

The big benefit is that you escape from the straightjacket of highly-prescriptive rubrics that tend to reward rote learning of certain key words and phrases, and instead you can choose to reward such attributes as clear logic or narrative flair.

And the downside? Well, accuracy requires that each student’s answer be seen 30 times. Some of those decisions can be made almost instantly but some require considerable thought. Whilst it has been claimed that CJ is quicker than conventional marking, I and others have yet to be convinced.

But… if the students are doing the marking, that doesn’t matter! In fact it is a benefit. More time on task = more learning. We actually ask students to spend as long on the marking task as they did on the writing task.

This was the insight that led to SPA, but we didn’t stop there. We added an extra element: students must also state their reasons for preferring one answer over another. This does two things; it forces the student to think through their rationale, and it provides a trove of formative feedback for the original writer of the answer.

And to encourage students to write thoughtful, positive feedback, we gamify it by allowing students to reward each other with ‘badge points’ for particularly good explanations.

From all this, you actually get three three useful sets of marking data:

  • All the answers are ranked according to the collective decision of the students.
  • Students accuracy as markers is also ranked. Technically we take a statistical measure of conformance to consensus.
  • We also report a combined average of the two.

Depending on the summative function of the assessment, any one of these three can be converted to grades simply by deciding where the grade boundaries should go.

More importantly than that, SPA gives you a strongly formative assessment tool that really puts the students in charge of their own learning.

 

Snow day? S’no problem!

cars in snow

Next snow day, you don’t have to lose continuity with your students. Here are 4 ways you can keep them focused on their learning objectives, using Yacapaca.

key

qa Quick Assignments

I have put this first because it’s one many teachers never experiment with. It is simply a way to set any assignment for your students and get it back without all the palaver of emails getting lost, blocked or mistakenly thrown in the spam. Video (requires Flash) | More information.

homework Homework

Select a topic from your syllabus, and Yacapaca will automatically choose questions at the right level for each individual student. Teacher review on the misschambersICT blog | How the monitoring works.

revision Revision

One step on from Homework, Revision mixes and matches from all the topics the student has so far covered and enables them to really take charge of their own learning. Video | How to manage Yacapaca Revision.

mastery Mastery quizzes

When you assign a quiz or group of quizzes, Mastery is one of the modes you can choose. Set a success threshold and students will be presented with each quiz once per day (no more!) until they have reached that threshold. It works especially well with short quizzes that draw randomly from large question banks. Even works for Snow Weeks or (Siberia only) Snow Months! Introduction | 5 tips for success.

Structured Peer Assessment reviewed by Adam Williams

adamThis guest post is by Adam Williams, Teacher of IT and Computer Science, City of Norwich School an Ormiston Academy.

I have trialled Structured Peer Assessment exercises over the last few weeks with my classes. When I first saw it pop up as a new way of getting structured written work out of students I jumped at the chance. They are focused both during their own responses and even more so when they are giving feedback on others.

Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced.

I had planned to block out one of my whole one hour sessions to trial this initially but due to time constraints of the lesson I ended up with about 40 minutes left . To start with I showed them the video provided on the website and set the question “People often want to buy the latest smartphone or other computing device, even though the devices they own still work. Discuss the impact of people wanting to upgrade to the latest smartphone. In your answer you might consider the impact on: * stakeholders * technology * ethical issues * environmental issues!”
and set them off with 30 minutes, leaving a little bit of time for feedback at the end of the session.

They took it very seriously (They are an optional GCSE class taking my subject as an extra option) They could see the benefits themselves. Responses to other students was purposeful and exceptionally useful for them to draw out misconceptions and I love that I can pick up their answers afterwards, display them on the board and dissect where and why they would be picking up/losing marks and how they compare with other answers. They also quite liked being high up on the leaderboards as ranked by their peers.

From their feedback they would have preferred a little bit less time on the feedback as they felt it was just too long to be reading through the same content worded slightly differently a number of times.

Following the lesson, the structure of their writing for long mark questions has dramatically improved and the amount of waffle has reduced. They are now more succinct and have learned over a few of these that sometimes quality over quantity in an exam question is good.