Jez Thompson reviews the Computer Science Practice Exams

Jez ThompsonThis guest post is by Jez Thompson, Strategic Lead IT at The Open Academy in Norwich

Jez blagged a free trial of our J276 OCR Computer Science Practice Exams in return for a promise to write up his experience. 

 

After reading How we cracked the auto-marking of GCSE short-text responses I asked Yacapaca for a trial.

Our Year 11 learners were privileged to trial Yacapaca’s ‘short-text’ artificial intelligent question practice exam questions; feedback from our learners was incredibly Continue reading

Funny exam question answers

I thought you might enjoy some of answers your student have been giving to the short-text questions in our J276 GCSE Practice Exam module.

Actually, I don’t know whose students these are. The students’ responses are completely anonymised when we see them. As the first results to each question come in, we analyse the performance of the auto-marking algorithm and tweak the scoring rubrics to improve performance. In the process, we discover that student humour is alive and well…

    • Describe the features of RAM. (2)
      It can breed with sheep, they are fluffy and have Continue reading

How we cracked the auto-marking of GCSE short-text responses

9570621018_29964755f1_k

Courtesy of https://www.flickr.com/photos/niexecutive/9570621018

The mainstay of most GCSE exams is the short-text question. For our new GCSE Practice Exams, we have developed a free-text marking engine that is robust enough to deal with real GCSE questions and, more importantly, real students. That system is now Continue reading

GCSE Exam Practice modules coming in April

Exam results

This April, we are launching a new series of professionally written and edited online practice GCSE exams, entirely auto-marked.

The aim is to raise GCSE results by at least one grade level for students who practice regularly.

The first fruit of our labours is J276 OCR Computer Science Practice Exams. Follow the link to see the current state as we complete development.

Features:

  • All GCSE question types, including multiple choice, short-text and long-text
  • Exam-timed questions
  • Instant, automatic marking
  • Formative feedback
  • Student motivation features
  • Select questions by keyword, topic and difficulty

There is a huge pile of work to do between now and then. New software modules to be written – some of them using advanced AI. A whole authoring and editing process has to be created and populated with talented individuals. But there is nothing we’ve not done before. What this project does is for the first time pull together skills we have developed over the last 25 years into a single, amazing, product.

I’m excited!

Is Yacapaca’s GCSE 9-1 grading accurate?

grading.png

My thanks to Bobby Grewal at Four Dwellings Academy for the awkward question.

Let’s start by asking what it is we are trying to measure. At Yacapaca, we have opted for “current” grading, as that’s what the majority of our users want. This means that we are attempting to answer the question:

  • If the student were to take the GCSE exam today,
  • and they had studied all the content to the same level as the content of this quiz
  • then what grade might they be expected to achieve?

Without actually throwing students as young as 11 into a real GCSE, how can we test whether our answers are accurate? Not only is that impractical; we can’t even (yet) back-calculate from real GCSE results. So we have to calibrate against some existing scheme for which we have data that can be validated.

Fortunately, there are two sources, NC Levels and GCSE A*-G. There are published conversion tables between each of these and the GCSE 9-1 grades. My preferred version is the QEGS Points Conversion Wheel.

qegs-conversion-wheel

The next question becomes “how do we know that our original NC and GCSE gradings were accurate. The answer is that we crowdsourced the calibration from thousands of teachers. As well as quizzes, Yacapaca allows you to add other grades from teacher-graded assignments. We correlate these to difficulty-adjusted quiz scores using linear regression. Thus, the accuracy of our quiz scores is based on the consensus in the teaching profession, rather than one arbitrary authority.

Now, we have to ask ourselves how accurate the result of one single quiz might be anyway. There are fundamental limits: consider the hypothetical case of a quiz that contains only one question, which may be got right or wrong. Does a correct answer equate to a Grade 9? An incorrect to a Grade 1? Of course not. A single answer tells you almost nothing. In practice, the absolute minimum you should rely on is 50 questions spread across at least three separate quizzes.

Finally, what accuracy can you expect from a single session with any assessment tool? Everyone has good and bad days; teenagers take it to extremes. If you really want Yacapaca to give you grades that you can rely on, my recommendation is that you use it little and often, building up a picture over time. Use the Progress Chart and the Parents’ Report to discover averages and trends in the data.

So is Yacapaca accurate? Yes, provided you use it intelligently.