• Milestone reached

    100,000 questions in the question bank. We passed this milestone during September. As well as representing a truly humbling commitment from our authors, it is also represents a stunning teaching resource. Consider:

    • 100,000 question is 10,000 per each of 10 main subjects.
    • Across 5 years of secondary school to GCSE, that’s 2,000 questions per subject per year…
    • …which is 30 weeks x 67 questions.
    • Assuming you need to differentiate across three ability bands, that comes out as one, 22-question quiz per week, per subject, throughout Secondary school.

    So you now have no excuses left for not knowing the current attainment level of each of your students!

    Milestones missed

    75,000 registered teachers. As I write (New Year’s Eve) we have exactly 74,863 so we missed it by just 117 teachers. Bah.

    2 million registered members in total. We’ve got 1,935,622 members in total (except that is the November figure: I’ll update it when I get the Decembers stats in the morning). Actually, I don’t feel so bad about this because 2010 has really been about developing new things to do with the students you have already registered.

    Some of the new features we have launched

    Much of the work we have done this year has been ‘under the hood’ to cope with the increasing server load, but we’ve still made time for some great new features. Here is a partial list:

    • Reporting by grades: tell us the grade scheme of your student set, and we will report progress in those grades. Sounds simple, but it required a massive statistical exercise to achieve.
    • Progress charts: you only get to see these once there is enough data about the student set, but teachers who use them report that they are a fantastic aid for discussing progress with students.
    • Offline assignments: not officially announced yet, but they are in the interface. If you have ever wanted a markbook that gave instant, online feedback to students, here it is.
    • Structured Peer Assessment: I have really let this experimental system languish this year. There has been some progress, but not as much as I would have liked.
    • Student module redesign: The new interface has been a hit with both teachers an students. Not rolled out to all schools yet.
    • School groups: Who in your school also uses Yacapaca? And who is more experienced and able to help you? The school group will tell you.
    • Simpler assigning: There is a lot of choice in the assigning process now, so to make it easier for newbies we separated all the detail decisions into separate ‘expert’ screens, and set the most popular values as the defaults.

    Not a bad year, all in all.

  • If you fancy extending your authoring skills over the holiday, check out Yacapaca’s list of permitted embeds, which I have just updated.

    Author Kevin O’Driscoll has produced some great exemplar material combining video with quizzes. Try his course AIMS practice: Strand 1: Number Sense to get (more…)

  • Next time you log into Yacapaca, I’m going to ask you to do a little extra work. It’s worth it, I promise. I’m going to ask you to enter the grade scheme and average grade of each of your student sets.

    Why?

    Lots of teachers have told us that getting marks just as percentage is not all that useful. You need all marks to be reported in the grade-scheme used by that particular class.

    So from now on, you will be able to see the markbook report according to the relevant grade-scheme – once you have enabled the feature.

    What you need to do

    Next time you either set an assignment or visit the Markbook, you will be asked to enter two pieces of data about that particular student set.

    • Grade scheme: in England that’s usually NC levels or GCSE grades, but we support a good list of other grade schemes.
    • The average grade the student set is currently achieving at. Take your best guess on this; there is actually quite a big margin of error.

    It gets better over time

    The system compiles all the data it gets (particularly the ‘average grade’ data) and performs nightly statistical comparisons to improve results. Incidentally, this is the benefit to you of using a web-based system like Yacapaca. We correlate all the data from our 1.8 million members, and this lets us get a very high level of statistical accuracy.

    If you hit problems

    Write to us at support@yacapaca.com – to make the system any better, we are going to need your feedback.

    Lots of new features to come

    This update will enable us to do lots of things that we couldn’t in the past. For example, look out for easy-to-use progress charts for each student.

  • Put this on the whiteboard, snap it to full screen and sit back. The discussion afterwards will tell you what a great lesson you just taught.

    My favourite bit comes at exactly 14:00 minutes: “…about 50% of the fall in child mortality can be attributed to female education.” Doesn’t that make you proud to be a teacher?

  • In a traditional test, one does not give feedback at all. This is a result of the limitations of the medium; give feedback on paper and you also give the game away. Working online, you can give extended feedback as soon as the question is answered, but how do you do it for maximum effect?

    The problem we are trying to overcome: “in one ear, and out the other”.

    If you just tell a student something, they will generally be able to repeat it and show you that they have ‘learned’ it. An hour later, it will be gone. It won’t automatically move from short-term to long-term memory. Classic teaching technique seeks to address this through repetition; language teachers, in particular, generally try to follow Ebbinghaus’ ‘forgetting curve’ when structuring reviews. However, repetition is not the only solution, neither is it always practical to apply.

    Without oodles of repetition, simply giving students ‘the right answer’ doesn’t just fail them, it cons them. Immediately after the quiz, they think they’ve ‘got it’, but when they need it in real life, it’s gone. Many students will start blaming themselves and believing it is they, not we, who are stupid.

    My preferred solution is what Milton Erickson called “building response potential”. I try to pique the students’ curiosity so they are motivated to go off an learn the answer for themselves. How they learn it will vary; they may look it up in a textbook, discuss it with friends or sit down and work it out on paper. They will come back and back until they are satisfied – thus doing the repetition for themselves.

    Another way of describing the principle is “give them the itch”. In the end, they will just have to scratch.

    How do you do this in practice? Here are some suggestions:

    • Explain why they got it wrong.
    • Give a hint, but not the whole answer.
    • Tease them in some other way.
    • Remind them of the mnemonic, without actually giving the answer.
    • Remind them of a parallel.
    • Use a rhyme or a riddle.
    • Remind them of the rule or derivation method.
    • Hurl a joking insult (thanks Andrew Field for this idea)

    If you hold in the front of your mind that your main aim is to increase students’ emotional state of curiosity, you’ll soon develop your own preferred way of doing it.