• By putting the world’s library at our fingertips, search engines have revolutionised our lives, and both changed and challenged the nature and purpose of education.

    Why should I clutter my brain with information that is readily available from reference sources? – Albert Einstein.

    But the search companies have scarcely scratched the surface of what search technology can do. Google, in particular, has a programme called Google Labs which turns out a constant stream of mind-boggling innovations.

    One consequence of this is that our teaching of how to use search engines is permanently out of date. This isn’t just a matter of being ignorant of a few new features; the whole paradigm of how we gather and manipulate information is changing as we watch.

    The closest parallel I can think of is the introduction of calculators. At first, schools banned them. When they unbanned them, they threw out paper-and-pencil arithmetic and became utterly disempowered by them. Finally (and thirty years on) we have gained an appreciation of the strengths and weaknesses of the calculator and can now teach them in their proper context.

    My contribution to avoiding that same lost-generation syndrome hitting web search is this little project:

    Not only does it introduce students to geographical search, it also provides the stimulus for a discussion about the strengths and limitations of search technology generally.

    Suppose you were starting at a new school? How would you learn how to get there? Look up your own school on both Google Local and Multimap. Get directions from home to school, using each. Screenshot the map thus generated in each case.

    • Are they the same or different?
    • Is either the journey you actually take to school?
    • If not, why not?

    Class discussion should not focus on which service is better, but rather

    • How can a machine work this out at all?
    • When it’s gone wrong, how did it go wrong?
    • What does it seem to find easy/difficult?

    For reference, here’s my journey from the Chalkface base in central Cambridge to Netherhall School, using

    Both routes take me through bollarded roads accessible only to buses and taxis. More profoundly, neither engine appears to know that no-one in their right mind uses a car in Cambridge, we all cycle and go by the most direct route.

  • If you’ve ever opened a brand new textbook to find an erratum slip, you will already be familiar with one of the core problems of book publishing. Once printed, your content is set in stone. Get on the web, problem goes away. Find an error tonight, it’s changed by morning.

    But what about things that aren’t so much errors as subtle weaknesses? The challenge isn’t so much to change them, as to spot them in the first place. Educational publishers have long since relied on our customers to challenge gross infelicities that have crept through our (usually excellent) quality control procedures. But what about subtly misleading wording? Generally, it’s been down to the teacher to cope with the confusion thus generated.

    Now, this too is starting to change. Tools are emerging which enable us to see exactly where we have inadvertently induced confusion instead of clarity. Yesterday, Miranda and I spent a productive couple of hours applying them to our betselling Online ICT Assessment for KS3. The results were interesting.

    If you’re already a Yacapaca user, you’ll be familiar with the Analyse screen that allows you to see aggregated results of a particular assessment across a specific set or class. It’s very useful for spotting areas that need extra reinforcement, or where a class is carrying a specific misconception that needs addressing.

    Assessment authors have access to a similar screen, but one that shows results across the entire national cohort (for data protection reasons they can’t select individual schools or teachers, btw). A good question would have results looking something like this:

    • Key 66%
    • Distractor 1: 17%
    • Distractor 2: 8%
    • Distractor 3: 9%
    • Timeout: 0%

    Two thirds of the cohort are getting it right, so we’ve pitched it at the right level for the students. Distractor one represents a common misconception, but at 8 and 9%, distractors 2 and 3 are not so obvious as to make it easy to simply guess the correct answer.

    Now take a look at this question:
    What did Tim do to make the first text example below look like the second example?
    (there’s an image of two pieces of text; the upper one is large and bold)

    • Key: changed the size and made it plain text: 36%
    • Distractor 1: made it italic and changed the size: 4%
    • Distractor 2: changed the font and made it plain text: 9%
    • Distractor 3: changed the size and made it bold: 50%
    • Timeout 0%

    Clearly, with more students choosing Distractor 3 than the Key, there’s something seriously wrong here. But what? The key is in the pattern of responses. All but a few students know that Distractors 1 and 2 are incorrect; clearly they do know the subject area. The Key and Distractor 3 are pretty much mirror images of each other, and this is the clue we’re looking for. The text of the question is logically correct, but as students’ brains try to correlate ‘first’, ‘second’ and ‘below’ with a picture of two pieces of text, it’s a wonder they don’t go into trance.

    Now we’ve found the problem, we can solve it. Simply numbering the examples and removing the confusing reference to ‘below’ would probably be enough. In fact, we’ve decided to expunge it, along with a couple of other offenders, from the bank altogether when we introduce a new set in a couple of weeks.

    So far, we’ve only applied this philosophy to multiple choice, but you can imagine it being progressively applied across all electronic resources as we develop the means to do so. Rather than watching textbooks falling progressively out of date, you should expect your teaching resources to get steadily better with age.

  • Recruiting sergeants down the ages have known that young men come genetically preprogrammed to pledge their loyalty to the first vaguely plausible cause they are presented with, then go out to kill and maim in the name of it. Football clubs know that a stripey scarf is enough to trigger the same behaviour.

    Now we find even a website is enough. Yesterday, WikiTextbook was attacked and vandalised by by a WikiBooks zealot. He or she hacked the site so all homepage hits were redirected to wikibooks. The damage was reversed easily enough; wikis are famously robust when it comes to dealing with vandalism – there’s a brilliant screencast by Jon Udell if you’re interested.

    What I find cheering is that this is the first instance I’ve ever come across of textbooks (which is what both sites offer) arousing the same passions as Everton, or Queen and Country. So what’s changed? These textbooks aren’t better written or better illustrated than their paper competitors, after all. What they are, for the first time in history, is democratically controlled by their readers.

    That’s worth fighting for. Now all we have to do is persuade them not to attack members of their own side.

  • Two students reported having problems downloading their work this week. Our programmers are working to fix the problem, but they need more examples. If you see this problem, please use the Report Bug feature to let us know!

  • Chalkface alumnus Steve Eddy has gone live with his new venture Q&A Resources, an educational publishing cooperative. So far they have resources for A and AS Business Studies and Politics, and a range of English material. Congratulations Steve!