The Open-Source Curriculum
Michael Gove's full speech on what technology and digital resources can do for learning
Education secretary Michael Gove addressed the BETT 2012 educational technology show today to announce the government would be scrapping ICT from UK schools in its current form and opting to bring computer science and programming into the National Curriculum - the number one recommendation from last year's Livingstone-Hope report.
In his speech, presented here in full and written in consultation with Eidos life president Ian Livingstone, he addresses the need to reform current ICT, the inspirational initiatives that have been set up to further education regardless of government policy, praise for the UK video game industry and technology initiatives like Raspberry Pi, and the importance of giving schools the freedom to use open-source educational resources in class.
Education Secretary Michael Gove
All around us, the world has changed in previously unimaginable and impossible ways. Most of us carry more advanced technology in the smartphone in our pocket than Neil Armstrong and Buzz Aldrin used to reach the Moon.
Every day we work in environments which are completely different to those of twenty-five or a hundred years ago.
Where once clerks scribbled on card indexes and lived by the Dewey Decimal system, now thousands of office workers roam the world from their desktop.
Where once car manufacturing plants housed lines of workers hammering and soldering and drilling, now a technician controls the delicate operations of a whole series of robots.
When I started out as a journalist in the 1980s, it was a case of typewriters and telexes in smoky newsrooms, surrounded by the distant clatter of hot metal.
Now newsrooms - and journalists - are almost unrecognisable, as are the daily tools of the trade. The telex machine became a fax, then a pager, then email. A desktop computer became a laptop computer. My pockets were filled with huge mobile phones, then smaller mobile phones, a Blackberry, and now an e-reader and iPad.
And with each new gadget, each huge leap forward, technology has expanded into new intellectual and commercial fields.
Twenty years ago, medicine was not an information technology. Now, genomes have been decoded and the technologies of biological engineering and synthetic biology are transforming medicine. The boundary between biology and IT is already blurring into whole new fields, like bio-informatics.
Last year's superb Livingstone -Hope Review found that the UK had been let down by an ICT curriculum that neglects rigorous computer science and programming skills
Twenty years ago, science journals were full of articles about the 'AI Winter' - the fear that post-war hopes for Artificial Intelligence had stalled. Now, detailed computer models show us more than we ever imagined about the geography of our minds. Amazing brain-computer-interfaces allow us to control our physical environment by the power of thought - truly an example of Arthur C. Clarke's comment that any sufficiently advanced technology can seem like magic.
Twenty years ago, only a tiny number of specialists knew what the internet was and what it might shortly become. Now, billions of people and trillions of cheap sensors are connecting to each other, all over the world - and more come online every minute of every day.
Almost every field of employment now depends on technology. From radio, to television, computers and the internet, each new technological advance has changed our world and changed us too.
But there is one notable exception.
Education has barely changed
The fundamental model of school education is still a teacher talking to a group of pupils. It has barely changed over the centuries, even since Plato established the earliest "akademia" in a shady olive grove in ancient Athens.
A Victorian schoolteacher could enter a 21st century classroom and feel completely at home. Whiteboards may have eliminated chalk dust, chairs may have migrated from rows to groups, but a teacher still stands in front of the class, talking, testing and questioning.
But that model won't be the same in twenty years' time. It may well be extinct in ten.
Technology is already bringing about a profound transformation in education, in ways that we can see before our very eyes and in others that we haven't even dreamt of yet.
Now, as we all know, confident predictions of the technological future have a habit of embarrassing the predictor.
As early as 1899, the director of the U.S. Patent Office, Charles H. Duell, blithely asserted that "everything that can be invented has already been invented."
In 1943, the chairman of IBM guessed that "there is a world market for maybe five computers". The editor of the Radio Times said in 1936, "television won't matter in your lifetime or mine".
Most impressively of all, Lord Kelvin, President of the Royal Society, scored a hat-trick of embarrassing predictions between 1897-9, declaring, "radio has no future", "X-rays are clearly a hoax" and "the aeroplane is scientifically impossible".
A new approach to technology policy
I don't aspire to join that illustrious company by stating on record that this technology or that gadget is going to change the world. Nothing has a shorter shelf-life than the cutting edge.
But we in Britain should never forget that one of our great heroes, Alan Turing, laid the foundation stones on which all modern computing rests. His pioneering work on theoretical computation in the 1930s laid the way for Turing himself, von Neumann and others to create the computer industry as we know it.
Another generation's pioneer, Bill Gates, warned that the need for children to understand computer programming is much more acute now than when he was growing up. Yet as the chairman of Google, Eric Schmidt, recently lamented, we in England have allowed our education system to ignore our great heritage and we are paying the price for it.
Our school system has not prepared children for this new world. Millions have left school over the past decade without even the basics they need for a decent job. And the current curriculum cannot prepare British students to work at the very forefront of technological change.
Last year's superb Livingstone -Hope Review - for which I would like to thank both authors - said that the slump in UK's video games development sector is partly the result of a lack of suitably-qualified graduates. The review, commissioned by Ed Vaizey who has championed the Computer Science cause in the Department for Culture, Media and Sport, found that the UK had been let down by an ICT curriculum that neglects the rigorous computer science and programming skills which high-tech industries need.
It's clear that technology is going to bring profound changes to how and what we teach. But it's equally clear that we have not yet managed to make the most of it.
Governments are notoriously flat-footed when it comes to anticipating and facilitating technical change. Too often, in the past, administrations have been seduced into spending huge sums on hardware which is obsolete before the ink is dry on the contract. Or invested vast amounts of time and money in drawing up new curricula, painstakingly detailing specific skills and techniques which are superseded almost immediately.
I believe that we need to take a step back.
Already, technology is helping us to understand the process of learning. Brain scans and scientific studies are now showing us how we understand the structure of language, how we remember and forget, the benefits of properly designed and delivered testing and the importance of working memory.
As science advances, our understanding of the brain will grow - and as it grows, it will teach us more about the process of education.