Wednesday, 7 January 2015

Gadgets have their place in education, but they’re no substitute for knowledge




The immense computing power we possess will only make learning easier if we acknowledge it will never make it effortless.


The children returning to school this week with their new Christmas gadgets don’t remember a world without smartphones, tablets, e-readers and laptops. For some, this generation of digital natives are using technology in collaborative and social ways that will revolutionise learning. Others worry about the damage these devices are doing to their concentration spans and their ability to think deeply.

So what is the truth about technology and education? Is it better to read War and Peace on a Kindle or on paper? Or should we forgo 19th-century novels completely in favour of co-creating our own stories on Facebook? As a recent New Scientist article acknowledged, the rapid pace of technological change means large-scale studies of many of these issues are lacking. However, there is some reliable research.

For example, there’s good evidence that one of the most popular claims made for technology is false. It has been said by many – from headteachers to union reps to Today presenters – that the internet reduces the importance of knowing facts. However, research from cognitive science shows the vital importance of remembering facts. When we think, we use working memory and long-term memory. Long-term memory is vast, but working memory is limited to about four to seven items and is easily overloaded. By committing facts to long-term memory, we free up precious space in our working memory to manipulate those facts and combine them with new ones.

That’s why it’s so important for pupils to learn their times tables: memorising them doesn’t stifle conceptual understanding but rather enables it. We also need a framework of facts in long-term memory to make sense of what we find on the internet; studies show that pupils frequently make errors when asked to look up unfamiliar knowledge. Long-term memory is not a bolted-on part of the mind that we can outsource to the cloud. It is integral to all our thinking processes; researchers even suggest it may be “the seat of human intellectual skill”.

While technology won’t remove the need for us to remember facts, it may make it easier for us to learn them. Another big insight from cognitive psychology is that we remember what we think about. In the words of Prof Dan Willingham of the University of Virginia, memory is the residue of thought.


At first sight, technology may not seem as though it will be very helpful here. Pop-up message alerts, ever-changing websites, and enticing hyperlinks make it very hard for us to think about what we are supposed to. North American studies show that university students frequently multitask on their laptops during lectures, and that those who do so understand less than students who concentrate solely on the lecture. Even if we switch off the internet, computers can still distract. Willingham gives an example of pupils working on a presentation about the causes of the Spanish civil war who spent most of their time experimenting with the different animations available in Powerpoint.

However, while distraction may hinder learning, technology doesn’t always have to lead to distraction. The striking thing about many computer games is that while they often involve quite monotonous tasks, they still prove incredibly addictive. People playing Tetris don’t seem to struggle to ignore distractions. So one potential solution for the distraction problem is to design educational games that grab and hold attention in the same way that computer games do. Duolingo and Khan Academy, two popular learning apps, offer points and badges when pupils complete challenges. Bruno Reddy, a former head of maths at the highly successful King Solomon Academy, has developed Times Tables Rock Stars, a game where pupils become “rock gods” if they answer maths questions in under a second.

Even in the case of such promising apps, we still have to put in time, effort and thought if we want to learn. Herbert Simon, a pioneer of cognitive psychology, wrote that “although we have a reasonable basis for hope that we may find ways to make learning processes more efficient, we should not expect to produce the miracle of effortless learning”. The immense computing power we possess definitely has the potential to make learning easier – but only if we acknowledge it will never make it effortless.