Please join us on Thursday 3 Jun, 16:30 – 18:00 (BST) where Paul and I will be giving a webinar on the first section of our book. Find out more here.
How Learning Happens: How Does Our Brain Work?
Thu 3 Jun 2021 16:30 – 18:00 (BST)
Professor Paul A. Kirschner
Dr. Carl Hendrick
Online webinar – £20
This (90 minute) session aims to give attendees a basic understanding of how the brain learns, drawing on important research from cognitive psychology. As John Sweller noted “without an understanding of human cognitive architecture, instruction is blind”.
In this webinar we will discuss the first section of our book ‘How Learning Happens’ looking at how our brains work and what that means for learning and teaching. We explain why students learn some things almost effortlessly without instruction, while other things are learnt with great difficulty through instruction, how our memory works and how we can make it work better, how we (learn to) solve problems, how and why images and words together can help us learn better, and why children should not be taught as if they were small adults. Among the topic covered are:
- What is cognitive load theory and how can I use it to inform my teaching?
- What do I need to know about short v long term memory?
- How do novices and experts learn differently?
- What is the difference between biologically primary and secondary knowledge?
Paul and Carl will give a 60-minute presentation about how our brain works which will be followed by a 30-minute session where they’ll answer any and all questions that you may have.
Dr. Carl Hendrick is a teacher of 15 years classroom experience and is co-author of ‘How Learning Happens’ and holds a PhD in education.
Professor Paul Kirschner is emeritus professor of educational psychology, instructional designer and ex-teacher of maths and sciences, and is co-author of ‘How Learning Happens’.
How Learning Happens introduces 28 giants of educational research and their findings on how we learn and what we need to learn effectively, efficiently, and enjoyably. Many of these works have inspired researchers and teachers all around the world and have left a mark on how we teach today.
I’ve written a new piece for the Chartered College of Teaching journal, Impact with Dr Jim Heal (Deans for Impact)
One of the difficulties with determining what is effective in a classroom is that very often, what looks like it should work does not and vice versa. Take, for example, the notion of engagement. On the surface, this would seem like a necessary condition for learning. However, there is some evidence that it may not be a sufficient one. Graham Nuthall explores this dichotomy in his seminal book The Hidden Lives of Learners (Nuthall, 2007, p. 24):
“Our research shows that students can be busiest and most involved with material they already know. In most of the classrooms we have studied, each student already knows about 40–50% of what the teacher is teaching.”
Nuthall’s work highlights the fact that students are keen to busy themselves doing tasks that give the appearance of learning but which actually might just be disposable activities that do not engender long-lasting and durable learning. In addition, there is the fact that students’ prior knowledge about a particular domain is really the key element in terms of how their learning will progress. As Tricot and Sweller (2014, p. 9) point out, ‘when performing a cognitive task requiring domain specific knowledge, the presence or absence of this knowledge is the best predictor of performance’. Without knowing what students know or don’t know, teachers might not be putting students in a place where they are transforming their understanding of a particular concept, and what might look like learning might in fact just be keeping busy.
Delighted to announce that the American edition of ‘What Does This Look Like in the Classroom?’ has been released this week with a new foreword by Dylan Wiliam which you can read here.
The book is the third volume in @Learn_Sci‘s Dylan Wiliam Center Collection
In 1999, Paul Black and I were working with a group of math and science teachers. We had just completed a major review of the research on the impact of assessment on learning, and we had published our findings in a rather dense 65-page academic journal article.1 However, since we thought our findings would be of interest to practitioners and policy-makers, we also wrote a more accessible summary of our research, and its implications, which was published in Phi Delta Kappan magazine.2
One of the most surprising findings of our review, which we were sharing with the teachers, related to research on feedback. A particularly comprehensive review of feedback in schools, colleges and workplaces by two American psychologists—Avraham Kluger and Angelo DeNisi—had found that while feedback was often helpful in improving achievement, in 38% of the well-designed studies they had found, feedback actually lowered performance.3 In other words, in almost two out of every five cases, the participants in the research studies would have performed better if the feedback had not been given at all.
In trying to makes sense of their findings, Kluger and DeNisi suggested that feedback was less effective when it was focused on the individual (what psychologists call “ego-involving”) and more effective when it was focused on the task in which the students were engaged (“task-involving”). We therefore suggested to the teachers that to make their feedback to students more effective, they should give task-involving rather than ego-involving feedback.
Most of the teachers seemed to find this advice useful, but one teacher, after some thought, asked, “So does this mean I should not say ‘well done’ to a student?” Paul and I looked at each other, and realized that we didn’t know the answer to the question. We knew, from the work of a number of researchers, that in the longer term, praise for effort would be more likely to be successful than praise for ability. However, without knowing more about the relationship between the teacher and the student, about the context of the work, and a whole host of other factors, we could not be sure whether “Well done” would be task-involving or ego-involving feedback.
What is ironic in all this, is that we had failed to take the advice we had given teachers a year earlier in the Phi Delta Kappan article, where we said,
if the substantial rewards promised by the research evidence are to be secured, each teacher must find his or her own ways of incorporating the lessons and ideas set out above into his or her own patterns of classroom work and into the cultural norms and expectations of a particular school community. (p. 146)
The important point here is that the standard model of research dissemination, where researchers discover things, and then tell teachers about their findings, so that teachers can then implement them in their own classrooms, simply cannot work. As Carl Hendrick and Robin MacPherson point out in this book, classrooms are too complex for the results of
research studies to be implemented as a series of instructions to be followed. Rather, the work that teachers do in finding out how to apply insights from research in their own classrooms is a process of creating new knowledge, albeit of a distinct, and local kind. This is why this book is so unusual and important. It is not an instruction manual on how to do “evidence-based teaching” (whatever that might mean). It is, instead, an invitation to every educator to reflect on some of the most important issues in education in general—and teaching in particular—and to think about how educational research might be used as a guide for careful thinking about, and exploration of, practice.
A second unusual—and welcome—feature of this book is the way it was put together. Carl and Robin started by identifying a number of issues that are relevant to every teacher—student discipline and behavior, motivation, grading, classroom practice, reading, inclusion, memory, technology and so on. At this point, most authors would have written advice for teachers on these issues, but of course, the danger with such an approach is that it reflects the concerns of the authors, rather than of the potential reader—what philosopher Karl Popper described as “unwanted answers to unasked questions.”4
Instead, in a novel twist, Carl and Robin decided to ask practicing teachers what were for them the most important questions in each of the areas. Then, again counter to what most writers would have done, they posed the questions to both academic experts and those with expertise in classroom practice. The result is a marvelous combination of insights into teaching that are both authoritative and immediately relevant to classroom practice.
If you have read the previous volumes in the “Dylan Wiliam Center Collection” —Craig Barton’s How I wish I’d taught maths and Tom Sherrington’s The learning rainforest—you will know that our aim has been to bring to North American readers the very best of authoritative writing on education from around the world. While the questions that were posed by the teachers that Carl and Robin worked with may appear to be focused on issues that are of particular concern to teachers in England, the extraordinary range of expertise of those responding to these questions means that the answers are relevant to every American educator. The very best writing on educational research, in an accessible form—solutions you can trust.
1 Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice, 5(1), 7-74.
2 Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139-148.
3 Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254-284.
4 Popper, K. (1992 p. 41). Unended quest: An intellectual biography. London, UK: Routledge.
In our new book ‘What Does This Look Like in the Classroom?’ we interviewed Dylan Wiliam on how to implement research on assessment in the classroom.
A central problem in the area of assessment in the classroom has been in the way we often confuse marking and feedback. As Dylan Wiliam points out in our discussion, there is an extraordinary amount of energy expended by teachers on marking and often very little to show for it in the way of student benefit. Although feedback is one of the most effective drivers of learning, one of the more surprising findings is that a lot of it actually has a negative effect on student achievement.
A set of marked books is traditionally seen as an effective proxy for good teaching but there is a lot of evidence to say that this might not always be the case. This problem is on a scale that might surprise a lot of people:
Dylan: I once estimated that, if you price teacher’s time appropriately, in England we spend about two and a half billion pounds a year on feedback and it has almost no effect on student achievement.
Certainly students need to know where they make misconceptions or spelling errors and correcting those is important. Doing so also provides a useful diagnostic for teachers to inform what they will teach next, but the written comments at the end of a piece of work are often both the most time-consuming and also the most ineffective. For example, taking the following typical comments on a GCSE English essay:
- Try to phrase your analysis of language using more sophisticated vocabulary and phrasing.
- Try to expand on your points with more complex analysis of Macbeth’s character.
This is a good example of certain assessment practices where the feedback mainly focuses on what was deficient about it, which as Douglas Reeve’s notes, is “more like a post-mortem than a medical.” The other thing is that it doesn’t really tell the student what they need to do to improve. What is more useful to the student here? receiving vague comments like these or actually seeing sophisticated vocabulary, phrasing and analysis in action? It’s very difficult to be excellent if you don’t know what excellent looks like.
Often, teachers give both a grade and comments like those above to students, hoping that they somehow improve by the time their next piece of writing comes around a week later and then berate the student when, lo and behold, they make the same mistakes again. Perhaps part of the problem here is that we have very low expectations of what students are willing to do in response to a piece of work and do not afford them the opportunity to engage in the kind of tasks that might really improve their learning.
To address this problem, Dylan advocates a much more streamlined model of marking that is not only more manageable for teachers, but also allows students to have more ownership over the process:
Dylan: I recommend what I call ‘four quarters marking.’ I think that teachers should mark in detail, 25% of what students do, should skim another 25%, students should then self-assess about 25% with teachers monitoring the quality of that and finally, peer assessment should be the other 25%. It’s a sort of balanced diet of different kinds of marking and assessment.
Dylan Wiliam’s Four Quarters Marking (Oliver Caviglioli)
After producing a piece of work, instead of using abstract skills based success criteria, it is probably more powerful for students to have access to a bank of exemplar essays or worked solutions to see concrete examples of success against which to self-assess their own work. Marking everything in sight and leaving detailed comments is an established cultural norm now but this practice doesn’t appear to be based on good evidence. We know for example that many students will look at a grade and not engage with the feedback but is that feedback always useful anyway?
As we discuss in the book, a common issue we see again and again in using research in the classroom is the ‘Chinese whisper effect’ where by the time evidence works its way down to the level of the classroom, it’s a pale imitation of its original form. This is especially prevalent in the area of marking where convoluted policies such as triple marking are enacted as a means of raising pupil achievement whereas all they are doing is often increasing teacher workload. As Dylan Wiliam reminds us, “feedback should be more work for the recipient than the donor,” but how do you change a culture that has traditionally been the opposite?
Dylan: In terms of what we do about this, I would say first of all, headteachers should lay down clear expectation to parents and say things like, “We are not going to give detailed feedback on more than 25% of what your child does. The reason for that is not because we’re lazy. It’s because there are better uses we could make of that time. We could mark everything your child does, but that would lead to lower quality teaching and then your child will learn less.” Heads have to establish those cultural norms. If a teacher is marking everything your child does, it’s bad teaching. It is using time in a way that does not have the greatest benefit for students.
As a profession, we are too some extent, we are our own worst enemy. Using marking policies that have little impact on student achievement and a negative impact on teacher workload and morale makes little sense. By adopting an approach like four quarters marking, we might go some way to address this issue and at the same time, give students more ownership over their own learning.
‘What Does This Look Like in the Classroom?’ is out later this month.
On an indecently hot day in Texas, professor Jerry B. Harvey was visiting his wife’s family when his father-in-law suggested they visit a new restaurant in the town of Abilene to which his wife exclaimed “sounds like a great idea.” Harvey had reservations about this however, as a 53 mile trip in a car with no air-conditioning sounded terrible to him, but not wanting to rock the boat he also proclaimed this a good idea and asked his mother in law if she wanted to go. As she was now the only one in the group who had not yet expressed agreement with this “great idea,” she also said they should go, and so they began their journey to Abilene. However, as Harvey explains, the trip was not a success:
My predictions were fulfilled. The heat was brutal. Perspiration had cemented a fine layer of dust to our skin by the time we had arrived. The cafeteria’s food could serve as a first-rate prop in an antacid commercial.
Some four hours and 106 miles later, we returned to Coleman, hot and exhausted. We silently sat in front of the fan for a long time. Then to be sociable, I dishonestly said, “It was a great trip wasn’t it?”
No one spoke.
After a while, his mother-in-law admitted that she never really wanted to go but only did so because she thought everyone else wanted to and didn’t want to cause a fuss, to which his wife also protested that she never really wanted to go either which then lead to a volley of argument. Eventually his father in law broke the silence and exclaimed in a long Texas drawl: “Shee-it. Listen, I never wanted to go to Abilene. I just thought you might be bored. You visit so seldom I wanted to be sure you enjoyed it. I would have preferred to play another game of dominoes and eat the leftovers in the icebox.” This experience led to Harvey coining the term ‘the Abilene paradox’ to explain a curious aspect of group dynamics in which the opposite of what everyone wants is tacitly created by the group who thinks they are agreeing with what everyone else wants.
After the outburst of recrimination we all sat back in silence. Here we were, four reasonably sensible people who, of our own volition, had just taken a 106-mile trip across a godforsaken desert in a furnace-like temperature through a cloud-like dust storm to eat unpalatable food at a hole-in-the-wall cafeteria in Abilene, when none of us had really wanted to go. In fact, to be more accurate, we’d done just the opposite of what we wanted to do. The whole situation simply didn’t make sense.
The Abeline paradox lies in the fact that we have problems not with disagreement, but rather with agreement. It is characterised by groups of people in an organisation privately agreeing that one course of action makes sense but failing to properly communicate those ideas and then collectively stumbling to what they think is the right course of action or what everyone else wants. Eventually an inaccurate picture of what to do emerges and based on that, the organisation takes steps towards actions that nobody really wants and which is ultimately counterproductive to the aims of the organisation itself.
You can witness the Abilene paradox at work in many schools. Often this takes the form of ill-considered marking policies which increase teacher workload to the point of exhaustion, endless tracking and monitoring of students, behaviour policies which punish the teacher more than the student who misbehaves, and graded lesson observations where teachers abandon what they normally do to put on a one-off, all singing, all dancing lesson for the observer, because that’s what everyone thinks that’s what inspectors want.
A lot of this can be accounted for by innate cognitive biases such as groupthink but it can also be exacerbated by either poor evidence, as in the case of learning styles or a poor understanding and misappropriation of good evidence as in the case of formative assessment. With the emergence of a solid evidence base, we might just be able to defend ourselves from these kind of cognitive biases if they are communicated clearly and appropriated effectively as part of a broader discussion about the values of a school. At it’s best, good evidence can act as a bulwark against the tsunami of nonsense that has so often washed over our schools. If we fail to have these important discussions and simply go with what we think might work, then we are at risk of loading the entire staff onto the school mini-bus and heading off to Abilene.
This is an excerpt taken from the forthcoming book ‘What Does This Look Like in the Classroom? Bridging the Gap Between Research and Practice
I was asked to write a piece for the Guardian Teacher network on what books teachers should read. You can read it here.
1. Motivation doesn’t always lead to achievement, but achievement often leads to motivation.
While there is a strong correlation between self perception and achievement and we tend to think of it in that order, the actual effect of achievement on self perception is stronger than the other way round (Guay, Marsh and Boivin, 2003.) It may well be the case that using time and resources to improve student academic achievement directly may well be a better agent of psychological change than psychological interventions themselves. Daniel Muijs and David Reynolds (2011) note that:
At the end of the day, the research reviewed shows that the effect of achievement on self-concept is stronger that the effect of self-concept on achievement.
Despite this, a lot of interventions in education seem to have the causal arrow pointed the wrong way round. Motivational posters and talks are often a waste of time and may well give students a deluded notion of what success actually means. In my experience, teaching students how to write an effective introduction to an essay through close instruction, careful scaffolding and then praising their effort in getting there is a far more effective way of improving confidence than showing them a TED talk about how unique they are.
2. Just because they’re engaged doesn’t mean they’re learning anything.
One of the slides from a talk that has stuck with me the most in recent years was this one from Professor Rob Coe which in which he criticised graded lesson observations and highlighted several performance indicators for learning which are actually very misleading:
This again is quite a counterintuitive claim. Why is engagement is such a poor proxy indicator – surely the busier they are, the more they are learning? This paradox is explored by Graham Nuthall in his book ‘The Hidden Lives of Learners,’ (2007) in which he writes:
“Our research shows that students can be busiest and most involved with material they already know. In most of the classrooms we have studied, each student already knows about 40-50% of what the teacher is teaching.” p.24
Nuthall’s work shows that students are far more likely to get stuck into tasks they’re comfortable with and already know how to do as opposed to the more uncomfortable enterprise of grappling with uncertainty and indeterminate tasks.
3. Marking and feedback are not the same thing.
This subtle difference may seem semantic but there is an important distinction to be made. The value in marking a piece of work may counterintuitively be of more benefit to the teacher than the student as David Didau explains:
While there’s no doubt that marking and feedback are connected, they are not the same. In some parts of the world – Japan for instance – teachers do very little marking but that’s not to say students are not getting feedback. From my own experience, I’m pretty sure it’s possible to make marks in students’ books without providing anything in the way of useful feedback and of course lots of thinking (some of it disastrous) has been done to try to prevent this from happening. Ask any group of teachers if their marking load has increased dramatically in past five years and they’ll fall over themselves to let you know just how much impact marking has on their lives, but what impact does it have on students’ outcomes? The answer is, we just don’t know.
4. Feedback should be more work for the recipient than the donor.
Possibly the most damaging misappropriation of research in my career has been the mangling of Assessment For Learning – a quagmire from which we are now only beginning to emerge. Not long after Dylan Wiliam’s seminal 1998 ‘Inside the Black Box’ became adopted at a national level, school leaders and policy makers managed to twist it into a pale imitation of its original form as AFL became about students memorising what level they were working at and teachers marking books at a level that defied sense in order to show ‘evidence’ of learning. But for feedback to be truly meaningful to students, they need to take ownership of it which may well mean not giving levels to a piece of work at all and instead just leaving comments for the student to reflect and act upon. As Dylan Wiliam writes:
Robyn Renee Jackson suggests that one of the most important principles for teachers is “Never work harder than your students” (Jackson, 2009). I regularly ask teachers whether they believe their students spend as long processing feedback as it takes for the teacher to provide it. Few teachers say yes. We spend far too much time giving feedback that’s either completely ignored or given scant attention.
5. (a) The steps needed to achieve a skill may look very different to the final skill itself.
If you want to get good at a certain skill then surely the best way to get good at it is to practice that particular skill right? Well not according to the tenets of deliberate practice which asserts a more indirect approach that breaks a global skill down into its constituent local parts and focuses on specific feedback and incremental improvement rather than a set of assessment criteria/performance descriptors that are “aimed at some vague overall improvement.” (Ericsson) In her book ‘Making Good Progress’, Daisy Christodoulou writes:
Whilst skills such as literacy, numeracy, problem solving and critical thinking are still the end point of education, this does not mean that pupils always need to be practising such skills in their final format. Instead, the role of the teacher and indeed, the various parts of the education system, should be to break down such skills into their component parts, and to teach those instead. This means that lessons may look very different from the final skill they are hoping to instil. For example, a lesson which aims to teach pupils reading may involve pupils learning letter-sound correspondences. A lesson with the ultimate aim of teaching pupils to solve maths problems may involve them memorising their times tables. The idea here is that the best way to develop skills does not always look like the skill itself.
5. (b). There is no such thing as developing a ‘general’ skill.
Of course, critical thinking is an essential part of any student’s mental equipment. However, it cannot be detached from context. Teaching students generic ‘thinking skills’ separate from the rest of the curriculum is often meaningless and ineffective. As Daniel Willingham puts it:
[I]f you remind a student to ‘look at an issue from multiple perspectives’ often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives … critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context.
This detachment of cognitive ideals from contextual knowledge is not confined to the learning of critical thinking. Some schools laud themselves for placing ‘21st-century learning skills’ at the heart of their mission but without anchoring them in domain specific contexts, they are often a waste of time. Anders Ericsson develops this point:
This explains a crucial fact about expert performance in general: there is no such thing as developing a general skill. You don’t train your memory; you train your memory for strings of digits or for collections of words or people’s faces. You don’t train to become an athlete; you train to become a gymnast or sprinter or a marathoner or a swimmer or a basketball player. You don’t train to become a doctor; you train to become a diagnostician or a pathologist or a neurosurgeon.
It’s a well observed truth that because everyone has had an education, everyone feels well placed to comment on all aspects of education. Often that takes the form of “My experience of education was like this so all education should be more/less like that.” This often finds its most pure expression in the form of mainstream journalists giving answers to questions nobody asked them and giving a state of the union address on education anyway.
In a recent article in the Times, Caitlin Moran decided to offer herself up as education secretary and outlined her vision:
My plan is very straightforward, and rests on two facts: (1) the 21st-century job market requires basically nothing of what is taught in 21st-century schools, and (2) everyone has a smartphone.
First, as anyone with a teenage/young adult child will know, the notion of them going into a full-time, long-term job with a pension at the end of it looks like something we left behind in the 20th century. The old pathway – learn a skill, use it for 40 years, then retire – is over. The jobs of the future require flexibility and self-motivation. Indeed, the jobs of the future increasingly require you to invent your own job. The majority of jobs our children will have – in just a few years’ time – have almost certainly not been invented yet.
Those of you playing TED talk education cliche bingo will probably be already screaming “full house!” but hold your horses folks…
If you work better sleeping until noon, then working until 2am – as most teenagers do – congratulations! You no longer have to deny your own biology! And if, working at 2am, you have no teacher to help you, discovering how to research on your own is, frankly, going to be far more useful than the thing you’re supposed to be learning. Because, (2) my education policy would be to stop bothering kids with anything you can access on a smartphone.
This kind of pseudo-futurist, Waitrose Organic aisle philosophy is often found in self-made individuals who eschewed school to admirably forge a successful career in the communications industry. Fine for those outliers but unconscionable to advocate that for the vast majority of children from less privileged backgrounds for whom good A level results can be life changing.
Apart from the fact there are well documented problems with the assertion that Google can replace human knowledge, claiming that we should just depend on our smartphones for learning is a dangerous idea as Paul Kirschner and Mirjam Neelen note:
All this leads us to conclude that if we don’t have knowledge and only depend on the information in the cloud, what will happen is what William Poundstone articulates in the Guardian: The cloud will make us mega-ignorant: unaware of what we don’t know.
There is also the potentially damaging impact with celebrities claiming that school doesn’t matter. One can’t help stifle a chuckle/grind one’s teeth when we hear auto-satirist Jeremy Clarkson tell A level students to not worry about their results because it all worked out fine for him. Again, fine for those who had great fortune/rare abilities but very dangerous to mandate as a broad system of education.
Behind these risible views however lies a dangerous conceit; namely that the purpose of education is to merely get you a job, and not just any job, but a job that doesn’t exist yet. These phantasmic jobs often focus on alliterative groupings such as “collaboration, creativity and connectivity”, (something we are evolved to do anyway) and are positioned in opposition with the cruelty of 20th century education which apparently was some kind of mass conspiracy designed to create a global village of the damned.
However, far from being the so called “factory model” bastille limiting children’s potential, 20th century schooling is perhaps the high water mark for education. Class sizes were reduced, more students with special needs were included in mainstream education, minimum teachers standards were introduced and more children had access to quality education than at any time in human history. Astonishingly, from 1900 to 2015, rates of global literacy increased from 21 percent to 86 percent.
Behind many of these claims is also a barely concealed contempt for the teaching profession. If Moran’s education policy were to be enacted and we were to “stop bothering kids with anything you can access on a smartphone” then the role of teacher would be defunct. Teachers are the most expensive commodity in the 21st century learning paradigm so it would make sense to replace them with technology, something these evangelists rarely say explicitly. At a time when school funding is being cut to almost unmanageable levels, the emancipatory claims of 21st century learning should be viewed with forensic scepticism.
Of course we should prepare students for an uncertain future, but if we adopt the techno-evangelist disruptive model and view education as merely a utilitarian enterprise for 21st century workplace then we truly will enact a “factory model” of schooling and furthermore, we will diminish the gift of knowledge for its own sake. Students should study Shakespeare not because of what job it might get them but because it’s an anthropological guidebook that tells them how to live.
“Any comfortable American who is cynical of progress – or the competent decency of modern civilisation – hasn’t pondered how life was for our ancestors. Any day that cossacks haven’t burned your home should start out a happy one, overflowing with optimism.”
– M. N. Plano
It’s a feature of any generation to believe that the present is never as good as the past. Indeed, the last few years have seen many rise to power on the back of this dangerous fallacy. Doom mongers and eschatologists would have us believe that we are hurtling towards a post-apocalyptic future fuelled by a more violent, divided and psychopathic society.
Now I’m not naturally given to optimism but after reading Johan Norberg’s book ‘Progress’ there are some inescapable truths that would suggest that far from living in end times, we’ve actually never had it so good.
For example, if you were a 10 year old living 150 years ago, you would probably have already experienced the death of more than one immediate family member already. You would have been sent to work in horrific conditions pretty soon after that and your average life expectancy would suggest you’d only have another 20 miserable years left on the planet.
In 1900, average world life expectancy was 31 years old, by 2016 it was 71. Remarkably, the increase in life expectancy has occurred in only the last four generations of roughly 8,000 generations of homo sapiens. In anthropological terms, if you’re reading this you’ve not just won the lottery, you’re winning it every day.
For the majority of human existence, people lived truly wretched lives as hunter gatherers but in the 19th Century the Industrial revolution gave rise not just to huge population growth but to a concomitant economic prosperity as Norberg points out: “Between 1820 and 1850, when the population grew by a third, workers’ real earnings rose by almost 100 per cent” and in the 20th Century, the advent of open markets and free trade did not just confine this wealth to the West: “Since 1950, India’s GDP per capita has grown five-fold, Japan’s eleven-fold and China’s almost twenty-fold.”
Indeed, as a recent article points out the amount of people living in extreme poverty is plummeting by an average of more than 100,000 people per day:
There were nearly 1.1 billion people on earth in 1820, with about 1 billion of those individuals living in what is considered extreme poverty, according to World Bank economists Francois Bourguignon and Christian Morrisson. By the 1990s, the world’s population exploded to over 7 billion, with 2 billion living in extreme poverty.
From 1990 to 2015, the number of people worldwide in dire poverty shrank to 750 million. Put differently, that figure has, on average, fallen by 137,000 people every day over the last 25 years.
In terms of violence, it is a commonly held belief that the 20th Century was the bloodiest on record but when taking into account huge population increases, this simply isn’t true. For example, the Mongol conqueror Timur Lenk killed proportionally almost as many as Hitler, Stalin, and Mao combined in central Asia in the 14th Century. The latest US figures, for 2013, show that murder rates are lower now than in the early 1960s. In the same year, homicides in Japan hit a post-war low. In England and Wales the level of violence has dropped by 66 per cent since the latest peak in 1995.
In fact, as Yuval Noah Harari points out, our mortality would seem to be very much in our own hands. In many tragic cases, quite literally:
“In 2012 about 56 million people died throughout the world; 620,000 of them died due to human violence (war killed 120,000 people, and crime killed another 500,000). In contrast, 800,000 committed suicide, and 1.5 million died of diabetes. Sugar is now more dangerous than gunpowder.”
In education, the increases in provision and access have facilitated a simply astonishing rate of improvement: from 1900 to 2015, rates of global literacy increased from 21 percent to 86 percent of the global population. This rise has mainly occurred in countries that have rejected authoritarian regimes and the superstitious dogma of religion for the liberal democratic values of equality, humanism and personal liberty. Further good news in terms of education is that current rates of growth could possibly see us turn our attention to solving seemingly intractable problems provided certain political movements are challenged sufficiently as Norberg illustrates:
If annual global economic growth remains around two percent per head, the average person in 100 years’ time will be around eight times richer than today’s average person. With those resources, the level of scientific knowledge, and the technological solutions that may then be at our disposal, many of the problems that intimidate us today will be much easier to handle—from adapting to warming to taking CO2 out of the atmosphere.
Perhaps our current apocalyptic malaise is a result of the fact that when extreme poverty is off the table, humans have more time to address social injustice and inequality and although there is still quite a way to go, the gains over the last 100 years have been nothing short of remarkable. The fact that a black man can now marry a white man in many Western societies is testament to how far we’ve come. The other major factor is the rise of social media and a 24 hour news culture that is predisposed to report what’s going wrong as opposed to what’s going right. What Norberg’s book shows is that progress is not dramatic and therefore often not news-worthy.
Although there are many challenges that we are right to be worried about, for example ongoing geopolitical tensions, a worrying trend in mental health and the disappearance of jobs due to automation, the recent past suggests that we always find solutions. We are less aware of progress than we are of disasters because progress is a glacial process that often happens under the radar and is so complex that it resists simplistic narratives. As Bill Hicks reminds us, modern media is often based on sensationalism and scare tactics but when you take a wider historical viewpoint, there’s never been a better time to be alive.
Writing in 1985, Neil Postman made the interesting observation that of the first fifteen U.S. presidents, many of them could walk down the street without being physically recognised yet they would be instantly identifiable by things they had written or speeches they had delivered. Today the opposite is true.
Postman saw 1980s America as a world that celebrated the transient and the superficial, where the power of the written word as a space to formulate and expand on complex arguments gave way to impressionistic understanding and surface engagement. He claims the 19th century was the apogee of human thought where the act of undistracted reading in particular represented “both their connection to and their model of the world.” He saw the advent of mass entertainment TV in the 1980s as a tipping point characterised by an over-saturation of information to the point of total distraction:
“Information is now a commodity that can be bought and sold, or used as a form of entertainment, or worn like a garment to enhance one’s status. It comes indiscriminately, directed at no one in particular, disconnected from usefulness; we are glutted with information, drowning in information, have no control over it, don’t know what to do with it.”
What’s remarkable about Postman’s dystopian vision is that it was written before the advent of the Internet. The dizzying amount of information now produced daily would have been inconceivable to him yet many of its stupefying effects would have been instantly recognisable. If TV was having this effect on adults 30 years ago, what impact is the Internet having on young people today?
An important new study on student use of the technology has shown that students who have access to the Internet in the classroom are being distracted out of learning in any kind of meaningful way. Students who used laptops in lessons voluntarily logged into a proxy server which monitored their in-class behaviour and researchers found that “the average time spent browsing the web for non-class-related purposes was 37 minutes. Students spent the most time on social media, reading email, shopping for items such as clothes and watching videos.”
An essential point to make here is that an adult using the Internet is not the same as a 15 year old using it. Most adults have developed schemas of knowledge that allow them to navigate the great highways of the Internet, identify subtle exits, negotiate fruitful side-roads and avoid potential dead ends. Asking kids to ‘research a topic on the Internet’ is like dropping a five year old on a motorway and expecting them to find their way home on their tricycle.
An older friend of mine who is a history teacher remarked to me recently that it was about 15 years ago that students who were asked to write assignments on Martin Luther King started handing in essays on the American civil rights leader who nailed 95 theses on a church door in Wittenberg in the 16th century and subsequently led the Protestant Church movement until his tragic assassination in 1968. Simply unleashing kids on the Internet with the vague justification of ’21st century skills’ is not just largely ineffective but a dereliction of duty.
Many techno-evangelists cite the vague concept of ‘creativity’ as a justification for the Internet in our classrooms but as Andrew Keen presciently pointed out ten years ago, as ‘truth’ becomes a relative term, not only is real creativity threatened but the wider implications for society are perhaps far more concerning:
This undermining of truth is threatening the quality of civil public discourse, encouraging plagiarism and intellectual property theft, and stifling creativity. When advertising and public relations are disguised as news, the line between fact and fiction becomes blurred. Instead of more community, knowledge, or culture, all that Web 2.0 really delivers is more dubious content from anonymous sources, hijacking our time and playing to our gullibility.
In a recent New York Times opinion piece, Darren Rosenblum noted that when teaching a unit on what he thought would be the engaging topic of sexuality and the law, his attempts to provoke discussion with his students was met with a “slew of laptops staring back” at him. He subsequently appealed to students not to bring laptops and created what he felt was a more human connection which in turn led to a better environment for learning: “Energized by the connection, we moved faster, further and deeper into the material.”
Laptops at best reduce education to the clackety-clack of transcribing lectures on shiny screens and, at worst, provide students with a constant escape from whatever is hard, challenging or uncomfortable about learning.
And that’s the thing about learning, it should be hard. It should be initially challenging and uncomfortable in the short term in order to be effective in the long term, but the ubiquitous and pervasive nature of Web 2.0 Internet has produced some disturbing effects on young people. As Simon Sinek has recently pointed out, there are some worrying trends in the millennial generation who are often characterised by chronically low self esteem, (facilitated he claims, by failed parenting strategies among other things) who now look to Facebook and Instagram approval to fix their lack of self-worth rather than human interaction and who are exhibiting behaviours that are profoundly addictive in nature.
It’s a bleak view of the future, often described as dystopian but for Neil Postman, there is an interesting distinction between the dystopian visions of Orwell’s ‘1984’ and Huxley’s ‘Brave New World.’ The former portrayed a bleak vision of oppressive state control in the form of Big Brother which sought to actively ban expression and limit human agency, however in ‘Brave New World’ there is a far more horrifying phenomenon at work:
In Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.
Technology has afforded us some incredible opportunities for education, such as comparative judgement or the JSTOR Shakespeare digital library and there are some good examples of purposeful, well structured environments but increasingly, we are suffering from what Sartre called ‘the agony of choice’ as we become more and more connected the the Internet of Things. Allowing kids to browse the Internet in a lesson and then expecting they will work productively is like bringing them to McDonald’s and hoping they’ll order the salad.
Techno-evangelists have sold us the Internet as a form of emancipation, freeing us from the ‘factory model’ of education (despite the fact the evidence for this is thin) but often technology in education seems to represent a solution in search of a problem that actually gets in the way of learning. Perhaps the most liberating and empowering thing we can do for young people today is to create a space for them where they can read the great works of human thought undisturbed and without distraction, for at least a short while.