Category: Uncategorized

The Abilene Paradox: Why Schools Do Things Nobody Actually Wants

On an indecently hot day in Texas, professor Jerry B. Harvey was visiting his wife’s family when his father-in-law suggested they visit a new restaurant in the town of Abilene to which his wife exclaimed “sounds like a great idea.” Harvey had reservations about this however, as a 53 mile trip in a car with no air-conditioning sounded terrible to him, but not wanting to rock the boat he also proclaimed this a good idea and asked his mother in law if she wanted to go. As she was now the only one in the group who had not yet expressed agreement with this “great idea,” she also said they should go, and so they began their journey to Abilene. However, as Harvey explains, the trip was not a success:

My predictions were fulfilled. The heat was brutal. Perspiration had cemented a fine layer of dust to our skin by the time we had arrived. The cafeteria’s food could serve as a first-rate prop in an antacid commercial.

Some four hours and 106 miles later, we returned to Coleman, hot and exhausted. We silently sat in front of the fan for a long time. Then to be sociable, I dishonestly said, “It was a great trip wasn’t it?”

No one spoke.

After a while, his mother-in-law admitted that she never really wanted to go but only did so because she thought everyone else wanted to and didn’t want to cause a fuss, to which his wife also protested that she never really wanted to go either which then lead to a volley of argument. Eventually his father in law broke the silence and exclaimed in a long Texas drawl: Shee-it. Listen, I never wanted to go to Abilene. I just thought you might be bored. You visit so seldom I wanted to be sure you enjoyed it. I would have preferred to play another game of dominoes and eat the leftovers in the icebox.” This experience led to Harvey coining the term ‘the Abilene paradox’ to explain a curious aspect of group dynamics in which the opposite of what everyone wants is tacitly created by the group who thinks they are agreeing with what everyone else wants.

After the outburst of recrimination we all sat back in silence. Here we were, four reasonably sensible people who, of our own volition, had just taken a 106-mile trip across a godforsaken desert in a furnace-like temperature through a cloud-like dust storm to eat unpalatable food at a hole-in-the-wall cafeteria in Abilene, when none of us had really wanted to go. In fact, to be more accurate, we’d done just the opposite of what we wanted to do. The whole situation simply didn’t make sense.

The Abeline paradox lies in the fact that we have problems not with disagreement, but rather with agreement. It is characterised by groups of people in an organisation privately agreeing that one course of action makes sense but failing to properly communicate those ideas and then collectively stumbling to what they think is the right course of action or what everyone else wants. Eventually an inaccurate picture of what to do emerges and based on that, the organisation takes steps towards actions that nobody really wants and which is ultimately counterproductive to the aims of the organisation itself.

ABI_CityLimitsSign

You can witness the Abilene paradox at work in many schools. Often this takes the form of ill-considered marking policies which increase teacher workload to the point of exhaustion, endless tracking and monitoring of students, behaviour policies which punish the teacher more than the student who misbehaves, and graded lesson observations where teachers abandon what they normally do to put on a one-off, all singing, all dancing lesson for the observer, because that’s what everyone thinks that’s what inspectors want.  

A lot of this can be accounted for by innate cognitive biases such as groupthink but it can also be exacerbated by either poor evidence, as in the case of learning styles or a poor understanding and misappropriation of good evidence as in the case of formative assessment. With the emergence of a solid evidence base, we might just be able to defend ourselves from these kind of cognitive biases if they are communicated clearly and appropriated effectively as part of a broader discussion about the values of a school. At it’s best, good evidence can act as a bulwark against the tsunami of nonsense that has so often washed over our schools. If we fail to have these important discussions and simply go with what we think might work, then we are at risk of loading the entire staff onto the school mini-bus and heading off to Abilene. 

 

This is an excerpt taken from the forthcoming book ‘What Does This Look Like in the Classroom? Bridging the Gap Between Research and Practice 

 

 

 

Five Things I Wish I Knew When I Started Teaching

Carl Hendrick

1. Motivation doesn’t always lead to achievement, but achievement often leads to motivation.

While there is a strong correlation between self perception and achievement and we tend to think of it in that order, the actual effect of achievement on self perception is stronger than the other way round (Guay, Marsh and Boivin, 2003.) It may well be the case that using time and resources to improve student academic achievement directly may well be a better agent of psychological change than psychological interventions themselves. Daniel Muijs and David Reynolds (2011) note that:

At the end of the day, the research reviewed shows that the effect of achievement on self-concept is stronger that the effect of self-concept on achievement.

Despite this, a lot of interventions in education seem to have the causal arrow pointed the wrong way round. Motivational posters and talks are often a waste of time and may well give students a deluded notion of what success actually means. In my experience, teaching students how to write an effective introduction to an essay through close instruction, careful scaffolding and then praising their effort in getting there is a far more effective way of improving confidence than showing them a TED talk about how unique they are.

2. Just because they’re engaged doesn’t mean they’re learning anything.

One of the slides from a talk that has stuck with me the most in recent years was this one from Professor Rob Coe which in which he criticised graded lesson observations and highlighted several performance indicators for learning which are actually very misleading:

Screen Shot 2015-03-21 at 21.15.21

This again is quite a counterintuitive claim. Why is engagement is such a poor proxy indicator – surely the busier they are, the more they are learning? This paradox is explored by Graham Nuthall in his book ‘The Hidden Lives of Learners,’ (2007) in which he writes:

“Our research shows that students can be busiest and most involved with material they already know. In most of the classrooms we have studied, each student already knows about 40-50% of what the teacher is teaching.” p.24

Nuthall’s work shows that students are far more likely to get stuck into tasks they’re comfortable with and already know how to do as opposed to the more uncomfortable enterprise of grappling with uncertainty and indeterminate tasks.

3. Marking and feedback are not the same thing.

This subtle difference may seem semantic but there is an important distinction to be made. The value in marking a piece of work may counterintuitively be of more benefit to the teacher than the student as David Didau explains:

While there’s no doubt that marking and feedback are connected, they are not the same. In some parts of the world – Japan for instance – teachers do very little marking but that’s not to say students are not getting feedback. From my own experience, I’m pretty sure it’s possible to make marks in students’ books without providing anything in the way of useful feedback and of course lots of thinking (some of it disastrous) has been done to try to prevent this from happening. Ask any group of teachers if their marking load has increased dramatically in past five years and they’ll fall over themselves to let you know just how much impact marking has on their lives, but what impact does it have on students’ outcomes? The answer is, we just don’t know.

4. Feedback should be more work for the recipient than the donor.

Possibly the most damaging misappropriation of research in my career has been the mangling of Assessment For Learning – a quagmire from which we are now only beginning to emerge. Not long after Dylan Wiliam’s seminal 1998 ‘Inside the Black Box’ became adopted at a national level, school leaders and policy makers managed to twist it into a pale imitation of its original form as AFL became about students memorising what level they were working at and teachers marking books at a level that defied sense in order to show ‘evidence’ of learning. But for feedback to be truly meaningful to students, they need to take ownership of it which may well mean not giving levels to a piece of work at all and instead just leaving comments for the student to reflect and act upon. As Dylan Wiliam writes:

Robyn Renee Jackson suggests that one of the most important principles for teachers is “Never work harder than your students” (Jackson, 2009). I regularly ask teachers whether they believe their students spend as long processing feedback as it takes for the teacher to provide it. Few teachers say yes. We spend far too much time giving feedback that’s either completely ignored or given scant attention.

5. (a) The steps needed to achieve a skill may look very different to the final skill itself.

If you want to get good at a certain skill then surely the best way to get good at it is to practice that particular skill right? Well not according to the tenets of deliberate practice which asserts a more indirect approach that breaks a global skill down into its constituent local parts and focuses on specific feedback and incremental improvement rather than a set of assessment criteria/performance descriptors that are “aimed at some vague overall improvement.” (Ericsson)  In her book ‘Making Good Progress’, Daisy Christodoulou writes:

Whilst skills such as literacy, numeracy, problem solving and critical thinking are still the end point of education, this does not mean that pupils always need to be practising such skills in their final format. Instead, the role of the teacher and indeed, the various parts of the education system, should be to break down such skills into their component parts, and to teach those instead. This means that lessons may look very different from the final skill they are hoping to instil. For example, a lesson which aims to teach pupils reading may involve pupils learning letter-sound correspondences. A lesson with the ultimate aim of teaching pupils to solve maths problems may involve them memorising their times tables. The idea here is that the best way to develop skills does not always look like the skill itself.

5. (b). There is no such thing as developing a ‘general’ skill.

Of course, critical thinking is an essential part of any student’s mental equipment. However, it cannot be detached from context. Teaching students generic ‘thinking skills’ separate from the rest of the curriculum is often meaningless and ineffective. As Daniel Willingham puts it:

[I]f you remind a student to ‘look at an issue from multiple perspectives’ often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives … critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context.

This detachment of cognitive ideals from contextual knowledge is not confined to the learning of critical thinking. Some schools laud themselves for placing ‘21st-century learning skills’ at the heart of their mission but without anchoring them in domain specific contexts, they are often a waste of time. Anders Ericsson develops this point:

This explains a crucial fact about expert performance in general: there is no such thing as developing a general skill. You don’t train your memory; you train your memory for strings of digits or for collections of words or people’s faces. You don’t train to become an athlete; you train to become a gymnast or  sprinter or a marathoner or a swimmer or a basketball player. You don’t train to become a doctor; you train to become a diagnostician or a pathologist or a neurosurgeon.

I’ve written more about this here and this by Dan Willingham is probably the definitive piece on critical thinking.

Screen Shot 2017-05-05 at 22.31.06

Education is an end in itself not a preparation for the workplace

It’s a well observed truth that because everyone has had an education, everyone feels well placed to comment on all aspects of education. Often that takes the form of “My experience of education was like this so all education should be more/less like that.” This often finds its most pure expression in the form of mainstream journalists giving answers to questions nobody asked them and giving a state of the union address on education anyway.

In a recent article in the Times, Caitlin Moran decided to offer herself up as education secretary and outlined her vision:

My plan is very straightforward, and rests on two facts: (1) the 21st-century job market requires basically nothing of what is taught in 21st-century schools, and (2) everyone has a smartphone.

First, as anyone with a teenage/young adult child will know, the notion of them going into a full-time, long-term job with a pension at the end of it looks like something we left behind in the 20th century. The old pathway – learn a skill, use it for 40 years, then retire – is over. The jobs of the future require flexibility and self-motivation. Indeed, the jobs of the future increasingly require you to invent your own job. The majority of jobs our children will have – in just a few years’ time – have almost certainly not been invented yet.

Those of you playing TED talk education cliche bingo will probably be already screaming “full house!” but hold your horses folks…

If you work better sleeping until noon, then working until 2am – as most teenagers do – congratulations! You no longer have to deny your own biology! And if, working at 2am, you have no teacher to help you, discovering how to research on your own is, frankly, going to be far more useful than the thing you’re supposed to be learning. Because, (2) my education policy would be to stop bothering kids with anything you can access on a smartphone.

This kind of pseudo-futurist, Waitrose Organic aisle philosophy is often found in self-made individuals who eschewed school to admirably forge a successful career in the communications industry. Fine for those outliers but unconscionable to advocate that for the vast majority of children from less privileged backgrounds for whom good A level results can be life changing.

Apart from the fact there are well documented problems with the assertion that Google can replace human knowledge, claiming that we should just depend on our smartphones for learning is a dangerous idea as Paul Kirschner and Mirjam Neelen note:

All this leads us to conclude that if we don’t have knowledge and only depend on the information in the cloud, what will happen is what William Poundstone articulates in the Guardian: The cloud will make us mega-ignorant: unaware of what we don’t know.

There is also the potentially damaging impact with celebrities claiming that school doesn’t matter. One can’t help stifle a chuckle/grind one’s teeth when we hear auto-satirist Jeremy Clarkson tell A level students to not worry about their results because it all worked out fine for him. Again, fine for those who had great fortune/rare abilities but very dangerous to mandate as a broad system of education.

Behind these risible views however lies a dangerous conceit; namely that the purpose of education is to merely get you a job, and not just any job, but a job that doesn’t exist yet. These phantasmic jobs often focus on alliterative groupings such as “collaboration, creativity and connectivity”, (something we are evolved to do anyway) and are positioned in opposition with the cruelty of 20th century education which apparently was some kind of mass conspiracy designed to create a global village of the damned. 

However, far from being the so called  “factory model” bastille limiting children’s potential, 20th century schooling is perhaps the high water mark for education. Class sizes were reduced, more students with special needs were included in mainstream education, minimum teachers standards were introduced and more children had access to quality education than at any time in human history. Astonishingly, from 1900 to 2015, rates of global literacy increased from 21 percent to 86 percent.

Behind many of these claims is also a barely concealed contempt for the teaching profession. If Moran’s education policy were to be enacted and we were to “stop bothering kids with anything you can access on a smartphone” then the role of teacher would be defunct. Teachers are the most expensive commodity in the 21st century learning paradigm so it would make sense to replace them with technology, something these evangelists rarely say explicitly. At a time when school funding is being cut to almost unmanageable levels, the emancipatory claims of 21st century learning should be viewed with forensic scepticism.

Of course we should prepare students for an uncertain future, but if we adopt the techno-evangelist disruptive model and view education as merely a utilitarian enterprise for 21st century workplace then we truly will enact a “factory model” of schooling and furthermore, we will diminish the gift of knowledge for its own sake. Students should study Shakespeare not because of what job it might get them but because it’s an anthropological guidebook that tells them how to live.

 

 

 

 

Things Aren’t Getting Worse, They’re Getting Better.

 

“Any comfortable American who is cynical of progress – or the competent decency of modern civilisation – hasn’t pondered how life was for our ancestors. Any day that cossacks haven’t burned your home should start out a happy one, overflowing with optimism.”

– M. N. Plano 

It’s a feature of any generation to believe that the present is never as good as the past. Indeed, the last few years have seen many rise to power on the back of this dangerous fallacy. Doom mongers and eschatologists would have us believe that we are hurtling towards a post-apocalyptic future fuelled by a more violent, divided and psychopathic society.

Now I’m not naturally given to optimism but after reading Johan Norberg’s book ‘Progress’ there are some inescapable truths that would suggest that far from living in end times, we’ve actually never had it so good.

For example, if you were a 10 year old living 150 years ago, you would probably have already experienced the death of more than one immediate family member already. You would have been sent to work in horrific conditions pretty soon after that and your average life expectancy would suggest you’d only have another 20 miserable years left on the planet.

In 1900, average world life expectancy was 31 years old, by 2016 it was 71. Remarkably, the increase in life expectancy has occurred in only the last four generations of roughly 8,000 generations of homo sapiens. In anthropological terms, if you’re reading this you’ve not just won the lottery, you’re winning it every day.

CupothRXEAQA6Io

For the majority of human existence, people lived truly wretched lives as hunter gatherers but in the 19th Century the Industrial revolution gave rise not just to huge population growth but to a concomitant economic prosperity as Norberg points out: “Between 1820 and 1850, when the population grew by a third, workers’ real earnings rose by almost 100 per cent” and in the 20th Century, the advent of open markets and free trade did not just confine this wealth to the West: “Since 1950, India’s GDP per capita has grown five-fold, Japan’s eleven-fold and China’s almost twenty-fold.”

Indeed, as a recent article points out the amount of people living in extreme poverty is plummeting by an average of more than 100,000 people per day:

There were nearly 1.1 billion people on earth in 1820, with about 1 billion of those individuals living in what is considered extreme poverty, according to World Bank economists Francois Bourguignon and Christian Morrisson. By the 1990s, the world’s population exploded to over 7 billion, with 2 billion living in extreme poverty.

From 1990 to 2015, the number of people worldwide in dire poverty shrank to 750 million. Put differently, that figure has, on average, fallen by 137,000 people every day over the last 25 years.

In terms of violence, it is a commonly held belief that the 20th Century was the bloodiest on record but when taking into account huge population increases, this simply isn’t true. For example, the Mongol conqueror Timur Lenk killed proportionally almost as many as Hitler, Stalin, and Mao combined in central Asia in the 14th Century.  The latest US figures, for 2013, show that murder rates are lower now than in the early 1960s. In the same year, homicides in Japan hit a post-war low. In England and Wales the level of violence has dropped by 66 per cent since the latest peak in 1995.

In fact, as Yuval Noah Harari points out, our mortality would seem to be very much in our own hands. In many tragic cases, quite literally:

“In 2012 about 56 million people died throughout the world; 620,000 of them died due to human violence (war killed 120,000 people, and crime killed another 500,000). In contrast, 800,000 committed suicide, and 1.5 million died of diabetes. Sugar is now more dangerous than gunpowder.”

In education, the increases in provision and access have facilitated a simply astonishing rate of improvement: from 1900 to 2015, rates of global literacy increased from 21 percent to 86 percent of the global population. This rise has mainly occurred in countries that have rejected authoritarian regimes and the superstitious dogma of religion for the liberal democratic values of equality, humanism and personal liberty. Further good news in terms of education is that current rates of growth could possibly see us turn our attention to solving seemingly intractable problems provided certain political movements are challenged sufficiently as Norberg illustrates:

If annual global economic growth remains around two percent per head, the average person in 100 years’ time will be around eight times richer than today’s average person. With those resources, the level of scientific knowledge, and the technological solutions that may then be at our disposal, many of the problems that intimidate us today will be much easier to handle—from adapting to warming to taking CO2 out of the atmosphere.

Perhaps our current apocalyptic malaise is a result of the fact that when extreme poverty is off the table, humans have more time to address social injustice and inequality and although there is still quite a way to go, the gains over the last 100 years have been nothing short of remarkable. The fact that a black man can now marry a white man in many Western societies is testament to how far we’ve come. The other major factor is the rise of social media and a 24 hour news culture that is predisposed to report what’s going wrong as opposed to what’s going right. What Norberg’s book shows is that progress is not dramatic and therefore often not news-worthy.

Although there are many challenges that we are right to be worried about, for example ongoing geopolitical tensions, a worrying trend in mental health and the disappearance of jobs due to automation, the recent past suggests that we always find solutions. We are less aware of progress than we are of disasters because progress is a glacial process that often happens under the radar and is so complex that it resists simplistic narratives. As Bill Hicks reminds us, modern media is often based on sensationalism and scare tactics but when you take a wider historical viewpoint, there’s never been a better time to be alive.

 

 


Johan Norberg – Progress: Ten Reasons to Look Forward to the Future

Yuval Noah Harari –  Homo Deus: A Brief History of Tomorrow

 

Amused to Death: Why the Internet Should Be Kept Out of the Classroom

Writing in 1985, Neil Postman made the interesting observation that of the first fifteen U.S. presidents, many of them could walk down the street without being physically recognised yet they would be instantly identifiable by things they had written or speeches they had delivered. Today the opposite is true.

Postman saw 1980s America as a world that celebrated the transient and the superficial, where the power of the written word as a space to formulate and expand on complex arguments gave way to impressionistic understanding and surface engagement. He claims the 19th century was the apogee of human thought where the act of undistracted reading in particular represented “both their connection to and their model of the world.” He saw the advent of mass entertainment TV in the 1980s as a tipping point characterised by an over-saturation of information to the point of total distraction:

“Information is now a commodity that can be bought and sold, or used as a form of entertainment, or worn like a garment to enhance one’s status. It comes indiscriminately, directed at no one in particular, disconnected from usefulness; we are glutted with information, drowning in information, have no control over it, don’t know what to do with it.”

What’s remarkable about Postman’s dystopian vision is that it was written before the advent of the Internet. The dizzying amount of information now produced daily would have been inconceivable to him yet many of its stupefying effects would have been instantly recognisable. If TV was having this effect on adults 30 years ago, what impact is the Internet having on young people today?

whitenoise

An important new study on student use of the technology has shown that students who have access to the Internet in the classroom are being distracted out of learning in any kind of meaningful way. Students who used laptops in lessons voluntarily logged into a proxy server which monitored their in-class behaviour and researchers found that “the average time spent browsing the web for non-class-related purposes was 37 minutes. Students spent the most time on social media, reading email, shopping for items such as clothes and watching videos.

An essential point to make here is that an adult using the Internet is not the same as a 15 year old using it. Most adults have developed schemas of knowledge that allow them to navigate the great highways of the Internet, identify subtle exits, negotiate fruitful side-roads and avoid potential dead ends. Asking kids to ‘research a topic on the Internet’ is like dropping a five year old on a motorway and expecting them to find their way home on their tricycle.

An older friend of mine who is a history teacher remarked to me recently that it was about 15 years ago that students who were asked to write assignments on Martin Luther King started handing in essays on the American civil rights leader who nailed 95 theses on a church door in Wittenberg in the 16th century and subsequently led the Protestant Church movement until his tragic assassination in 1968. Simply unleashing kids on the Internet with the vague justification of ’21st century skills’ is not just largely ineffective but a dereliction of duty.

Many techno-evangelists cite the vague concept of ‘creativity’ as a justification for the Internet in our classrooms but as Andrew Keen presciently pointed out ten years ago, as ‘truth’ becomes a relative term, not only is real creativity threatened but the wider implications for society are perhaps far more concerning:

This undermining of truth is threatening the quality of civil public discourse, encouraging plagiarism and intellectual property theft, and stifling creativity. When advertising and public relations are disguised as news, the line between fact and fiction becomes blurred. Instead of more community, knowledge, or culture, all that Web 2.0 really delivers is more dubious content from anonymous sources, hijacking our time and playing to our gullibility.

In a recent New York Times opinion piece, Darren Rosenblum noted that when teaching a unit on what he thought would be the engaging topic of sexuality and the law, his attempts to provoke discussion with his students was met with a “slew of laptops staring back” at him. He subsequently appealed to students not to bring laptops and created what he felt was a more human connection which in turn led to a better environment for learning: “Energized by the connection, we moved faster, further and deeper into the material.”

Laptops at best reduce education to the clackety-clack of transcribing lectures on shiny screens and, at worst, provide students with a constant escape from whatever is hard, challenging or uncomfortable about learning.

And that’s the thing about learning, it should be hard. It should be initially challenging and uncomfortable in the short term in order to be effective in the long term, but the ubiquitous and pervasive nature of Web 2.0 Internet has produced some disturbing effects on young people. As Simon Sinek has recently pointed out, there are some worrying trends in the millennial generation who are often characterised by chronically low self esteem, (facilitated he claims, by failed parenting strategies among other things) who now look to Facebook and Instagram approval to fix their lack of self-worth rather than human interaction and who are exhibiting behaviours that are profoundly addictive in nature.

It’s a bleak view of the future, often described as dystopian but for Neil Postman, there is an interesting distinction between the dystopian visions of Orwell’s ‘1984’ and Huxley’s ‘Brave New World.’ The former portrayed a bleak vision of oppressive state control in the form of Big Brother which sought to actively ban expression and limit human agency, however in ‘Brave New World’ there is a far more horrifying phenomenon at work:

In Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.

Technology has afforded us some incredible opportunities for education, such as comparative judgement or the JSTOR Shakespeare digital library and there are some good examples of purposeful, well structured environments but increasingly, we are suffering from what Sartre called ‘the agony of choice’ as we become more and more connected the the Internet of Things. Allowing kids to browse the Internet in a lesson and then expecting they will work productively is like bringing them to McDonald’s and hoping they’ll order the salad.

Techno-evangelists have sold us the Internet as a form of emancipation, freeing us from the ‘factory model’ of education (despite the fact the evidence for this is thin) but often technology in education seems to represent a solution in search of a problem that actually gets in the way of learning. Perhaps the most liberating and empowering thing we can do for young people today is to create a space for them where they can read the great works of human thought undisturbed and without distraction, for at least a short while.

 

 

References:

Postman, Neil (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. 

Internet Use in Class Tied to Lower Test Scores

Logged In and Zoned Out: How Laptop Internet Use Relates to Classroom Learning

How Today’s Internet is Killing Our Culture and Assaulting Our Economy

Leave Your Laptops at the Door to My Classroom

A review of 1 on 1 laptops in education shows… that much more and better research is needed

 

 

 

 

 

 

 

The Air Traffic Controller Paradox: Why Teaching Generic Skills Doesn’t Work

Being an air traffic controller is hard. Really hard. The job entails having to remember vast amounts of fluid information often within a context of enormous pressure. Essentially the job is about ‘situational awareness’ which involves “the continuous extraction of environmental information, the integration of this information with prior knowledge to form a coherent understanding of the present situation.”[1] The job is sometimes done under extreme duress, where they have to make life or death decisions often with a lack of sleep leading in some cases to long-term fatigue and burnout. So stressful is the job that they are eligible for retirement at age 50 or after 25 years of service.

In the 1960s, a series of interesting experiments was done on air traffic controllers. Researchers wanted to explore if they had a general enhanced ability to “keep track of a number of things at once” [2] and whether that skill could be applied to other situations. After observing their sophisticated abilities in air traffic control, they then gave them a set of generic memory based tasks with shapes and colours. The extraordinary thing was that when tested on these skills outside their own area of expertise the air traffic controllers did no better than anyone else.

1

These findings challenged contemporary thinking on generic skills. Surely they had developed a set of general cognitive capacities that could be used in other areas or ‘domains’? The evidence suggested the opposite. In order to be good in a specific domain you need to know a lot about that specific domain and moreover, “the more complex the domain, the more important is domain-specific knowledge.”[3] This phenomenon is now well established and has been replicated many times. Other research for example has shown that the ability to remember long strings of digits does not transfer to the ability to remember long strings of letters.[4]  Indeed, we all know very ‘clever’ people in their professional lives who seem to often make very stupid decisions in their personal lives:

“A person who is able to reason logically in science may show no such ability in his or her personal life or in any areas outside of his or her areas of science. Knowing that we should only test one variable at a time when conducting a scientific experiment is critical. Outside of hypothesis testing, it may be irrelevant, with other knowledge being pre-eminent.” [5]

Take another example, sport. Within a football team you have many different types of positions or ‘domains’ such as goalkeepers, defenders and attackers. Within those domains you have further categories such as centre backs, full backs, attacking midfielders, holding midfielders and attacking players. Now the ‘general skill’ that all these players have is the ability to play football, however if you put a left back in a striker’s position or put a central midfielder in goal they would be lost.

A footballer’s ability to be effective in a particular position or domain is based on years of experience where they have built up thousands of mental models from playing the game in that particular position so that when they have to perform at a high level they can do so with faster reaction times and their full concentration can go on anticipating the complexities of the game faster than their opponent. Of course there are elements that are consistent with each position such as touch and technical ability but they look very different in each position and are heavily context specific. For example, a central defender heading a ball away to safety is very different to a striker heading a goal and the types of positioning and runs an attacking player needs to make are radically different to those of a defender. In other words, elite footballers are not “good at football” as such, they’re good at being a left back, defensive midfielder or attacker.

forwardruns2

Despite the growing body of evidence questioning the efficacy of teaching general skills in recent years, there is still a near constant refrain for them to be prioritised in schools. This usually takes the form of generic “critical thinking skills” often taught in some form for an hour or two a week and decontextualised from any specific subject. This is a problem as Dan Willingham reminds us

Critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context.”

Thus, if you remind a student to “look at an issue from multiple perspectives” often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives. [6]

Another problematic area is the diaphanous world of “21st century learning skills” which some schools have made a central part of their mission. It’s even been suggested that some of these nebulous skills are now as important as literacy and should be afforded the same status. An example of this is brain training games whose proponents claim can help kids become smarter, more alert, and able to learn faster. However recent research has shown that brain training games are really only good for one thing – getting good a brain training games. The claim that they offer students a general set of critical or problem solving skills was recently debunked by a new study  reviewing over 130 papers :

We know of no evidence for broad-based improvement in cognition, academic achievement, professional performance, and/or social competencies that derives from decontextualized practice of cognitive skills devoid of domain-specific content.

Instead of teaching generic critical thinking skills, an alternative strategy would be to focus instead on subject specific critical thinking skills that seek to broaden student’s individual subject knowledge and unlock the unique, intricate mysteries of each subject. This goes for other dispositions and faculties taught generically such as Growth Mindset and Grit – students may well have a Growth Mindset in English but not in Maths, and yet the concept is often portrayed to students as a general capacity that can supposedly function in a transversal way across all subjects.(Despite the fact that the jury is still out on whether these can be taught at all.)

In the same way that teaching knowledge devoid of any platform for students to discuss, explore and develop that knowledge makes no sense, the teaching of standalone, decontextualised general skills is a questionable practice at best. It’s enduring appeal is probably in the fact that the concept seems so intuitively right yet when the evidence is appraised we find their justification weak. To those advocates of the ubiquitous critical thinking skills we might risk the question: “but what are they going to think with?” As Dan Willingham reminds us “thought processes are intertwined with what is being thought about.”

 

 

More:

The Role of Memory in Air Traffic Control (Gronlund, Dougherty)

Domain-Specific Knowledge and Why Teaching Generic Skills Does not Work (Tricot, Sweller 2014)

Critical Thinking: Why Is It So Hard to Teach? (Willingham)

Do “Brain-Training” Programs Work? (Simons et al 2016)

 

 

1 (Dominquez, 1994)

2 (Yntema & Mueser, 1960)

3 (Ericsson & Charness, 1994)

4 (Tricot, Sweller 2014)

5 (Tricot, Sweller 2014)

6 (Willingham 2007)

Why Fads and Gimmicks Should be Resisted in the Classroom

One of the perpetual cycles in education is harnessing of whatever is popular in youth culture at the time in order to ‘engage’ students. The current gimmick de jour is with Pokemon Go, a virtual reality mobile phone game that has taken the world by storm. Several ‘hints and tips’ websites offer ways of using this technology in the classroom. For example, in order to engage students in History you could “Create a timeline that shows the history of Pokemon and the other Pokemon games.” Last year English teachers were treated to a series of books on how to use emoticons to teach Shakespeare. Titles included Srsly Hamlet,’ ‘Yolo Juliet’ and  ‘Macbeth #Killingit.’  

51u4l9ls3sl-_sx348_bo1204203200_

One of the main justifications for these kinds of approaches is the notion that kids will be engaged in subjects they would otherwise not be, and it’s a way to “get them involved.” Apart from the fact that engagement is a very poor proxy for learning, using fads and gimmicks to interest children reveals a more troubling belief that you somehow need to ‘trick’ kids into being interested in things, that they couldn’t possibly be captivated by Shakespeare, Henry VIII or Newtonian Physics without first having it go through the filter of their own immediate interests.

Clearly teachers have a job on their hands competing against the immediacy of mobile phones and the Internet and a dwindling attention span but the strategy of ‘fighting fire with fire’ might not be the best approach here. While well intentioned and indeed ‘engaging,’ does using this kind of approach lead to effective learning? This cautionary tale from David Didau would question that:

I once observed a history lesson in which the teacher had as her stated aim that her class should learn what life was like for Irish peasants during the Potato Famine. She decided to do this by hiding potatoes around the classroom. The kids absolutely loved it! They were highly engaged from the word go and had enormous fun working out the likely hiding places for potatoes. They learned an awful lot about where it was possible to hide a potato in a classroom. They then wrote about the experience of life as an Irish peasant. But because the activity had taught them nothing about the life of an Irish peasant, their responses were poor. The other teacher that I observed the lesson with had covered their pro forma with enthusiastic scrawl and was convinced they’d seen something outstanding. But, what did they learn? I asked. But they absolutely loved it! They replied.

The obsession with novelty in education appears to happen at all levels of school life with many school leaders adopting gimmicks and fads for whole school policies with little or no evidence they are effective. Whole school policies on marking for example come and go but can leave a trail of destruction behind them. As Alex Quigley argues “Anything that distracts teachers and school leaders from improving teaching and learning are cumbersome tools that serve only to weigh us down.”

The other thing is that what many teachers fail to realise is that as soon as adults being appropriating youth culture it ceases to become theirs and it loses its radical appeal. Is there anything more tragic that the ageing teacher who attempts to adopt youth slang in order to ‘relate’ to kids? fellowkids

As an English teacher I feel an instant resistance to the adoption of gimmicks in the classroom. Reading is a sacramental act, a form of meditation that can transport children to chimerical worlds and offer them new ways of understanding the human condition beyond their own immediate interests. By reading about the struggles of characters in a novel or a play they are able to view their own struggles in a way that was previously unavailable to them.

Reading is hard to do in 2016 and requires commitment to something beyond immediate pleasure in order to gain richer reward. Getting kids to wander around the playground playing Pokemon Go is simply keeping them busy.

Using fads and gimmicks not only depreciates the process of learning but also reveals a contempt for the experience of being absorbed in something for its intrinsic worth. It also sends out a message that, whether we are aware of it or not, is surely negative. By using text message emoticons to teach Hamlet we are tacitly saying “you are not really able to handle this.”

Using fads and gimmicks presumes that all kids are interested in the same thing. One thing we might want to consider is that by using them we could be possibly be disengaging students as opposed to engaging them. As Martin Robinson writes:

I’ve enjoyed seeing my daughter play the game, we have had fun exploring and noticing things but none of this is in the detail or depth I would call educational, nor is it edutainment, it is play, and that is fine as far as it goes; I love play. But I pity my little ‘un if she has to go back to school and comes across an enthusiastic teacher who has come up with a term’s work based on Pokémon Go in order to engage her interest, it will more likely enrage her to disinterest.

Surely a central part of the mission of being a teacher is to introduce kids to things beyond their own immediate borders? To initiate them into new ways of seeing, new ways of thinking and to endow them with a wider understanding of the world in order to be able to navigate the troubling waters they sometimes find themselves in.

As Martin says, playing Pokemon Go has its own intrinsic worth for kids that is just as valid as anything else but whatever it is, it’s not learning. By insisting that the only way kids can learn is by being distracted into learning, we are offering them a debased view of the process itself.

Teachers should model the types of behaviours we would like to see in children. By modelling an effusive love of subject and showing how it has transformed our own lives as adults we can begin to show how it can transform their own lives as children.

 

 

“Limited men with limitless energy”: Why we should be wary of ‘passion’

Passion

In Philip Roth’s novel American Pastoral, there’s a phrase that has always stayed with me and one I’ve since associated with a particularly unpleasant character trait. Seymour ‘Swede’ Levov is the blonde haired, blue eyed protagonist of the novel, a star athlete in high school, who marries a beauty queen and expands his father’s business empire but can never escape his shadow. In many respects his father is the unknowing architect of his demise yet his central flaws are some of the most desirable qualities of the American dream:

a father for whom everything is an unshakable duty, for whom there is a right way and a wrong way and nothing in between, a father whose compound of ambitions, biases, and beliefs is so unruffled by careful thinking that he isn’t as easy to escape from as he seems. Limited men with limitless energy; men quick to be friendly and quick to be fed up; men for whom the most serious thing in life is to keep going despite everything.

Passion is seen as a universally positive trait and rightly used in any discussion of personal enfranchisement but this week the word has cropped up in two separate contexts that has given me pause to consider whether it is not being used in more nefarious ways, particularly in the context of leadership.

As we have seen over the last month, the last person standing in a leadership contest is often the least desirable candidate. The qualities which yield a ‘winner’ are often some of the most insidious traits observable in a human being, and yet are often the most celebrated. In Roth’s novel, the Swede’s father, a “limited man with limitless energy” is emblematic of a particularly pernicious kind of American rugged individualism that Donald Trump has harnessed so well this year; self-promoting, shallow, inflexible, ruthless, decisive, brash and of course, full of passion. We’ve also been treated to Andrea Leadsom’s “passion for strengthening families” which includes a rejection of same sex marriage and the claim that those with the ability to reproduce have more of a stake in the future than those unable to do so.

On occasion however, a candidate with a different set of qualities can make it to the Iron Throne. This week the Education Select Common’s committee rejected Nicky Morgan’s choice of Ofsted Chief Inspector Amanda Spielman noting that they “were concerned by the lack of passion she demonstrated for the job and the important contribution it makes to the lives of children.”  Of course, the committee’s job is not to choose the candidate but rather to hold to account the government’s decision to do so and in that sense that are somewhat of a paper tiger, but their decision was met with a volley of argument from almost all sections of education, most notably those who have worked with or have met Spielman, perhaps because many felt her predecessor’s notion of passion was so alienating. The whole episode does however invite us to consider what exactly we mean by the term ‘passion.’

Screen Shot 2016-07-09 at 19.08.47

 

If the Select Committee’s understanding of passion is so wildly at odds with so many of us in the field then we might venture the question; is the term even a viable one to use in judging such an important position? The term seems to be so wide ranging and all-encompassing as to render itself meaningless and would appear to chop off the branch on which it is standing.

We can perhaps think of two differing interpretations of passion on a broad spectrum; benign passion and malignant passion. The passion of an English teacher for Shakespeare is a very different proposition than the kind of jingoistic passion of the extremist. There is also the tendency to confuse passion with a sort of self-promoting extroversion. What of the passion of the introvert with its “quiet determination”?  Might we realign the notion of passion with the qualities of measured reflection, patient facilitation, the rare ability to ask the right kind of questions and then listen attentively to the answers?

Of course used pejoratively as I have outlined here, passion can be viewed perhaps wrongly in a solely negative light. After all, having no conviction at all leads nowhere and being passionate about personal goals and pursuits extends many benefits to us all both personally and professionally. Good teachers are devotees of their subject and have an innate ability to communicate that in ways that make learning infectious.

However ‘passion’ can often mask a more set unhelpful of dispositions, from the harmless passion associated with the banality of self help culture to the more sinister passion of a certain leadership culture that uses the term to excuse prejudice and exercise power over others in negative ways. As we have seen this week, passion often confers certainty where it is not warranted, and at its worst can cover a multitude of sins as Yeats knew all too well.

The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.

Not All Stress is Bad. The Benefits of Eustress or ‘Good Stress’ For Learning

Carl Hendrick

In the 1930s endocrinologist Hans Selye differentiated between two types of stress, distress and eustress. We are all familiar with the first term but perhaps less with the second term which refers to a positive response to external stressors leading to a state of optimism, confidence and agency, in other words ‘good stress.’ The origins of this model has its roots in 1908 when psychologists Robert M. Yerkes and John Dillingham Dodson posited that productivity is directly correlated with an optimal state of stress. Too little of it and you get nothing done, too much of it and you get nothing done either.

HebbianYerkesDodson

Fig. A

 

A key concern of anyone working in education is monitoring the stress levels of staff and students. Of course we don’t want anyone to be in a state of distress but we now live in an age that often views all stress as distress without acknowledging the benefits of eustress. Is it possible to imagine a more ‘stress-tolerant’ culture where students embrace a ‘sweet spot’ or optimal level of stress, one where we could engender a atmosphere of positive challenge and agency? As Ben Martynoga points out:

This is where good teachers and managers should push their charges: to the sweet spot that separates predictable tedium from chaotic overload. Where stress gets more persistent, unmanageable and damaging, Selye calls it “distress”. Eustress and distress have identical biological bases; they are simply found at different points on the same curve.

The key point here is that both of these states are responses to external stressors as opposed to being caused by events themselves, in other words, perception is everything. A key question here is in what way do educators shape the perception that all stress is distress?

Broadly there are two responses to stress, an initial avoidance and then subsequent coping strategies. For a group of Yale researchers, both of these approaches deny the benefits of eustress because they perpetuate the idea that all stress is bad:

These approaches advocate and perpetuate the mindset that stress-is-debilitating, a mindset that not only is partly inaccurate but may also be counter-effective. Even hardiness and resilience approaches to stress, while acknowledging the enhancing outcomes, still ultimately affirm the mindset that the debilitating effects of stress must be managed or avoided.

In contrast to the “stress-is-debilitating” mindset, these researchers discovered that students could be primed to adopt a “stress-is-enhancing” mindset in which they embraced a certain level of stress and which resulted in them being more open to seeking help, more open to feedback, which led to lower levels of distress overall and which had “positive consequences relating to improved health and work performance.” This “stress-is-enhancing”  mindset has many resonances with Robert Bjork’s notion of desirable difficulties.

We are all familiar with the”stress-is-debilitating” mindset. When we have open ended large tasks, we are often are on the left of the Yerkes-Dodson curve, with little or no stress and thus no stimulation to act, but when the deadline is looming, we find ourselves often on the right of that curve, in a state of paralysis, unable to act and making poor decisions in an effort to alleviate the distress. Clearly then the ‘sweet spot’ is to be in a state of eustress, characterised by hope, excitement, active engagement, (O Sullivan, 2010) and that feeling that you are in control of the task you are faced with.

While there are some serious external stressors that are debilitating no matter what your response to them, two questions  worth asking are:

  1. Are the kinds of tasks we are asking students to do genuinely placing them in a state of distress or could they be seen more positively as a potential state of eustress?
  2. Are we focusing on teaching methods that actually increase distress such as a focus on the storing of information as opposed to the retrieval of it?

In education research there is often very little consensus, but one area in which there is almost unanimous agreement is in the testing effect. We now know that the worst thing we can advise students to do in terms of revision is to re-read material and highlight key points, and that the most effective thing we can advise them to do is to practice retrieving information through testing, preferable through self testing, low stakes quizzing and flash cards. This distinction between storage and retrieval processes is well researched as Roediger and Butler explain:

“The testing effect is a robust phenomenon: The basic finding has been replicated over a hundred times and its generalizability is well established.”

So we know that testing is beneficial for learning but yet the general perception of testing seems to be altogether negative. Is the problem not just the high stakes nature of them but also how students are prepared for them? If students are using poor study techniques like re-reading and highlighting material for most of the school year within a curriculum that is not interleaved but focuses on mass practice, is it any wonder that they enter a state of distress when they enter exam season?

Stress experienced early in life can be debilitating and potentially devastating if compounded throughout life. Where children experience prolonged periods of distress they need the proper help and support to enable them to cope and we clearly have some way to go in this area. But are the kinds of tasks that we are asking them to do in schools genuinely creating a state of distress? If stress is a often a question of perception as Selye claimed then to what extent is it helpful to portray testing and exams for example as a key contributor to a “mental health crisis spiralling out of control?”

Stress is a very difficult area because it is highly subjective and often results in emotional and sometimes irrational reactions to it. We all want to create a healthy, productive atmosphere for staff and students in which they feel they have agency over their future and in which they don’t feel overwhelmed by external stressors but by viewing all stress as distress without harnessing the hidden benefits of eustress, we might just be missing a trick.

 

 


 

Fig A. Yerkes and Dodson, HebbianDiamond DM, et al. (2007). “The Temporal Dynamics Model of Emotional Memory Processing: A Synthesis on the Neurobiological Basis of Stress-Induced Amnesia, Flashbulb and Traumatic Memories, and the Yerkes-Dodson Law”. Neural Plasticity: 33. doi:10.1155/2007/60803. PMID 17641736.

O’Sullivan, Geraldine (18 July 2010). “The Relationship Between Hope, Eustress, Self-Efficacy, and Life Satisfaction Among Undergraduates”. Social Indicators Research 101 (1): 155–172. doi:10.1007/s11205-010-9662-z.

Roediger & Butler Encyclopedia of the Mind (2013)

Rethinking Stress: The Role of Mindsets in Determining the Stress Response