Tuesday, 5 May 2015

researchED New York part 2: Daniel Willingham persuades

I do not like Dan Willingham.
I do not like him, Sam Freedman
I do not like him now he's cool
I don't like 'Why don't kids like school?'

I do not like him in a book
I do not like the way he looks
I do not like his views on Styles
I do not like him when he smiles

I do not like him on a dog
I do not like him in a blog
I do not like him in the street
I do not like him when he tweets

I do not like him at a ball
No NO I don't like him at all
I do not like Dan Willingham
I do not like him, Sam Freedman.


From 'The Cat in the Thinking Hat'

Daniel Willlingham's presentation to researchED New York, May 2nd, Riverdale Country School


Willingham's session was a grand finale. When conference organisers like me all meet up at our private members club in Pall Mall to smoke cheroots, we call people like him 'anchors'- names that will hold people at the venue when otherwise they might drift off to the attractions. It's why central London gigs look like graveyards by 4pm, because shopping and pubs. He's well known for his book 'Why don't students like school?' which deals with memory and learning and features of these for which we have strong evidence bases. As far as I know no one seriously contests very much of what he says in his own field- for example the claim that there is no strong evidence base to suggest using learning modalities helps children learn any better. Something as uncontroversial as this still finds resistance when you speak to people who have. for example, used them for years. Me, for example; I didn't like to think I'd been mistaken for years, and it took me a while to shake it off, shake it off.


Which brings us to his topic at the conference; persuasion. Why do we believe what we believe, and how can we use this to help us have useful discussions about topics we feel strongly about? He certainly wasn't trying to show people how to hustle. I've been on courses like that- I was trained to use NLP, Bandler and Grinder's 'science of success' which promised near-magical Jedi powers to persuade. It was mostly rubbish, and many rightly queried the ethics of what amounted to a formalised attempt to trick people into agreeing with you. It's also why you already agree with everything I say. Wait a minute.
No, Willingham's session was about how we can attempt to get past all of our emotional and instinctive biases and have meaningful discussions that aim at truth rather than politics. It went a little something like this:


1. It's a bad assumption to think that people's beliefs are motivated solely by a desire to represent the world accurately.

2. Beliefs maintain our self identity

3. Beliefs protect values we deem important or even sacred

4. Beliefs regulate emotion

5. Beliefs maintain social ties

The takeaway was that we often cling to our beliefs for many more reasons than simply 'they accurately represent the world.' This is such an important point. I spent years in nightclubs trying to convince addled men and women not to glass each other, and how you make them feel is often more important than the facts you tell them. It works both ways; if you alienate someone with needless antagonism, they're far less likely to nod along to your beat.


Willingham was doing something he's very good at, and many academics aren't: summarising and highlighting useful lessons which cognitive psychology can teach to teachers- all teachers, be ye progressive bringer of light or neo-trad axeman. He took pains to describe how these biases and filters affect everyone. In point (1), above, he was at pains to say, 'this is all of us, we all do this.'


And he's right. Everyone has to struggle to see beyond the frame we draw around the world, and peer outside. Me. You. Daniel Willingham. Speaking for myself, I have to ask, what articles do I usually read? And the harsh truth is that usually if you agree with what someone says you're far more likely to read them, or click on the link. If not, why would you invest the energy and time to read something that potentially could disprove what you already believe? It's painful to actually expend effort to prove your strongly held beliefs wrong.


Cognitive dissonance.


He also discussed the well documented phenomena where we reconcile the discomfort of two apparently conflicting views making them 'fit' so they are less painful. This discomfort is called cognitive dissonance.


For example:


  • I believe I am a kind person
  • I am having an affair
  • But having an affair is unkind
  • This makes me feel uncomfortable. How can I be kind and have an affair?
  • It must be the case that my partner doesn't understand me. I am not unkind. I am reacting reasonably to a condition of neglect.

We see this in education a lot. We see it a lot everywhere. I used to believe that children had learning styles. But using them in lessons never seemed to help very much. My experience was at odds with my belief. Solution? I must be doing it wrong. I redoubled my doomed efforts. Repeat, rinse, wring.


(Soapbox moment: It's why, just over a decade after the Iraq war- which left approximately 100,000 plus people dead, for no obvious reason- few people seem that bothered about it in the UK. It's simply too painful to realise that we were party to mass murder and a good old land grab. It's easier to ignore it, like it's easier for people throughout millennia to forget or ignore what our governments do and we allow, or condone by our silence. Soapbox ends).


Confirmation bias


This is the tendency we have to pay attention to new evidence that confirms and reinforces our existing beliefs, and to discard or ignore that which does not.


It's the reason why people on Twitter retweet what they agree with and Ignore what they don't. Or follow those whose values echo with theirs and don't those who don't. Who forgive their friends far more quickly than they forgive their perceived enemies. We all do it. It's a human trait. The trick is to admit we do it, like alcoholics standing up at a meeting, and try, every day, to be better than yesterday. If you watched the Leader debates on TV, did you think Miliband's stumble was more or less laughable than Cameron's career gaffe? Which poll do you latch upon and repeat- the one where your party is 1% ahead, or that of your enemy?


Did the London riots prove that there is a poor, marginalised underclass that is so excluded from mainstream society that violence was an understandable reaction? Congratulations, you are Owen Jones. Or did it prove that people are naturally selfish and given a chance will rob and plunder? Congratulations, you are Rob Liddle. Did the Olympic opening ceremony celebrate the NHS or deride it's dismantling? There was another magic mirror, that proved exactly what you wanted to believe.


This doesn't mean we're all trapped inside ourselves, slaves to our gut and our intuition; it does mean we have a responsibility to the truth, the idea that somehow, there exists affairs external to our minds independent of our mere opinions, and it is not subject to our feelings alone.


Nothing he said was ground breaking or new; this is standard cognitive psychology, and very well evidenced for decades. But his skill lay, once again, in presenting it in a useful and accessible way. The only other person of his standing I can think of who does it as well is Dylan Wiliam.


How to win friends and influence tweeple

OK, so how do we minimise our own biases? By comparing them, honestly and sensitively, with the experiences of others in a structured and systematic way. We look at what evidence exists. We analyse the provenance of that evidence. We weigh up its strengths and weaknesses, and make a judgement. In teaching, many people 'just know' their way works. But sometimes it doesn't. I once had a mentor who swore blind children should learn almost exclusively in groups because 'they just should- it's obvious.'. Even when her results came in, up and down, she would cling to her methods, regardless. They were so important to her. I know people who lecture at their classes all day long and feel the same. This is something that troubles every point of the pedagogical spectrum.


Some people just know Learning Styles exist. Some people just know the Mantle of the Expert works. Some people just know phonics are best. Some people just know repetition and rote are keys that imbed learning. Sharing and comparing our experiences in a structured way is how we escape this gravity.


ResearchED isn't about arguing for the supremacy or primacy of RCTs in classroom practice. It argues for the use of honest and careful evidence to support teaching methods and interventions where it is available, and in context. It's also for healthy and hearty scepticism for research that masquerades as evidence, that gets above itself, that seeks to promote, or hustle methods for its own gain. I started it because so much evidence I'd seen was so poor, not the other way about. I wrote a whole book about bad science in education, which sounded a warning bell against mindless adoption of research just because it was there.


Above all, it stands for the restoration of the professional teacher, who has access to good evidence, and interprets that evidence in the context of his or her own circumstances, classrooms and children. It inoculates us against dogma, and bias because it challenges us to interpret our experiences in light of the evidence and interpret the evidence in light of our experiences. It's a support, not a leash.


Group hug not group think

He also advocated empathy; 'recognise that the person you're trying to persuade has another view which they believe is well reasoned.' Also very true. It's sad when you hear people accusing people with whom they don't agree as being stupid, or evil, or lazy, or blind to the truth. Funnily enough no one ever believes that about themselves, and rarely about people with which they agree. It's always the other guy. The UK is in the grip of election fever right now, and practically every beat and chord struck in public falls victim of the sin of ignoring this.


Some Closing wisdom from Bhagwan Dan: 'be selective with your battles; sometimes peace is better than being right.' Amen, Brother Willingham. I think this should be hung above the door as you enter Castle Twitter.

Willingham's points, many of them hardly new or innovative, are nonetheless worth repeating, and I think are vital for educators throughout the research ecosystem to remember, whether they be producers, consumers or anything else. Your experience is vital, but should be sharpened with the challenge research offers. Research can be powerful, but must accommodate the collective and accumulated body of wisdom that actual practitioners possess and researchers frequently don't.


But then, I would say that. I'm biased.

from Tom Bennett - Blog http://ift.tt/1EUC0Ut

No comments: