“Working out what works” ran the strapline on the wonderful ResearchED 2013 conference held yesterday. I have the t-shirt to prove it.
Here’s a few things that I learned that didn’t work during the course of the day.
Tom Bennett’s plan to have name badge inserts ready before Friday evening didn’t work. Or rather it did, but they got left at school and he had to type them all out again. No matter.
Laura McInerny’s idea to put the newly typed name tags in the holders upside down didn’t work. Neither did Alex Weatherall’s idea to put them in facing inwards rather than outwards. Or rather it did, because the laughter helped the helpers to bond in a potentially stressful hour preceding the day. No matter.
My mobile phone’s reception didn’t work in the lower common room for Frank Furedi’s talk. Or rather it did, because in making notes rather than tweeting I focused more on the content which you will see has allowed me to write this post. No matter.
My plans to meet such twitter luminaries as Debra Kidd, Ruth Kennedy and Barry Smith didn’t work, even though I was in the same room as each of them on a number of occasions. Or rather it did, because we were all too engaged in listening and learning to turn those plans into reality and we all met others we never expected to meet. No matter.
Ben Goldacre’s brilliant (he assured us) PowerPoint slides didn’t work because of a technical glitch that might have been sorted had be been there on time. Or rather it did, because he got some of his best laughs and strongest connections with his audience because of, not in spite of, this fact. No matter.
And so on. And so on. And so on.
At times things don’t work when you least expect them to not work. At other times things do work when you least expect them to work. In all of the above cases confidence, professionalism, the ability to make judgment calls and, sometimes, sheer good luck or serendipity, saved the day when plans and expectations failed.
And so it is in our classrooms. How many of you have had two classes from the same year group, taught them the same thing and had one lesson turn out beautifully and the other turn out disastrously? How many of you have taught the same things to two equally attentive students and had one perform disastrously in examination whilst the other aces it? How many of you have had a fabulous lesson ready to teach, been waylaid by an interesting question and then proceeded to have one of the most memorable tangential learning experiences with your students?
So why are we looking so hard at working out what works? Especially when we know that “what works” is always going to be a site of disagreement. If you don’t believe me put a categorical statement out on twitter about any aspect of behaviour, pedagogy, curriculum or leadership and see what comes back to you.
Let me give you some brief explanations of what I mean based upon the sessions I attended.
I’ll be honest up front. Not a big fan. I was, until I read the paper on RCTs in education commissioned by the DfE. In my opinion an arrogant set of assumptions about the current quality of teacher self-reflection and educational research forming a poor diagnosis apropos of a metaphoric prescription of horse tranquillisers for a human patient. Safe to say that I wasn’t expecting much and (maybe this is confirmation bias on my part – an inevitable element of all research perhaps?) he very much lived down to my expectations.
Granted, he added to his prescription of RCTs with a further recommended dose of Research Reading Clubs, but the talk felt like a triumph of style over substance with many bloggers already noting the lack of depth about how RCTs might actually be conducted. In many ways it felt like a Govean haranguing, with the neat transplantation of ‘dinosaurs’ for ‘enemies of promise’. I don’t really have much more to add, which in some ways says it all.
Joe Kirby & Rebecca Allen
One of the highlights of my day was meeting Joe. Such a lovely man and I hope to catch up with him again. In some ways a real antidote to Goldacre for me: humble, inquisitive, receptive to alternative viewpoints. He began the session with some statistics about whether Teach First is working or not, with very much a focus on the experiences and opinions of those involved in the programme. This was neatly counterpointed by Rebecca’s second half looking at comparable outcomes between TF and non-TF schools and TF and non-TF departments.
Together they were eloquent advocates for the programme and some of their evidence was convincing that TF is working. Rebecca’s responses to online comments about her research was honest, reflective and self-challenging. There were, however, two big buts emerging from this session for me (and if you’re giggling at that, you’re my friend forever). The first is that both pieces of research were carried out by people who I suspect want to show that TF is indeed working: Joe as an ambassador for it and Rebecca’s co-researcher as someone who (she admitted) had moved into the administration of it. The second ‘but’ was because of the admittedly small impacts of the programme and the recognised need to conduct further research against other ITT programmes and into whether TF is working well enough given the relatively high costs of it.
As an aside, am I the only teacher or school leader that worries about or is frustrated by the seemingly ubiquitous conclusion to research work that “more research is needed”.
The session finished and I went to thank Joe for his contribution. He disarmingly asked me for my opinion about TF and I gave him it with both barrels: I don’t care how people come into this profession, just that they do and are bloody good at it. That’s what works for me.
I was very torn about this session, wanting to see so many people delivering talks at the same time (another thing about ResearchED 2013 that didn’t work, but then only the invention of a self-duplicating machine could have solved that problem). In the end the lack of a breakfast meant that I moved nowhere and then, I must confess, meant that I left the session early to seek sustenance. My apologies to Brian for that but the research proves that a man my size must be fed or collapse.
Juliet Brookes & Katie Nicholson
Despite the competition these guys faced this was one of my must see sessions as it was about the role of R&D in Teaching Schools, a key part of my current role. I wasn’t disappointed, leaving the session with some sharpened ideas and some new ones, as well as some useful contacts.
It was good to see someone from the NCTL (Julia) leading in this area, although I have concerns about the focus of the main RCT already underway, the rather disturbingly titled “Closing the Gap: Test and Learn”. I don’t know whether it’s the connection of testing and learning like this, or the fact that the word ‘test’ comes first, but I hope that the reality of the project is far more enriching than the moniker suggests.
The work of Katie’s Harrogate-based Teaching School Alliance looks like it is or could be excellent and I was really impressed by the number of hook ups with universities (and their geographical breadth: as far apart as Exeter and Hull). From the presentation and the responses to questions afterwards, it was clear that these links will need to be further embedded and I’m sure that this is a key focus. But, as someone pointed out later in the day, how sure are we that these teacher/researcher partnerships will be able to flourish whilst the Westminster architecture still has them in different departments and whilst government pits an education system against a university system for diminishing funds? These are more than “leaves on the line” questions: they are ruddy great boulders that could derail the purpose of all who attended ResearchED 2013.
I don’t think I was alone in seriously looking forward to Sam’s presentation. He is a highly intelligent man with whom I disagree on much, but he has one of the most decent presences on twitter I have come across. He takes criticism beautifully, puts his point across non-threateningly and almost always responds engagingly with anyone who cares to challenge him.
His analysis of the relationship between policy and research was to the point and his insights into the inner workings of Westminster were eye-opening in places, eye-watering in others and eye-popping in others.
Alas, it was his slide about the limitations of politicians and civil servants when using research in creating policy that was just too damning to leave me with anything but concern for a genuinely evidence-based approach to policy formation. He tried gainfully to suggest remedies to these limitations (electoral cycles, party politics, antediluvian structures and practices, tenure for the least able alongside endless departmental switches for the most able), but greater data transparency et al sounded a bit like placing a few Cnuts along the shoreline. Even his response to my question about a Royal/Chartered College of Teaching, that it would require masses of funding and political will, left me despondent in light of his earlier slide.
A great shame. I was kind of hoping that this window into Westminster might have offered more hope for the future.
I was so glad to finish my day with David’s session, which had more laughter in it and more vocal agreement from the teachers in the room than any of the previous sessions I had been to. He has the presence of a teacher I’d have loved to have been taught by or work with. He has the ability to convey technical information from the research sphere with simplicity and absolutely no patronisation. Most importantly, though, he has a healthy scepticism about some of the claims made about research by other more ardent people.
He began with a very welcome deconstruction (but not destruction) of the overly revered ‘Visible Learning’ work of Hattie, notably the methodology underpinning ‘effect sizes’. The bathwater thus disposed of he invited us to look closely at the baby left behind; and that baby is feedback shaped.
He went on to discuss the limitations of research in schools, a not dissimilar approach to the one taken by Sam with politicians, and yet this evaluation didn’t feel like a deal-breaker (maybe, I’ll admit, because I work in schools and know the context better). But it was in his suggestions for ways forward that I found most hope and most resonance with my own thinking. Rather than quote at length let me show his slides on this issues.
In short he advocates a classroom-focused approach to research in education that walks the tightrope of what has already worked elsewhere with what has already worked in our schools. Perhaps more importantly, he suggests a new system architecture (not just an information architecture advocated by others during the day) that brings teachers and researchers together in a more symbiotic relationship: sign me up for that now.
David’s talk was the empirical bookend at the close of a day that had begun to make sense to me theoretical bookend at the start of the day provided by Frank Furedi. It is his words that provide the title for this post and I’ll strain the courtesy of the logically-minded by finishing with his words from my first breakout session, unaccompanied by evaluation of mine as I think they speak for themselves. Some are paraphrased, simply because so much he had to say rang true that I couldn’t keep up. I hope I’ve captured the gist.
Evidence has very little to say to us. We are turning to a dark science.
Science inappropriately exported into other domains of the social experience.
We’ve turned science into scientism: a quasi-religious belief system.
The problem with RCTs is twofold. It doesn’t work and it distracts teachers from the real challenges they face.
Dangerous to take a methodology from one domain to another. Pedagogical research has to be organic and integral to pedagogy.
RCTs based upon the premise that children are a bit like patients. Children have deficits that need to be put right.
The pathologisation of the classroom.
Something insidious about the language of interventions. I’m not intervening. I’m teaching. I’m educating.
A lot of implicit pathologising of children at a very early age through an interventionist approach.
“If we adopt an instrumentalised approach to learning then ‘what works’ becomes its own imperative.”
Instead what is it that children need to know. That’s where research should come in.
“Pedagogy always involves an element of rationalisation…but when you rationalise education you run the danger of decontextualise learning.”
Evidence-based approach leads to hyper-rationalisation of education.
Education is not something you should reduce to homogenised, bite size chunks. It is a more fluid enterprise.
I’m a Pedagogic Pluralist. Some can teachers brilliant using phonics. Other might do it a different way.
What works for some teachers doesn’t work for others. It is dependent upon subjective relationships.
Instead of focusing on evidence the profession needs to cultivate confidence in its own professional judgement.
Teachers shouldn’t be worried about being super researchers but better at behaviour and subject knowledge.
The capacity to judge – making judgment calls – leads to the forging of really important research questions.
Summary of ResearchED 2013
Furedi called himself the ‘antichrist’ of the programme (perhaps more than a little of the showman in that!) and wondered at a number of points whether the whole room might be against him. I suspect, judging from my own thoughts and Debra Kidd’s blogpost on the day, that he was far from alone. RCTs have become the ‘GoldenBalls’ of the research in education debate and, with Goldacre as the keynote, I wondered whether the conference would be pedagogically pluralist enough to suit my tastes, or whether pathologisation, homogenisation and hyper-rationalisation might be the only research nutrition on the menu.
But Furedi was wrong. He wasn’t the antichrist at all. He was just another voice at an incredibly pluralist event.
And I was wrong. There was something for everyone in this banquet of professional learning cooked up by Tom Bennett and beautifully sliced and diced by his sous chef Hélène Galdin-O’Shea.
And to be honest, isn’t that the way with both classroom and academic research? Aren’t we always striving for ‘what works’ whatever its recipe and whatever its flavour? Of course, we all have our own tastes and all like to season it in our own fashion. So long as we all keep coming back to the research table for more, like unrepentant and uncowered Oliver Twists, who cares what works for Old Andrew or Rachel Jones or Fearghal Kelly so long as it works for them and for the students that they teach.