The Optimus blog

The blog that inspires leaders in the UK education sector

The Optimus blog

The blog that inspires leaders in the UK education sector

Owen Carter

What does research-based classroom practice look like anyway?

A bunch of very intelligent people joined us at Optimus HQ to discuss one question: is research-based classroom practice realistic and is it desirable? Owen Carter reports on some of the conversation.

Research-based practice is a term that’s interested me for quite a long time. So it was a delight to bring together a group of teachers to discuss whether research can shape teaching, should it, and how would it do it?

On the face of it the idea is pretty uncontentious. Surely teachers should make their decisions on the best evidence available to them, and surely scientific inquiry is more reliable than personal intuition? There’s something alluring about projects like the EEF toolkit – finally, classroom techniques in ranked order. Some clarity to the messy business of the classroom! But if this is what research means to most teachers, it’s not enough.

Any decent definition of research probably stresses the systematic nature of inquiry, and the need to evaluate claims against the strength of the evidence they offer. Learning styles and Brain Gym, after all, were marketed as research-based innovations. 

So in order for teachers to tell whether something truly is evidence-based, they need a critical frame of mind that doesn’t just accept things as received truths.

The politics of it all

Another issue is fairly deeply rooted. There’s no doubt that the current of policy is moving in favour of research. Witness Nick Gibb saying things like ‘our government believes in basing teaching, as far as is possible, on evidence’. Pretty hard to disagree. But the tendency of government policy is to quantify, measure, judge – and pretty inevitably create some resentment among those being judged. David Leat, Anna Reid and Rachel Lofthouse put this powerfully: ‘If research engagement simply becomes another policy imperative, driven by an expectation of rapid, measurable school improvement and relentless expectation to raise standards, it is unlikely to achieve its potential.’ A relentless focus on ‘what works’ is likely to create compliance driven checklists which deskill teachers and make them less likely to develop the ability to independently evaluate evidence. It’s not simple as what works, works. Different things work better in different contexts. And engagement from the people at the chalkface is probably a significant factor in whether or not something works. On the day pretty much everyone agreed for the need for teachers to have ‘decisional capital’ – any intervention foisted on teachers is likely to work badly. So research and accountability need to stay separate.

Setting priorities

In 2010 Bob Lenz published a piece on Edutopia’s blog, titled Evidence that PBL works. The study he discusses does indeed show a positive effect for PBL. It has a large sample size of 7000 students, taught by 76 teachers across 66 schools. The way he summarises the results is also accurate.

  • PBL students outscored their peers in the control group.
  • PBL students scored higher on measures of problem-solving skills and their application to real-world economic challenges.
  • PBL teachers scored higher in satisfaction with teaching material and methods than the control group.

But if we go to the study itself we find, on its 14th page, this warning: ‘Since this study recruited a purposively targeted sample, these findings should only be generalized to teachers and schools where the economics program and the associated professional development are a priority.’ So the research is not evidence that PBL works. It is evidence that PBL works in some specific schools where the economics programme was central to the curriculum and a large amount of time was given to economics related professional development – 40 hours in fact. The problem here, as elsewhere, is that thorny and complex research projects get boiled down into a headline or a few bulletpoints. And people remember the headline, not the substance. Research gives us an indication of where to go. It doesn’t tell us exactly what to do when we get there. That is where the expertise and judgement of the teacher come in. As Tim Taylor put it, ‘educational research is complicated. Really complicated. So consider with extreme caution and apply using a critical mindset.’ Can research inform classroom practice? Should it? Leave a comment below or drop me a line. With thanks to @rmlofthouse, @pedagog_machine, @hgaldinoshea, @powley_r, @Maths_Master, @Smichael920, @imagineinquiry, @nor_edu. Honourable mention to @kitandrew1.


Similar Posts

Charlie Roden

Fake news and media bias: teaching pupils to think critically

Children and young people need to be equipped with the knowledge and skills to think critically about everything they read online. Charlie Roden looks at examples of fake news, and how we can teach pupils to make their own informed judgements. The Collins Dictionary defines fake news as 'false,...
John Dabell

Maverick teachers wanted

Teaching is made up of all types and that's a good thing. We need diversity and difference. But do we have enough adventurers and mavericks? Just what type of teacher are you? There are some teachers who do things differently and can be trusted to ruffle feathers. These teachers I like. The pupils...
John Dabell

How to help children who say they are 'stuck'

Teachers are sometimes far too quick to respond to requests for help. How do children benefit from being 'stuck' and how can we encourage them to find a solution independently? When some children encounter a problem, difficulty, or challenge, they stop. Sometimes stopping and pausing for a moment...