X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

In Defence of OFSTED

By Blog Editor, IOE Digital, on 3 February 2014

Chris Husbands 
No-one likes inspectors. The Daily Telegraph reports that TV licence inspectors are three times more likely to be attacked by angry householders than by angry dogs. In Montana, a furious meat processing company owner launched a physical assault on the food safety inspectors who had described his plant as “putrid”. RSPCA inspectors were assaulted almost 250 times in one calendar year. So Michael Wilshaw perhaps has some way to go as criticism appears to come not only from schoolteachers but – strenuously denied – unnamed briefers in the Department for Education, criticism which, he said, left him “spitting blood”.
School inspection in England has a long history. Her Majesty’s Inspectors were established in 1839, and the nineteenth century reports of inspectors remain as invaluable a source on nineteenth century education as the reports of factory inspectors on working conditions.  HMI developed world-renowned expertise in inspection, though their principal role was to provide information and advice to ministers: it was calculated in the 1980s that at the then current rate of progress, each school could expect to be inspected once every 250 years.
HMI was transformed in 1991. OFSTED was established. Every school was to be inspected on a four yearly cycle and – a critical development – the inspection handbook, which had hitherto been a closely guarded secret, was published as the framework for school inspection. Inspection arrangements, managed by OFSTED and overseen by HMI, were contracted out. The framework has been revised regularly since 1991, and the inspection cycle has been varied, but the principle remains the same: regular inspection based on published criteria. Other countries have also developed inspectorates, and Melanie Ehren from the IOE is leading a cross-national study.
There is little doubt that the twin measures of regular inspection and published criteria have exercised enormous influence on the system, and mostly for good. One of OFSTED’s early straplines was “improvement through inspection”, and the key idea of examining the performance of all schools on the same basis is one, albeit only one, of the measures which have helped to raise expectations of what is possible, of what schools can achieve.
The problems for OFSTED have often been not the inspection framework, nor the principle of judgements: all the research on education assessment and evaluation is clear that evaluative judgements based on public criteria matter. Instead, there have been concerns about variability in the quality of inspection teams, about the reliability of their judgements, about the interaction between a public inspection regime and an ever-tighter accountability framework, and the very serious challenges of sustaining improvement in the most challenging of schools: “improvement through inspection” is a good mantra, but has proved far more difficult to demonstrate in practice. Rob Coe from Durham University has identified the problems for OFSTED: inadequate training in classroom observation produces unreliable judgements about quality, and poor ability to interpret complex data makes it difficult for many teams to contextualise what they see. In a high stakes environment, these weaknesses have profound consequences.
In all this, the issue is, perhaps, less inspection than the weight which is hung on it:  as Melanie Ehren’s project is telling us, inspectorates can work in very different ways. As the reported disagreements between the Department for Education and first the Chief Inspector of Schools and now the Chair of OFSTED suggest, inspection is extremely important. It shapes the way governments, practitioners and the public think about the school system. There are some tough lessons from the history of inspection: there are always tensions between inspectors and policy makers; inspection judgements need to be nuanced as well as incisive; there are always limits to what inspection can do; inspectors stand in the perpetual militarized zone between those who would centralise education and those who would decentralise it. In practice, OFSTED really owns only one asset: its evidence base, still the most comprehensive and thorough evidence base on what happens in classrooms anywhere in the world. It is what makes OFSTED important and relevant, however uncomfortable its findings may sometimes be to read. The independence and integrity of the evidence base are of critical importance. It has been, and remains, a precious commodity in English education.

Print Friendly, PDF & Email

8 Responses to “In Defence of OFSTED”

  • 1
    3arn0wl wrote on 3 February 2014:

    :O Shocked, Dr. Husbands!
    Finland?
    Pasi Sahlberg?
    http://www.theguardian.com/education/2013/jul/01/education-michael-gove-finland-gcse
    I think there’s a need to track student progress – probably statistically – but I think that information should be shared only amongst “service providers”.
    What’s desperately required is a spirit of sharing and the non-intimidatory Advisory Service we had a generation ago. We’re a long way from that, and there’s no sign we’re even going in the right direction! :/

  • 2
    Chris Husbands wrote on 3 February 2014:

    I do like to be provocative. Remember that the core of OFSTED remains HMI

  • 3
    teachingbattleground wrote on 3 February 2014:

    Reblogged this on The Echo Chamber.

  • 4
    Alasdair Smith wrote on 3 February 2014:

    If OFSTED is so important why do they not have one in Finland? I understand their inspectorate was shut down in part because teachers were trusted with raising standards. And evidence suggests Finland has been more successful in raising standards.
    OFSTED’s evidence base may well be useful but as every farmer knows weighing a pig does not make it grow! I wonder if we spend more on OFSTED than on CPD?
    And even if we don’t, the process of separating inspection from professional accountability and turning it in to a lucrative business does little to secure its independence or credibility.

  • 5
    Steve Shaw wrote on 3 February 2014:

    If the collection of data for the evidence base is ‘unreliable and of variable quality’ what value can you place on the one asset that you claim OFSTED have?

  • 6
    professor Rita Jordan wrote on 3 February 2014:

    I agree in a general way but would want to make a plea for the education of those who require very particular understanding i.e. those with autism spectrum disorders. For 10 years I chaired an accreditation programme for services for children with autism spectrum disorders and we found the quality and usefulness of that accreditation depended on the the quality of the accreditation teams – they needed to understand autism and understand the service (schools or care services). In setting up the accreditation programme we first tried the tick-box approach that OFSTED uses but found that just did not work. It proved almost impossible to quantify ‘quality’ in this way, but quality could be recognised as long as those who were looking were skilled and knowledgeable. But such a service proved very expensive to run and eventually was changed to a system much more like an ‘investors in people’ scheme, which measures how people conform rather than how they are creative in providing the best service for individuals. There are increasing numbers of professionals in all our schools (mainstream and special) who do have knowledge of autism and are committed to providing a quality service for those on the spectrum. However, they are always contacting me in what is often close to despair as OFSTED inspectors fail to understand what they are doing and why, or (even worse) senior management curb and narrow what they do to fit what they imagine is required by OFSTED. No-one can understand autism without training so how can we expect OFSTED inspectors to make effective and fair judgements without that knowledge? Autism education has made great strides in the UK with lots of courses and guidance, but narrow ignorant ‘quality’ assessments are in danger of setting all this back and destroying the morale of dedicated professionals.

  • 7
    primaryblogger1 wrote on 5 February 2014:

    Reblogged this on Primary Blogging.

  • 8
    Graham Holley wrote on 6 February 2014:

    In my view, Ofsted has been the single most important factor in raising standards in the last 25 years.
    That is notwithstanding all the angst caused to schools – which I understand – and the (marginal?) inconsistency of approach and judgements to which attention is rightly drawn here.
    Greater freedoms and autonomy at school level must be balanced by a robust accountability structure, and Ofsted is an important part of that.
    That said, part of me does still remember fondly the Royally-appointed inspectors of HM Inspectorate prior to 1991. They were based within the DfE of the day, while still having an independent voice. They inspected more to offer development advice to schools, rather like experienced consultants. They also helped the DfE to form policies, advising on practicability.
    One concern that I do have now is that the great wealth of data that is collected by Ofsted is not analysed and used as systematically as it might be to raise standards across the system as a whole.

Leave a Reply