Mandarin makes sense for children and schools
By Blog Editor, IOE Digital, on 6 June 2014
6 June 2014
Since the Prime Minister’s visit to China in December 2013, there has been more talk in the press not just about the rise of China, but also about the teaching of Mandarin Chinese. Does teaching Mandarin make sense for schools – for students, MFL departments or headteachers? From our perspective here at the Institute of Education (IOE), it certainly does.
Let’s take students first. Learning Mandarin Chinese has the same excitement for young people as any language, but more so. When you are 11, learning to conjugate sein or avoir can seem to take forever, but Chinese verbs don’t conjugate and there are no tenses. So the verb ‘to be’ is 是 and it is the same whether it is linked with I, you or they; to make 是 into a past timeframe, you just precede it with 昨天 (yesterday). Progression is easy.
The standard mantra is that characters are hard. This is not true and they often build up logically. Let’s look at a few. Note that they all have the same first character which means electric.
电脑 (electric brain),电话 (electric speech),电视 (electric vision),电影 (electric shadows).
I leave the reader to guess the English meanings, but answers are at the end of this blogpost. More of this approach to decoding characters can be found at www.chineasy.org. Take a look.
For students, learning Chinese is enriching culturally. Study of the language and culture enable an entirely different way of looking at things. Students move quickly beyond the Paul Merton in China comprehension of the country (i.e. that everything is weird) to a more complex understanding of aspects of an ancient culture and artefacts, the ever-popular martial arts and a thriving contemporary cultural scene.
Does teaching Chinese make sense for the MFL Department? It makes no sense not to. The students love it and there is plenty of evidence, albeit anecdotal (for we are yet to push forward on a substantive research agenda) that students who have not hitherto been motivated by language learning enjoy Chinese too. Nearly everyone is starting together at the beginning with characters. Characters are constructed from components; students enjoying shapes and looking for similarities are intrigued by the puzzle. There are no endings, so characters can be manipulated to make sentences easily.
At GCSE, where Mandarin Chinese is embedded in a school, results are often amongst the best in the MFL department. There are now good teaching materials available, written by teachers teaching Chinese in this country. There is a strong Chinese teaching community of native and non-native speaker teachers, which is mutually supportive. There is an Annual Chinese Conference for school teachers of Chinese – this year’s (the 11th) is on 6 June at the IOE with over 270 registered to attend.
Why should headteachers make room for Mandarin? Gone are the days when it was impossible to find a teacher. Mandarin Chinese teachers are doing PGCEs and coming into the system.
There is a chance to give your school an edge. Schools which offer Mandarin Chinese are finding that this is a ‘key draw’ for applicants. Some schools, for instance Kingsford Community School in Newham, have found that focus on Mandarin and Chinese culture for all has also proved a cohesive force for the school community.
Headteachers need to prepare their students for the future. As Leszek Borysiewicz, vice-chancellor of Cambridge University, said in the Guardian this week: “In an international world of tomorrow, I’d love to see more children in Britain having more than one language to be able to fall back on.”
There is no reason why not to add Mandarin Chinese to the range of languages on offer in school and lots of compelling reasons why it is a very good idea.
Katharine Carruthers is director of the Confucius Institute at the IOE. This post was first published by The TES as part of Languages Week
Answers to the character quiz:电脑(computer),电话 (telephone),电视 (TV),电影(films).
Why do grammar school systems increase inequality?
By Blog Editor, IOE Digital, on 5 June 2014
5 June 2014
By Lindsey Macmillan (IOE), Matt Dickson (University of Bath), Simon Burgess (CMPO)
The role of grammar schools is still a hotly contested topic in education policy in England. We contribute to this debate by showing that earnings inequality is higher under a selective system in which pupils are allocated to secondary schools based on their performance in tests at age 11. While selective systems have declined since their heyday in the mid-1960s, a number of areas retain a selective system and some believe that this system should again be expanded.
In our recent paper, we moved away from typical questions around grammar schools such as whether access to them is fair (it isn’t) and what the impact of grammar schools is for the marginal student (debatable), to ask about the longer term impacts of these types of systems on earnings inequality.
Using a nationally representative panel data source, Understanding Society, we considered the adult earnings distributions of over 2500 individuals born between 1961 and 1983, comparing those who grew up in an area operating a selective schooling system to those who grew up in very similar areas operating a comprehensive schooling system.
We ensure that the areas we are comparing are very similar by matching areas that are comprehensive to selective areas based on the average hourly wage, unemployment rate and proportion of private schools in both areas. The rich data source also allows us to control for things that may be driving the choice of area and the later earnings distributions, such as parental education and occupation when the individual was 14, gender, age, ethnicity and current area of residence.
We therefore compare the adult earnings of people who have very similar characteristics, live as adults in very similar areas and grew up in very similar areas: the main difference being that one area operated a selective system and the other a comprehensive system.
When we consider these two groups, then, we see that earnings inequality is greater for those who grew up in areas operating a selective system compared to those who grew up in comprehensive areas.
Comparing individuals of similar characteristics, the variance of earnings (2009-2012) for those who grew up in selective areas is £29.22 compared to £23.10 in non-selective areas. Put another way, the difference in pay between those at the 90th percentile of the wage distribution and those at the 10th percentile for those who grew up in a selective system is £13.14 an hour compared to £10.93 an hour in comprehensive systems.
On a personal level, if you grow up in a selective system and end up with earnings at the 90th percentile, you earn £1.31 more an hour (statistically significant) than the similar individual who grew up in a comprehensive system. At the other end of the scale, if you grow up in a selective system and don’t do so well – earning at the 10th percentile, you earn 90p less an hour (statistically significant) than the similar individual who grew up in a comprehensive system.
We can also compare the 90-10 wage gap between selective and non-selective areas to the overall 90-10 wage gap in the sample. As noted, in selective areas the 90-10 wage gap is £2.21 an hour higher than in comprehensive areas. This accounts for 18% of the overall 90-10 wage gap in our sample. So selective systems account for a large proportion of inequality in earnings. The message is clear. Grammar systems create winners and losers.
There are also interesting differences by gender. If we look separately at males and females, we see that males in selective systems at the top of the earnings distribution do significantly better than their non-selective counterparts (£2.25 an hour) while there is no difference for those at the bottom of the earnings distribution.
For females, the picture is the opposite. Females growing up in selective systems who do well look very similar to successful females from non-selective systems but those who do badly earn significantly less (87p an hour) than their comprehensive system counterparts. We think this could be because males were outperforming girls at school for the cohorts we consider and so more males attended grammars and more females attended secondary moderns within selective systems, although we cannot observe this directly.
What lies behind these differences? Inequality in earnings comes from inequality in qualifications and these in turn might derive from differences in peer effects and teacher effectiveness between the systems. We speculate that in the 1970s and 1980s more able teachers might have been more effectively sorted in a selective system into schools with high attaining pupils. The evidence on peer effects in the UK is mixed but the evidence on teacher effectiveness points to this as a possible key mechanism.
Whatever might be driving this phenomenon, our research shows that inequality is increased by selective schooling systems. If this is combined with evidence that sorting within selective systems is actually more about where you are from rather than your ability, then selective systems may not be the drivers of social mobility that some claim. The pros and cons of a system which creates greater inequality will doubtless continue to be passionately debated. What we cannot ignore is that there are losers as well as winners in this story.
‘Knowledge exchange’ between researchers and practitioners must be a two-way street
By Blog Editor, IOE Digital, on 27 May 2014
27 May 2014
By Louise Stoll and Chris Brown
Both of us are fascinated by how research finds its way into policy and practice. Most researchers hope their findings will be used, but engaging people isn’t always easy or straightforward.
It’s good to see an increase in initiatives focusing on this challenge – for example, the Education Endowment Fund’s recent call for bids in relation to encouraging the uptake of research in schools. Many terms are used to describe the process – dissemination, knowledge transfer, knowledge mobilisation, research utilisation to name a few. Whatever their intention, the message they can convey to practitioners is that researchers have the knowledge that practitioners need to receive.
Our attention has been caught, though, by the term ‘knowledge exchange’. This suggests a two-way flow in a more equal relationship, which makes a lot of sense. Everyone has their own knowledge and experience to share and research can enrich this, as well as pushing researchers to think again about what their findings mean in different contexts.
An R&D project, funded through the Economic and Social Research Council’s (ESRC’s) Knowledge Exchange Opportunities Scheme has been giving us the opportunity to explore researcher/practitioner relationships in more depth. Over the last six months, along with our colleagues Karen Spence-Thomas and Carol Taylor, we have been working with Challenge Partners, a group of more than 230 state-funded schools across England that work collaboratively to enhance the quality of their teaching and leadership, with an ultimate aim of improving outcomes for children and young people. Challenge Partners (CP) aim to provide a vehicle for their schools to learn from the best of their peers.
Our project has been adding research into this mix. It’s exploring and learning about establishing an expanding and sustainable network of middle leaders (such as department heads, subject leaders and key stage leaders) across CP schools that can: exchange evidence-informed knowledge about effective middle leadership that changes teacher practice; track its impact; and find powerful ways to exchange the outcomes of their applied project work more widely within and beyond the partnership to benefit a broader range of educators. For a summary of our project questions and the project, see: ESRC Middle Leaders Project.
In workshops, we share both research findings and effective practice. These are then blended together to create new knowledge that middle leaders use to design and refine processes and tools to help them lead more effectively and track their impact. In between sessions, the middle leaders test new ideas and trial tools with colleagues and teams both in their own and in other schools. They do this both via face-to-face engagement and through social networking. With us they will also be developing processes to embed the notion of sharing high quality research-informed practice between schools in their own networks and for practitioners in other networks. We have a parallel evaluation strand where our researchers and researchers from two Challenge Partners (CP) schools and the CP office are collecting baseline and follow up information, and following project activities.
Partnership is absolutely critical. We co-designed the project with CP, are now involving middle leaders in planning and facilitating sessions, and are co-evaluating the project. Through this, we are trying to model knowledge exchange and collaboration by drawing on the expertise and practices of researchers, knowledge exchange professionals (a term used by the ESRC to describe people who help translate research findings) and practitioners. We hope this will increase the project’s potential to benefit the middle leaders and their colleagues and pupils.
Ours is a two-way relationship: we are learning from our partners as well as them from us, and we have combined our research knowledge, Challenge Partners’ prior experience and published knowledge, and the middle leaders’ knowledge. At times this challenges our thinking – we are tracking this as well – but we know that powerful professional learning does just that.
We will be back with an update in a few months.
Let’s not play fast and loose with language, especially when talking about illiteracy
By Blog Editor, IOE Digital, on 20 May 2014
20 May 2014
By Brian Creese, NRDC (National Research and Development Centre for Adult Literacy and Numeracy)
Oxford English Dictionary: Illiteracy: of persons ignorant of letters or literature, spec. (in reference to census returns, voting by ballot etc.)
Wikipedia: Functional illiteracy is reading and writing skills that are inadequate “to manage daily living and employment tasks that require reading skills beyond a basic level.” Functional illiteracy is contrasted with illiteracy in the strict sense, meaning the inability to read or write simple sentences in any language. Foreigners who cannot read and write in the native language where they live may also be considered functionally illiterate.
The old adage suggests that there are ‘lies, damned lies and statistics’, clearly blaming the numerate for obfuscating the truth. I think this is a calumny; it is plain to me that it is the literate and the way they subvert the meaning of words who cause all the trouble.
Once upon a time we knew what words meant, and if in doubt, we could look them up in the dictionary. But these days words change so rapidly we can have little recourse to books for help. Here at the IOE I am surrounded by potentially explosive words which have deep and subtle nuances: is that deadline challenging rather than impossible? are we teaching, supporting or delivering? dare I have a brain storm in this company? do I mean English and maths (GCSE) or literacy and numeracy (functional skills)?
Michael Gove, in a speech to the British Chambers of Commerce, declared his intention to “eliminate illiteracy and innumeracy in Britain”. In a welcome burst of ambition he went on, “…in the same way as developing nations know they need to secure clean drinking water and eliminate malaria if their children are to flourish.”
So there is no doubting Mr Gove’s ambition. But is this a hard challenge? The thrust from the likes of NRDC has been to work with those with poor or very poor literacy skills. But we have never made illiteracy a prime target, because there really aren’t many illiterate people in the country. There are undoubtedly a few who exist largely off the radar – remote Traveller families perhaps, some immigrants who may be illiterate in their own language. But usual estimates are that way under 1% of the population is illiterate.
So eliminating illiteracy doesn’t look that tricky!
But perhaps Mr Gove is using a different definition of illiteracy? The Leitch Report, one of the fundamental drivers of basic skills policy for the last Labour government, defined something called ‘functional literacy’ as the skill level possessed by those with literacy skills of Level 1 and above (equivalent to GCSE D-G scores). Confusingly, the report defined ‘functional numeracy’ as those with Entry level 3 skills and above (the average 9- to 11-year-old). This is, I think the definition used by the National Literacy Trust, who suggests there are 5.2 million ‘functionally illiterate’ adults in England alone. The trust defines the ‘functionally illiterate’ as those whose skills are “at or below those expected of an 11 year old”. I assume this means those with literacy skills below Level 1.
Meanwhile, Shadow Employment Minister Stephen Timms recently pointed out that “one in 10 jobseekers lack basic skills.” Mr Timms, however, defines ‘basic skills’ as those of people on Entry level 1 or below (the level of a 5- to 7-year-old), a rather lower benchmark than used by Leitch and the charities.
Does any of this matter? I think it does. The term ‘illiterate’ is not neutral and certainly carries with it a series of expectations which could easily stigmatise an individual. The professionals may be happy to use ‘functional illiteracy’ as a label and have a clear understanding of what it means, but label someone illiterate in the real world and the expectation will be that they cannot read or write. Indeed, the formal definition (above) suggests that illiterates are ignorant of letters.
NRDC and other organisations working with those who do indeed have poor literacy skills know well the strategies used by people to ‘get by’ in the real world. Even those at the lowest formal levels can recognise words, and deduce some meanings. To suggest that the average 11-year-old is illiterate is similarly misleading, and rather dismissive of the achievements of that age group. They may struggle to read the Financial Times but this is hardly a meaningful definition of literacy (functional or otherwise). I would suggest that most 11-year-olds function quite adequately and do have reading and writing skills that are adequate to manage their daily living. I’m sure their teachers would never define the average Year 6 child as ‘illiterate’.
So while I clearly welcome Michael Gove’s support and determination to improve the literacy and numeracy skills of the adult population, I really think his use of language was unfortunate and misleading. Please, Mr Gove, pledge to help those adults with poor literacy and numeracy skills as much as you can, but don’t label them illiterate. That is something else altogether.
What is the problem for which MOOCs are the solution?
By Blog Editor, IOE Digital, on 14 May 2014
14 May 2014
By Diana Laurillard, London Knowledge Lab
MOOCs – Massive Open Online Courses – have been grabbing headlines and conference time for a year or two now. It’s the very large numbers that attract attention. But are MOOCs solving any real, global education problems? They are certainly not solving the problem of providing the 100,000,000 university places now needed by young people in emerging economies desperate for HE. This will double by 2025. They are not the people taking MOOCs.
They are not solving the problem that in the US student loan debt is now higher than credit card debt; nor the problem that in the UK 40% of student loans will not be repaid. University fees remain high while graduate pay is still low.
Massive sums have been invested in these courses by universities and venture capitalists, but right now the main beneficiaries are those who need it least. The most popular MOOCs are in computer science, finance and psychology. They do attract large numbers – sometimes hundreds of thousands to one course. But the people most likely to stay the course and gain a free qualification are well-educated men in their 30s working in professional jobs. Research by MOOC provider Coursera shows that 85% of MOOC participants already have university degrees.
So the problem MOOCs succeed in solving is: to provide free university teaching for highly qualified professionals.
Consider another problem: achieving the Millennium Development Goal of universal primary education by 2015. UNESCO data show (PDF) that by 2015 there will still be 53m children out of school.
When attempting to address our most ambitious educational goals, it should be a professional habit always to ask “how can technology help?” – especially when they are large-scale.
How do we reach these children? The answer is that we don’t, not directly. We focus first on developing the teachers. UNESCO estimates that we need 1,600,000 teachers to achieve universal primary education by 2015 (PDF page 223). Suppose we could use MOOC-style courses to provide teacher development for 10,000 teacher educators in the cities of developing countries? And each of those could use the same MOOC materials to train 10 teachers in the local towns? And each of those could train 16 local teachers in their villages? And they in turn could reach the children who would not otherwise have had any primary schooling…?
Here at the IOE, we are making a start. Supported by the UNESCO Institute for IT in Education we are pioneering a MOOC on ICT in Primary Education. It’s due to begin on 27 May, and we have already enrolled over 4000 teachers, school leaders, policy-makers and other educationists from more than 50 countries. It will run for 6 weeks, and is built around case studies of good practice from around the world.
This is a professional development course for which the teaching methods currently used in MOOCs – videos, forums and quizzes* – are appropriate, because teachers are professionals who know how to learn, and can learn a lot from each other. These methods are not sophisticated enough for teaching children or even undergraduates in the developing world, which is why the beneficiaries are still the rich. But they may help to train the professionals who can begin to make the difference.
The demand for education will continue to rise; we cannot afford to scale up at the current per student cost, in any sector, in any country. And even at the modest cost of $49, our CPD MOOC is a stretch for teachers from developing countries**.
If we are to have any hope of reaching our most ambitious educational goal of universal primary education, we have to find innovative ways of teaching. MOOCs could be part of the solution, but only if we start focusing on the problems we have.
Free university education for highly qualified professionals is not one of them.
* However, the UK’s FutureLearn does have more ambitious plans for the pedagogy it will support.
**Recently we asked Coursera for differential pricing by country, and I was delighted to see in their latest roadmap that they are responding to pressure on this, and will introduce it soon.
Understanding impact: what does it actually mean?
By Blog Editor, IOE Digital, on 9 May 2014
9 May 2014
Research changes lives. Somewhere along the way, every research project involves a question which is about making a difference to the way people think, behave or interact. In applied social science – the field in which the IOE excels – research sets out to understand, shape and change the world around us.
The idea of research ‘impact’ has become important in research management and funding. Universities are now held to account for the impact of their research, not least through the Research Excellence Framework. But ideas about ‘impact’ and how to secure it vary.
In popular accounts of science, research is sometimes described as ‘earth shattering’, as if it creates something like a meteorite crater reshaping the landscape for ever. An example might be Frederick Sanger’s development of rapid DNA sequencing in the 1970s, which has transformed practices across any number of fields.
But there are other images to describe impact. Not all research has what the LSE research team looking at impact call an ‘auditable’ consequence. They comment that research applied in practice is “always routinized and simplified in use” so that over time, the impact fades like ripples in a pond.
The University of the Arts talks of its research as having creative outcomes that enhance cultural life and provides examples of artworks which create possibilities and shift perceptions: ideas which float into the air like dandelion seeds.
The impact of some research is apparent quickly – though almost never as rapidly as the tabloid newspapers which almost weekly trumpet miracle breakthroughs would have us believe – whereas in other cases it can take decades before the value of research becomes apparent.
Not only does the IOE itself undertake research which seeks to have an impact, it’s also interested in understanding what impact looks like, what it means and how it happens. At a recent conference we explored the linked questions of research impact and public engagement: the relationships between research, policy, practice and improvement, are things some of my colleagues try to understand.
The ESRC defines research impact as “the demonstrable contribution that excellent research makes to society and the economy“. This suggests three components: the demonstrable nature of the contribution, the quality of the research, and the focus on both society and the economy. Successful impact means holding all three in creative relationship: without any of them, the other two are diminished. Research which is not excellent will not make a demonstrable contribution; research which sits unpublished and unread will not, whatever its methodological sophistication, make a demonstrable contribution and so on.
Understandings of impact – or of knowledge mobilisation and its dynamics – have been transformed over the last fifteen years, as the barriers to, and strategies for making creative use of research to impact more directly on people’s lives have become clearer and ways of engaging the public in the dynamics of research have developed. No research – however excellent – ever simply has an ‘impact’.
Richard Doll discovered that smoking caused lung cancer in the 1950s, but it took several years and active public health campaigns to change behaviour. In education, the gap between, say, research on assessment for learning (AfL) and AfL practice suggests that – like the idea of the droplet making ripples on a pond – the impact of research can quickly dissipate unless something active is done.
Research always needs mediating – or, to put it differently, research impact needs a plan. Academics used to talk about ‘dissemination’, but thinking has moved far beyond such models – “I research, you listen” – to more creative and nuanced understanding of the ways knowledge moves – and does not move – around organisations and society. We have learnt that while these relationships are complex, they can be managed effectively.
In the early days of work on research impact, thinking focused on ‘what works’, on the assumption that research could tell us what techniques have maximum effectiveness, and that this could in some way be taken to scale by more or less sophisticated dissemination techniques. We have become – individually, institutionally, collectively – more sophisticated than that, and we have done so quickly. We know that ‘how it works’ and ‘why it works’ are just as important and that the effort to link the worlds of research, policy and practice involve commitment and engagement from all parties. In Ontario, the Knowledge Network for Applied Education Research tries to draw key groups together to enhance knowledge mobilisation. Bringing about change in practices is never easy, as anyone who has ever tried to lose weight, get fitter or learn a new language knows.
There’s a nice social enterprise quotation: “success does not matter. Impact does”. The IOE is a socially engaged university. We care about the quality of the research we undertake, and we make sure that it is of the highest quality. But we care equally about the way our research shapes the society it is seeking to understand. We understand that research evidence will always be only one of the factors that influences society, and that other factors always intervene. But we also know that progress has been made in the past in this field and more can be made in future with persistent effort.
For us, ‘impact’ is not an artefact of the 2014 REF, nor an obligatory hoop through which to jump. There is a wonderful line from the 2008 strategic plan for Carnegie Mellon University – and very, very few university strategic plans contain quotable lines. But in 2008 Carnegie Mellon got it right: “we measure excellence by the impact we have on making the world better”.
How England’s emetic testing regime is causing new academic diseases
By Blog Editor, IOE Digital, on 6 May 2014
6 May 2014
By Frank Coffield
Students are contracting a new disease – bulimia academica – defined as repeated bouts of bingeing on information and regurgitating it in exams. The pressures on students to obtain the best possible grades have become so intense that they feel forced to resort to ingesting large amounts of information and then, in government-induced bouts of vomiting, otherwise known as national tests, they spew it out.
The term – bulimia academica – is not being used lightly as that would insult those suffering from bulimia nervosa. Instead it is considered to be every bit as serious as its medical counterpart. Far from feeling better afterwards, students end up feeling empty and educationally malnourished. The students I’ve interviewed in FE and Sixth Form Colleges come to associate learning not with growing self-confidence and a sense of achievement, but with stress and self-disgust. Learning for them is reduced to the skill of passing exams rather than the means of understanding and coming to love the subjects they’re studying.
The cause of this new disease is no mystery. The increasingly punitive testing regime in England is responsible. Politicians from all the main parties will have you believe that it is robust and rigorous. It’s neither. It’s purgative and emetic and as such is both ineffective and inefficient.
This learning disorder is compounded by its equally distressing twin – anorexia academica – that affects individuals and the system. Some students become anxious about being seen by their classmates to be clever; they restrict their intake to bite-size chunks of information which makes them easier to swallow – the educational equivalent of chicken nuggets. In their teenage years, they lose their previously keen appetite for learning, give lame excuses for repeated failures to learn and pretend to have studied when they have not; and pretend not to have studied when they have. They spend their time reading self-help books about study skills without ever acquiring any. This response may be the self-harming outcome of having been tested every year since they were five years of age, a regime which has turned their stomachs against learning.
The education system also shows symptoms of the same malaise, with some curricula driven by qualifications that have had the educational nourishment stripped out of them. Groups of students can be found in colleges discussing topics about which they don’t have sufficient knowledge to form opinions and so their learning remains shallow. We offer young people so-called ‘transferable’ skills and then discover they need to be in command of a body of knowledge before they can be either critical or creative.
What’s to be done? In a new book, published this month by IOE Press, called Beyond Bulimic Learning: Improving teaching in Further Education, I’ve scoured the research literature for (and tried out in practice) the most effective interventions; and I discuss the results. Even within the tightening parameters set by government, we can still work at transforming our colleges into learning communities and our tutors into experts in teaching, learning and assessment (TLA). Now that Ofsted has decreed that no college can be judged ‘outstanding’ without being ‘outstanding’ at TLA, the best response is for colleges to access the growing body of knowledge on TLA. And this book is devoted to showing how that can be done and is being done within some colleges.
Frank Coffield is an emeritus professor at the IOE
This post has been re-blogged from IOE Press blog
A book launch for Beyond Bulimic Learning: Improving teaching in further education will take place at Blackwell’s Bookshop at the IOE on Wednesday 7 May 2014 at 6 p.m.
Local heroes? Labour’s plan for a ‘middle tier’ should be seen as work in progress
By Blog Editor, IOE Digital, on 2 May 2014
2 May 2014
By Ann Hodgson and Ken Spours
The latest output from Labour’s policy review tries to tackle one of the most difficult legacies of the Coalition Government; a highly fractured and privatised English education landscape. Accordingly, David Blunkett’s Middle Tier Review decided to take aim at the Achilles’ heel of the Gove education revolution – the centralisation of contracting with thousands of schools in the hands of the Secretary of State and the fracturing of the local landscape that, they argue, undermines standards and opportunity for all.
The political dilemma for Labour, however, was to avoid being seen as embracing the old world of the local authority or “creating wholesale upheaval and deconstructing the existing landscape”. Its answer has come in the form of a complex set of proposals aimed at creating coherence, consistency and collaboration in a reconfigured local landscape. Drawing on what it sees as the successes of London Challenge, as well as other examples of local good practice, the Review makes a total of 40 recommendations. However, Labour’s new policy framework on education governance arguably revolves around five key areas.
- The appointment by clusters of local authorities of Directors of School Standards (DSS) who will oversee local performance and institutional collaboration and will work with the National Office of School Commissioners.
- Local authorities to be responsible for a range of functions including fostering collaboration, representing parents, planning school places and championing the needs of vulnerable groups such as NEETs (young people not in employment, education or training).
- Academy chains to be regulated and inspected and schools will be free to leave them and to join other types of partnerships or trusts.
- Education Panels of local stakeholders to provide additional local oversight and accountability.
- The re-establishment of the National College of School Leadership linked to an alliance of teaching schools.
In addition, and as almost throw-away points, the report also suggests the need for a light-touch curriculum framework with room for local innovation and the establishment of a Curriculum Advisory Group. This would be drawn from across the political spectrum and report to the Secretary of State to overcome politicisation of the curriculum and to ensure that all students have an entitlement to personal development, citizenship and a sense of identity and belonging.
The report, Review of Education Structures, Functions and the Raising of Standards for All: Putting Students and Parents First, is not the easiest of reads. It is simultaneously both complex and technical, and vague and open-ended. Nevertheless, its strengths lie in its recognition of the need for the devolution of powers to the local level; for greater institutional collaboration; a consistent approach to teacher professionalism and qualification; strategies to gradually knit together a local learning system and the promise of a more open approach to curriculum and innovative learning.
Interestingly, David Blunkett’s Foreword to the document focuses on learning, creativity and inspirational teaching rather than the substance of the report on educational governance. This could be seen to reflect where his heart really lies.
There are, of course, weaknesses. To some the document will still look very New Labour with its reluctance to fully politically invest in local government and local democracy – the central role of local commissioners to hold the show together; the rather vague and constrained roles of local authorities; the lack of substance behind the proposals for collaboration and the possibility that Education Panels might turn out to be just talking shops. The document also broadly ignores post-16 education and colleges, even though it talks briefly about progression at 16.
However, there may be cause for a more benign interpretation. This was never going to be easy for Labour, given the ambition of the Gove organisational revolution and David Blunkett has had to balance the advice from the different think tanks – IPPR and Compass – as well as the political preferences of Shadow Education Secretary Tristram Hunt. As such it should, we think, be seen as work in progress with some promising proposals that will need elaborating and much discussion. There’s a lot to play for as Labour continues with its Policy Review and tries to be in a position in 2015 to put any of this into practice.
A fairer deal for top International Baccalaureate students
By Blog Editor, IOE Digital, on 30 April 2014
30 April 2014
It seems that Leeds University and Kings College London have decided to become more generous to their applicants with the International Baccalaureate, lowering the grade equivalents with A-levels. Where once they asked for 39 points (out of a rare maximum of 45), now they will ask for around 35. A good thing too. For too long now, the upper tier of universities have been far too snooty about the IB. Our research at the IOE showed that top universities did not appreciate how well their IB students were doing.
We compared IB and A-level students who make the same decision on university and subject. The principle we used is that the equivalence map between the IB grades and A-levels should be such that IB and A-level students on average are shown to do equally well. We deployed the power of large numbers. Looking at all those students who graduated in 2010, we compared our equivalence map with the map that universities were in practice using. We found that:
- In the middle ranks the universities’ guesses were broadly right: IB students performed similarly to A level students in middle-ranking universities that recruited IB applicants with grades in the low 30s. At the bottom end, students who barely passed the IB (pass mark 24) performed a little worse than their fellow students who had been admitted with low A-levels.
- At the top end of the scale, however, universities’ IB students performed significantly better than those they mistakenly thought were similar A level students. In other words, the ‘high-ranking’ universities were asking for too high grades from their IB applicants. Any IB students rejected because they did not quite achieve the very high grades asked for (around 38, 39) could feel aggrieved, as they would have done at least as well on average at their chosen subject as accepted A level students.
- We excluded Oxford and Cambridge from our analysis because of their intensive selection processes which dominate exam grades as their selection filter. Nevertheless, their admissions tutors frequently appear to ask for unreasonably high IB points (often as high as 42 at Cambridge) compared to the grades asked of their A level students, most of whom have very little trouble in achieving the top A-level grades asked for, once given their conditional offers.
Now don’t get me wrong: it is quite easy to misread research of this nature, thinking that it proves IB is a better preparation for university than A-levels: it does not. As anyone with any contact with IB knows, on the whole the IB has attracted more academically capable students over the years. Of course, more academically-capable students do better academically! The argument that IB is best might be valid, but it would be difficult to set up a statistical test, not to mention a quasi-experiment, that would seriously evaluate that claim. It is very difficult to find adequate ‘controls’ that mimic the counterfactual case for a school-student that opts for A-levels instead of IB.
There’s no such thing as ‘best practice’
By Blog Editor, IOE Digital, on 24 April 2014
24 April 2014
For over 30 years a central plank in the reform programme for education of all governments has been the strategy of identifying and disseminating ‘best practice’. There’s only one thing wrong with this approach: there’s no such thing, but the FE and Skills sector is saturated with the term.
I first began to doubt the strategy when watching with student teachers a video of an ‘outstanding’ teacher working with a small group of well motivated and impeccably behaved pupils in a sun-lit classroom. Were the students inspired by the ‘best practice’ of Miss Newly Qualified Teacher of the Year? On the contrary, they either pointed out that they were teaching not 12 middle-class but 32 working-class students from a sink estate, some of whom were refugees with next to no English. Or they worried that they would never be able to match the smooth, practised performance of the more experienced teacher.
In other words, the two contexts were so different that little learning was transferred or the expertise of the “outstanding” teacher was so far above their current level of performance that they felt intimidated. My attempt to spread ‘best practice’ was more like a con-trick played by the unimaginative on the unsuspecting, particularly because the students were left to work out for themselves how to transfer the ‘best practice’ of the video to their own classrooms.
Further reflection led me to the central weakness with the strategy: it builds up psychological resistance in those at the receiving end, because they are being told implicitly that their practice is poor or inadequate. If their practice was thought good or outstanding, why are they being expected to adopt someone else’s ‘best practice’? Almost certainly they think their practice is pretty effective; that’s why they are using it.
Besides, there are questions that need to be asked of all those pushing ‘best practice’. Who says it is? On what grounds? Based on what criteria? Would another observer looking at the same teaching episode agree that it was the best? Is this ‘best practice’ equally effective with all age groups and all subject areas? What are the distinctions between ‘good’, ‘best’ and ‘excellent’ practice, terms which are used interchangeably? These questions are not answered; we‘re expected to take ‘best practice’ on authority, without evidence. There are no sure-fire, student-proof recipes for the complex, ambiguous and varied problems in teaching.
Luckily, there is a well tested alternative – JPD – where tutors jointly (J) share their practice (P) in order to develop (D) it. In an atmosphere of mutual trust and joint exploration, they explain to each other their successes and struggles in teaching their subject. They then move on to observing and evaluating each other’s classroom practices in a supportive atmosphere which encourages the creativity of both partners.
JPD restores trust in the professional judgement of teachers because it does not undercut their current practice, as happens with the strategy of ‘best practice’, but rather it seeks to enhance it by opening it up to discussion with supportive colleagues. Both partners in the exchange play the roles of observer and observed, of being the originator and receiver of practical advice; and both roles are accorded equal status. This equality in the relationships between tutors in JPD goes a long way to explain why it is proving to be far more effective than ‘best practice’.
This is one of the main themes that I explore in my new book – Beyond Bulimic Learning: Improving teaching in FE – which is published this month by the Institute of Education Press. The rest of the book is devoted to showing how some FE and sixth form colleges are responding to Ofsted making teaching and learning the number one priority by introducing what the research claims are the most effective interventions, while dropping the least effective.
I shall explore here in a little detail two examples. First, I show how to harness the potential power of feedback; I say ‘potential’ because too often feedback has negative effects and some types of feedback are more powerful than others. Many students are dissatisfied with the quality of the feedback they receive – eg what is meant by “Be more analytic”? Tutors too are frustrated by students who prefer to receive praise rather than being challenged to think more deeply. The research emphatically suggests that tutors use the strong definition of feedback, namely, if it doesn’t change students’ behaviour or thinking, it isn’t feedback.
Another chapter shows how Socratic questioning can change the culture of learning in classrooms and workshops. It’s a means of challenging students’ thinking in a non-threatening way; and it treats challenges from students as constructive contributions to dialogue.
Other chapters show how social media can motivate students; combine psychological and economic factors to explain students’ motivation; and they assess the impact of ‘flipped’ learning, peer teaching and peer assessment.
The final chapter addresses the question: “can we transform classrooms and colleges without first transforming the role of the state?” My answer is that we can improve the quality of teaching and learning and make our colleges more like learning communities even within the current constraints of government policy and declining resources.
Close