Government decisions about student fee levels, research funding and all the other aspects of higher education finance, and how individual institutions respond to these policy choices, now form a central aspect of the study of higher education, in the UK and elsewhere.
But it was not always so: and one of the scholars who was largely responsible for bringing these matters centre-stage academically in the UK was Professor Emeritus Gareth Williams, who was honoured at a seminar held at the IOE this week. The event was structured around the book of essays written to celebrate his contribution to the study of the economics of higher education. The international significance of Gareth’s work is demonstrated by the fact that authors were from the United States, Germany and Canada, as well as from the UK.
As he remarks in his own commentary, Gareth began his work as an economist just at the time – the mid-1950s – when the economics of education was emerging as a distinct topic for study. He joined the IOE in 1984, after working for the OECD and then at the LSE. The Institute had by then become a leading centre for research in the economics of education under the leadership of Mark Blaug (1927-2011), a pioneering (more…)
Jamie Martin, a Leave campaigner and former special adviser to Michael Gove has written a piece in Times Higher Education on how British universities could achieve “education leadership in a post-Brexit world”. Martin begins his article by giving the impression that he sees the Battle of Waterloo in terms of plucky little England standing up to a gang of foreigners. In fact, the Duke of Wellington led a proto-EU multinational army group that would have certain sections of the Press frothing with rage if it was even suggested today. In 1815, it was Napoleon who stood alone against a combined Europe.
The rest of Martin’s piece is a good example of what I suppose we’ll have to get used to from the Brexiters. From being told, pre-referendum, that there were lots of golden opportunities there for the taking once we were freed from the EU’s iron grip (though specific examples were hard to come by), we’re now told that, fingers crossed, there may be ways round the (accurately predicted) difficulties that Brexit presents. It’s as if the UK had just drifted in from mid-Atlantic to find all these interesting things going (more…)
Ronald Barnett, Peter Scott and I have just finished editing a volume of essays in honour of our colleague Professor Emeritus Gareth Williams, one of the foremost contemporary economists of higher education. Much of Gareth’s work has involved the study of markets and market-type mechanisms in higher education.
While he has no objections in principle to market-based methods in higher education as ways of improving efficiency and equity, in a recent essay he sets out their limitations:
- Markets will not provide long-term funding for programmes with uncertain returns;
- the inevitable knowledge asymmetry between higher education providers and users reduces their effectiveness (in other words, the providers know far more about the system than the students do);
- higher education has benefits that go beyond those gained by the people who take part in it;
- stratification (the creation of elites), rather than the differentiation that might be expected in a normal market, is a typical feature; along with other difficulties.
Martin Trow (1926-2007) was a leading American scholar of higher education, probably best known for his work on the development of mass higher education in western countries in the second half of the twentieth century. In one of his many influential studies, he drew a distinction between British universities where what he called “hard managerialism” was to be found, and those where “soft managerialism” applied.
Trow’s hard version “elevates…management to a dominant position in higher education…business models are central to the hard conception.” By contrast, the soft managerialists “still see higher education as an autonomous activity, governed by its own norms and traditions, with a…management still serving functions defined by the (more…)
The Harry Potter library at the Karolinska Institutet in Stockholm – Sweden’s leading biomedical teaching and research institution – is not, as you might perhaps have imagined, a facility in the paediatrics department to distract young patients with the complete works of J K Rowling. It is, in fact, an informal student learning space, kitted-out with faux bookcase wallpaper concealing a secret door, a cuckoo clock, and other Hogwarts paraphernalia.
Downstairs, another learning space resembles a billiard room, with a large rectangular table and billiards-style lighting over it. Next to it is the “New York nightclub”, dimly lit except for pools of light over small round tables. Leave that and you’re in a bright park, with wall and floor coverings in shades of green and park benches to sit on. Around the corner you find a street scene (“Should we have some (more…)
The University Grants Committee (UGC) was created in the immediate aftermath of the First World War, when the penny began to drop in Britain that universities were fundamental to what wasn’t then called the knowledge economy. For most of the twentieth century, the UGC and its successor bodies sought to guide the development of a high-achieving university system by funding institutional development and, at various times, using its funding role to orchestrate system planning. As UK higher education is now generally regarded as a world leader, you could argue that the UGC and its successors did a pretty good job.
Their role began to change from the 1980s onwards, when market mechanisms came to be seen as answers to public sector resource allocation questions. A quasi-market methodology was developed by the (more…)
Is the management of universities much different to the management of other sorts of big, complex organisations? In my new book, The Hallmark University (IOE Press), I argue that it is (or should be) recognisably different – although the best-run commercial organisations have many things in common with the best-run universities. Accounts of what it’s like working at Google sound a lot like working in a university. (more…)
Interesting, isn’t it, how often bold, thrusting entrepreneurs end up asking for money from the taxpayer? Banks, obviously, but others too. The private (though non-profit) University of Buckingham was created in the 1970s to try to counterbalance what its founders saw as political interference in higher education, which in their view was an outcome of the public funding of universities. “Liberty”, you can read on the University’s website, “is constantly under threat from governments…who seek to over-tax and over-regulate”.
A pamphlet by Nick Hillman (2014), now Director of HEPI, the Higher Education Policy Institute, but until recently special adviser to the present Minister for Universities and Science, David Willetts, makes some pertinent observations about Buckingham, and UK private higher education more generally. For one thing, Buckingham’s UK and EU students can now borrow up to £6,000 a year as taxpayer-subsidised loans to meet their tuition fees – so, to that extent, Buckingham is now benefitting from the tax-and-spend profligacy of the current government in relation to student support. Even more remarkably, Hillman quotes Buckingham’s Vice-Chancellor telling a House of Commons Committee in 2011 that “our lives would be so much easier at Buckingham if…[we had] access to QR money [the funds that, in England, HEFCE allocates to universities to support research infrastructure] without having to subject [ourselves] to all the regulatory framework of HEFCE.” Vice-Chancellor, we feel your pain!
The present Government’s 2011 higher education White Paper (PDF) takes a lot of space discussing how to encourage the development of private higher education. What it doesn’t do, I have argued, is to describe with any clarity the problem to which private higher education is the answer. Hillman’s pamphlet helps here: he reveals that what he calls “alternative providers” successfully carried out “lobbying…focused on ensuring their students have access to student support” – that is, so students at what are mainly for-profit private colleges could take out subsidised loans to pay their fees. The lobbying was, in other words, to achieve the transfer of public money into private pockets. (I’m not saying that this is sharp practice: it is simply what “for-profit” implies.) This has been hugely beneficial to the for-profit sector: as Hillman notes, the number of their students claiming public support has rocketed five-fold in two years; and it is fair to assume that profits have risen on a similar trajectory.
As I’ve remarked before, the comparison with the waste of public money associated with the US for-profit sector (never mind the human costs) is depressingly plain. This isn’t, mainly, whatever Hillman thinks, about healthy competition between the public and private sectors (they serve largely distinct markets); or about introducing more diversity (for-profit colleges, pretty well everywhere, teach a limited range of popular subjects in traditional ways*): it’s about an ideologically-driven programme that will waste public money – lots of it.
*see Kwiek, M. (2009), ‘Entrepreneurialism and private higher education in Europe’. In M. Shattock (ed.), Entrepreneurialism in universities and the knowledge economy: diversification and organizational change in European higher education. Maidenhead: McGraw-Hill/Open University Press.
This month marks the fiftieth anniversary of the Robbins report, or the 1963 Report of the Committee on Higher Education as it’s never called, and we’ll be marking it with a one-day conference at the Institute on 24 October. It’s an anniversary worth celebrating.
From today’s vantage point, we can see Robbins as an early indicator that higher education was moving from the periphery towards the centre of British national life –changing from picturesque adjunct to essential component. UK university student numbers had grown from 20,000 at the start of the last century to 118,000 when Robbins reported – roughly the combined student populations of Manchester and Leeds today.
The 1963 report marked a wider modernisation, one of a number of developments signalling a break with the immediate post-war years. Philip Larkin, Hull University’s Librarian at the time, later summed it up: “So life was never better than / In nineteen sixty-three / (Though just too late for me) / Between the end of the Chatterley ban / And the Beatles’ first LP” – though perhaps he wasn’t thinking primarily of the Robbins report.
Although the decisions about founding the 1960s new universities had been made well before Robbins was published (the first, Sussex, opened for business in 1961), it is significant that they are so often associated with the report. Like the report itself, they marked new ways of thinking about higher education, both in curricular terms (Asa Briggs’s “new map of learning” at Sussex), in physical terms (leading architects designing bold new campuses), and in terms of the modestly enlarged entries to higher education that followed their establishment, shortly to be followed by the creation of the polytechnics. Suddenly, higher education looked and felt different.
Parallel modernising changes were taking place in the schools system, overseen by two outstanding ministers: Edward Boyle, the Conservative Minister of Education from 1962-64, and Anthony Crosland, the Labour Secretary of State from 1965-67. As Maurice Kogan notes in his book about the two men, The Politics of Education (1971), they directed the transition from “the assumptions of pre-war education psychology [to those of] the post-war radical sociologists about the extent to which ability could be reliably predicted”. Crosland’s Circular 10/65, which “requested” (not a word now much favoured by Secretaries of State) local education authorities to prepare plans to abolish selection in secondary education was a result. The Robbins-validated expansion of higher education was another response to this dawning realisation (to quote Kogan again) “that access to the more favoured forms of education was differentiated according to social class” – not to ability.
Robbins cleared the way intellectually for the expansion of higher education by driving a stake through the heart of the “more means worse” argument – though, like one of the undead in a cheap horror movie, it nevertheless emerges regularly from its grave. What Robbins’s research showed about what he called “the so-called pool of ability” was that entry to university largely depended not on innate ability, but on your father’s occupation: 45% of children whose father was in a “higher professional” occupation went into full-time higher education, compared with about 2% from families where the father was a manual worker. It was your dad’s job, not how bright you were, that determined whether or not you’d go to university.
The higher education system that we see in Britain today would amaze Robbins in terms of its scale and scope. But its foundations were laid by the work of his committee, and others pressing for social and educational change in the early 1960s. We should use this anniversary to honour their work.
This post is based on a previously published article in the Times Higher Education
A recent biography of J Robert Oppenheimer (there have been six since 2004 alone), Inside the Centre: The Life of J Robert Oppenheimer (2012), by Ray Monk, Professor of Philosophy at Southampton University, makes you think about both twentieth-century scientific history and current national policies.
Oppenheimer’s story has much to interest students of higher education. Apart from his time in charge of building the first atomic bomb at the Los Alamos laboratory from 1943-45 – when he showed himself to be an outstanding leader of a disparate group of brilliant egoists – Oppenheimer spent his working life as a university teacher, researcher and administrator, latterly as the Director of Princeton’s Institute for Advanced Study, where his staff at one point included Einstein, Bohr and Dirac.
A point of particular interest is that Oppenheimer’s academic career spanned the period during which Europe, as a result of self-inflicted wounds, ceded world scientific leadership to the United States. When Oppenheimer graduated from Harvard in 1925 (in chemistry, not physics – a reminder that the disciplinary boundaries we now take for granted have only recently become so rigid), bright young American scientists wanting to work with the world’s best researchers crossed the Atlantic as a matter of course. In theoretical physics, Oppenheimer’s chosen research topic, the choice was between Germany, particularly Göttingen and Leipzig, and England, particularly Cambridge.
Oppenheimer began with Cambridge, where he was unhappy (though it was not, he wrote, “quite so bad as Oxford”), and then in 1926 went to work with Max Born, one of the leading figures in quantum mechanics, at Göttingen, receiving his doctorate there in 1927 – and no doubt improving the University’s doctoral completion statistics in the process. The international language of science was then significantly German, in which Oppenheimer, from a German-Jewish family, was fluent: presumably other ambitious American scientists learned German as a matter of course, in the way that German academics now learn English. (Oppenheimer was in any case a gifted linguist: he learned Sanskrit in the 1930s in order to read Hindu literature.)
Several factors came together to allow America to build an atomic bomb in a stunningly short period. It took about three-and-a-half years from approval of what became the Manhattan Project to the first use of an atomic bomb, at Hiroshima on 6 August 1945. But the crucial period, from when the first scientists arrived at Los Alamos to the “Trinity” test in the New Mexico desert on 16 July 1945, lasted 28 months: perhaps what you’d allow for the implementation of a modest IT project in a university today. But the Manhattan Project was able to build on the world’s best physics and engineering research, created in American universities in the 1930s – Berkeley and Chicago in particular – largely with public funding for the purest of pure research. Through the 1930s, for example, Berkeley seemed to have no particular difficulty in obtaining funding to build ever more powerful cyclotrons, with no practical aim in view: nobody seems to have asked them for an impact statement.
American universities, and later the Manhattan Project, also took full advantage of talent sucked in from Europe, particularly Jewish refugees from fascism in Germany, Hungary and Italy. Even Britain took in foreigners: Rudolf Peierls and Otto Frisch, both German-Jewish refugees, worked at Birmingham University in the 1940s and made a vital contribution to building the bomb by showing that the amount of uranium-235 needed to sustain a chain reaction was a matter of kilograms, not tons as had been thought. This insight made the bomb a practical proposition – though even America struggled at first to produce even a few grams of the stuff. Around the same time, in one of history’s happiest mathematical errors, Werner Heisenberg (of “uncertainty principle” fame), running the Nazi atomic bomb project, made the same calculations but got an answer in tons, and so abandoned the uranium-235 method. Lucky that nobody thought to check his sums.
A number of things allowed the Manhattan Project to succeed, but large-scale, long-term public funding for blue-skies research, together with a policy of grabbing talent from wherever it could be found, and a strong manufacturing economy, were all crucial. Might there, just possibly, be any lessons from this for policymakers today?
Oppenheimer’s loss of his security clearance in 1954, at the height of the McCarthy witch-hunts, was devastating for a man with a strong sense of national duty. There are several ironies here. One is that, while Oppenheimer’s politics were certainly left-wing, he was notably clear-eyed about the Soviet Union, concluding as early as 1947 that negotiations with Stalin over the control of nuclear weapons would be a waste of time. And, just as past outstanding service to the Soviet state was no guarantee of one’s future safety, so the fact that Oppenheimer had given America the bomb (“What more do you want? Mermaids?” a friend asked at his Security Board hearing) did not help him in countering the FBI’s obsession about his political unreliability. There is a depressing contrast between this cold war paranoia and the open, international scientific culture which Oppenheimer had known before the war. Princeton’s refusal to bow to pressure from Washington to sack him must have been a consolation of sorts. The traditions of institutional autonomy and academic freedom, which had served America so well in the Manhattan Project, came to Oppenheimer’s rescue at the lowest point in his career.
The picture on the dust-jacket of Monk’s book shows Oppenheimer writing equations on a blackboard, cigarette in hand. He died of throat cancer in 1967. He was 62.