On 15 October 1971, the first edition of The Times Higher Education Supplement was published. In the five decades since, the publication now known as Times Higher Education has charted the expansion and marketisation of the UK sector while taking an ever more global perspective. Three editors reflect on their time at the helm
Expansion: More means better
I was editor of The Times Higher Education Supplement, as it was then known, from 1976 until 1992. But I was there from the very start, having been the late Brian MacArthur’s deputy when The Times launched a sister title to its schools-level educational supplement in October 1971. So my association coincided with the two decades when the UK acquired, with some reluctance and a few backwards glances, a mass system of higher education.
My interlude between being deputy editor and becoming editor was also partly spent in California, where mass higher education had all begun – and, more specifically, at the Carnegie Commission in Berkeley. Its chair was Clark Kerr, former president of the University of California, and a frequent visitor was the sociologist Martin Trow: two people at the heart of the university expansion project.
Throughout my two decades, and the three since, the engine of higher education was expansion. The UK’s landmark Robbins report, which stated the case for expansion, had been published only eight years before the first issue of the THES, and it was the growth of student numbers, and of the number of universities, that created the need for the paper. We are now so used to expansion, rather like economic growth, that it is hard to appreciate that this idea was once novel – and unsettling. Kingsley Amis’ cry that “more means worse”, bizarrely seconded by The Times, had only just died away when the first THES issue appeared.
During the 1970s and 1980s, the pattern of growth shifted. Initially, the pace was set by the new universities: the pre-Robbins “plate glass” universities, rather than the polytechnics.
However, the latter were slowly but surely establishing their presence as legitimate partners alongside the established universities, even if they never quite became rival “people’s universities”, as some radicals had hoped. And then, in the 1980s, the growth of the established universities faltered, giving the polys their chance to break into the mainstream.
That faltering was deliberate. Faced with the infamous 15 per cent cut to the higher education budget imposed by Margaret Thatcher’s government in 1981, the University Grants Committee (UGC), with the general support of the university establishment, decided to attempt to protect funding per student – at the expense of expansion. The attempt failed, but in the meantime the polytechnics took up the slack by meeting displaced student demand. By the end of the 1980s, they had already become universities in all but name. In ending the binary system in 1992, allowing polytechnics officially to become universities, the education secretary Kenneth Clarke was just bowing to this reality.
Thirty years later, when the cap on student numbers was removed, what were now known as the pre-1992 universities did not repeat the mistake. Despite moans that they were running undergraduate courses at a loss, they recruited extra students at the expense of many of the post-1992s that, as polytechnics, had flourished in the 1980s. It is interesting to speculate what the shape of higher education might look like now if the pre-92s’ vain attempt to maintain the “unit of resource” had never been made.
Of course, it would have been difficult, in 1981, for the universities to have been made to continue expanding in the face of reduced funding because they still maintained the substance – threadbare perhaps – of autonomy. They were still seen, and saw themselves, almost as a separate estate of the realm, protected from political interference by the buffer of the UGC. They still had the power – just about – to refuse to expand if the price was the erosion of what they saw as a “proper” university education.
As THES editor I observed that autonomy at first hand. The 1981 cuts were not applied equally by the UGC and I was given a preview of the winners and losers by the committee chair Edward Parkes at least two days before the secretary of state for education, Mark Carlisle, knew anything. Funding decisions were still seen as an essentially private matter between the universities and their friends.
A quarter of a century later, as vice-chancellor of Kingston University, I was reminded how things had changed. After a civil servant had spent some weeks at Kingston on an “immersion programme”, I received a thank-you letter from her boss saying how much she had enjoyed spending time in a “delivery organization”. Another country indeed.
This erosion of autonomy, after expansion, was the big change during my years as editor – and, sadly, it has accelerated since. It was the encroachment of Whitehall officialdom, followed later by Westminster politicking, that was at the root of the polytechnics’ removal from the control of local government. The first attempt in the early 1980s was foiled by local authorities, still able to wield bipartisan clout, only for the second attempt later in the decade to succeed. At the same time, however, the UGC was abolished, to be succeeded by a series of more intrusive agencies under the increasing direction of central government.
There were exceptions. Universities in Scotland and Wales were “repatriated” at the end of my time as editor. But nothing could reverse the creeping power of the state – London, Edinburgh or Cardiff – over higher education. An inevitable outcome of expansion, many would argue; higher education had become just too important to be allowed to be free.
Sir Peter Scott was editor of The Times Higher Education Supplement from 1976 to 1992. He is emeritus professor of higher education studies at the UCL Institute of Education and commissioner for fair access in Scotland. He is a former vice-chancellor of Kingston University.
Widening participation: Too narrow an approach
I was not the first in my family to go to university. That moment came a generation later with my daughter.
I ticked all the boxes universities now seek to tick – poor, immigrant family, English as a second language, council estate. But there wasn’t the same focus – or incentive – back then to recruit students from underprivileged backgrounds.
I was bright, too. I passed my 11-plus, and, with a handful of others from the estate, went off to a grammar school. Job done. Or so you might think.
But most of my council estate peers dropped out, one by one, or sank in a system designed for middle-class children. Only one child from my estate went to university, as far as I know. I wanted to be a journalist, and the fastest (and cheapest) route was college. It worked for me.
What people often forget about young people from poor families is that it’s not just the lack of money that prevents them from succeeding: it’s the multitude of factors associated with poverty – poor physical and mental health, terrible living conditions, few books, no place to study, little support or encouragement (for many different reasons and usually not the fault of the parents) and, today, lack of access to wifi and devices.
I am one of the lucky ones. I just caught the tail end of the post-war golden age of social mobility. You need a lot of luck as well as intelligence to overcome your background. You also need a system with the will to help disadvantaged children succeed.
No one ever asked me in my career whether I had a degree. I guess it just wasn’t relevant if they knew I could do the job. When I became editor of Times Higher Education – the name contraction reflected our switch from a newspaper to a magazine format – I made no secret of the fact I had not attended university. Most vice-chancellors liked it because I had no allegiance to any single institution – although one did say it meant he wouldn’t employ me, even if I applied to be his PA.
I will always treasure Cambridge philosopher Simon Blackburn’s reaction. When I told him, he just threw back his head and laughed.
There is no denying that higher education has tried to recruit more poor children. However, it has had very limited success, despite the hundreds of millions of pounds spent over the years.
Widening participation, and its close cousin fair access, was a running thread throughout my 18 years at THE, which I joined in 1994 as a sub-editor. It became headline news in 2000 when the University of Oxford refused to give high-flying state school student Laura Spence a place to study medicine, leading to a row fuelled by the then chancellor of the Exchequer and future prime minister Gordon Brown.
At its core, widening participation was a simple example of carrot and stick. Universities got a large amount of money (the rumour was always that they put in for it never expecting to get it) and the government – and everyone else – got a lovely big blunt instrument to beat them with.
I always believed it was a battle they could not win. Not least because they are trying to close the gap when it is at its widest.
According to the Education Policy Institute, by the time children in England take their GCSEs, disadvantaged pupils (those who have been eligible for free school meals at any point in the past six years) are more than 18 months of learning behind their peers. This gap is the same as it was five years ago and has probably increased through the pandemic. One thing we can be sure of is that it’s not likely to have improved by the time university applications are made 18 months later.
Why on earth, then, are universities trying to solve a problem when it is at its hardest? Arrogance? Hubris? Surely most interventions should take place at the very start of education, when the gap is at its narrowest and when attempts to address the deficits and provide a firm footing for future learning have the best chance of success?
When the UK’s poorest children start school, they are already 11 months behind their peers, according to the Sutton Trust. Surely that’s where we should concentrate our educational efforts and funding? If a child’s reading lags behind that of their better-off peers during the early part of their education, they are not going to catch up as they move through the system without significant intervention. They won’t be able to access the curriculum, they won’t enjoy learning and they certainly won’t want to stay in education.
As editor, I often said that universities should be more honest and admit that the money could be better spent elsewhere in the education system. That view became further entrenched and refined when I later became editor of Tes (previously known as the Times Educational Supplement). I firmly believe universities should relinquish most of their widening participation cash and hand it over to schools, but specifically to the early years, where it can provide the most value.
It is a shift that could close the gap when it is at its narrowest and give all children a solid foundation on which to build their education. And it would liberate universities to concentrate on doing what they do best – advancing society’s knowledge through their top-rated research and scholarship.
Ann Mroz was editor of Times Higher Education from 2008 to 2012.
Internationalisation: the market leader
On a cold winter’s night in 2010, hundreds of demonstrators and one freezing Times Higher Education reporter found themselves confined to Westminster Bridge in London by police in riot gear.
The protest against the introduction of £9,000 tuition fees ushered in both a new year and a new era for universities in England: a shift to marketisation that shaped much of the following decade.
Fast forward to 2020 and thousands of students found themselves confined to their purpose-built, en-suite rooms by a virus, pasting signs to windows decrying the value they were getting from their expensive university experience. Many thousands more were obliged to study from their homes all over the world.
Between these two bookends, the twin engines of marketisation and internationalisation drove much of the decade’s narrative in higher education. This was also reflected in THE’s coverage. Internationalisation, in particular, drove something of a sea change, as a once almost exclusively UK-focused publication continued to expand its purview across the world.
Back in England, for those who oversaw the fee hike, the transfer of costs from the state to the individual was inevitable in an era both of austerity and huge increases in participation in higher education. It was neither sustainable nor fair, so the argument went, that those who personally benefited from higher graduate salaries paid so little of the direct costs.
In truth, the boom in international study had already foreshadowed this change: market ideology was unquestioned in that sphere, and what is good for the goose is eventually good for the gander.
The endless rise in demand for the “product” that universities in the West were “selling” had introduced both the environment and motive for growth, and the fees paid by overseas students subsidised far more than their tuition. This served as a flashing neon sign for policymakers casting around for a way to reduce the cost burden of ever-expanding domestic student numbers (albeit that the model was different, and the public loan system underwrote those private investments quite deliberately).
It is worth acknowledging that at a time when deep cuts were being made almost across the board, this was probably the only way that the unit of resource and student numbers could be protected. But, as many warned, such shifts – even if they are based in constructive pragmatism – have implications far beyond public accounting.
Most fundamentally, the terms of the debate changed: universities were no longer seen as benign education providers but as businesses, and higher education became a consumer story. From there, it is possible to trace a direct line to many of the challenges they face today.
Academic precarity undermining the fabric of the profession? Well, if you turn universities into businesses, they will behave like them.
Timidity in the face of social media cancel culture and attacks on academic freedom? If saying the wrong thing risks financial or reputational Armageddon, it is safer to say nothing at all.
Loss of trust and respect for expertise? If universities are cast endlessly as self-serving, money-grubbing elites, too craven even to stand up for freedom of speech, how could it be otherwise?
Which is not to say that these claims and perceptions are fair, but you can see how they take root.
Even when the public utility of universities is painted in primary colours (and utility is far too modest a word), purveyors of disinformation online and charlatans in office find it easy to distract, deny and undermine. The way in which universities responded to Covid-19, helping to lead the world through a crisis that was almost unimaginable just two short years ago, contrasts starkly with the failure of politicians to respond effectively on a global basis and the dispiriting influence of anti-vaxxers online.
For some, the Covid crunch, which grounded planes and closed borders, was a demonstration of the unsafe foundations on which many Western universities have built over the past 10 years – in particular, the overreliance on international fees, seen in extremis in some Australian institutions. But to leap from that vulnerability to disavowing internationalisation altogether would be a mistake.
Note again how research networks mobilised without regard for national rivalries or differences of opinion when disaster struck. Acknowledge with pride how international study has enriched teaching and learning, building bridges and understanding at a time when both are in short supply.
Globalisation has become a dirty word in recent years, and not without reason: distributed supply chains may have delivered cheap goods, but they have also hollowed out parts of our societies. Meanwhile, the deterioration in geopolitics has turned international relations into a pressure cooker. But we have never been more in need of the international framework that universities provide to share ideas, to allow talent to flow and flourish, and to offer cooperation as an alternative to confrontation. And never has universities’ commitment to evidence and truth, rather than ideology or point scoring, been more important.
One of the perennial questions over the decade I have spent as THE editor is whether this thousand-year-old institution (the university that is – THE is old, but not that old) is still fit for purpose. Will it survive?
It will, but those warning about new models of delivery that will challenge the university should not be dismissed as purveyors of snake oil. At least, not all of them. Change is a given, and universities have been slow to respond in the past. The likely impact of AI on the future of work, in particular, should not be underestimated.
But if there is a lesson from the past decade, it is that we should be careful what we wish for.
Like those outsourced supply chains, or the idea that social media was a leap towards democratic enlightenment, disrupting and disaggregating the various elements of the university would come with all sorts of unintentional consequences. Their significance goes far beyond turning out tomorrow’s workforce, as important as that is.
Confidence in their future rests on the foundations that underpin what universities are: communities of scholars with a common purpose and shared values, operating within a framework that protects what makes them precious and unique. The idea of a university, to coin a phrase.
So here’s to the next thousand years. And to THE’s next 50, as your partner and critical friend.