The Right To Sex: Srinivasan’s cry for help

If you were a greengrocer in Soviet Czechoslovakia, it would be prudent to display, in your window, a poster proclaiming: “Workers of the world, unite.” This is the famous example Vaclav Havel used, in The Power of the Powerless (1978), to illustrate mass conformity to Communist dogma. Havel’s greengrocer probably never thinks about that slogan, let alone believes it; he puts it obediently in his window to signal compliance with the regime. As Havel puts it: “If he were to refuse, there could be trouble.”

I was reminded of Havel’s greengrocer when reading The Right To Sex, a much-lauded new book on women and feminism by Amia Srinivasan — the holder of Oxford University’s prestigious Chichele professorship of social and political theory, a position previously held by luminaries such as Isaiah Berlin.

Despite — or perhaps because of — her standing, she opens the book with a statement typically found in the preface of any contemporary woke writing about women; I’ve come to think of it as a direct equivalent to the greengrocer’s poster:

“At birth, bodies are sorted as ‘male’ or ‘female’, though many bodies must be mutilated to fit one category or the other, and many bodies will later protest against the decision that was made. This originary division determines what social purpose a body will be assigned.”

Yes, commissar, the statement says, the definition of “woman” in my book about women is “anyone who identifies as a woman”. No, commissar, biology is not a thing.

Continue reading “The Right To Sex: Srinivasan’s cry for help”

What the modern mermaid leaves out

Scuba diving is both magical and terrifying. Put on your gear, slip under the surface, and find yourself freed from gravity. In the glory days Before Coronavirus, I remember diving through the clear waters of coastal Turkey, drifting on warm currents and rolling to stare at the sunshine playing on the surface, from underneath.

But even as I rippled through the deep, marvelling at flashing schools of fish, there was a trade-off: constant self-control. Don’t breathe out through your nose. Don’t sneeze. Never, ever panic. For a short while it’s possible to pretend that you have the freedom of such an alien world, but in truth you’re only ever a tourist, granted safe passage thanks to technology, training and self-discipline.

Something about this sense of crossing an uncrossable threshold surely also powers our obsession with mermaids. And it is an obsession: mermaids are everywhere. Monique Roffey’s novel The Mermaid of Black Conch: A Love Story recently won the Costa Book Prize, while “mermaiding” — swimming in the sea wearing a “mermaid tail” — has gained a cult following in Australia. And you only need to browse the girls’ clothing selection in a high-street shop to find countless cartoon girls with fish-tails, sequinned and sparkly, smiling at you from t-shirts, dresses, wellies, duvet sets, pencil cases and the like.

As a parent of a four-year-old, I’m more familiar than I’d like with mermaid content, and Disney is a rich source. Sofia the FirstA Mermaid Tale is a favourite with my daughter, who is entranced by the moment when Sofia is magically transformed into a mermaid and dives underwater. There, she swims in circles exclaiming: “This is incredible!”. And it is. The rest of the story is almost an afterthought, with the whole narrative punch condensed into that moment of metamorphosis, and the dive into a new and mysterious realm.

If mermaids offer an enchanting dream of transformation, perhaps it’s no surprise that the transgender movement enthuses about the special place mermaids have in their iconography. Activist Janet Mock links this to Ariel, heroine of the 1989 Disney film The Little Mermaid, who chafes at her underwater life and longs to visit the world beyond.

Continue reading “What the modern mermaid leaves out”

The world according to LARP

Who would have guessed that a weekend hobby for outdoorsy nerds could spawn an era-defining political metaphor?

LARP, or live action role-playing, is an offshoot of the fantasy roleplaying subculture. It involves dressing up in costume and acting out a fantasy-fiction game narrative in real time and space, sometimes over several days. A witty friend once characterised the experience as something akin to ‘cross-country pantomime’.

Thanks to lockdown, no one’s LARPing this year — at least not in the cross-country pantomime sense. But the word ‘LARP’ has escaped into the wild: far from being the preserve of fantasy fans, I’ve noticed it appearing with increasing frequency in political discourse.

When riot police finally pushed activists out of the Chapel Hill Autonomous Zone following the murder of one joyriding teenager and serious wounding of another by CHAZ ‘security’, resistance to the advancing riot shields was so paltry it prompted contemptuous accusations of ‘revolutionary larping’. Weird Christian Twitter (it’s a thing) hosts arguments where people are accused of larping more traditionalist theologies than they truly espouse. Still further out on the fringes, the QAnon conspiracy rabbit hole (don’t go there) is fiercely defended by its True Believers against accusations that it is, in fact, a bunch of people LARPing.

Around the time my friends were discovering LARP, I got into LARP’s Very Online cousin, Alternate Reality Gaming (ARGs). An artefact of the age before Facebook and Twitter colonised most of the internet, ARGs are a hybrid of online treasure hunt, mystery story, and live-action immersive theatre. The first mass-participation ARG was a promotional stunt for the 2001 film AI, and featured a series of fiendish clues for participants to crack and follow, which unlocked further elements of story including live-action segments.

For a moment in the mid-Noughties, ARGs looked like the future of storytelling. The idea of internet communities over-writing stable systems of meaning with playful new narratives that danced back and forth between the web and real world felt refreshing and subversive. With hindsight, though, the phenomenon was just a more-than-usually-creative moment in a widespread unmooring of reality that’s been under way for decades.

It’s not all the fault of the internet. In 1955, the philosopher J L Austin developed a theory of ‘performative’ language: that is, language that does something to reality in the act of being spoken. ‘I pronounce you man and wife’ is an example of performative speech — words that effect change through the act of being spoken.

Then, in 1993, the queer theorist Judith Butler borrowed the concept of ‘performative’ language wholesale and applied it to sex and gender, arguing that the identities ‘man’ and ‘woman’ — along with the bodies and biologies associated with those identities — are performative. In taking these roles on, Butler claimed, we make them real.

While these ideas pre-date mass adoption of the internet, the notion that we participate in creating our own realities has been wildly accelerated by social media. Online, it’s easy to get the impression that we can reinvent ourselves entirely, independent of our bodies or other dull ‘meatspace’ constraints. Unsurprisingly, Butler’s conception of sex and gender as performance has long since escaped the petri dish of academia and, like the concept of LARPing, is evolving rapidly in the wild.

Strikingly, the word ‘performative’ has also mutated. Today, it isn’t used as Butler did, to mean “a performance that shapes reality”, but the opposite: an insincere performance for social kudos. So, for example, celebrity endorsements of social justice orthodoxies are often accused of being ‘performative’. It means much the same as ‘larping’, but with an added payload of cynicism. So where ‘LARPing’ means “playacting at something you wish you were”, ‘performative’ means “playacting at something you don’t really believe”.

Meanwhile, the LARP is no longer confined to cheery camping trips accessorised with pretend armour. Back in the noughties, online communities refactoring reality to fit a fantasy storyline felt like a fun game, but as I stare into the sucking void of the QAnon conspiracy, that perspective now seems hopelessly naïve. It’s not a game today: it’s how we do politics.

Liberal commentators spend a great deal of energy trying to explain why this is bad. Countless writers ‘fact-check’ Trump’s bloviations, seemingly unaware that from the perspective of reality-as-ARG, the fact that Trump is lying doesn’t matter. Nor does it really matter whether QAnon is real or not. Reality is, to a great extent, beside the point.

Laurie Penny got closer to the truth in this 2018 piece, where she characterises the very notion of a ‘marketplace of ideas’ as being a kind of LARP: “a Classical fever-dream of a society where pedigreed intellectuals freely exchange ideas in front of a respectful audience”. The reality, she argues, is that this ‘marketplace of ideas’ is less free, rational exchange than dick-swinging theatre.

Those who like to imagine this pessimistic perspective is new, wholly the fault of the Orange Man (or perhaps Facebook), should recall the words of an unnamed aide to George W Bush, quoted in 2004 on the relationship between facts, reality and the military invasion of Iraq:

The aide said that guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works any more,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors . . . and you, all of you, will be left to just study what we do.”

Ron Suskind, New York Times

Though his approach was less overtly hubristic, Tony Blair’s embrace of spin reflected a similar belief in his own ability to manipulate public narratives. ‘Political communications’ has only grown in significance since those days, and taken a self-referential turn. Today it’s as common for commentators to criticise a politician for performing badly at a presser — for poor-quality larping, or bad theatre in Penny’s formulation — as for saying things that are immoral or factually wrong.

Donald Trump is distinct from George W Bush not so much in disdaining facts as in lacking the religious conviction Bush deployed to fill in the gaps left behind by their disregard. But both, in different ways, embodied or embody the idea that what you believe is what is. If you LARP hard enough, this view says, your larp will come true.

Boris Johnson’s administration has something of the same cavalier attitude to the relationship between facts and rhetoric. To date, the handling of coronavirus has routinely over-promised and under-delivered, while seeming indifferent to the disorienting effect on public life of a string of announcements only very loosely tethered to everyday experience.

It’s not a coincidence that this larpification of politics has evolved in tandem with a public fixation on ‘evidence-based policy’. The political polarity of absolute LARP — blatant political lying — and absolute insistence on evidence are two sides of the same loss of faith in a common understanding of reality.

If you’re not convinced, consider SAGE, the government’s scientific advisory committee. Then consider ‘Independent SAGE’, a kind of counter-SAGE comprising scientists every bit as eminent as those on SAGE. This august body produces its own carefully evidence-based reports, which are then used as a foundation from which to disagree with whatever positions the Tories choose to adopt from official SAGE.

Who do we believe? That’s politics. If the Brexit debate hadn’t already killed your faith in ‘the evidence’, the competing claims of SAGE and counter-SAGE should be the death-blow. There is no dispassionate foundation of facts we can deploy to take the politics out of political decisions. The original LARPers might have a fun weekend, then pack up and go back to their normal lives; but in its political sense, there’s no outside to the game. It’s larping all the way down.

Some parts of our culture are coping better with this shift than others. Among the worst performers, to my eye, are mainstream liberals of both left and right. Many have responded to the larpification of everything by concluding that in losing objectivity we’ve lost reality. Some then imagine they can make reality whatever they want it to be by sheer force of will (the Trump/Bush approach). Others suggest we can fix things by supplying enough facts to prove whatever we already believed (the SAGE/counter-SAGE approach). Others, such as Laurie Penny, try to refuse to play.

But we haven’t lost reality, just the fixed vantage point we pretended we had from which to evaluate it. What we have instead is a kind of reality-shaping free-for-all, and there’s no opting out.

As most of us flounder, disoriented, we’re starting to see subcultures adapting. The old story about the Inuit having 50 words for snow is (appropriately) itself probably fake news. But much as a snow-dwelling people might be expected to develop specialist terminology for different types of frozen precipitation, we should understand the emergence of words like ‘larp’ and ‘performative’ as analogous. We’re developing a specialist vocabulary for types of unreality.

We’re also having to find new ways to talk about the reality that, inconveniently, refuses to go away completely. The grim story of the Iraq invasion and its destructive and bloody aftermath gave the lie to Bush’s messianic faith in his capacity to create a new reality to order. Humans still can’t really change sex. And no amount of fiddling the statistics can bring back those people who’ve already died of coronavirus.

The political future turns on our being able to get used to parsing our new Babel for what’s actually happening, and what actually matters. We have to get used to doing this without trying to eliminate the element of LARP (spoiler: can’t be done) or pretending we can abolish reality (ditto).

But there’s no putting the genie back in the bottle. If the ground is moving under all our feet now, the way forward is learning how to dance.

This article was originally published at Unherd

The Irreligious Right

Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.

What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.

The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.

But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian: 

including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.

Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.

Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”

What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.

Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:

Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends. 

Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And  when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.

In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.

From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.

This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.

Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.

The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.

Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried

Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.

This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left?  Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.

Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.

Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.

And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.

Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.

But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.

Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.

This article first appeared at Unherd

Growth is destroying our prosperity

e started the 2010s reeling from the Great Crash of 2008, and ended the decade with angry populism widespread in the Western world. Today, the global economy limps on more or less as usual, while resentment grows among the “little people” at an economic consensus many feel is rigged without really knowing who is to blame. Over the same period, climate change activism has gone from being a minority pursuit to mainstream debate, occasioning worldwide “school strikes” and, since the beginning of the year, the high-profile and colourful Extinction Rebellion movement.

What these dissatisfactions share is a sense of being trapped in an economic logic whose priorities no longer benefit society as a whole, but that — we are told — cannot be challenged without making things even worse. The foundational premise of that system is continual growth, as measured by GDP (gross domestic product) per capita. Economies must grow; if they do not, then recession sinks jobs, lives, entire industries. Tax receipts fall, welfare systems fail, everything staggers.

But what happens when growth harms societies? And what happens when growth comes at the cost of irreparable harm to the environment?

As Sir David Attenborough put it in 2013, “We have a finite environment – the planet. Anyone who thinks that you can have infinite growth in a finite environment is either a madman or an economist.”

This is the argument of Tim Jackson’s Prosperity Without Growth. The challenge the book sets out is at once simple and staggering. In a finite world, with limited natural resources, how do we deliver prosperity into the future for a human population that keeps on growing?

Jackson, who today is Professor of Sustainable Development at the University of Surrey, and Director of the Centre for the Understanding of Sustainable Prosperity (CUSP), argues that we need to start by scrapping GDP as the core metric of prosperity. As the book puts it: “Rising prosperity isn’t self-evidently the same thing as economic growth.”

The pursuit of growth is also undermining social bonds and overall wellbeing. In 2018, a commission reported on the ‘loneliness epidemic’ that is blighting lives and worsening health across the UK, driven in part by greater mobility of individuals away from family connections. (Mobility of labour, of course, is essential to drive economic growth.)

This year, even The Economist acknowledged that rising growth does not guarantee rising happiness.

If that were not enough, the resources available to supply continued growth are dwindling: “If the whole world consumed resources at only half the rate the US does […] copper, tin, silver, chromium, zinc and a number of other ‘strategic minerals’ would be depleted in less than four decades.” (Prosperity Without Growth)

Rare earth minerals, essential for technologies from circuit boards to missile guidance systems, are projected to be exhausted in less than two decades. 

Inasmuch as the public debate considers these nested dilemmas, the vague sentiment is that technology will save us. The jargon term for this is ‘decoupling’ — that is, the ability of the economy to grow without using more resources, by becoming more efficient. But will decoupling happen?

The theoretical core of Jackson’s book is a detailed unpacking of models that suggest it will not, or that if absolute decoupling is possible it will happen so far into the future we will already have wrecked the climate and run out of everything. Rather than rely on this fantasy, Jackson argues, we must challenge the dependence on growth.

But how? The global economic system depends on growth and in times of recession it is the poorest who suffer first. It is a policy double bind: on the one hand, we must think of the environment, so governments encourage us to buy less, consume less, recycle more and so on. But on the other, they must deliver a growing economy, which mean encouraging us to buy more, consume more, keep the economy going. Electorates are, understandably, cynical about the sincerity of this flatly self-contradictory position.

What, then, is the alternative? Jackson is an economist, not a revolutionary firebrand, and his book does not call on us to bring down capitalism. In the second part of Prosperity Without Growth, he instead suggests some quietly radical approaches to bringing the global economy back into the service of human flourishing.

He advocates government intervention to drive much of the change he proposes, including encouraging economies to pivot away from manufacturing, finance and the pursuit of novelty at all costs toward less obviously productive but more human services such as slow food cooperatives, repair and maintenance or leisure services.

He also advocates heavy state support for ecologically-oriented investment. When I contacted him to ask about his book ten years on he spoke positively of the contribution that a “Green New Deal” could make on this front: “It shows a commitment to social and environmental investment that is absolutely essential to achieve a net zero carbon world”, he told me. “Simply put, we just can’t achieve that without this scale of investment, and that form of commitment from Government.”

He also told me he is often criticised for being “too interventionist in relation to the state”, as he puts it. But perhaps (though Jackson does not use the term himself) he might be more fairly described as post-liberal. Prosperity Without Growth is a quiet but excoriating critique of the growing human and ecological costs of liberal economics.

Intriguingly, within Jackson’s proposals lurks another challenge to liberalism, that to date has not been greatly associated with the left: the critique of radical liberal individualism as a social doctrine. Along with state intervention to tweak the economy and drive ecological investment, Jackson argues that governments should promote “commitment devices”: that is, “social norms and structures that downplay instant gratification and individual desire and promote long-term thinking”.

Examples of ‘commitment devices’ include savings accounts and the institution of marriage. Governments should implement policies that clearly incentivise commitment devices, for doing so will promote social flourishing and resilience even as such institutions offer alternative forms of meaning-making to the pursuit of shopping-as-identity-formation.

Thus, we cannot save the earth without reviving some social values and structures today thought of as small ‘c’ conservative: stable marriage, savings, settled and cohesive communities with lower levels of labour mobility.

I asked Jackson whether some of the more vociferous socially liberal proponents of environmental change had cottoned on to these potentially quite conservative implications of his theories. He told me “This is an interesting question, for sure, and one that I don’t think has really been picked up – even by me!” (Except at UnHerd – see here and here for example.) But, he says, it is incumbent on us to set aside political tribalism in the search for solutions to our current dilemmas.

“I believe there are elements of a Burkean conservatism which are profoundly relevant to a politics of the environment, even as I am convinced that the progressive instincts of the left are essential in our response to social and environmental inequality. I see it as incumbent on those working for change both to understand the underlying motivations of different political positions and also to adopt a pragmatic politic in which solutions are suited to the challenges of today rather than the dogma of yesterday.”

Indeed.

This essay was originally published at Unherd

Who gains from the great university scam?

Higher education is big business. Over half of UK young people now attend university, meaning the target first set by Tony Blair 20 years ago has finally been reached. And according to a 2017 report for Universities UK, once you count the (mostly borrowed) money students spend on subsistence, tertiary education generates some £95 billion for the British economy, more than the entire legal sector, the advertising and marketing sector and air and spacecraft manufacturing combined. 

This is true across the country, but its impact is especially noticeable in post-industrial regions. According to a 2017 report, the University of Liverpool alone contributed £652m in gross value added to the Liverpool city region in 2015/16, and supported one in 57 jobs in the region.

Some 11,000 jobs are either directly funded or supported by spending associated with the university — and the University of Liverpool is only one of 5 or more institutions (depending how much of the area you count) offering graduate and post-graduate courses in the Liverpool area, meaning the total sum is even greater.

As well as generating jobs and supporting whole industries catering to student life — from nightclubs and cafes to housing rentals – higher education is shaping the very landscape of the cities in which it thrives. As this 2015 report from UCL’s Urban Laboratory shows, universities are increasingly actors in urban development:

“Driven by competition (for reputation, staff and students) in an international marketplace, and released from financial constraints by the lifting of the cap on student fees, [universities] produce locally embedded variants of global higher education models. These assume physical and spatial form within the parameters of distinct, but increasingly similar, city planning and urban regeneration contexts defined by an ‘assemblage of expertise and resources from elsewhere’.”BY THE SAME AUTHOR

Some of the money that flows into and through universities and out into local economies of course comes from overseas students, endowments and the like. But to a great extent, these dependent industries, revamped urban landscapes, former factories converted to student accommodation, ancillary services and so on are funded either directly — via government subsidies to higher education — or indirectly, via government-backed student loans.

Though academic research is still heavily subsidised by government via the UK Research and Innovation body, the proportion of direct funding to students has shrunk even as that taken on by students as loans has grown. A January 2019 research briefing from the House of Commons Library stated that the cash value of higher education loans is estimated to be around £20bn by 2023-24. The report also acknowledged that only about half of the money borrowed will ever be repaid, estimating that “The ultimate cost to the public sector is currently thought to be around 47% of the face value of these loans”.

The total cost to the public sector, the report continues, is roughly the same as it was before the funding model changed to scrap maintenance grants and increase tuition fees. That is to say, as the proportion of direct government funding to higher education has been reduced, there has been a corresponding rise in the amount of student debt that will never be repaid and which the government will eventually have to cover.

This money is in all but name a form of government subsidy, funded by government borrowing. But it is counted differently. The briefing notes in passing that “This subsidy element of loans is not currently included in the Government’s main measure of public spending on services and hence does not count towards the fiscal deficit.”

That is to say, billions of pounds are being borrowed by government for disbursement in the higher education sector, and the government already knows much of this will never be paid back. But the money is no longer counted toward the fiscal deficit, as it has been nominally privatised in the form of loans to individual young people.

One might argue that this is unimportant provided the higher education sector is delivering value to those who are nominally its customers — the students. But in 2018, the ONS reported that only 57% of young graduates were in high-skilled employment, a decline over the decade since the 2008 crash of 4.3 percentage points.  The ONS speculates that this could reflect “the limited number of high-skilled employment opportunities available to younger individuals and the potential difficulties they face matching into relevant jobs early in their careers”.

Nay-sayers pointed out, when Blair first introduced the tuition fees and the 50% graduate target, that the law of supply and demand suggests employers’ willingness to pay a “graduate premium” in wages as graduates become more plentiful. In 1950 only 17,500 youung people graduated from university; but when 1.4 million of them do so, as reported by the House of Commons this year, can they really hope for the same graduate premium?

Results so far suggest that many of them cannot. In a hard-hitting article last August in the New Statesman, Harry Lambert spelled out further the way in which the marketisation of higher education under the Blair rubric has also incentivised grade inflation.

Cui bono, then? Arguably less the students, graduating in ever greater numbers with ever less valuable degrees, than the cities in which they live for three or four years to study, and which have in many cases experienced a renaissance due in large part to the post-Blair expansion of higher education.

In 1981, after the Toxteth riots, Lord Howe advised Margaret Thatcher to abandon the entire city to “managed decline”. In a letter only made available to the National Archives in 2011, following the 30-year rule, Howe wrote:

“We do not want to find ourselves concentrating all the limited cash that may have to be made available into Liverpool and having nothing left for possibly more promising areas such as the West Midlands or, even, the North East. […] I cannot help feeling that the option of managed decline is one which we should not forget altogether.”

Today, Howe’s words remain only as a bitter memory: the regenerated Liverpool city centre hums with tourists, students and shoppers. The Albert Dock area, reimagined from shipping and warehouses to office buildings, shops and leisure, is beautiful, vibrant and popular.

Much of this regeneration has come via the higher education boom. In Liverpool and elsewhere, successive governments have used the higher education sector more or less explicitly as an instrument of regeneration. In effect, government-backed student loans have become part of this: an off the books subsidy for depressed post-industrial areas, that have thus been partially rescued from the threat of Thatcherite “managed decline” and reinvented as hubs of the “knowledge economy”, all funded by government debt.

But a conflict of interest lurks beneath this picture. If we work on the assumption that the main beneficiaries of the higher education industry are supposed to be students, then it follows that institutions delivering shoddy teaching and useless degrees should be allowed to fail, as word spreads and students go elsewhere. But what if the main beneficiaries of this industry are in fact the cities regenerated with the borrowed money those students spend there?

In that case, from a policy perspective, the quality of the courses delivered will be less important than that the students continuing to arrive in their thousands, bringing their borrowed money to the region and spending it on accommodation, lattes, printer paper, fancy dress hire and all the other essentials of student life.

If the aim were indeed less the introduction of market forces than the use of students as a covert form of subsidy, we would surely see market distortions. In order to head off the threat of young people abandoning poor quality higher education, and entice them into shouldering their allotted portion of off-the-books government borrowing, the “graduate premium” would have to be maintained.

And indeed, since Blair’s student attendance target was first introduced, we can see that instead of using market forces to drive up quality the government has conspired with employers to cartelise the world of work. A growing number of roles that were once accessible via on-the-job training have — by government fiat if necessary — been rendered degree-only. Nursing is the classic example, but in 2016 this was even expanded to include the police,  a move so unwelcome in the force that this year Lincolnshire Police Force launched a judicial review against the policy. 

The victims in this situation are the students, who have come of age at a time when to have any hope of snagging a job they are more or less forced to leave their families and shoulder an enormous debt burden — over £50,000 each on average according to the IFS.  They must do so to acquire a degree whose value for money is declining, but which they cannot do without in a cartelised employment climate in which higher education is obligatory even as the grades it confers count for ever less. 

Not only is the government paying for today’s elderly care (and banker bailouts) with borrowing that will fall on tomorrow’s taxpayers, but young people are also being forced to take on huge personal loans to fund degrees; degrees that are less useful as preparations for adult life than as a conduit for indirect subsidies for regional regeneration.

To make matters worse, the government knows that much of this borrowing will never be repaid, which will leave tomorrow’s taxpayer on the hook for yet more billions. It is an accounting fiddle on a gigantic scale, which penalises young people by first saddling them with loans, then devaluing their education, and finally by hiding government borrowing that future taxpayers will somehow have to meet.

Young people already live with the suspicion that overall public sector borrowing is running up a tab today that will be their burden tomorrow. The situation is far worse than they think.

This article originally published at Unherd

Weekend long read: GPS ‘crop circles’

In 2003 science fiction author William Gibson said ‘The future is already here; it’s just unevenly distributed’. I was reminded of that phrase reading ‘Ghost ships, crop circles, and soft gold: A GPS mystery in Shanghai’, my pick for this weekend’s long read from MIT’s Technology Review.

It recounts the discovery of a phenomenon around the port of Shanghai in which the GPS transponders of oceangoing ships have been spoofed, meaning that vessels appear to be in locations where they are not or ‘ghost ships’ appear to be present where in fact no ship exists:

Although the American ship’s GPS signals initially seemed to have just been jammed, both it and its neighbor had also been spoofed—their true position and speed replaced by false coordinates broadcast from the ground. This is serious, as 50% of all casualties at sea are linked to navigational mistakes that cause collisions or groundings.

Mark Harris, MIT Tech Review

AIS transponders on vessels were introduced in order to increase safety and transparency in shipping. But as tracking technology advances, the means of hacking the system has kept pace, for example to obscure the activities of illegal sand dredgers in the Yangtze estuary:Under the cover of darkness, AIS can be a useful tool for a sand thief. Ships that are not equipped or licensed for sea travel, for example, have been known to clone the AIS systems of seafaring boats to avoid detection.

Nor are sand thieves the only users of hacked AIS technology. In June this year, an oil tanker with a cloned AIS system rammed an MSA patrol boat in Shanghai while trying to evade capture.

MARK HARRIS

The spoofed GPS signals were appearing in a pattern that analysts began to call ‘crop circles’, that centred on the Huangpu river near Shanghai, and that affected not just ships but all GPS signals. Analysts are still unsure as to what is causing it. High-tech hacking to conceal illegal resource extraction or oil shipping? New forms of experimental weaponry?

The article leaves the conclusion open but one thing is clear: the age of big data will bring with it not just the potential for advances in medical research and social innovation (or surveillance) but also for unforeseen new kinds of crime and even warfare.

This article originally published at Unherd

Remainers are the ones longing for empire

In his valedictory speech as outgoing European Council President, Donald Tusk described Brexit as a delusion driven by the foolish nostalgia of those Brits still “longing for the empire”. His words prompted the usual harrumphing, but the truth is he has it precisely backwards. It is not Brexiters who are chasing an imperialist high, but those devoted to the European Union.

Since its founding, the EU has self-mythologised as a project of peace, whose principal aim is to prevent a repeat of the two World Wars of 1914 and 1939. The basis for this argument tends to be a notion that the World Wars were caused by an excess of “nationalism”, with the aggressive and expansive German identity promoted by the Nazis held up as the primary exhibit, and that by diluting the power of Europe’s nation states nationalism will also be attenuated.

Lately, despite its convoluted and multivariate origins, the First World War has also been recruited by European leaders as a cautionary tale against nationalism. But the origin of the Second World War can just as reasonably be described as a multi-sided jockeying for power between imperial powers.

And as Yoram Hazony has argued in The Virtue of Nationalism, Hitler was less a nationalist than an imperialist, who sought to expand German-controlled territory and as such was resisted by the rival empires of Britain, the United States and other allies. That is to say, the two World Wars were arguably more driven by the competing interests of imperial players than an excess of national identification as such.

Over the horrific bloodshed that took place between 1914 and 1945, these imperial powers lost or began the irreversible process of losing their empires. The British Empire was at its greatest, not to mention most crisis-ridden, after the end of the First World War, and by the end of the Second was exhausted to the point where it no longer had either the will or the resources to sustain its imperial reach. 

The international world order that replaced the Old World empires from 1945 until relatively recently was, in effect, an empire of American-influenced rules underpinned by American military and economic dominance.  And in this new age of Pax Americana, international conventions established the right of nations to self-determination. It was no longer the done thing to invade countries halfway round the world for the purpose of grabbing resources, extending geopolitical influence and/or “civilising: the natives.

With no one overseas to colonise, what happened to the old ruling bureaucracies of the formerly imperial nations of Europe? What now for those educated with imperial dreams and a global vision, trained from a young age to run international business and political institutions, dreaming of rule across vast territories and hundreds of millions of benighted souls in need of guidance?

The solution they came up with was to colonise one another. To console themselves for the loss of the riches and ready supply of servants in their overseas colonies, the washed-up post-imperial nations of Europe agreed to pool their reach, influence and unwashed natives into a kind of ersatz empire.

It did not greatly matter whether the natives in question liked the idea or not, as the pooling was undertaken largely without public discussion and in practice (to begin with at least) made little difference to their everyday lives. Rather, the extension of ‘reach’ and ‘influence’ was largely a bureaucratic one, harmonising rules on the kind of trade and manufacturing standards which most ordinary people care very little about.

The result provided an imperial buzz for a cadre of civil servants, who got to dictate standards on the minutiae of countless areas of commerce for hundreds of millions of people rather than mere tens (and enjoy the perks of a colossal corporate lobbying industry in the process).

Even better, they could do all this without any of the demonstrable dangers of the kind of overheated jingoism that came with the style of imperialism that ended in bloodshed with the two world wars. A kind of diet imperialism, if you like: all the fun of civilising the heathens, with none of the guilt.

Their diet empire now constituted, the post-imperial civil servants of each EU member state could enjoy something of the lavish transnational lifestylemoney-no-object pageantry and grand entertaining they missed out on by the unfortunate fact of having been born too late for a career enjoying absolute power in the colonies while feathering their own nests. Indeed, the strange disappearance of a 2014 report on corruption within EU institutions suggests the diet imperialism of Europe offers ample opportunities of the nest-feathering variety.

Those in the administrative class who missed out on the opportunities for self-enrichment in the prewar empires can enjoy instead the huge and relatively unaccountable sums of money that flow around the European Union’s various budgets.

Indeed, even when misbehaviour tips over into outright criminal activity it can sometimes go unpunished, as was the case with IMF head Christine Lagarde, who received a criminal conviction in 2016 for negligence over inappropriate payouts while in the French Government but was nonetheless installed this year as head of the European Central Bank.

The administrative empire also delivers a servant class, at a scale appropriate to the post-imperial nostalgia it serves to alleviate. The debate around the Brexit referendum was full of dire warnings about the looming loss of staff to (among other things) wipe bottoms, look after children, pick fruit  and make lattes.

These laments strongly hint at the preoccupations of a colonial class reluctant in the extreme to let go of a rich supply of subaltern masses whose services were rendered affordable by the expansion of the labour market through freedom of movement.

It is not just the servants. The prospect of losing the European extension to their shrunken, empire-less British geopolitical self-image cuts to the heart of our modern governing class. As one would expect, then, those lamenting Britain’s post-Brexit loss of “standing” or evolution into a “laughing stock” (who cares?) are not the supposedly imperialist and thin-skinned Brexiters but those who wish to remain. Because in their view the only available modern source of the suitably elevated pomp, influence and imperial “standing” to which they feel entitled is our membership of the EU.

Paradoxically, in the act of accusing Brexiters of the imperial nostalgia of which they themselves are guilty, the Remain Europhiles have hit on a term which is more accurate than they realise for their Brexiter foes: Little Englanders. As has been pointed out elsewhere, the original Little Englanders were anti-imperialist, and wanted the borders of the United Kingdom to stop at the edges of the British Isles.

The epithet tends to be used against Brexiters to imply jingoistic and probably racist imperial aspirations, but this is the opposite of what it meant when first used. And taken in its original sense, calling Brexiters Little Englanders is entirely accurate: they would like the borders of the nation to which they belong to be at the edge of the British Isles, not along the edge of Turkey or Russia.

Should they get their way, this will present the United Kingdom with the prospect of life as an independent nation of modest size. We can then look forward to a future going about our business much reduced from the giddy, extractive and racist highs of the early twentieth century but hopefully more stable, more content with ourselves and, importantly, perhaps even finally at ease with the loss of British imperial reach.

For the imperialist nostalgists of Remain, though, unable to reconcile themselves to the notion of the United Kingdom as anything but a world power, this possibility is anathema. The argument tends to be that unless we join a large power bloc we will be ground to dust between them. Gideon Rachman argued recently in the FT  that “the EU needs to become a power project”, saying that future geopolitics will be a contest between four or five large blocs including China and the US and the individual nations of Europe cannot hold a candle to these behemoths.

But must this necessarily be so? Rachman’s future is just a projection, and many projections – such as Fukuyama’s famous one about the “end of history” have been proved wrong by subsequent events. Admittedly, a multipolar future seems likely. But any age of competing superpowers has always also contained smaller nations that managed to avoid absorption into a larger empire by one means or another. Why should Little England not be one of them?

The only thing holding us back from a post-Brexit and doubly post-imperial future, at ease with our reduction in stature and ready for a new chapter in our national history, is the imperial nostalgia of the Europhiles.

This post originally published at Unherd