I joined Palladium’s Wolf Tivy to talk about my theory that what gets called ‘postmodern’ today is really the last stand of high modernism, and the interpersonal implications of genuinely decentering meaning and subjectivity without collapsing into nihilism. Have a listen here.
While we live, we all present different facets of ourselves to different people. Whether in our friendships, work, family or at different times in our lives, we encounter others. All remember us slightly differently, according to their perspective.
While we live, our physical presence holds that multiplicity together. After we die, though, the memories begin to come apart. When my step-grandfather married my grandmother, he already had two children with his first wife. But she had already left him and moved to a different country; he was stepfather to my mother and aunts instead.
He was a big character: an aristocrat of the Greatest Generation, the subject of several films about his war exploits, well-loved farmer, and patriarch to two families. At his funeral, the many facets of his life were already coming apart. Each version of his memory was fiercely defended by the mourner to whom it belonged. Long-standing quarrels, no longer held in check by his living presence, began trickling back into the open. It was not an easy day.
Today, we are all mourners at the funeral of a character on a scale that dwarfs even my roaring, hectoring, pedantic, affectionate, and irascible step-grandfather. We are gathered to mourn teleology itself—the belief that life has objective meaning and direction. What we call the culture war is the aggregate of those quarrels now breaking out between the gathered mourners over their divergent memories of the deceased.
Were we progressing toward universal peace, justice, and equality? Was it the resurrection and the life of the world to come? Perhaps it was the end of history in universal liberal democracy? We cannot agree.
The death of teleology represents a collective cultural trauma that accounts for, among other things, the increasingly unhinged debates around social justice within elite universities, and the reactive phenomenon of the aggressively transgressive online far-right.
But it doesn’t have to be like this. Post-structuralism killed teleology, but it did so in error, by taking a wrong turn; it is this wrong turn that has left us so traumatized.
What is commonly referred to as postmodernism is not in fact post-modern but rather represents a last-ditch attempt by modernism to resist the implications of the post-structuralist mindset whose inevitability is now indicated by fields as diverse as physics, ecology, and psychotherapy.
Deconstruction is not the end: reconstruction is possible, indeed essential.
To situate myself a little in this story: I belong to a generation that is marginal, facing two directions, in several ways that are relevant to my argument. Born in 1979, I sit at the tail end of Generation X. I am old enough to remember the days before the internet, but young enough to be more or less a digital native. I got my first cell phone and email address as an undergraduate at Oxford. I researched my undergrad essays sitting in actual libraries reading physical books, but wrote them on a word processor. I can remember life before social media.
I also received, prior to undergraduate life, a recognizably classical education. This was, in the old-fashioned way, designed to deliver a whistle-stop tour of the march of civilizations from Ancient Egypt via the classical era to Western Christendom, with at least a vague grasp of the cultural and historical highlights of each.
The overall impression delivered was of an evolution of societies, consciousnesses, and cultures over a vast sweep of time and different human epochs that nonetheless seemed to have at least some narrative continuity and directionality. Everything else we learned seemed at least to an extent framed by that sense of situatedness within a larger narrative of human cultural evolution, whose direction was a mystery but did at least seem to be headed somewhere.
Then, in my first year as an English Literature undergraduate, I encountered critical theory—and the entire organizing principle for my understanding of reality fell apart.
To summarize: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified.’ That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs—a study that reaches far beyond language and was immediately influential in the social sciences.
This insight was developed by Jacques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by millennia of culture to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up entrenched interests, and to conceal the operations of power.
In this view, recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.
For me, the shift from a sense of the world as having some stable narrative trajectory to this perspective, in which meanings were not only networked but fundamentally without foundation, was deeply disturbing. It landed like a psychotic experience. Overnight, the hallowed architecture of Oxford University went from seeming like a benign repository of traditions within which I could find my place, to a hostile incursion into my consciousness of something phallic, domineering, and authoritarian. I remember describing to a friend how, as a woman and sexual minority, I suddenly experienced the ‘dreaming spires’ as ‘barbed penises straining to penetrate the sky.’
I wish I could say it passed, but it did not. What did happen, though, after I left, was that I found an accommodation with the loss of teleology and objectivity from my frame of reference. I did this by theorizing that if to posit anything at all is an act of power, then it was one I was also entitled to attempt. All cognition, meaning-making, interpretation, and perception is conceptually laden and socially-mediated action. It is impossible to ground even perception in anything but action and thus power. But so be it. We live in a society and participate in the flow of power all the time. I developed the idea of ‘temporary certainties,’ or in other words, the idea that even if meanings are not stable, many of them are stable enough for me to act as if they were solid in the pre-Derridean sense. I did not have to deconstruct every minuscule interaction for the operations of power it encoded.
In an effort to evade the monstrous pervasiveness of systems of domination and submission, I experimented with radically non-hierarchical forms of living, power exchange sexualities, non-binary gender presentation. I tried my own operations of power: I changed my name to Sebastian, to see what it felt like, then settled for a while on Sebastian Mary. I co-founded a startup with friends, in which we tried to avoid having a management hierarchy.
My accommodation kind of worked, for a while. But it did not last. It is all very well to theorize about non-hierarchical forms of organization, but in order to get stuff done you need a chain of accountability. And the worst sort of hierarchies have a habit of emerging, too, especially in social situations where they are intentionally obscured or deprecated. Communes, collaborative projects, and the like all find their leaders and followers, or their tyrants and victims. My increasing bitterness as I learned this, in the course of trying to get somewhere with the startup, made me so obnoxious as a co-worker that eventually I was expelled from the project which was, by then, failing anyway.
With that rupture, I lost my social circle, my best friend, and my entire carefully reassembled working theory for how to navigate the rubble of broken teleologies that was my adult life in the ‘00s. Concurrently, the Great Crash of 2008 destroyed the equally teleological fantasy of global liberal-democratic hegemony under international capitalism that had powered the Iraq invasion along with the triumphalism of the Blair years.
In the wreckage, though, something wonderful happened. Two wonderful things, actually. First, I met the man who I would eventually marry, and by degrees let go of the belief that in order to sustain my integrity as a person I had to reject any form of stable loving relationship to an Other in favor of multiple, overlapping, unstable platonic, sexual, or ambiguous friendships. Second, I decided I needed to learn how to do something more useful than floating around London curating experimental art events and screwing up entrepreneurship, and went back to school to train as a psychotherapist.
In the course of that study, I learned where postmodernism took its wrong turn. Implicit in the post-structuralist theories taught to every young humanities student at university is the idea that because meanings have no singular objectively correct grounding, they are therefore of no value. Also implicit is the idea that because of this, no satisfying, authentic or truthful encounter with the Other is ever possible—only an endless recursive hall of mirrors composed either of our own anguished reflections or the invasive pressure against our psyches of another’s desire.
In studying psychotherapy, though, I came to realize that while the same post-structuralist decentering of the self took place in psychoanalytic theory between Freud and his contemporary descendants, therapists had—because they have to—rejected the idea that we can never encounter the other. While much contemporary analytic theory acknowledges the need to excavate and make space for the operations of overdetermined systems such as race, class, or sex, it does not automatically follow from the presence of those things that intersubjective contact and meaningful connection cannot take place.
Just like post-structuralism decentered the observer, intersubjective psychoanalysis radically decenters the analyst. But an intersubjective understanding of the relational space as co-created by client and therapist does not preclude the possibility of therapeutic work taking place. And this in turn speaks powerfully to a claim that however muddled, muddied and overdetermined our encounters with the other may be, yet they still contain the potential to be not just benign but real, true, and transformative.
I suppose I could deconstruct that claim in turn. But I have experienced its truth both as client and also, in the course of my work, as therapist. Through intersubjective encounters in the consulting room, I have been transformed, and have transformed in turn. From this vantage point, the claim of post-structuralism to render meaningless all semiotic systems, and reveal as brute operations of power all encounters with the other, seems not just mistaken but (in the Kleinian sense) paranoid-schizoid. It is the tantrum of a child who, on realizing they cannot have exactly what they want, refuses to have even the next best thing and dismisses everything and everyone as evil.
The alternative to this paranoid-schizoid repudiation of meaning is not to reject meaning as dead or hopelessly suborned by power, but to accept that we are enmeshed, shaped and in turn helping to shape networks of meaning as part of a dynamic dialogue. We are nodes in the social and semiotic system. As such, even the act of contemplating those systems of meaning will have some tiny effect on them. When Derrida said ‘Il n’y a pas d’hors-texte’—”there is no outside-text,” though commonly mistranslated as “there is nothing outside the text”—I took it to mean meaning itself was hopelessly corrupted, and objectivity a bust. Today, I see it more as a radical decentering of my selfhood that opens up new, vibrant possibilities of connectedness.
If we read ‘text’ in the biosemiotic sense as fractal, multi-dimensional, and interconnected systems of signification, both of human culture and the natural world (inasmuch as those things can even be separated), then indeed there is nothing outside the text. But that does not mean the text is wholly illegible, or that it does not exist—simply that in reading, we affect what it says, and in return it changes us. We are unavoidably caught up in perspectival context, without truly objective ground to stand on. But objectivity was always an implicit abdication and obscuration of power and the necessity of choice. It was the idea that we could calculate what to do from objective factors that we didn’t have to take responsibility for. We do have to take responsibility, but that can mean a proactive positive acceptance. We can step up to the challenge of power and perspective, rather than reject it out of guilt and trauma.
Seen thus, a living post-structuralism is a philosophy not of radical alienation but radical interconnection. It is not the death of stable meaning, but the moment a system we thought rigid, immovable, and observable from the outside stirred and opened its eyes to return our gaze. It is also increasingly supported by contemporary studies in—for example—ecology and theoretical physics. If even the hardest of hard sciences now advances a theory of reality that embrace radical uncertainty and the implication of the observer in what is observed, then surely the humanities can do so as well without giving up on meaning altogether?
The great insight of postmodernism is that meaning is unstable, and mediated in infinite complexity by systems of power in which we are decentered but implicated. But the response to this insight from the humanities has been a furious rearguard action by the ideology of fixed meanings that postmodernism itself displaced. Enlightenment rationalism is to postmodernism as Newtonian physics is to general relativity, and it is in the ‘social justice’ ideologies now increasingly hegemonic in elite institutions that Enlightenment rationalism is seeking to make its last stand against the new philosophy of radical interconnection.
If postmodernism claimed that all meanings are unstable, socially constructed, and held in place by operations of power, the defining characteristic of the anti-postmodernism that masquerades as contemporary postmodern thought is its determination to apply that analysis to everything except its own categories and hierarchies. In effect, this system of thought seeks to recoup semiotic stability by replacing the old ‘bad’ hierarchies of Western, patriarchal, heterosexual, etc. dominance with new ‘good’ ones.
All activities, goes the claim, are tainted by the toxic operations of overdetermined systems of oppressive social meaning which speak through us and over us regardless of what little agency we might imagine ourselves to have. So in the political framework of anti-postmodernism, fixed immutable characteristics such as race assign their bearers a position on a rigid hierarchy of ‘marginalization’ which in turn influences their status within the system. The legitimacy of the new, fixed hierarchies of marginalization-as-status rests, we are told, in how they correct for, deconstruct, and overcome previously imposed power inequalities. The chief form of political action is a wholesale effort to dismantle these former inequalities, wherever they may be found.
But in practice, the demand that all historically imposed power relations be deconstructed unwinds the legitimacy of any possible social relationship or institution. All meanings necessitate the exclusion of what-is-not-meant. Making absolute inclusion a central political demand is thus in effect a call for the abolition of meaning. We are never told what form the good life might take, should this project of semiocide ever be completed. But one thing is clear: it can have no social or intersubjective dimension, for that would imply shared meanings, and with shared meanings the operations of power—exclusion, definition, the imposition of significations not wholly self-chosen—inescapably return, as do hierarchies. In this sense, the push for semiocide in the name of social justice is a project whose ultimate aim is an individuation so total it precludes any form of encounter with the Other, except in a multidirectional contest for domination that none can be permitted to win.
From other vantage points within the culture war, the reaction to this doctrine is often mockery, for the doctrine’s self-absorption, incoherence or preoccupation with language and ‘luxury beliefs.’ This is mistaken. Its adherents are motivated by compassionate idealism, but have been misled by a destructive falsehood and are in many cases deeply unhappy. The decentering of the Enlightenment subject brings with it an invitation to a more fluid experience of selfhood as radically inseparable from and in a process of co-creation with all of reality, and yes, with the power structures of the society in which we live. But the contemporary critical theory I am calling anti-postmodernism shows young people this vision of beauty, only to dismiss it as a pack of tendentious, self-interested lies.
It is no wonder today’s young people fling themselves miserably against the bars of whatever structures of meaning are still standing in an effort to knock them down—or perhaps to prop themselves up. Whether it is the SJWs, the frog memers, or the ‘failson’ ironists, they can smell the fresh breeze of meaning, less linear than the rationalists would like but nonetheless real, and yet they have been told they cannot have it, because it is not there, or else comprises only violence and hostility. So, they fight over the broken rubble of the Enlightenment, or with each other, or their ancestors, and starve in the midst of a banquet.
To recap, then: what gets called ‘postmodernism’ today is not postmodernism but the last spasm of the worldview displaced by postmodernism, that saw meanings as fixed, knowable and amenable to human mastery. This anti-postmodernism diverts young people from the astonishing richness of a systems-based, decentered engagement with the world’s semiotic complexity by seeking the only remaining form of mastery it can imagine: a defensive assault on meaning itself.
Instead of embracing the fluidity of systems of meaning, and each subject’s situatedness within that system, young people are taught that the only legitimate foundation for political action—or indeed any kind of social participation—is atomized selfhood, constructed from within and defended with narcissistic brittleness. They are taught to see themselves as solely responsible for discovering, curating, optimizing and presenting this supposedly ‘authentic’ self as their central marketable asset. But they also learn that it is continually under assault by hostile forces of oppressive social meaning whose aim is to keep them—or others like them, or someone anyway—marginalized, abject and on the back foot.
Within this system, it follows that the central locus of political activism must be to disrupt these oppressive forces that marginalize unfavored groups, so as to advance the project of collective liberation to ‘be our authentic selves.’ This is not just a political project but an existential one, for along with marginalizing unfavored groups these forces impose unlooked-for and oppressively overdetermined social meanings on each of us, undermining each young person’s quest for authentic selfhood. Individuals caught up in this worldview genuinely believe they are agitating not just for the liberation of the oppressed but for their very existence.
The fixation of today’s elite graduates on ‘validation’ of ‘identities’ may seem frivolous to older generations. But within a worldview that frames all forms of social meaning as oppressive by definition, the very gaze of the Other is an unacceptable attack on the pristine territory of the self. If we reject the genuinely postmodern ethic of radical semiotic interconnection, and our interwovenness with structures of meaning in society and the natural world, then the movement of these structures in, on and within our individual identities comes to be experienced as violence.
This perspective exists in tormented symbiosis with an Other it can neither tolerate, nor yet wholly dispense with. For the paradox is that the invasive gaze of the Other, laden with unwanted and oppressive shared meanings, is simultaneously the source of suffering and salvation. The gaze of the Other is experienced as a hostile and violent invasion, forever imposing unlooked-for social meanings that constrain the liberty of each sacred self. But it is also the only source of the ‘validation’ that will reassure each individual that their self-creation project is real, true and accepted.
The solution, within this worldview, is an (again paranoid-schizoid in the Kleinian sense) ever more desperate effort to control the thoughts of the Other. We see this in politicized campaigns to control speech in the service of identities. But as any psychotherapist (or parent) will tell you, trying to control the inner life of another is a project that in normal circumstances seems achievable (or indeed desirable) only to young children or the mentally disturbed. That it should become a central political desideratum for a generation of elite young people does not bode well for the future health of public life.
When I started my undergraduate degree 20 years ago, critical theory was one epistemology among several, which we learned about as it were ‘from the outside’ rather than as a framework for understanding other philosophies. Though it affected me severely, in ways I have already described, most of my contemporaries simply learned about the ideas and were largely unaffected. Today, though, this epistemology has eaten and digested the humanities and begun to nibble on science and even mathematics. As a result, for today’s young people, it is increasingly difficult to find a vantage point outside its political ontology from which to evaluate its operations.
We should not be surprised, then, that mental health issues have skyrocketed in elite college-age populations. They are being taught to believe, as a foundational framework for understanding the world, that acceptance in the gaze of the Other is key to validating a selfhood they alone are responsible for creating, curating and optimizing. But they are also being taught that all shared meanings—in other words, anything conferred by the gaze of the Other—represents a hostile act of violence. How is any young adult meant to navigate this catch-22?
It is a mistake to dismiss this as narcissistic—or, at least, to ignore the suffering of those trapped in this bind. To be ‘defined’ by something other than our own desire is in this system to be injured, parts of our authentic self mauled or amputated, whether by social meanings we did not choose or the givens of our embodied existence. This is a phenomenally cruel thing to teach young people, as it leaves them feeling perpetually oppressed by the givens of existence itself.
This analysis also sheds light on the crisis of elite purpose and leadership Natalia Dashan described in her Palladium piece last year. If shared meanings are not only unavailable but actively hostile, how is any young person meant to formulate a legitimate rationale for stepping up? No wonder so many elite graduates dismiss even the possibility of public service in favor either of pecuniary self-interest in finance or tech, or else joining the ranks of activist-bureaucrats seeking to advance the destruction of shared meanings in the name of total inclusion.
But as societies around the globe struggle to get to grips with coronavirus, we no longer have the luxury of sitting about like Shakespeare’s Richard II, mourning a broken model of meaning as the world disintegrates around us. Facing the deaths perhaps of loved ones, and certainly of everything we thought of as ‘normal’ until a few weeks ago, destroying what is left of our structures of social meaning in the name of liberation politics or frog-meme irony is an indulgence we cannot afford. The project of reconstruction is urgent. This project is both an inner and an outer one: reconstruction of an inner life capable of navigating social meanings without experiencing them as violence, and also of our willingness to participate in the external, political analogue of those social meanings, namely institutions, political structures and—yes—hierarchies.
This is not to say that we should shrug at unjust systems of domination. The ‘social justice’ excavation of ‘implicit bias’ is not wholly without merit. It is on all of us to make sincere efforts to meet the Other to the best of our abilities as we find it, and not simply reduce the world out there to our preconceptions. But this effort cannot be so all-encompassing as to destroy what systems of shared meaning we have left. Nor can we afford to see it grind common endeavor to a standstill.
No one knows yet what the world will look like as we emerge from the political and economic convulsions engendered by this global pandemic. One thing is clear, though: the ethic of radically individualist atomization implicit in ‘social justice’ campaigns for the destruction of all shared meaning is woefully inadequate to the challenges we now face. Through its lethal spread and infectiousness, coronavirus has demonstrated vividly how our fates remain bound to one another in infinitely complex ways, however loudly we may assert our right to self-authorship. Faced with the persistence of our social, biological, semiotic, economic, and ecological interconnectedness, we would do well to embrace and make a virtue of it, to salvage those shared meanings that remain to us, and begin the process of building new ones that will sustain us into the future.
This article was originally published at Palladium magazine.
Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.
What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.
The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.
But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian:
including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.
Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.
Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”
What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.
Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:
Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends.
Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.
In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.
From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.
This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.
Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.
The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.
Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried.
Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.
This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left? Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.
Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.
Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.
And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.
Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.
But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.
Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.
What if we could create a marketplace for relationships, so that – just as we can rent our homes on Airbnb – we had an app that allowed us to sell at the market rate dinner with our husbands or bedtime with the kids?
Marriage is a legally recognised agreement after all, one that has been shown to confer many benefits for health and wellbeing. Why should I not be able to rent my place as wife and mother in my particular family to others who wish to enjoy some of those benefits?
Ryan Bourne of the Cato Institute recently argued that the technology exists to enable us to trade citizenship rights. Calling the right of British nationals to work in the UK’s high-wage economy “an effective property right we own but can’t currently trade”, he suggests we could ease immigration pressures by implementing an Airbnb-style secondary market in working rights.
If we frame citizenship, or marriage, as something owned by an individual, it is simply a set of bureaucratic permissions. Like the right to live in a house, surely this could be traded in a marketplace? And if the technology exists to create a citizenship market, surely we could do the same for marriage? I could sublet my wifedom and nip off for a weekend on the tiles with the proceeds. Why not?
The problem is obvious — my husband and daughter would, not unreasonably, object. She would no more want her bedtime story read by a stranger than my husband would want to share a bed with that stranger.
My marriage is not a good I own but a relationship, created by mutual consent. In a marriage, I give up some of my autonomy, privacy and private property rights by declaring my commitment to the relationship. What I gain is of immeasurable value: a sphere of belonging, the foundation of my existence as a social creature.
Likewise, citizenship implies relations of belonging, both of me to a community but also a community to me. It also implies commitments on behalf of the community of which I am a citizen. And in exchange it requires commitments of me, as a citizen: to uphold the law, to behave according to its customs and so on. As the late Roger Scruton put it in a 2017 speech:
The citizen participates in government and does not just submit to it. Although citizens recognise natural law as a moral limit, they accept that they make laws for themselves. They are not just subjects: they appoint the sovereign power and are in a sense parts of that sovereign power, bound to it by a quasi-contract which is also an existential tie. The arrangement is not necessarily democratic, but is rather founded on a relation of mutual accountability.Roger Scruton
Just as my husband and daughter have a stake in who is entitled to be called “wife” or “Mummy” in our particular context, so other citizens of a nation have a stake in who is entitled to the rights conferred by citizenship.
In this light we can better understand the revulsion that greeted the actions of the Duke and Duchess of Sussex in trademarking “Sussex Royal” for personal commercial gain. Royalty, after all, does not exist in a vacuum. It is not an intrinsic property of a person, like blue eyes or long legs, but something conferred both by the monarchy and also by the subjects of that monarchy.
As Charles I discovered in 1649, ultimately no king can govern save by the consent of his subjects. Royalty is not a private property, but a relationship. The popular disgust and anger engendered by the Sussexes’ move to transfer their stock of royalty from the relational public sphere to that of private property is in truth anger at their privatising something which does not belong to them but to the whole nation.
In The Question Concerning Technology, writes Josh Pauling, Heidegger argues that technology uncouples humans from what is real, paving the way for a mindset that treats everything as “standing-reserve”, or in other words “resources to be consumed”. For Heidegger, seeing the world thus is dangerous because it flattens all other perspectives:
Commodifying nature and humanity leads us to discard other understandings of being-in-the-world and the practices, beliefs and ideas that accompany them: all aspects of reality are incorporated into the ordering of standing-reserve.Josh Pauling
My husband’s goodwill would rapidly wear thin were I to Airbnb my role in our family. Similarly, Bourne’s citizenship marketplace fails to consider how the general population would react to seeing fellow citizens renting their right to work to non-citizens and swanning about spending the unearned proceeds. And the goodwill enjoyed by the Duke and Duchess of Sussex while discharging their royal duties has already evaporated, now it transpires they wish to enjoy the privileges of their elevated station without embracing its obligations.
Treated as objects to be exploited, relational meanings wither and die. Treated as dynamic relationships, they are infinitely renewable. In this sense, they are more akin to ecologies in the natural world. In Expecting the Earth, Wendy Wheeler argues that in fact ecologies are systems of meaning: whether at the level of DNA or megafauna, she says, living things deal not in information but in meanings that change dynamically depending on context.
Why does any of this matter? “Modernity is a surprisingly simple deal,” writes Yuval Noah Harari in Homo Deus. “The entire contract can be summarised in a single phrase: humans agree to give up meaning in exchange for power.” The impressive achievements of modernity might make the loss of meaning seem, to some, a fair exchange.
But if Wheeler is right, meaning is more than an optional seasoning on the mechanistic business of living. In Man’s Search for Meaning, Victor Frankl observes of his time in Nazi concentration camps that those who felt they had a goal or purpose were also those most likely to survive.
Indeed, the growing phenomenon of “deaths of despair” is driven, some argue, by deterioration in community bonds, good-quality jobs, dignity and social connection — in a word, the relational goods that confer meaning and purpose on life. As Frankl observed, humans need meaning as much as we need air, food and water: “Woe to him who saw no more sense in his life, no aim, no purpose, and therefore no point in carrying on. He was soon lost.”
An order of commerce that treats relational ecologies as objects that can be exploited will exhaust those objects. That is, in the course of its commercial activities it actively destroys one of the basic preconditions for human flourishing: meaning.
The Estonian thinker Ivar Puura has called the destruction of meaning “semiocide”. As concern mounts about the effects of pollution and emissions on the earth, campaigners have called for new laws to criminalise the destruction of ecologies, which they call “ecocide”. Perhaps we should take semiocide more seriously as well.
e started the 2010s reeling from the Great Crash of 2008, and ended the decade with angry populism widespread in the Western world. Today, the global economy limps on more or less as usual, while resentment grows among the “little people” at an economic consensus many feel is rigged without really knowing who is to blame. Over the same period, climate change activism has gone from being a minority pursuit to mainstream debate, occasioning worldwide “school strikes” and, since the beginning of the year, the high-profile and colourful Extinction Rebellion movement.
What these dissatisfactions share is a sense of being trapped in an economic logic whose priorities no longer benefit society as a whole, but that — we are told — cannot be challenged without making things even worse. The foundational premise of that system is continual growth, as measured by GDP (gross domestic product) per capita. Economies must grow; if they do not, then recession sinks jobs, lives, entire industries. Tax receipts fall, welfare systems fail, everything staggers.
But what happens when growth harms societies? And what happens when growth comes at the cost of irreparable harm to the environment?
As Sir David Attenborough put it in 2013, “We have a finite environment – the planet. Anyone who thinks that you can have infinite growth in a finite environment is either a madman or an economist.”
This is the argument of Tim Jackson’s Prosperity Without Growth. The challenge the book sets out is at once simple and staggering. In a finite world, with limited natural resources, how do we deliver prosperity into the future for a human population that keeps on growing?
Jackson, who today is Professor of Sustainable Development at the University of Surrey, and Director of the Centre for the Understanding of Sustainable Prosperity (CUSP), argues that we need to start by scrapping GDP as the core metric of prosperity. As the book puts it: “Rising prosperity isn’t self-evidently the same thing as economic growth.”
The pursuit of growth is also undermining social bonds and overall wellbeing. In 2018, a commission reported on the ‘loneliness epidemic’ that is blighting lives and worsening health across the UK, driven in part by greater mobility of individuals away from family connections. (Mobility of labour, of course, is essential to drive economic growth.)
This year, even The Economist acknowledged that rising growth does not guarantee rising happiness.
If that were not enough, the resources available to supply continued growth are dwindling: “If the whole world consumed resources at only half the rate the US does […] copper, tin, silver, chromium, zinc and a number of other ‘strategic minerals’ would be depleted in less than four decades.” (Prosperity Without Growth)
Rare earth minerals, essential for technologies from circuit boards to missile guidance systems, are projected to be exhausted in less than two decades.
Inasmuch as the public debate considers these nested dilemmas, the vague sentiment is that technology will save us. The jargon term for this is ‘decoupling’ — that is, the ability of the economy to grow without using more resources, by becoming more efficient. But will decoupling happen?
The theoretical core of Jackson’s book is a detailed unpacking of models that suggest it will not, or that if absolute decoupling is possible it will happen so far into the future we will already have wrecked the climate and run out of everything. Rather than rely on this fantasy, Jackson argues, we must challenge the dependence on growth.
But how? The global economic system depends on growth and in times of recession it is the poorest who suffer first. It is a policy double bind: on the one hand, we must think of the environment, so governments encourage us to buy less, consume less, recycle more and so on. But on the other, they must deliver a growing economy, which mean encouraging us to buy more, consume more, keep the economy going. Electorates are, understandably, cynical about the sincerity of this flatly self-contradictory position.
What, then, is the alternative? Jackson is an economist, not a revolutionary firebrand, and his book does not call on us to bring down capitalism. In the second part of Prosperity Without Growth, he instead suggests some quietly radical approaches to bringing the global economy back into the service of human flourishing.
He advocates government intervention to drive much of the change he proposes, including encouraging economies to pivot away from manufacturing, finance and the pursuit of novelty at all costs toward less obviously productive but more human services such as slow food cooperatives, repair and maintenance or leisure services.
He also advocates heavy state support for ecologically-oriented investment. When I contacted him to ask about his book ten years on he spoke positively of the contribution that a “Green New Deal” could make on this front: “It shows a commitment to social and environmental investment that is absolutely essential to achieve a net zero carbon world”, he told me. “Simply put, we just can’t achieve that without this scale of investment, and that form of commitment from Government.”
He also told me he is often criticised for being “too interventionist in relation to the state”, as he puts it. But perhaps (though Jackson does not use the term himself) he might be more fairly described as post-liberal. Prosperity Without Growth is a quiet but excoriating critique of the growing human and ecological costs of liberal economics.
Intriguingly, within Jackson’s proposals lurks another challenge to liberalism, that to date has not been greatly associated with the left: the critique of radical liberal individualism as a social doctrine. Along with state intervention to tweak the economy and drive ecological investment, Jackson argues that governments should promote “commitment devices”: that is, “social norms and structures that downplay instant gratification and individual desire and promote long-term thinking”.
Examples of ‘commitment devices’ include savings accounts and the institution of marriage. Governments should implement policies that clearly incentivise commitment devices, for doing so will promote social flourishing and resilience even as such institutions offer alternative forms of meaning-making to the pursuit of shopping-as-identity-formation.
Thus, we cannot save the earth without reviving some social values and structures today thought of as small ‘c’ conservative: stable marriage, savings, settled and cohesive communities with lower levels of labour mobility.
I asked Jackson whether some of the more vociferous socially liberal proponents of environmental change had cottoned on to these potentially quite conservative implications of his theories. He told me “This is an interesting question, for sure, and one that I don’t think has really been picked up – even by me!” (Except at UnHerd – see here and here for example.) But, he says, it is incumbent on us to set aside political tribalism in the search for solutions to our current dilemmas.
“I believe there are elements of a Burkean conservatism which are profoundly relevant to a politics of the environment, even as I am convinced that the progressive instincts of the left are essential in our response to social and environmental inequality. I see it as incumbent on those working for change both to understand the underlying motivations of different political positions and also to adopt a pragmatic politic in which solutions are suited to the challenges of today rather than the dogma of yesterday.”
This essay was originally published at Unherd
Our local church runs a monthly service aimed at children, with crafts and without Holy Communion. The team that organises the Friends and Family services are lovely, work very hard to come up with activities and an appealing programme for younger worshippers, and it is popular with families many of whom I don’t see at regular services. My daughter (3) loves it.
It’s on the first Sunday of every month, so the first Sunday of Advent coincided with the Friends and Family service. My daughter enjoyed decorating the Christmas tree, making little Christmas crafts and other activities. But one thing puzzled and still puzzles me.
This is one of the songs we were invited to sing. ‘Hee haw, hee haw, doesn’t anybody care? There’s a baby in my dinner and it’s just not fair.’ It’s supposed to be a funny song, from the donkey’s point of view, about the Holy Family in the stable and Jesus in the crib. What I don’t understand is why this should be considered more suitable for children than (say) Away In A Manger.
The former depends, for any kind of impact, on a level of familiarity with the Christmas story that allows you to see it’s a funny retelling and to get the joke. That already makes it more suitable for adults. The latter paints the Christmas scene in simple language and follows it with a prayer that connects the picture with the greater story of the faith it celebrates. The tune is easy to learn and join in with. Why choose the first, with its ironic posture and ugly, difficult tune, over the latter with its plain language and unforced attitude of devotion?
I’ve wondered for some time what it is about our culture that makes us reluctant to allow children to be serious. Children are naturally reverent: if the adults around them treat something as sacred, even very young children will follow suit without much prompting. This should come as no surprise – the whole world is full of mystery and wonder to a 3-year-old. It is us that fails so often to see this, not the children.
So why do we feel uncomfortable allowing children to experience seriousness? Sacredness? Reverence? How and why have we convinced ourselves that children will become bored or fractious unless even profoundly serious central pillars of our culture, such as the Christmas story, are rendered funny and frivolous?
The only explanation I can come up with is that it reflects an embarrassment among adults, even those who are still observant Christians, about standing quietly in the presence of the sacred. What we teach our children, consciously or unconsciously, is the most unforgiving measure of what we ourselves hold important. But it seems we shift uncomfortably at the thought of a preschool child experiencing the full force of the Christmas story in all its solemnity. Instead we find ourselves couching it in awkward irony, wholly unnecessary for the children but a salve to our own withered sense of the divine.
If it has become generally uncomfortable for us to see reverence in a young child, during Advent, then the Christian faith really is in trouble.
A society that venerates health, youth and individual autonomy will not much enjoy thinking about birth or death. We are born helpless and need years of care until we reach the happy state of health and autonomy. At the other end of life, the same often applies: the Alzheimer’s Society tells us there are some 850,000 dementia patients in the UK and that this will rise to over a million by 2025 as life expectancy continues to rise.
If we are reluctant to dwell on the reality of human vulnerability at either end of life, we are unwilling to give much thought to its corollary: that (somewhere safely hidden from the more exciting business of being healthy, youthful and autonomous) there must be people caring for those who are unable to do it themselves. Someone is wiping those bottoms.
Traditionally, this job of caring for the very old and the very young has been “women’s work”. To a great extent, it still is: the OECD reports that, worldwide, women do between two and ten times as much caring work as men.
In the UK, this tends in statistics to be framed as “unpaid work”, a sort of poor relation of the economically productive type that happens in workplaces and contributes to GDP.
Carers UK suggests there are around 9 million people caring for others in the UK part or full-time, of whom up to 2.4 million are caring for both adults and their own children. Women carry out the lion’s share of this work: 60% according to the ONS. Full-time students do the least and, unsurprisingly, mothers with babies do the most. Older working women carry the heaviest load of people in employment, with those in the 50-60 bracket being twice as likely as their male counterparts to be carers whether of a vulnerable adult, a partner or a child or grandchild.
Second-wave feminism pushed hard against the pressure women experience to take on this work of caring. Within this variant of liberalism, caring work is routinely framed as a burden that imposes an economic “penalty” while harming the economy by keeping skilled women away from the workplace. The OECD report cited above states: “The gender gap in unpaid care work has significant implications for women’s ability to actively take part in the labour market and the type/quality of employment opportunities available to them.”
The implication is that, once freed of this obligation, women can then pursue more fulfilling activities in the workplace.
So what does this liberation look like in practice? According to a 2017 report by the Social Market Foundation, women in managerial and professional occupations are the least likely to provide care, as are people with degree qualifications. The number working in routine occupations who also donate more than 20 hours a week of care in their own homes is far higher than those in intermediate or professional occupation.
In other words, higher-earning women are to a far greater extent able to outsource the wiping of bottoms to less well-off people, who are themselves typically women: 90% of nurses and care workers are female.
These women are then too busy to wipe the bottoms of their own old and young, who are sent into institutional care. Such institutions are typically staffed by women, often on zero hours contracts, paid minimum wage to care for others all day before going home to do so for their own babies and elderly. The liberation of women from caring is in effect a kind of Ponzi scheme.
This is a problem for our liberal society, for two interlocking reasons. Firstly, the replacement of informal family-based care with a paid, institutional variety renders caring impersonal, in a way that invites cruelty. Indeed, cases of care home abuse are well documented – see here, here or here – and the number is rising: the CQC received more than 67,500 in 2018, an increase of 82 per cent over the already too high 2014 figure of 37,060.
It is difficult to see how this could be otherwise. Caring for those who are physically or mentally incapacitated is emotionally testing even when we love those we care for. An exhausted worker on a zero-hours contract, paid the minimum wage to perform more home visits than she can manage in the allotted day, is unlikely to have a great store of patience to begin with, let alone when faced with a refractory “client”. The entire system militates against kindness.
Secondly, and relatedly, it turns out that the informal, traditionally female networks in which caring for the young and old once took place were actually quite important. Those networks also ran church groups, village fetes, children’s play mornings – all the voluntary institutions that form the foundation of civil society.
When caring is treated as “unpaid work” and we are encouraged to outsource it in favour of employment, no one of adult working age has time for voluntary civil society activities any more. If the number of people caring informally for relatives is waning, replaced by institutional care, so is voluntarism: between 2005 and 2015 alone there was a 15% drop in the number of hours donated (ONS).
The result is loneliness. Almost 2.5m people aged between 45 and 64 now live alone in the UK, almost a million more than two decades ago. Around 2.2 million people over 75 live alone, some 430,000 more than in 1996. In 2017, the Cox Commission on loneliness described it as “a giant evil of our time”, stating that a profound weakening of social connections across society has triggered an “epidemic” of loneliness that is having a direct impact on our health.
Several generations into our great experiment in reframing caring as a burden, we are beginning to count the cost of replacing mutual societal obligations with individual self-fulfilment: an epidemic of loneliness, abuse of the elderly and disabled in care homes, substandard childcare. A society liberated from caring obligations is, with hindsight, a society liberated from much that was critically under-valued.
What is the alternative? Some would prefer a more communitarian approach to caring for the old and the young. Giles Fraser recently wrote on this site that caring for the elderly should be the responsibility of their offspring:
“Children have a responsibility to look after their parents. Even better, care should be embedded within the context of the wider family and community. […] Ideally, then, people should live close to their parents and also have some time availability to care for them. But instead, many have cast off their care to the state or to carers who may have themselves left their own families in another country to come and care for those that we won’t.”
These are strong words and there is much to agree with, but the barest glance at the statistics shows that in practice what that means is “women have a responsibility to look after their parents”.
If we are to count the costs of liberating society from mutual caring obligations, we must also count the benefits, as well as who enjoyed them. Society once encouraged men to seek worldly success, underpinned by the imposition of an often-suffocating domestic servitude on women.
Liberalism blew this out of the water by declaring that in fact both sexes were entitled to seek some form of worldly activity and fulfilment. It is not enough to point to negative side effects of this change and say: “Someone needs to be resuming these mutual caring obligations or society will disintegrate.”
To women well-accustomed to the widespread tacit assumption that it is they who will pick up those underpants, wash up that saucepan, pack that schoolbag and so on, this sounds a lot like a stalking-horse for reversal of societal changes that, on balance, most of us greatly appreciate. In truth no one, whether liberal or post-liberal, wants to confront the enormous elephant that liberal feminism left in society’s sitting room: the question of who cares. Who, now that we are all self-actualising, is going to wipe those bottoms? There are no easy answers.
THE Times reports that a chain of nurseries has invested in ‘frustration toys’ for children prone to biting. The Tops Day Nurseries operations director said: ‘The children learn that if they get the sudden desire to bite they can select a teething toy or similar to bite on to release the urge.’
With more than three-quarters of UK mothers of dependent children in work, non-maternal childcare is overwhelmingly the norm for young children in this country, and it makes its impact, at a mass scale, on their development.
Decades of research show that maternal attachment – meaning the strength and security of the bond between mother and young child – is of crucial importance in laying the foundations for psychological wellbeing in later life. We have known since 1997 that children spending more than ten hours a week in poor quality childcare are at increased risk of for unhappiness and insecurity.
That children left in ‘industrial’ childcare settings (however committed individual staff may be) are likely to be less secure than those cared for by a loving mother at home will not come as a shock to readers of The Conservative Woman. What is disturbing is the rising prevalence of nursery behaviour indicating infants’ frustration and unhappiness. It suggests an epidemic of infant misery across the country as barely-verbal preschoolers shuttle between screen time at home as their overworked parents scrabble to complete domestic chores around full-time jobs, and sometimes chaotic nursery settings which function less as caring environments for development than holding facilities for children whose parents cannot afford to look after them themselves.
Stones would weep for these poor babies. For their mothers as well: I know too many women who spent weeks in a state of bereavement, sobbing in the office loos on returning to work after maternity leave. Eventually, those mothers became accustomed to suppressing the visceral desire to be physically close to their baby (for a 12-month-old is still a baby). Presumably their babies adjust – at whatever cost – as well. But for the most part, these sobbing mothers are returning not to fulfilling careers but to mundane jobs. They have little choice: the alternative is not staying at home with their baby but having their home repossessed.
The conservative stance on these matters has for some time been to see the problem in terms of women’s needs (not babies’ needs) and their assumed desire and priority for fulfilment via the workplace.
Now too much screen time and the pressures on working parents, which means they are not spending time talking to their children, are blamed for the rise in children’s problems communicating.
Seeing the situation through this lens alone ignores the way public policy, from left and right, has been falling over itself for years to put the entire population – male, female, young and old – under this pressure by driving them out of the home and its purported ‘economic inactivity’ and into GDP-boosting employment instead.
To glance past this and place the blame solely on mothers, as individuals, for the misery of their babies in industrial childcare is at best wilful blindness and at worst a kind of sadism. Where are the voices in our political discourse who are unafraid to stand up for mothers and mothering and say that some things matter more than GDP? That top of the list is family life and especially the needs of young children?
Yesterday I attended the SDP’s party conference. The rump of the party that merged with the Liberals to become the Liberal Democrats has enjoyed something of a revival in the last year under William Clouston, who has led the charge to reinvent its social-democratic platform along distinctly post-liberal lines. The party is a minnow compared to the big hitters of conference season, but the conference was important. Here’s why.
With very few exceptions, the party’s leadership do not live in London. Its strongest support base is in Yorkshire, notably around Leeds where the conference was held. Clouston himself lives in a village in the North-East. In his closing remarks, he apologised to delegates for the fact that the next meeting will be in London. Where most of the big parties now talk about the need to take note of the perspective of people outside the capital, within the SDP the reverse is the case.
The party leans centre-right on social issues and centre-left on cultural ones. Broadly speaking, it stands for family, community, nation and a robust welfare state, and bears some similarities to ‘Blue Labour’, Maurice Glasman’s project to bring issues such as family and patriotism back into Labour politics. But whereas Glasman’s project was to a significant degree driven by metropolitan intellectuals, the SDP is not driven by London voices or perspectives. This is also perhaps why the SDP has to date had little cut-through in media terms despite numerous polls that suggest widespread support for a combination of redistributive economic policy with small-c social conservative values.
Movements that articulate concerns or perspectives widespread in the UK population outside major cities have in recent years often been traduced in the media as ‘populist’ or even ‘far right’. But while several speakers at the conference inveighed against identity politics and ‘political correctness’, the SDP is not reactionary. The first motion to carry was one to amend the party policy banning non-stun slaughter to one regulating it, both in the interests of religious tolerance but also to avoid far-right dogwhistles. Clouston himself referred in his speech to a ‘decent populism’ that seeks to return the common concerns of those outside major cities and the liberal consensus to mainstream political discourse.
The watchword was ‘community’ and ‘solidarity’. A key theme emerging from the speakers was: what are the proper limits to individual freedom? Where is it more important to consider the needs of a group? Who pays the price for ‘double liberalism’, and how can we mitigate those costs?
For some considerable time, politics has been something done by Anywheres (Goodhart) and more done to the Somewheres. Efforts to rebalance this have tended to be treated as monstrous aberrations that must be contained, whether with disparaging media coverage or more government funding for some client-state scheme or other.
But looking around on Saturday, my sense is this may change. The Somewheres are beginning to organise.
On Being An Arsehole: A Defence is this weekend’s long read pick, by Jonny Thakkar in The Point. It is a funny and thoughtful discussion of the tension between the author’s wish to fit in socially, and the desire he also feels as a philosopher to ask difficult questions that may push debates – and the social relations within which they take place – into uncomfortable places.
Most people, Thakkar argues, agree with the vast majority of what others say to them, largely in the interests of harmony. But this is unappealing to philosophers, who take active pleasure in argument of a sharpness and persistence most people would find stressful if not downright obnoxious. This, in turn, can have social repercussions for those who approach discussion in this spirit:
For philosophy trains you to presume that genuine listening, and so genuine conversation, involves helping people to clarify their thoughts, and while this might be true in some contexts, it can also have the effect of turning a heart-to-heart into an Oxbridge tutorial. “I know you’re upset, but you’ve said three different things that are in tension with one another” isn’t always the most helpful way to respond to a loved one’s distress, as I have repeatedly discovered.– JONNY THAKKAR, THE POINT
The challenge for those who would debate is to assess when it is appropriate to ask difficult questions – and when, especially in the modern world of ‘cancel culture’, the frank expression of views is likely to take significant courage:
It seems natural to conclude that the social role of philosophers is to help people think things through by confronting them with counterarguments to their current views. But since there’s no way to do that in a non-philosophical context without coming off as an arsehole, there’s no way for a philosopher to be a good citizen without having the courage to look like a bad one.– JONNY THAKKAR, THE POINT
In a week where the Prime Minister was accused both of being a leader to right-wing extremists and also of dismissing the murder of Jo Cox as ‘humbug’, a reflection on the debate, trolling and when to keep one’s own counsel feels timely, to say the least.