Friday, 22 December 2017

Faring badly: EU ruling another blow for Uber

Geoffrey Dudley, David Banister and Tim Schwanen

A favourite saying of Uber chief executive and co-founder Travis Kalanick was that ‘it is easier to ask for forgiveness than permission,’ but recent events suggest that this strategy can have its limits.

After years of successfully disrupting established taxi businesses all over the world, in 2017 the expansionary ride-hailing app Uber is itself experiencing the traumas of disruption. The latest blow for Uber occurred this month, when the European Court of Justice ruled that Uber is a taxi operator, and not, as the company claimed, a technology platform that acts as an intermediary between driver and customer.

This means that, throughout the EU, Uber will be subject to regulatory control as a transport operator, and not be entitled to exemptions allowed for e-commerce companies. With its roots in Silicon Valley, this is also a blow to Uber’s image as an innovative technology company that apprently placed it in a different category to established taxi operations.

Innovation at the expense of customer safety?

Another notable setback occurred in a former stronghold of London, where in September the regulator Transport for London revoked Uber’s licence to operate, claiming that the company was not a fit and proper service provider, as it had failed to report serious criminal offences appropriately, and to conduct adequate background checks on its drivers.

London Mayor Sadiq Khan commented that providing an innovative service must not be at the expense of customer safety.

Uber has been able to continue its operations pending an appeal, and launched a petition enlisting public support that has been signed by 800,000 people, but has notably adopted a more conciliatory tone, with new chief executive Dara Khosrowshahi stating after a meeting with TfL commissioner Mike Brown that he is “determined to make things right in this great city.”

Founded in San Francisco in 2009, Uber epitomises the Silicon Valley image of a rapid technological success story. It now operates in more than seventy countries, with around $16bn invested in global operations.

Its market value is about $70bn, making it the world’s most valuable privately held technology company. The successful expansion of Uber has been based on a deceptively simple use of modern technology combined with the principles of the ‘sharing economy,’ whereby the drivers use their own vehicles, and are put in touch with customers who are using their own smart phone Uber app.

Prior to the ECJ ruling, Uber had sought to differentiate itself from established taxi companies by describing itself as a technology platform, rather than a taxi business, while it classifies drivers not as employees, but as ‘registered partners.’

However, this classification has become highly sensitive politically, with many drivers in the UK seeking social benefits such as sickness and holiday pay, and claiming that Uber is exploiting their self-employed status.

In 2016, two Uber drivers won a notable test case at the Central London Employment Tribunal that they should be treated as employees and given associated benefits, and were also successful with a subsequent appeal. The case has wide implications for workers in the so called ‘gig economy.’ At the same time, some Uber drivers have defended being self-employed because of the flexibility and tax benefits it allows.

Prior to its London experience, Uber’s expansion has been met with fierce resistance in a wide range of countries. In Europe the company has fought battles with governments, regulators, and established taxi operators in the UK, France, Germany, Belgium and Italy, while Uber has been excluded altogether from Hungary and Denmark.

Uber has also encountered criticism for its failure to actually make a profit, caused partly by heavy investment in its services in order to fuel expansion, and also through a highly ambitious autonomous vehicle development programme. For Uber, however, the overriding objective has been expansion and an ever-increasing number of users.

Ironically, despite losing its licence, this expansion is illustrated particularly well by the example of London, where Uber commenced operations in 2012. Unlike the situation in many other European cities, its standard Uber X service was officially registered by the regulator, Transport for London (TfL), from the outset. As Uber expanded, it provoked fierce opposition from the drivers of the established and iconic black cabs, including major protests in 2014 and 2015.

Significantly, the black cab drivers principally blamed TfL for failing to regulate Uber more strictly. As TfL became more sensitive to the case being put forward by the black cab drivers, it sought to impose more restrictive conditions on Uber, but with only limited success.

Firstly, in 2015 TfL brought a high court case against Uber and similar operators claiming that the app constituted an illegal taximeter. Uber vehicles are classed as private hire, and the law governing them in London stipulates that only plying for hire vehicles, such as the black cabs, are legally entitled to carry a taximeter, which gives a running price for the ride based on time and distance. The high court ruled that the app was legal, and could not be classed as a taximeter. In this case, the London taxi regulations rested on nineteenth century foundations, and TfL discovered that it was difficult to fit Uber technology to these rules.

After this failure, TfL made a further attempt to impose greater restrictions on Uber, but in response the company launched a public campaign that included a petition signed by 200,000 people, and TfL was once again compelled to back down. In 2017, TfL did achieve some success when the high court this time supported its case to introduce written English language tests for all drivers. But prior to revoking its licence, nothing TfL could do was able to restrict the expansion of Uber, and by 2017 it had around 40,000 drivers in London, compared with around 22,500 black cabs.

A series of scandals

Despite its success as a disruptive innovator, and in addition to its London experience, Uber was ill prepared for the disruption it has suffered in 2017 resulting from a series of scandals.

Firstly, the company was subject to sexual harassment claims by a former employee, and this was swiftly followed by a recorded heated altercation between Travis Kalanick and an Uber driver with regard to employment conditions. Uber’s image was further damaged by revelations concerning its secret Greyball programme, whereby the company would identify users who might be rivals or enforcement officials, and show them a fake version of its app whenever they tried to order a car, thereby frustrating any official action. Uber was also subject to legal action in the case of its autonomous vehicles programme, when it was accused of trade secret theft by its competitor Alphabet, the parent of Google.

A number of high profile resignations culminated in the departure of Kalanick himself, amidst bouts of in-fighting on the part of the company’s board, and a period of several months elapsed before the appointment of a new chief executive. Nevertheless, Kalanick remains a member of the board, with much debate concerning his future role with the company. Khosrowshahi will need to steady the ship, but in doing so can Uber become a more publicly acceptable company without losing its innovative edge?

In this context, how events develop in London is likely to be of particular significance for Uber. The company appears anxious to achieve a compromise with TfL, even before the appeal case comes to court. Uber cannot afford to see its position undermined in a city that is particularly important for its expansionary strategy, while it is also diversifying into other fields such as a food delivery arm UberEats. The situation is also likely to be watched carefully in other cities, with regulators possibly taking a lead from TfL in seeking new concessions from Uber.

Geoff Dudley is Visiting Research Associate in the Transport Studies Unit at the University of Oxford.

David Banister is Professor Emeritus of Transport Studies at the University of Oxford.

Tim Schwanen is Director of the Transport Studies Unit at the University of Oxford.

This article is adapted from a longer piece in The Political Quarterly journal.

Thursday, 21 December 2017

"Democracy is really struggling to cope" – Interview with David Runciman

Anya Pearson

I caught up with David Runciman, Professor of Politics at Cambridge University, after he delivered The Political Quarterly's annual lecture 'Nobody knows anything: Why is democracy so surprising?' to discuss the volatility of electoral politics, fake news and the state of democracy.

In your talk, you mention the “five P’s” who have been wrong about recent elections: politicians, pundits, pollsters, political scientists and prediction markets. You argue that pundits, pollsters, political scientists have been getting their predictions wrong for a long time, and don’t have strong incentives to change this necessarily. But it seems to me that politicians and prediction markets do have strong incentives to get predictions right. What do you think has derailed them in recent years?

My argument was that for [pundits, pollsters, and political scientists], being interesting is often better than the business of being right, whereas for politicians and prediction markets, being right is their livelihood. And it’s one of the big puzzles, because they should have sufficient diversity that they pick up on the signals that other people miss. My feeling is that those other two groups are also suffering from groupthink. They’re listening to each other rather than listening to the signals that are coming from the wider public.

But then the questions is: why is that? Because the whole point of them is that they’re not closed off, they’re meant to be open towards lots of different opinions.

I suggested a couple of reasons. One of which is the feedback loop. The people in politics listen to what the City thinks, and the people in the City are listening to see what the betting markets think.

And then the other thing that I genuinely think is a big problem for contemporary politics is the divide between the people in all of these groups, who tend to be professionals in one form or another, and so have gone to university, and the people who left education earlier.

I don’t think it’s surprising that people who comment on politics in newspapers get it wrong, because you might expect that, and they’ve done that for two hundred years. But with these other groups there seems to be a bigger disconnect in the past two years.

I think your idea about groupthink and the feedback loops is a really useful one. In your talk, you also mention how dramatic UK elections in recent years have thrown us all “off kilter”. And your basic claim there is that the digital revolution has increased the supply of information, and this makes politics more unpredictable. Could you comment on some of the evidence you find is the strongest in linking increased information supply to increased voter volatility?

I think there is clearly evidence for increased volatility. One of the things that political parties have got wrong in recent years is that election campaigns are not meant to matter. There is meant to be a sort of settled set of opinions out there, and then election campaigns churn it up a bit, but when it comes to the vote, the settled opinion reasserts itself. And people revert back to the things that they knew before the campaign started.

And now, not just in UK elections but around the world you’re seeing that the four six weeks of a campaign actually changes peoples’ minds. And that does seem to be evidence that short-term cascades of information actually alters their [voting] behaviour. That does seem to be relatively new, since the last three to four years.

We can see these things happening, [although] it’s much harder to provide the evidence that shows this volume of information over here creates this volatility over there. But these two things do seem to be connected.

People inhabit different information universes which don’t overlap. And then, within those universes, there does seem to be a lot more movement as people change their political behaviour.

But it’s so early. People have been studying politics and elections for ten, twenty, thirty years are having to adjust to a reality which is two, three, four years old. It can’t be a coincidence that this period of time in which people sharing news on Facebook has become, for many, the dominant source of information. It’s almost certainly part of the explanation.

That’s interesting, and made me think: what about the issue of digital literacy? 12.6 million adults in the UK lack basic skills in this area and five or so million adults have never used the internet. I know these are relatively small numbers compared to the country as a whole but how do you see this fitting in to your theory about voting behaviour?

That’s a good question. My understanding is that the vast majority of those people are older people – though not all.

Again, one of the things that has been overturned recently is this idea that ‘old people vote and young people, particularly students, don’t vote’. But in the 2017 election, the fact that young people and students turned out to vote took everyone by surprise. Some of that churn and volatility seems to be disproportionately affecting certain parts of the population. And, of course, there are still people voting in traditional ways according to traditional patterns.

I heard a presentation by YouGov where they said one of the things that they missed was that there was this view that you can segment populations by age group. There’s an age group that are 65+ and you treat that as a homogenous group. They’ve decided that that was a mistake. There are now so many people who are 75 – 80 + and those people are really hard to get to. They’re often living in care homes, they’re not online, but they vote. And they tend to vote Tory.

It’s a complicated picture. There’s churn among young people, there’s stability among older people, and there’s a group of older people who maybe do behave in traditional voting ways but aren’t being picked up.

We heard a bit about your views on the functioning of democracy and how surprise was potentially an important part of it. How should we measure whether or not democracy is functioning better than it was, say, twenty years ago? I mean, what kind of criteria could we use – if that’s a possible thing to do?

That’s a big question. One of the really useful functions of these surprising elections is just to make people aware of the country that they live in. A lot of people said after Brexit, after Trump, especially after the two [UK] general elections, is that they found something out about their democracy that wasn’t visible before. And that’s got to be, I think, a positive thing.

But there is that surprising-ness that reveals itself within democratic elections. The thing that I’m warier of is you get randomness and surprise in the functioning of the institutions themselves. They actually become destabilising for the framing of democratic politics.

It’s kind of amazing the extent to which Americans accept that Trump is their president. Despite the resistance. It’s still the case that his legitimacy, though it’s questioned, is not really in doubt. So surprise within that context is probably a good thing. But then when that context itself starts to get a bit unpredictable so people aren’t really sure what the shared values and norms are, they aren’t really sure about which institutions are legitimate…

I think one of the wider consequences of Trump and Brexit, not so much the [2015 and 2017] general elections is that it would make the functioning of the institutions themselves unpredictable. Maybe not elections, elections are probably the one thing that we’d probably still cling onto, but some of the other basic functioning like the value of parliamentary government, role of referendums versus representative democracy – some of the basic frameworks of politics – if you can’t be sure from one year to the next which foundational rules will apply, I think that’s more hazardous.

The two are related. You get more of the second because of the first. Because if you’re going to elect Trump, if you’re going to vote for Brexit, you don’t just get information about the country you live in, you get consequences, which are, particularly in Trump’s case, potentially the destabilising of the institutions.

So there are surprises within democracy and there are surprises about democracy. And I think on the balance sheet, I’m inclined to the pessimist side. I think democracy is really struggling to cope at the institutional level. You do need predictability in the way institutions function.

In terms of Trump’s legitimacy – and I’m not questioning the fact that most Americans are not pushing back too much on the fact that he should be in the White House at all, but the issue of ‘fake news’ is in some ways unsettling his legitimacy. I felt like in your talk you implied that fake news is a misleading term: ‘My news is different to your news’. Can fake news be so readily dismissed?

There definitely is quite a lot of fake news out there and there’s unquestionably some manipulation going on. The main thing I was trying to say was that I’m uncomfortable with the idea that the answer to the question ‘Why is politics so surprising?’ is: ‘Fake news’.

It’s just too simple.

If you take that as the answer, you’re missing the bigger picture, yes. And there’s something uncomfortable about saying that it’s because “Bad people deceived gullible people” because it takes away the extent to which voters are actually finding out things for themselves. There’s a danger that we end up calling one person’s preference for a news source over another news source ‘fake’ or ‘illegitimate’.

It’s not clear to me that the people who calling some things ‘fake news’ are themselves on the side of the angels when it comes to objective reporting. It’s a variant of this thing where people say: ‘The people who voted for Brexit were lied to, the £250m [for the NHS on the bus]… they didn’t understand the issue’. I’m not sure the people who voted to remain understood the issue.

I don’t think anyone did. And it smacks of class bias as well; people being very judgemental about Brexiters.

Yes, and particularly, I have to say, highly educated people. [Remain voters] were often just as tribal as the other side.

Absolutely. Fake news or not, there is of course the problem of the echo chamber, of filter bubbles, and people not interacting and engaging with dialogue. So, my last question is how do you view the role of mainstream media in the future. Does it still have a place where people with differing views can interact?

I’m not sure whether people ever really interacted through mainstream media. There’s a nostalgia in this country for a time when everyone watched the BBC and ITN. I’m not sure that there was a huge amount of interaction going on when the only way you could get your news at nine o’clock was the nine o’clock news, but there was clearly a shared water cooler thing, there was a shared grounding for what people were then thinking about.

There was a synchronicity in time at least, and then potentially conversations happening in spaces that weren’t filtered.

That has gone. I don’t think there’s going to be a time where the BBC will manage to capture all the different things that people think. The same with newspapers.

And there is a dystopian version of this which is that people are self-curating their news sources but it’s all from one giant corporate entity. I don’t think Facebook is a malevolent organisation, but just that kind of monopoly power. You don’t predetermine what happens on these networks but you are the provider of the network.

Facebook would say, well unlike the old days of the BBC and ITN, on our network people really are interacting, they really are exchanging ideas, they really are communicating. Which is sort of true. But there is something chilling about the thought that all of this is happening on a framework that the people who are exchangers of information don’t control and don’t understand and don’t see.

I mean, the thing about old-fashioned media is that there was at least some transparency in what was going on.

And they change the algorithms and the way that they operate their platforms without revealing what those changes are.

There’s that famous thing you can do where you go on the Guardian website and you can use an ad revealer to show you at any given moment how many advertisers are tracking you. Machines are watching you. And so much of it is hidden.

And, of course, we know that people act differently when they know they are being observed, as well.

It’s so new that we don’t really deeply understand what it means and it’s changing all the time. I mean people’s behaviour is unquestionably changing all the time as they react autonomously and independently to their understanding of what’s happening to them. I’m a bit nostalgic for the times when people at least understood what reading the news meant. It meant you read it and they supplied it. Now, reading the news means they’re watching you, that’s different.

I think there’s a level of trust which is lost as well, of people who are the gatekeepers of knowledge.

Yes, and to repeat I don’t think it’s because there are bad people who are exploiting the system to engineer election results. In a way, it’s more serious than that. Good people as well as bad people – I think that most people who work at Facebook are decent people, they’re not proto-fascists. They’re probably well-meaning liberals. And the fact that those people engineer the system is in some ways scarier.

Anya Pearson is Social Media and Events Editor at The Political Quarterly.

A recording of David Runciman's lecture is available below. 

Wednesday, 20 December 2017

Class still matters, but it is more complicated than that

Rachel Reeves

We asked a selection of authors to respond to ‘The New Politics of Class’ by Geoffrey Evans and James Tilley

Writing in mid-2017, it is very hard for anyone to pretend that class is not a major factor in British politics. After a major financial crisis and the deepest recession since the 1930s, with household incomes stagnating and inequality growing between class, generation and region, we seem a world away from a time when anyone felt able to say ‘we are all middle class now’.

In many ways, this reinforces the arguments made by Geoffrey Evans and James Tilley in The New Politics of Class, which is a welcome, thorough and provocative examination of the enduring impact of class on British politics.

Labour's working class problem

Despite a remarkable increase in voter share at the 2017 general election, Labour's growing problems with working class voters remain. Labour made its biggest gains in seats with heavy concentrations of middle class professionals and the wealthiest voters, while losing ground to the Conservatives in the poorest seats in England and Wales.

However, we should be careful not to overstate or oversimply the importance of class in political preferences. Other factors matter too: a working class twenty-five year old today is likely to have much more in common with a middle class twenty-five year old than they are with a working class person forty years older.

The political fault lines in contemporary Britain are complex, as the academic David Runciman recently noted in the London Review of Books. New and old gulfs between sections of our society are widening, and two-party politics now has to accommodate many divisions of generation, educational background, geographic location and much more.

In analysing the problems facing the Labour party, we therefore need to understand the multifaceted aspects of identity: we struggle with older, generally white voters (especially men), outside cities, with relatively little formal education.

The authors are correct to identify the overlap between cultural and economic concerns, when it comes to globalisation. Labour became overly relaxed about the dislocating effects of globalised markets on ordinary people and communities, an attitude that was most evident in Tony Blair’s 2005 speech to the party conference, in which he spoke of a changing world “indifferent to tradition. Unforgiving of frailty… replete with opportunities, but they only go to those swift to adapt, slow to complain, open, willing and able to change.”

This vision certainly did not conform to the feelings of working class communities, but nor do I think it conforms to how most people in Britain today experience or feel about the world. We all want a sense of belonging, community and stability in our lives, and we all hold on to some parts of tradition. In this election, younger, middle class voters in cities were often reacting to the vanishing possibility that they will be able to enjoy exactly those things and instead are looking ahead to less economic security than their parents enjoyed.

Labour's role

The fundamental task of the Labour party remains unchanged: to offer an appeal able to unite working people across social groups to reform capitalism in their interest. There is still space for consensus. Many of the major challenges we face cut across classes, such as the housing crisis and the need to support our embattled public services.

So a party that has put opposition to austerity front and centre of its electoral pitch should be deeply concerned that it is those who have experienced some of the worst austerity (certainly more than the middle classes) who are least moved by its appeal.

A left-wing government that ignored the poorest and those less able to adapt to globalisation and automation, offering them only a subsidy in the form of a universal basic income, sounds less to me like a bright new future, and more like a dystopian nightmare.

Labour's future must depend on persuading working class people that we can represent them and their interests, both for electoral purposes and because that is the point of Labour.

Rachel Reeves is the Labour Member of Parliament for Leeds West.

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Monday, 18 December 2017

Occupation class is on the decline, but cultural class is on the rise

Eric Kaufmann

We asked a selection of authors to respond to ‘The New Politics of Class’ by Geoffrey Evans and James Tilley

The New Politics of Class by Geoffrey Evans and James Tilley offers a comprehensive new sociology and politics of class. For its wealth of useful empirical research alone, this book should be required reading for students of British politics.

An interesting dilemma presented by the authors is the tension between concluding class is in decline for structural reasons even as its cultural stability suggests it is as important as ever.

The country is clearly shifting from manual to non-manual occupations. The new knowledge economy has created more opportunities for professional and managerial occupations. As a result, the share of the labour force in working class occupations has fallen from 60 per cent in 1950 to about a quarter today.

Yet there is a paradox: more of the country identifies as working class than middle class. When asked to choose in a forced-choice format, the ratio of working to middle class is around 60:40. This has changed very little since 1960, suggesting that class identity has a strong inherited component that is only partly rooted in material realities such as educational attainment or profession.

To illustrate, opposition to immigration and the European Union remains persistent amongst working class people. In both cases, there is an independent conservative effect of being working class even when education and other predictors are held constant. Education matters about 3 to 5 times more than class for immigration opinion, but is only twice as important as class for views on the EU. This portends a widening class aspect to the emerging globalist-nationalist ‘culture war’.

In the Citizenship Surveys and Understanding Society, working class people who strongly identify with their class also identify more strongly with their nation. These differing class versions of national identity are nicely captured in qualitative work which finds that many in the white working class see Englishness as an insurgent identity which is being denied by the liberal middle class, while sections of the middle class view Englishness as unrespectable.

Evans and Tilley could have mentioned that political discourse has also become more sensitive to the term ‘white working class’ even as discussions of class conflict have faded.

The white working class/English nationalism nexus has been at the centre of a new politics of anti-liberal elite resentment, from the rise of the British Nationalist Party (BNP) in Barking and Dagenham in 2006 to the UK Independence Party (UKIP) from 2009 and Brexit.

Consider the blowback for Labour from Gordon Brown's comments about Rochdale pensioner Gillian Duffy in 2010; and from Emily Thornberry's 2014 tweet depicting a George Cross-covered working class home in Rochester. These were about cultural more than economic tension.

In view of the fact that 60 per cent of the country identifies as working class while working class identity predicts hostility to immigration, it would be wrong to write class's obituary. Indeed, quite the reverse is true.

Perhaps what we face is a culture war between ethno-nationalists and globalists in which class identity—increasingly emptied of material connotations—is an upstream engine of cultural politics.

Alternatively, as Corbyn's performance shows, the white working class may become a swing vote pulled culturally toward the Tories and economically to Labour. This configuration means that if the Tories move in a liberal direction, opting for a soft Brexit where immigration remains at its current level, UKIP could recover its 13 per cent vote share.

But I digress. The point of The New Politics of Class is not to engage in punditry and prediction, and that is to be commended. Taken at once, the book is a remarkable achievement and certain to be talked about by students of British society and politics for years to come.

Erik Kaufmann is Professor of Politics at Birkbeck, University of London and author of Whiteshift (Penguin, 2018).

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Tuesday, 28 November 2017

Exploring the relationship between class and voting

Harold D. Clarke

We asked a selection of authors to respond to ‘The New Politics of Class’ by Geoffrey Evans and James Tilley

The New Politics of Class by Geoffrey Evans and James Tilley is a new and noteworthy contribution to the huge literature on the impact of social class on voting and elections in Great Britain decision and other important developments in contemporary British electoral politics.

In their book, Evans and Tilley boldly revive the class politics debate and argue that their findings have relevance for understanding public support for parties such as the SNP and UKIP and the historic Brexit decision voters made in June 2016.

The basic observation of the book is that the strength of the relationship between social class and voting in Britain has declined greatly since the 1960s when the first British Election Study (BES) was conducted.

The erosion of historic ties between the working class and the Labour party has been accompanied by a significant decrease in turnout among working class voters. Evans and Tilley contend that the decrease in class voting occurred rapidly in the 1990s, coincident with Labour's decision to rebrand itself as ‘New Labour’ and enhance its ideological/policy appeal to the growing middle class of post-industrial Britain.

It bears emphasis that for Evans and Tilley the decline in class voting and the decrease in working class turnout largely are consequences of changes in the ‘supply-side’ of party politics.

As the authors put it: “Class divisions in social attitudes and political preferences remain robust. It is the political parties that have chosen not represent these class differences. This has led to a decline in class voting, but also an accompanying accentuation of class divisions in non-participation.” The merits of the supply-side argument are discussed in more detail in my article for The Political Quarterly.

New Labour's failure to offer working class voters attractive policy choices is the key example here, but the 2017 election can also be viewed as an interesting test case for Evans and Tilley's supply-side argument for the decline in class voting. If they are right, then one might expect that the old-line socialist policy appeals articulated by Labour leader Jeremy Corbyn would have generated a surge in working class support for Labour.

In the event, the party did enjoy a marked increase in support (from 30.4 per cent to 40.0 per cent of the UK popular vote), but available survey evidence suggests that this was largely a result of a huge increase in turnout by young people of all classes who were very enthusiastic about Corbyn and his message.

In contrast, survey data displayed in Figure 1 show that increases in Labour voting between 2015 and 2017 occurred in all classes as measured by the standard Market Research Society social grade classification with the largest increases occurring in the lower middle class (C1) group and the smallest in the unskilled working class and unemployed (D/E) group. There clearly was a class gradient to Labour voting in both 2015 and 2017, but it actually was larger in the former election than in the latter one.

Evans and Tilley might respond by pointing to their hypothesis that Corbyn's appeal to working class voters was likely limited by his social liberalism, which does not resonate well with many working class people.

In this way and many others besides, the book points in multiple interesting directions for future research. Equally, it is an engrossing read for anyone interested in political choice in Britain.

Harold D. Clarke is Ashbel Smith Professor, School of Economic, Political and Policy Sciences, University of Texas at Dallas.

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Friday, 24 November 2017

With long-term care, stop giving expert answers to the wrong questions

Deborah Mabbett

In the UK’s uncertain economic landscape, one bright spot continues to shine. The country is wonderfully rich when it comes to property wealth.

The ONS put the total net property wealth of private households at nearly £4 trillion in 2014. Of those households in the game, median net property wealth was over £150,000, rising to £260,000 in London. Much of this is a windfall gain, achieved by buying at a lucky time, and the wealth is very unequally distributed. The IFS has shown that inheritance is also highly unequal: those with the highest incomes can expect to inherit the most wealth.

One might expect that the median voter would look at this and advocate that more of the windfall should find its way into the public exchequer. But no: those with large sums to bequeath – and those who expect to inherit – are politically influential and astute in defending their hoard. They frame any raid on wealth as an attack on the striving middle classes, and they largely succeed in convincing many of those at the bottom of the property pile that their interests lie with those higher up.

The failures of the Dilnot report

Nowhere is this more evident than in the debate over how long-term care costs should be met. The Dilnot report on ‘Fairer care funding’ advanced a distinctive conception of ‘fairness’. Having to sell your home to pay for care, the report argued, was widely regarded by the public as unfair. The report gave no hint that there are vast inequalities in housing wealth; instead it claimed that ‘everyone’ faces a significant risk from care costs. This is patently untrue. Only those who have assets above the means-test threshold face a financial risk, and the scale of the risk increases with wealth.

The policy instrument that Dilnot advocated to protect housing wealth was a lifetime cap on the amount that a person should have to spend on long-term care. The cap would protect housing wealth, because people could plan to ensure that they could pay the capped amount without raising a charge on their homes. They could ensure that they had enough liquid savings, or they could buy an insurance policy.

My informal surveys suggest that few people understand the cap, but that did not save Theresa May’s policy advisers when they left it out of the 2017 Conservative manifesto. They proposed to raise the means-test threshold to £100,000 from its current level of £23,250 in England, but, until assets fell to this level, people would pay for their own care. Housing wealth, released if necessary by Deferred Payment Agreements, would count in what could be afforded. There was an immediate outcry about the failure to include a cap on lifetime costs. This might be £75-100,000 (Dilnot recommended a lower level, but that was some time ago now).

What do these policies mean in practice?

Faced with thinking about a cap and a means-test, both set at similar levels, most people begin to glaze over, so here is a handy table to illustrate their operation.

How means-tested and capped care costs are affected by household wealth 

The table shows three households. The first, in the lowest part of the income distribution, with modest savings of £30,000, would currently be expected to contribute £6,750 to their own care costs, but a higher means-test threshold would preserve their nest egg and all their costs of care would be met from the public purse. The cap on the lifetime amount they should pay is irrelevant to them.

The next case, with somewhere around the median of net wealth, possibly in the form of a house, would have to find £60,000 before qualifying for public support due to the raised means-test threshold. Since £60,000 is below the lifetime care cost cap in this example, the cap is irrelevant to them too, although a lower cap, nearer to the £35,000 that Dilnot originally proposed, would help them somewhat (by £25,000, to be precise).

Finally, the household with relatively high wealth is unlikely to qualify for public support in a means-tested system. It would have to run down its assets substantially, spending £220,000 before qualifying. But the cap is wonderful news: now only £100,000 has to be found before public support comes in, saving a possible £120,000. If this wealthy household took the precaution of setting aside £100,000 in liquid assets or buying a limited long term care insurance policy, then its housing wealth would be preserved to pass on to its heirs, with the aid of support from the taxpayer.

In short, the cap is blatantly regressive. Tinkering with its level does not solve this problem. Set the cap low, and the state will have to meet the bulk of long-term care costs, which it is already failing to do, so the policy problem will not be solved. Set it high, and more and more people will find their wealth is not protected. Not only will the number of prospective beneficiaries of the cap fall, but also the beneficiary group has an undesirable feature: it consists of wealthy people. The higher the cap, the more it will be that only the wealthiest benefit from it.

Housing equity insurance

There are much more equitable ways of enabling people to protect their homes than the cap. The starting point must be that people have different amounts of housing wealth to protect, and, the wealthier they are, the more they should pay for their protection. An insurance scheme for those who want to protect their equity in their homes would achieve this. The premium would rise with the value of the house. Those who don’t believe in inherited wealth could choose not to take out insurance.

Economists from John Stuart Mill to Thomas Piketty have pointed out the iniquitous and damaging effects of inheritance on the distribution of wealth. Their arguments have often come to nought, as those with wealth play their political cards well.

A politician with equitable ideals will know that she can easily be derailed by clever rhetorical strategies, but she might at least hope that the economics profession would come to her aid. Instead, Dilnot has bestowed the imprimatur of economic expertise on an opaque instrument that serves the ends of the wealthy and does nothing to solve the problem of funding long-term care.

Deborah Mabbett is Professor of Public Policy at Birkbeck, University of London. 

This blog is adapted from Deborah Mabbett’s editorial commentary in The Political Quarterly journal, available here.

Thursday, 23 November 2017

The 'new' power of class

Mike Savage

We asked a selection of authors to respond to ‘The New Politics of Class’ by Geoffrey Evans and James Tilley

Class still fundamentally affects political engagement, but today it happens indirectly rather than through overt class struggle at the ballot box.

From the time of the 1992 general election, and certainly from the electoral success of ‘New Labour’ in the 1997 election, it became clear that the relationship between class and vote was breaking down. The Labour party was moving to the middle ground and downplayed its historical association with the Labour movement, and winning more middle class votes. On the face of it, there has been a declining salience of class in affecting electoral outcomes in the UK.

Not so fast! Geoffrey Evans and James Tilley’s new book attempts an important rebalancing act. They argue that class is still at the centre of political alignments, but that the ‘new’ power of class lies in differential levels of political engagement rather than in differences in party preferences.

The new class divide is that the working classes increasingly abstain from formal electoral politics because they feel they are not being effectively represented. To illustrate, the class gap in non-voting has risen precipitately since 1987. Only around 20 per cent of well educated professionals did not vote in 2015, compared to around 50 per cent of workers with low education levels (controlling for trade union membership, gender, race, region and religion).

Interestingly, Evans and Tilley’s argument chimes with that of Pierre Bourdieu, whose brilliant observation in Distinction that the prime political divide is not between left and right, but between those who are engaged and excluded from politics, is underlined here (although his work is not mentioned here).

This book does a signal service in insisting that class is not dead, but there is a sense that the book seems both narrow and also detached from the real world of political debate we are currently experiencing. We learn little about the politics of populism, elites, experts, racism, immigration, nationalism and meritocracy, which currently dominate the agenda, and which indicates the visceral ways that ‘a new politics of class’ operates today.

We need to careful in assuming that there is strong class consciousness today. There is abundant evidence that ethnicity, sexuality, gender, nationality, age and location are often more salient than social class, but as this is not discussed here. For example the relative success of Corbyn's Labour party in the 2017 general election, with its left wing manifesto, appears much more linked to the support of younger votes and ethnic minorities than to any recovery of its working class base. Class is often powerful precisely because it is not talked about, or fully recognised by electors.

Make no mistake, this is an important book reasserting the importance of class, though I feel that further work needs to be done to draw out how class itself is being remade and how intersectionalities with race, gender and age are crucial in generating contemporary political divisions. My Social Class in the 21st Century is one attempt to do this.

Mike Savage is Professor of Sociology at the London School of Economics.

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Wednesday, 22 November 2017

How austerity has impacted our school system

Fran├žoise Granoulhac

It is time to take stock of what austerity policies have meant for the school system in England.
Seven years after the election of a Conservative-led coalition government engaged in a lasting deficit reduction programme, the extent of spending cuts between 2010 and 2016 is such as to raise serious questions about their short-term and long-term impact on the school system.

In addition, the distribution of resources is indicative of the priorities underlying policy decisions, and can be contrasted with the pledges initially made, and renewed by David Cameron five years later.

The reform agenda

The Coalition government came into office with a reform agenda based on two main commitments: reducing educational inequalities and completing the ‘schools revolution’ started under the Thatcher administration.

The objective was to establish a diverse school system in which independent state schools (academies, parent-promoted free schools, university technical colleges) would co-exist with and eventually replace locally administered schools. Autonomy – the end of local authority control – and diversity – the end of the ‘bog-standard comprehensive’ - were seen as drivers of excellence, helping to raise standards and close the gap between poorer and richer children.

The impact of austerity

However, right from the beginning it had been made clear that the reduction of public deficits would take precedence over any other policy objective. In a context of financial restraint, how did the government fulfil its pledges?

Examined closely, the spending choices during the Coalition and Conservative terms in office followed a political and ideological agenda. Cuts were not spread evenly, affecting local authority budgets and further education much more severely than schools budgets, which were relatively protected. This reflected George Osborne’s and the government’s wish to avoid a crisis on the frontline but also, strategically, to “make local authorities running schools a thing of the past’”.

Above all, the allocation of resources, while being aligned with the reform agenda, showed how priority was given to ‘choice and diversity policies’. The expansion of the academies programme was prioritised over ‘corrective policies’ aiming to provide additional resources to help disadvantaged children (through a Pupil Premium) and families. While schools were modest ‘winners’ in this uneven distribution of resources, they did experience severe cost pressures linked to inflation, rising pupil numbers and, after 2015, rising labour costs.

From cutting spending to cutting costs

But what the school system went through was more than another cycle of spending cuts. Meeting the objectives set in the 2010 and 2015 Spending Reviews, which linked long-term deficit reduction and the return to growth to reform of ‘unproductive’ public services, required a shift from cutting spending to cutting costs.

The transformation of local comprehensive and most faith schools into academies and the creation of free schools was a key feature of the new cost-effective school system. It ultimately made funding of local authority education services redundant, as the costs of those services, especially school improvement, would now be borne by schools themselves and their sponsors.

This first decisive step has opened the way to other forms of cost-effectiveness, with increasing reliance on voluntary organisations and civil society groups and widespread marketisation of remaining local authority services, such as careers advice or provision of supply teachers.

Likewise, the rapid growth of Multi Academy Trusts, which are clusters of schools forming academy chains run by different types of public and not-for-profit private organisations, responds to the need to ‘deliver better outcomes from resources available’ through economies of scale. The fact that the largest academy chains are run by edu-business firms operating under charity status, or by private international groups such as Edison or Cognita has entrenched the presence of private sector providers and opened up opportunities for indirect profit-making.

There are two ways of thinking about the function of privatisation: first as an instrument of austerity, but also as its main beneficiary.

The end of public service state education?

Does that mean that we are witnessing the end of public service state education? The rise of academies as the standard model of state schooling has accelerated the move from ‘a national system, locally administered’, to what has been termed by David Bell a ‘system of many small systems’, with different types of schools – now including grammar schools - led by different types of providers (universities, private groups, or even successful academies) serving specific individual or community interests.

However, the threats to the public service may be overstated. Full-fledged privatisation, with for-profit companies being given freedom to run schools, has lost some of its appeal for David Cameron, as for Theresa May: over the last few years the financial and educational risks involved in allowing indiscriminate growth of academy chains have come into the public eye, following reports of repeated cases of financial mismanagement and conflicts of interests.

While the mingling of private and public actors seems here to stay, the state has not been hollowed out yet. In fact, it has retained a strong hand in the direction of education policy, setting targets, determining the desired outcomes, overseeing the performance of the school system.

Public opinion is also a force to be reckoned with, as former Education Secretary Nicky Morgan realised in 2016 when forced academisation of schools, especially primary schools, was resisted. The fact that the education debate is now largely in the public realm suggests that the idea of public service state education is not obsolete. What matters now is to define what degree of public/private privision should be looked for in an education service.

It remains to be seen how this delicate balance, as well as the balance of power between state and private actors in the education system will be played out under Theresa May’s leadership.

Fran├žoise Granoulhac is Senior Lecturer in British Studies at University of Grenoble Alpes

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Tuesday, 21 November 2017

Why our governing economic model is at a tipping point

Alfie Stirling, Laurie Laybourn-Langton

It is widely accepted that macroeconomic policy in the UK and the USA has experienced two major periods of breakdown and significant transition since the start of the twentieth century. But has a third period of comparable change in the UK already begun?

To answer this, it is important to place the UK's present economic ‘moment’ in historical context.

The first period took place between the financial shock and global depression at the end of the 1920s, which led to the forty-year period of economic and policy approaches generally described as the ‘post-war consensus’, in the UK.

The second period came between the currency and oil shocks during the 1960s and 1970s, leading to the development of ‘Thatcherite’, or free market economic policies from the 1980s onwards.

Over the course of the decade since the 2007 financial crisis, it has increasingly been acknowledged that a cyclical crisis has become a structural crisis. Many western economies are exhibiting significant structural weaknesses, particularly the stalling of productivity growth and stagnation of average earnings.

We argue that we are currently experiencing a third period of faster than normal transition, with significant change in economic ideas and policy. Notable features of this include the 2007 financial crisis itself, as well as the failure of economic theory to predict it; the unprecedented deviation from historical productivity and earnings growth; and the challenge of setting monetary policy in a world where interest rates are already at their effective lower bound.

It remains to be seen whether the current acceleration in economic change proves to be the beginning of something akin to previous eras. There is not currently a credible candidate for an alternative programme, partly because heterodox theories, such as in complexity economics, necessarily present an existential threat to the very institutions—for example, the Office for Budget Responsibility, the Bank of England or private sector forecasters—that they would need to influence in order to enter mainstream policy.

And yet, powerful, normative demands for an alternative to mainstream economic practice already exist and have been expressed through democratic process in many countries. The conditions are apparent for considerable change in economics and on an historically significant scale, but as yet the missing ingredient is an alternative with the power to displace the existing ‘world view’.

Alfie Stirling is Senior Economic Analyst at the IPPR. Laurie Laybourn-Langton is a Senior Reearch Fellow at the IPPR.

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Wednesday, 15 November 2017

Mixed communities or missed opportunities?

Bert Provan

Large social housing estates – often high-rise post war blocks – are common in Europe. Frequently seen as problem neighbourhoods, these estates have become unpopular and hard to let, fallen into disrepair, gained reputations for crime and poverty, and from time to time been the scene of riots.
In both England and France these problems have led to major programmes run by both local and national government agencies. In recent years there has been a move towards the policy of ‘mixed communities’ – the idea that in order to improve these neighbourhoods, an influx of middle class working families is needed, to improve not only the income base but also the social and moral capital of the area.

But does this work? And who benefits, even if it does work? The evidence suggests that this is not a magic bullet, despite the political popularity of the approach.

In the period after 1945, there was a crisis of housing due to the very high levels of need created by bomb damage and the pre-war legacy of inner city slum housing. Driven by this housing crisis, and informed by architectural and welfare state notions of modernity and state provision, high rise system built estates were rapidly constructed. Apart from anything else, their pre-fabricated structure mitigated the need for skilled building workers, who were in very short supply.

In many cases the initial reaction of residents was very positive – they had new, light and airy flats with central heating, indoor plumbing, and separate rooms for their children, which was a great improvement from what they had before.

The problems emerged after only a few years – buildings were poorly constructed and damp, estates were isolated and poorly served by transport, and design problems led to high crime rates.
With these problems came difficulties in letting the flats, not least as other housing options were becoming available with continuing high rates of construction. The estates could only be let to households with few other options, including homeless people and people with very low incomes, and a cycle of decline let to the entrenchment of these problems.

Government intervention and mixed communities

Government intervention started in the early 1970s in both England and France. Programmes were invented, implemented, inspected, interrupted, and re-invented every few years, dealing with a variety of factors from the quality of the buildings to attempts to build community identity and social capital. There was very mixed progress in significantly improving the outcomes for residents. Although decline was halted and the neighbourhoods mainly stabilised, they remained pockets of poverty within their wider city framework.

The arrival of ‘mixed communities’ as a new putative solution came in the early 2000s. The proposal was to transform these estates into areas where there was a mix of all classes, ethnicities, and capacities to engender higher social capital and community cohesion.

Part of the implementation of this approach can be seen in planning frameworks in England and France, where there are strict (if not entirely watertight) regulations around the need to achieve a specified level of ‘mix’ of what are now termed ‘affordable’ homes within any new housing and community developments or area renewal projects. In France this laudable aim is covered by the “law of city solidarity and renewal” indicating the egalitarian impulse behind it.

When social ‘mixing’ becomes problematic

However, this aim of ‘mixing’ becomes much more problematic when applied retrospectively to existing neighbourhoods with concentrated poverty and dense social housing. The first problem is the underlying objectives of the programme.

One way to look at it is as a means to bring new families, businesses, finance for better housing, social capital, and a better image to the neighbourhood.

Another way is to see it as a morally driven imposition of ‘middle class morals of hard work and good living’ on an estate mainly inhabited by some kind of ‘criminal classes’ or benefit cheats, and expect that the overall standard of behaviour can be raised.

This rather exaggerated sounding second option reflects many more historic characterisations of the residents of poor neighbourhoods (for example in England, Booth’s poverty maps). But in fact even today these attitudes often form part of political rhetoric about poor areas - and for evidence of this we can recall ex-President Sarkozy’s 2005 characterisation of the residents of these areas as “racailles et voyous“ (scum and hooligans).

Experiences of the residents

For residents, things may appear and be different. One key element emerging from some of the many varied programmes mentioned above is the existence of strong local communities, with high social capital and levels of self-help and cooperation on these estates.

Many programmes have worked to develop the existing capacities of residents, driving improvement from the inside and building on existing strengths. Not only do large scale demolitions and rebuilding with middle class housing undermine this, they also very often result in the forced or inevitable eviction of existing residents from the neighbourhood.

Where ex-residents go is also seldom studied, but the limited evidence suggests that they often go to neighbourhoods or to housing conditions which are no better than those on the estate they left. Evidence from the slightly different and more systematically studied US experience of moving families from similar neighbourhoods of ‘project’ housing had clearly shown that even if they move to better (more ‘mixed’) neighbourhoods, the outcomes in terms of levels of achievement of the families are little changed, or poorer, compared to families remaining in their original homes.
There is also a version of this poor outcome in existing attempts to create ‘mixed communities’ which often quickly morph into areas of gated middle-class households who go to different schools, shops, and clubs, and seldom if ever mix with the original residents.

Beyond the ‘mixed communities’ approach

But if the ‘mixed communities’ approach does not work for existing problem neighbourhoods, what should we do?

This is not a problem that can be ignored, and good social policy and city governance considerations suggest that better solutions to tackle these pockets of poverty and disadvantage should continue to be developed, despite the patchy results of past programmes.

Perhaps we should start by accepting that mass displacement of poor residents undermines existing capacities and social capital. We can also recognise that many of these programmes are driven by the unfounded fears of more affluent parts of a city rather than a real desire to address the concerns of residents on these areas.

One key thing that the range of original improvement programmes have shown is that stability and progress can be made not by exporting the residents to unknown and unsupported places. Instead, these poor neighbourhoods need to receive not only their fair equal share of economic and social investment in jobs, schools, hospitals, transport and training, but also require high quality housing improvements, local management and control, and specific attention to the needs of the vulnerable residents.

Bert Provan is an Occasional Senior Research Fellow at the Centre for Analysis of Social Exclusion at the London School of Economics and Political Science.

This article is adapted from a piece in The Political Quarterly journal. You can read the full article here.

Friday, 10 November 2017

Understanding the post-liberal centre ground

Adrian Pabst

Much of the post-2017 general election analysis has focused on Theresa May’s spectacular fall from grace and the surge in support for Jeremy Corbyn. What is lacking is a reflection on the fundamentals of British politics.

A decade of financial disruption, austerity and stagnant wages has produced a popular rejection of market fundamentalism. Weaker civic ties have left many people feeling dispossessed and ignored. In an age of economic and cultural insecurity, the task of politics is more than ever to rebuild accountability to people and democratic participation in the polity.

The liberal centre is in retreat

In my article for The Political Quarterly, I conceptualised the double demand for greater economic justice and more social cohesion in terms of ‘post-liberalism’ – moving beyond the free market dogma of the liberal right since Margaret Thatcher and the identity politics of the liberal left since Harold Wilson. These two liberalisms converged in Tony Blair’s ‘third way’ and David Cameron’s ‘compassionate conservatism’, and the liberal centre that has dominated British politics for nearly four decades is now in retreat.

After the Brexit referendum and the 2017 election, we are seeing a series of paradoxes that cannot be mapped according to the old binaries of either left vs. right or liberalism against the rest.

The first paradox is the return to a two-party contest where neither commands a majority. The Tories ran one of their worst campaigns in living memory, but still managed to get 318 seats on a vote share of 42.4 per cent, up 5.5 per cent from 2015. Corbyn’s lively campaign and popular policies galvanised Labour, which increased its share of the vote by 9.6 per cent to 40 per cent, but the party’s achievement of securing 262 seats remains over sixty seats short of a working majority of 323. Despite the Corbyn surge, Labour lost for a third consecutive time against the backdrop of the slowest economic recovery in 70 years and a government that is anything but ‘strong and stable’.

The second paradox is that the Conservatives lost their majority even as they broadened their electoral coalition, while Labour has built a platform for victory next time based on a narrower electoral coalition. Although the Tories lost support among middle-class Remainers and especially young voters, they are at about 40 per cent among manual workers (same as Labour) and at nearly 50 per cent among people with no educational qualifications (compared with Labour’s 35 per cent).

Labour won 21 extra seats in England, but it lacks support among the over 55-year old voters and in large swathes of the country – especially suburban places, coastal regions and rural communities. The traditional working class are switching to the Tories, while Labour is now the party of the affluent and the university-educated. For now, neither party is building a cross-class and cross-cultural coalition that can win a stable majority.

The third paradox is politics is moving both left and right at the same time, but not in a liberal direction. Since Thatcher’s victory in 1979, parties had to move right on the economy and left on social issues in order to win. Now parties are moving left on the economy and right on some social issues (like controlling immigration). 2017 defied the conventional law that British elections are not won on a left-wing economic manifesto. Both parties promised an active industrial strategy and central state intervention in energy and other markets. And both committed themselves to ending the free movement of people after Britain leaves the EU in 2019.

The fourth paradox is that after the election both parties are retreating into their ideological comfort zones just when the country needs a national popular politics. At first, Theresa May seemed to articulate a more economically egalitarian and socially communitarian politics. She denounced both the libertarian right and the socialist left while promising greater economic fairness and more social stability. The narrative of the much-maligned 2017 Tory manifesto was in fact a fusion of Burke, Beveridge and Blue Labour, as I argued in a blog piece for the New Statesman.

Missing: A politics of the common good

But all the talk about breaking with Thatcherism – “we do not believe in untrammelled free markets. We reject the cult of selfish individualism” – came to nothing. Already in her first year as Prime Minister, May’s initially ambitious proposals for corporate governance reform were watered down and the government’s industrial policy provided nothing of substance. Now the Tory arch-Brexiteers want ‘more neo-liberalism in one country’ as they make plans for a low-regulation, low-tax economy boosted by free trade deals with the other countries of Anglo-Saxondom.

Corbyn’s opposition to austerity continues to be very popular and he has been vindicated for his consistent critique of capitalism. But his utopia of ‘socialism in one country’ is fusing twentieth century-style statism with twenty-first century digital platforms. It offers a future for the new, networked generation of globally mobile cosmopolitans. The rest will subsist on a universal basic income funded by taxing tech companies. Automation and artificial intelligence promise to create a post-capitalist economy without work or workers.

Neither party is currently offering a national popular politics of the common good that can build new alliances across the deepening divides of rich vs. poor, young vs. old, north vs. south, urban vs. rural, university-educated vs. no qualification, and so on.

Post-liberalism may not be the right word, not least because it accords too much significance to the economic liberalism that has lost support. But it does name the ‘new times’ we inhabit – the search for political purpose in an age of upheaval.

Adrian Pabst is Reader in Politics at the University of Kent and co-author of The Politics of Virtue: Post-liberalism and the Human Future (Rowman & Littlefield International, 2016)

This article is adapted from a piece in the Political Quarterly journal. You can read the full article here.

Tuesday, 26 September 2017

Where next for long-term care?

Deborah Mabbett

The Conservatives expected to win in 2017, and the manifesto was written accordingly. For the small team in Number 10, it was an opportunity to fix policies where there could be internal dissent and backsliding. The House of Lords observes a convention of not opposing policies that are based on clear manifesto commitments by the winning party, and May’s team evidently hoped that dissidents in the Commons could be subjected to a similar discipline. The manifesto was taken as an opportunity to reorient the party towards a new social and egalitarian vision of conservatism in which the instincts of middle England were identified and distinguished from the interests of the cosmopolitan elite.

Apart from the obvious problem of managing a party packed with representatives of the cosmopolitan enemy, the image of middle England was always vulnerable to deconstruction once policy details emerged. General and rhetorical appeals gave way to the calculus of interests, and therein lay a problem for long-term care policy in particular. Middle England may believe itself to be only ‘just about managing’, but it still enjoys or aspires to home ownership. But a goodly share of future long-term care finance will have to come out of housing equity, and the Prime Minister’s advisers took the plunge and said so. In a policy area riddled with complexity, their answer was crystalline in its simplicity: people should pay for their own care unless they could not afford to do so. Housing wealth, released if necessary by Deferred Payment Agreements, would count in what could be afforded, while the means-test threshold that delineates those who can afford care from those who cannot would be raised to £100,000.

Critics were quick to point out that this scheme offered no insurance against the lottery of long-term care costs for anyone but the worst off, for whom the £100,000 threshold provided protection. And long-term care costs really are a lottery: while many people will spend something on care in their old age, a crippling burden of high costs falls on about 10% of the elderly. It seems an ideal case for insurance, especially as the burden is genuinely difficult to predict. Wealth and education do not guard the privileged against dementia: the disease where the costs can be highest and most prolonged.

The argument that there should be insurance against very high costs, and that the state should provide it, was accepted in the Dilnot report. It proposed that there should be a cap on the amount anyone should have to pay towards their own care. The amount up to the cap can be thought of as an insurance ‘excess’. Economic theory (specifically, ‘Arrow’s theorem of the deductible’) suggests that it is efficient to have a certain amount of excess or self-insurance, although, somewhat confusingly, Dilnot thought that the private sector might provide insurance for the capped amount. Beyond that, the public sector would step in, effectively providing the ‘stop loss’ insurance. Thus the idea of a cap on the amount that anyone should have to spend on care became established, having received the imprimatur of economic expertise.

In 2015, the cap on care costs seemed to be a done deal: the Conservative and Liberal Democrat manifestos referred to it as if it was already in force, and Labour indicated that it supported the measure. Thus there was an outcry when the Conservatives did not include a cap in their 2017 proposals. But, as often happens when economic theorems are applied to public policy, the underlying arguments are far from straightforward. Dilnot proposed a cap but gave little guidance on the vexed question of how it should be set. A moment’s reflection tells us that setting the cap will always bring political torment. Set it low, and the state will have to meet the bulk of long-term care costs, which it is already failing to do, so the policy problem will not be solved. Set it high, and more and more people will find their wealth is not protected. Not only will the number of prospective beneficiaries of the cap fall, but also the beneficiary group has an undesirable feature: it consists of wealthy people. The higher the cap, the more it will be that only the wealthiest benefit from it.

In short, the cap is the policy instrument from hell. How did public policy on long-term care get into this predicament? We return to the Dilnot report on ‘Fairer care funding’ and its conception of ‘fairness’. Fairness, for Dilnot, meant finding a way to protect housing capital against care costs. Having to sell your home to pay for care, the report argued, was widely regarded by the public as unfair. The report gave no hint that there are vast inequalities in housing wealth; instead it claimed that ‘everyone’ faces a significant risk from care costs. This is patently untrue. Only those who have assets above the means-test threshold face a financial risk, and the scale of the risk increases with wealth. Estimates contained in the report showed that, if its proposals were adopted, the largest increases in public expenditure on care would accrue to those with the highest incomes, but this failed to ring alarm bells.

Dilnot also advanced a more prosaic defence of the cap, evaluated against the alternative of universal social insurance. Requiring a private contribution would restrain the cost to the public sector. Public schemes, Dilnot noted, tend to be, or become, underfunded. But the cap was really a limited gesture towards solving the problem of public underfunding. While dismissing a general insurance scheme, the Dilnot proposal envisaged the maintenance of public insurance beyond the cap. There was no discussion of the structure of this insurance: implicitly it was assumed that it would be the familiar British kind, whereby the premiums are collected through general taxation. This approach to insurance has many admirable features. Without the capped excess, it is the basis of the NHS: we pay in according to our income and use the services according to our needs. It is in a way a ‘double’ insurance, providing protection against health care costs and against having a low income. The Dilnot report is not to be faulted for planning on the continuation and augmentation of this insurance, but the report was silent on how that money might be found. That, apparently, was a problem for politicians to solve.

There is plenty of discussion in the report on how households could ensure that they could pay the capped amount: suitable private insurance products might be developed, or provision might be linked to pensions or savings plans. A private model was not seen as viable for the whole amount of care costs: private long-term care insurance has failed to get established anywhere. But insurance for a capped amount would be possible, as the cap would remove uncertainty about the potential cost of care. Thus Dilnot proposed a sort of ‘public-private partnership’ but, as so often with these wizard schemes, the government would have to find more money, and spend it on relatively wealthy recipients, in order to fulfil its role as partner.

In the face of Dilnot’s deafening silence about how to raise more public money, Labour and the Liberal Democrats ventured forth with their proposals. The Lib Dems proposed a 1p rise in income tax and the eventual introduction of a hypothecated health and care tax. In 2015, Labour proposed instead to create a new source of revenue, based on wealth rather than income, aiming particularly at those who have enjoyed large windfall capital gains from their home ownership. But this proposal came under intense criticism and was evidently deemed a vote loser, because in 2017 the party hedged its bets, keeping a wealth tax on the agenda but saying it would seek a cross-party consensus on how to raise the necessary revenue.

In proposing that a significant contribution to long-term care costs would have to come out of (a tax on) housing wealth, Labour tacitly challenged Dilnot’s peculiar definition of fairness, which is that housing wealth should be protected. Labour’s challenge is different to that posed by the Conservatives’ 2017 manifesto proposal, which failed to address the problem that some unlucky people would lose the capital in their homes, and have nothing to pass on to their children. A wealth tax would mean that the risk would be pooled: everyone with housing wealth would pay something, and they would all have a little less to pass on to their children.

Given that Labour’s proposal for a wealth tax in 2015 fell heavily on its face, it seems to be time for a bit of lateral thinking. The party is right to insist that new sources of revenue need to be opened up: in particular, that some sort of charge on wealth is needed if public services are to be sustained without an undue burden on the working age generation. The problem with the care proposal in the Conservatives’ 2017 manifesto was that it was a charge on an unlucky few, rather than a general levy on all those facing the risk of losing housing wealth. The solution to this problem is not to design the whole system of long-term care finance around the protection of housing wealth, as Dilnot did. The missing item in the Conservative manifesto was not the cap, but an insurance scheme for those who want to protect their equity in their homes.

The logic of housing equity insurance is quite simple. The main beneficiaries of the expansion of public long-term care provision are those with something to lose: the equity in their homes. Who do we want to tax to fund long-term care? Wealthy people: meaning, by and large, people with substantial equity in their homes. These dots can be joined up. If the problem with the Conservatives’ proposal was that unlucky people would lose their equity, the solution is to offer protection against that risk with housing equity insurance. Such a scheme is not difficult to devise, and it could have the useful feature that, since the insurance is of housing equity rather than care itself, those with more valuable houses to protect should pay a higher premium. The scheme can be voluntary: those who don’t believe in inherited wealth can reap a reward for their enlightened views.

No doubt this idea has snags that I have not thought of. But it has one great merit, which is to tackle the underlying politics of the long-term care debate. There are good arguments for turning to housing wealth to provide funding for care, but as the debate is currently structured, housing wealth seems untouchable. The cap has become firmly lodged in the policy debate even though it is fundamentally iniquitous. If the argument could be reframed, a solution is possible.

Some years ago, David Runciman dissected the politics of inheritance tax reduction in the US [1]. He pointed out that the wealthiest people, who would benefit the most, had proved skilful in finding frames and slogans which deceived the middle classes into thinking that their interests lay with the rich. Progressive politics has to learn to play this game too. ‘Tax breaks for rich murderers’ was Runciman’s suggestion for pushing back against the ‘death tax’. Unfortunately, some well-meaning people were suckered into talking about the ‘dementia tax’ at the last election, and these things are difficult to row back on. ‘Caps for rich home-owners’ does not have the same ring to it, but that is the policy we seem to be locked into now.

[1] Runciman, D (2005) 'Tax Breaks for Rich Murderers', London Review of Books Vol. 27 No. 11, 2 June.

Wednesday, 13 September 2017

Proscribing National Action: considering the impact of banning the British far-right group

Chris Allen

Chris Allen
Following the news that West Midlands Police have arrested five serving members of the British army on suspicion of being members of the proscribed neo-Nazi group National Action, we should consider the extent to which the British Government’s approach to banning extremist groups has been successful.

Over the past two decades, the British Government has adopted a range of different legislative and policy measures in trying to address extremism and radicalisation, one of which is proscription. While the majority of those banned have typically adhered to extreme Islamist ideologies, those adhering to extreme far-right ideologies have begun to increasingly concern politicians and others alike. In this respect, the arrests will be far from surprising for some.

Prior to proscription under the Terrorism Act 2000 in December 2016, little was known about National Action. While those such as Britain First and the English Defence League (EDL) had courted media attention and thereby public and political reach, National Action was growing in confidence and numbers. Most concerning, however, was that as Hope Not Hate noted, its supporters were becoming increasingly provocative, ever more erratic and wholly unpredictable to the extent that its greatest threat was physical rather than political. There were also very real concerns about the group’s link with Thomas Mair, the convicted murderer of the former Labour Member of Parliament for Batley and Spen, Jo Cox. At his trial, he spoke only to say was “Death to traitors, freedom for Britain”, a slogan that featured prominently on the group’s now defunct official website.

On the decision to proscribe, Amber Rudd, the Home Secretary, said that National Action “is a racist, Antisemitic and homophobic organisation which stirs up hatred, glorifies violence and promotes a vile ideology. It has absolutely no place in a Britain that works for everyone” before adding that it was “concerned in terrorism”. In doing so, it was the first time in British history that membership of a far-right group had been outlawed. Consequently, it became a criminal offence to be a member of the group, to invite support for it or help organise any meetings. Likewise also to wear clothing or insignia linked to the group or carry symbols. It would appear to be the former upon which the recent arrests have been made.

As before, little is known about National Action. Prior to proscription, it self-identified as Britain’s premier Nationalist Socialist street movement. Its founders – Alex Davies and Ben Raymond – were originally members of the youth wing of the now largely defunct British National Party (BNP). Having become acquainted via social media they agreed that a newly revitalised nationalist youth movement was now necessary in Britain. Recognising the need to be different and distinct from existing groups and movements, the two began to demarcate between ‘good’ forms of nationalism – including the British Democratic Party and Blood & Honour – and ‘bad’ – for instance, the BNP and EDL. From this process emerged the impetus for National Action. Publishing a manifesto in 2012, the group was formed the following year and quickly began to orchestrate direct action campaigns that largely targeted university campuses and mobilising supporters to demonstrate in city centres.

Its ideology was unequivocally traditionalist and sought to offer Britain’s youth an authentic interpretation of Nazism. In doing so it broke with a number of recent trends evident within the British far-right milieu. This would appear to have been a deliberate ploy, decrying other far-right and neo-Nazi groups as being cowards for being populist in preference to traditionalist. As such, National Action’s ideology was one that included overt expressions of ultra-nationalism, racism, Antisemitism, disablism, homophobia, anti-liberalism and anti-capitalism among others. In doing so, it routinely expressed both an admiration and glorification of Hitler and what it believed were the great achievements of the Third Reich. It argued that such was needed to ‘save’ Britain, ‘our’ race and ‘our’ generation. As part of this, it aspired to establishing a ‘white homeland’ in Britain.

It was clear that National Action knew that it was likely to be banned. As such, it used its now defunct website to deny that it was in any way extremist. Citing the legislation, it argued that an extremist organisation was required to use or encourage illegal violence or terrorism to achieve its goals. Countering this, the group asserted that it was instead radical rather than extreme, adding that to achieve its goal of establishing a white Britain it could only do so through state power and so it was – and always would be – fully complicit with the state’s institutions, including the police, army and intelligence services. While so, there was always an underpinning threat of violence in much of what National Action said and did. This was evident in how it regularly spoke about the need for its members to prepare for ‘self-defence’; likewise for them to undergo combat training. But most overt was its bold statement that as a group, it was “not afraid to swing the bat at the enemy”.

While banning was as unprecedented as it was unsurprising, a lack of clarity remains about the specific and long term impact of doing so; something that is given further emphasis in the wake of the recent arrests. Given these were intelligence-led, it would seem that the mere act of banning has not countered the group’s extremist ideology, stopped the group from functioning or prevented its members from being active. On one level this too is unsurprising in that many of the groups banned under the same legislation have continued to operate by instantly regrouping albeit with the mere adoption of a different name. That is how easy it is to circumnavigate the legislation. A good illustration of this is the Islamist group Al-Muhajiroun which was banned in 2005. Between the initial ban in 2005 and 2014 when the last ban was imposed, the group, its members and its Islamist-inspired extreme ideology continued to function via groups named the Saved Sect, Call to Submission, Islam4UK, Islamic Path, London School of Sharia and Need4Khilafah. Given that it was recently claimed that affiliates to Al-Muhajiroun were linked with over half of all Islamist-inspired terror in the UK, the frailties and weaknesses of banning become apparent.

What would appear to be different here is that National Action and its members have apparently continued to use its name. At the moment, it is unclear why this might be so: could it be that the group were unconcerned with the ban or was it that they did not believe that they would be arrested? At this stage it would be wrong to speculate. Nonetheless, irrespective of whether the group and its members had chosen to continue to use the name its ideology would not have been destroyed on the basis of the ban alone. Also of grave concern is the prospect of National Action – and possibly other far-right and neo-Nazi groups – actively seeking to recruit servicemen to their rank and file. Again, while it would be wrong to speculate about this now it was alleged earlier this year that something similar was happening in the United States via the 4Chan website. It will be interesting to see how the situation develops.

You can read the full article here.