Sunday, 18 November 2018

The economic incoherence of the Premier League's £5m golden goodbye to its boss

Christmas has come early for Richard Scudamore. And Santa has been exceedingly generous.
The Premier League announced last week that it will award its outgoing executive chairman £ 5m in "recognition of his outstanding work". Not everyone is filled with seasonal goodwill at the news, to put it gently. The Football Supporters' Federation, in particular, are playing Grinch. "Fans strongly oppose the 'golden handshake' and we urge clubs not to make a decision which is hugely unpopular with supporters," it complained.
But actually the supporters' group arguably has its terminology slightly wrong. For this isn't so much a "golden handshake" as a "golden goodbye" - a perk for past services. Golden goodbyes present a bit of a problem for the usual justifiers of high corporate remuneration. The conventional argument in favour of showering bosses with humongous bonuses is that it incentivises them to do a good job.
One could fill a book explaining why this is analytically flawed and, generally, self-serving tosh. But it's impossible to make this incentive argument, at least with a straight face, when the boss getting showered in cash is departing and, moreover, did not know about it in advance. Scudamore was reported to be "surprised and embarrassed" about the idea, but seems to have got over it.
The basic point is that the £ 5m, as a matter of logic, can't incentivise Scudamore to work any harder because he won't be there. Scudamore will, apparently, continue to "advise" the league. But this raises the question: why not continue to pay him as an adviser? And the clubs are really not trying very hard to conceal the truth, which is that this is a reward for past performance, not one conditioned on future hard work.
"He deserves everything he gets," glowed the West Ham co-chair David Gold, fastening on his metaphorical Santa cap. The Premier League has, predictably, been highlighting all those blockbuster broadcast deals Scudamore oversaw while he was in charge as context for their generosity.
Three-year television rights when Scudamore took over in 1999 were worth £ 670m. Now they're well north of £ 5bn. Explosive growth, unquestionably. But how much of that was personally due to Scudamore, rather than the environment in which he worked? Should we credit Scudamore for the fact that BT decided to enter a bidding war with Sky for football rights in 2013 because it saw live football as a means of luring customers over to its broadband services? And, in any case, is that really the main criteria we should use to judge his tenure? It's an irony that, when it comes to sports, the Americans are much less blinded by primitive free market ideology than we are here in the UK.
American football has aggregate salary caps for players and a redistributive draft system - where the weakest teams have first pick of the best young talents in the following season - to ensure some degree of competitive balance and to prevent the total domination of a small number of franchises with huge fan bases.
Any suggestion that the Premier League should implement something similar would soon be denounced as revolutionary socialism. But the American sporting authorities understand that top-line revenues for a league are not everything.
To its credit, the Premier League is more financially egalitarian than some other European football leagues when it comes to the division of TV money. The inequality in Spain's La Liga - where Barcelona and Real Madrid hoover up the vast majority of broadcasting revenues due to their popularity - is grotesque. Things have slightly improved recently, though the Spanish system remains grossly imbalanced.
But the Premier League's egalitarianism only stretches so far. The "5 per cent of revenues" promise for redistribution to grassroots football made in 1999, whatever the protestations of the league, has not been met. As the football writer David Conn notes, the true figure is closer to 3.5 per cent.
And there's been no moves to rein in the richest clubs, to look after the broader interests of the league and football itself. A cache of documents leaked earlier this month led to claims that Manchester City, owned by Sheikh Mansour of Abu Dhabi, used financial skulduggery to get around the Uefa rules on clubs having to match their spending on players broadly with revenues.
Uefa has said it could re-open its previous investigation into the club's spending in light of the new information. But there has not been a peep from Scudamore about that - even though such rule-breaking, if it occurred, would be just as potentially damaging to the competitive balance of the Premier League as to pan-European competitions.
Nor has there been a peep from him, over the years, about any of the rogues who have, at various times, been able to take control of Premier League clubs. They've all been "fit and proper" as far as Scudamore has been concerned.
He who pays the piper calls the tune. And in the Premier League it is the big clubs who have called the tune ever since its foundation. The piper has been very well remunerated for his loyal and steadfast service to their narrow interests. And now he's getting a golden goodbye.

Tuesday, 13 November 2018

Goldman Sachs is implicated in history’s largest financial con – but will it be held accountable?

Even by Wall Street standards of gouging customers this was one hell of a skim.
In 2012 and 2013, the Malaysian government was raising $ 6.5bn (£ 5bn) from investors to establish a sovereign wealth fund and finance various domestic infrastructure investment projects. And the cut for Goldman Sachs - the most prestigious investment bank in the world - for arranging the fundraising from the global capital markets? Ten per cent, or $ 600m.
Now we can have a guess as to why the Malaysian authorities were so insouciant about those extortionate fundraising costs: because they themselves were, apparently, going to loot the pot in one of the biggest frauds in history.
Around half of the fund has gone missing. According to the US Justice Department a fair amount has been pumped into luxury American real estate and shady art auction bids. Appropriately, some went into investing in Martin Scorsese's The Wolf of Wall Street. At one stage $ 680m mysteriously appeared in the bank account of the former Malaysian prime minister, Najib Razak, who chaired the 1MDB advisory board, and who is now charged in his own country with corruption.
Malaysian politicians, officials and financiers had effectively bought Goldman Sachs' blue chip reputation to pull in naive investors to the "1MDB" state investment fund. Ten per cent probably seemed a reasonable cut in the circumstances.
The question is: what did Goldman know about the theft? The bank claims today that it was completely oblivious. But the senior Goldman banker on the ground in Malaysia, Tim Leissner, certainly knew. He pleaded guilty in New York to financial crimes related to 1MDB last week, including bribery of officials to ensure Goldman was the sole fundraiser.
What's even more problematic for the bank is that Leissner told the court there was a "culture" at Goldman Sachs of bypassing internal compliance. That's backed up by US prosecutors, who say Goldman's business culture in the region was "highly focused on consummating deals, at times prioritising this goal ahead of the proper operation of its compliance functions".
Goldman has been a Teflon bank over the past decade. Scandals have slithered off it and nothing has really stuck. We found out in 2010 that Goldman Sachs financiers constructed derivatives to help the Greek government deceive the outside world about the true state of its finances prior to the country joining the single currency.
It was revealed in 2013 that, before the financial crisis, the bank had been deliberately designing mortgage backed investment products to fail and then selling them to unwitting clients. There have been some large fines from regulators for malfeasance over the years but no senior resignations. The top brass have at every stage deplored the bad behaviour of underlings, but insisted they personally had no idea what was going on.
Lloyd Blankfein was one of the few Wall Street chief executives, along with Jamie Dimon at JP Morgan, to survive right through the financial crisis, collecting bonuses all the way. In 2007 Blankfein's total remuneration was $ 100m. His compensation in 2017: $ 22m. Clearly austerity in action.
But now Blankfein is implicated in 1MDB scandal. Reports say he personally met the Malaysian prime minister and Jho Low, the Malaysian financier accused of masterminding the theft, in New York in 2009.
Low was notorious in New York for his copious and ostentatious nightclub partying and outrageous spending. At the time, the New York Post quoted one person as saying: "Nobody spends their own money like that. It's just weird."
Is it really credible to say that this was all just a local problem, perpetrated by local rogue operatives? Did it really never occur to senior Goldman Sachs managers to wonder why the fees on the fundraising deal were so enormous? Even if it is unproven that top Goldman executives knew what was going on, what does it say about the culture of the bank that individuals like Leissner were employed there? Who is accountable for that culture? The incoming Malaysian prime minister, Anwar Ibrahim, accuses Goldman Sachs the bank, not just corrupt individuals who worked for it, of being "complicit" in the looting. And he says Goldman Sachs should return those $ 600m in fees.
We are about to discover whether the world's most politically-connected investment bank - the former employer of dozens of senior civil servants, from US treasury secretaries to the governors of the Bank of England and the European Central Bank - can brush off being close to the heart of the world's largest financial con.
The answer will tell us something - one way or another - about how much reform there has been in finance in the decade since the crash.

Sunday, 11 November 2018

The worrying echoes of the First World War's beginnings in May's approach to Brexit

In 1909 the British empire was locked in a frantic race with Germany for naval military supremacy. The continent was carved into antagonistic power blocs of states and kingdoms, jealous of each others' colonies. The atmosphere was thick with anticipation of a major new European conflict.
In that year, a Labour Party MP and journalist Norman Angell came to the rescue, with a pamphlet designed to put this febrile talk of European war to bed. Angell's thesis was that the cost of a general war to the European economy was so manifestly enormous that it was highly unlikely to happen - and, if it did, would be soon over. The economic self-interest of nations - of the individual citizens and businesses in them - would block military conflict.
"What is the real guarantee of the good behaviour of one state to another?" Angell asked. "It is the elaborate interdependence which, not only in the economic sense, but in every sense, makes an unwarrantable aggression of one state upon another react upon the interests of the aggressor."
Angell's pamphlet was a bestseller. It was bulked out into a book called The Great Illusion, which was widely translated and published around the world. Its thesis was, of course, just about as misleading as any book's has ever been, either before or since.
We stand in the middle of Angell's ashes today as we mark the hundredth anniversary of the end of the First Word War. Ten million soldiers and ten million civilians died between 28 July 1914 and Armistice Day, 11 November 1918. Economic self-interest did not save them.
A chapter by David Jacks, part of a new book released last week on the economics of the Great War, shows that the shadow of the conflict was enormous. The arteries of global commerce were thoroughly ruptured in 1914. World exports did not return to the pre-1914 growth path until the 1970s.
The extreme economic dislocation that followed the end of the First World War arguably prepared the ground for the Great Depression, mass unemployment and the rise of fascism that culminated in the Second World War.
The counterfactual question of how the lives of our great-grandparents, grandparents - indeed all of us born over the past 120 years - might have been different if the nations of Europe had not slithered over the lip of the cauldron of war in 1914 is mind-meltingly large. The lesson of Angell's error is, of course, that no forecast of how nations will behave can rest solely on an analysis of economic or commercial interest. But it's a lesson that we still find it a hard to absorb.
Sometimes we hear that a no-deal Brexit, which leads to us crashing out chaotically from the European Union next March, is most unlikely because it would plainly be economically destructive for all involved.
British firms understand it. European firms grasp it. The majority of MPs don't want it. Ergo, it's really not going to happen, however much it gets talked up. But accidents do happen. And especially in complex, yet fragile, geopolitical environments.
AJP Taylor has fallen out of fashion these days. But he was the first popular TV historian. And in the 1970s he enthralled British viewers (speaking to them plainly without notes and without sophisticated production techniques) of his theory of the causes of the First World War. According to Taylor, the military powers in 1914 were tightly constrained by the practical logistics of moving soldiers and supplies around the continent by railway.
"All the mobilisation plans had been timed to the minute, months or even years before and they could not be changed. Modification in one direction would ruin them in every other direction…. Any alteration in the mobilisation plan meant not a delay for 24 hours but for at least six months before the next lot of timetables were ready."
These constraints, according to Taylor, meant that a single spark - the assassination of an Austrian Archduke in Sarajevo - exploded into a wildfire; this was "war by timetable".
Maybe, maybe not. Historians have been debating the causes of the First World War ever since it ended and have been unable to settle on a consensus.
Yet when we think about Theresa May's decision to trigger Article 50 in March 2017 - setting the twoyear countdown running on our exit from the European Union - before she had even reached agreement within the Conservative Party, let alone the country, over what sort of post-Brexit vision to pursue, it is hard not to hear disturbing echoes of the Taylor thesis of calamity by timetable.
It's hard not to be reminded of how potentially dangerous are those leaders who have painted themselves, and their countries, into a corner.

Wednesday, 7 November 2018

INTERVIEW: Martin Taylor

Martin Taylor presses his fingertips together, leans back in his chair and stares into the distance in contemplation. I’ve just suggested to him that if he’d fired Bob Diamond two decades ago the whole history of the global financial crisis might have been different. After a microsecond of thought, Taylor breaks into a smile.

“I like to think the financial system didn’t hang on a binary choice of mine in 1998 – that would really be solipsism in the highest degree!” he chuckles.

Fair enough. But the Apprentice-style fire-or-not-fire decision scenario was real enough. Taylor himself described it in an article for the Financial Times a few years ago.

Taylor was the chief executive of the Quaker-founded British bank Barclays. And Bob Diamond’s investment banking division had blown up due to an unexpected sovereign default in Russia, forcing the group to register a large loss.

What made it worse was that tight trading limits for Russian debt set by by the bank’s board had been deliberately bypassed by Diamond’s traders. The ambitious American banker, humbled, was offering his resignation.

Taylor turned it down believing that Diamond was too important to lose. Diamond, of course, went on to drive Barclays into the very heart of the global credit storm that unleashed disaster on the world and the UK economy in 2008.

And he became one of the public villains of the crisis thanks to his spectacularly ill-judged suggestion before the Treasury Select Committee in 2011 that the “period of remorse” from bankers “needs to be over”.

“Looking back on it I think I made the wrong call,” Taylor tells me, sitting in his rather spartan office on the upper floors of the Bank of England’s City of London headquarters. “But it was more about Barclays than the financial world as a whole.”

Tieless, glasses nestling in his blazer pocket, wearing comfortable slacks, 66-year-old Taylor resembles a country solicitor more than a former City titan or the influential regulator he is now, as an external member of the Bank of England’s Financial Policy Committee (FPC). The only hint of financial services about him is a faint pinstripe on that dark blue blazer.

I’d wanted to interview Taylor for some time, and not only for his juicy insider knowledge on Barclays. He’s also been at the heart of some of the major post-financial crisis reforms and institutions.

He served on the Independent Commission on Banking (better known as the Vickers Commission, after its chair, Sir John Vickers) which was set up by George Osborne to decide whether or not to split up investment and retail banking in the wake of the crash.

And since 2013 he has been a member of the FPC, which sniffs out financial crises before they happen and orders the inflation of banks’ capital buffers when they seem to be getting too excited.

Taylor’s time with the commission is drawing to an end. He has informed the Treasury he will step down no later than June 2019. So perhaps it seemed a reasonable time for him to reflect a little in public, not just on his time on the FPC but on the decade since the crisis.

The first question is, how much has really changed in our financial system over the last 10 years? Taylor’s answer is unequivocal, trenchant even.

“I don’t know whether to laugh or be irritated by it but one of the things I’ve found about a lot of the writing about the tenth anniversary of the crisis is that lots of them basically said ‘nothing’s changed’. But actually almost everything has changed,” he says.

 “The industry has changed profoundly. The relationship of regulatory supervisors to industry has changed profoundly. The incentives have changed. The rules on pay have changed. The rules about liquidity in banking have changed. I think there’s been an extraordinary policy response. And I think there’s been some cultural change. Those who wrote it’s all the same, I don’t know where they’re looking. They don’t see the world that I see.”

I decide this is not a good time to divulge that I myself recently wrote an essay lamenting the timidity of the reform effort. We move on.

Lots of banks have been fined huge sums by regulators over various scandals, from interest rate rigging to insurance mis-selling. Plenty of bosses have departed under a black cloud. But virtually none of the top bankers during the crisis whose institutions foundered and incubated gross misconduct – not even Fred Goodwin of the Royal Bank of Scotland – have been struck off from the Financial Conduct Authority’s “approved persons” list. Inclusion on it is needed to work in the industry.

Doesn’t that, I suggest, represent a gross failure of nerve by regulators, a blow to genuine accountability? Taylor’s not having it.

“I think on the whole that the people at the top of institutions that went wrong in the financial crisis, they lost their jobs, they lost a lot of money, they lost their reputations. I think the idea that they didn’t suffer is a strange one. Imagine what it’s like to be in their shoes – it’s not great actually,” he says.

“I don’t think having the FCA write something on your tombstone is necessarily the be all and end all.”
A new book on Barclays’ roller coaster history by the writer Philip Augar relates that when Taylor was in charge he wanted to split off the troublesome investment banking division. Indeed, he resigned after failing to win the support of the board for his radical restructuring plans.

The Vickers Commission was asked to look into that very question, not just for Barclays but the entire UK banking sector. Yet Vickers stopped short of recommending a clean break and proposed a somewhat complicated “ringfence” of banks’ retail arms instead.

Why didn’t Vickers go the whole hog? Wasn’t this a missed opportunity, especially from Taylor’s perspective?
“I’ve done lots of work for government. If you’re on one of those committees, the ideal space to get to is somewhere where there’s an overhang on a Venn diagram and where what you think is a proper solution is something that government and parliament is going to accept,” explains Taylor.

So did he think that a full split proposal might not be welcomed by the government and risked being rejected?
“I was certainly conscious of it,” he says. “Finding a resolution to something does just mean recommending something. Recommending an ideal solution which government then does not enact is useless. Equally, finding an inadequate solution the government enacts is useless too.”

But he’s adamant that ringfencing, finally due to come in next year, is an acceptable solution to the problem of banks funding their casino trading habits with ordinary depositors’ savings.

“I remember when we reported it in September 2011 and said we wanted these things to be done by the beginning of 2019, a lot of people laughed and said, ‘This is for the long grass and it’s never going to happen’. And you know what? It’s three months away and it’s happened,” he says.

Yet, on the subject of Vickers, there’s been an unusually public disagreement between the Bank of England and Sir John on the question of banks’ capital requirements.

The FPC has mandated UK banks to have a capital buffer [shareholders’ equity to absorb potential losses] of at least 4 per cent of total assets. But Sir John, essentially, thinks that’s too low – and a floor beneath what the Vickers Committee itself proposed.

What does Taylor think of the concerns of his old chair?

“John has been vocal about the systemic risk buffer. I’m comfortable that where we’ve got to, we’ve got a system with about as much equity as Vickers proposed, and arrived at in a slightly different way, and with an awful lot more debt that is convertible into equity,” he says.

“Would I prefer in an ideal world for some of that debt to be equity? Yes I would. But I think we’re in the right place.”

I point out that even Mervyn King, who was the Bank’s Governor when Taylor joined the FPC in 2013, is now saying that banks should have substantially more capital than the regulatory minimum.

“Mervyn’s a friend and I’ve known him a very long time. He was Governor of the Bank for 10 years and deputy governor for five and chief economist before that and he didn’t actually do it in those 20 years,” says Taylor, slightly mischievously.

“If you’re a macroprudential regulator your default setting is to fall asleep worrying about the banking system and to think a little bit more capital would be nice. And the next time you think maybe a little more. We don’t want more capital to make supervisors sleep better – we want more capital to have the appropriate amount of resilience in the system.”

Often central bankers and regulators are surprisingly gregarious in person but automatons when they write anything down. With Taylor it’s almost the other way around. As an interviewee he’s rather hesitant, diffident even. Our talk is punctuated by long pauses for thought. At a couple of points he asks, almost plaintively, “Don’t you think?”

Rather disappointingly, he refuses to talk at any length about Barclays, where activist investor Edward Bramson is now pressing the bank to get out of investment banking, just as he urged all those years ago.

Yet a supremely self-confident intelligence shines out of Taylor’s speeches for the FPC, which are crowded with erudite and witty historical references to imperial China and the French revolution. Their substance is usually the magisterial brushing aside of the latest anti-regulation fallacy propagated by the financial lobby.

And there have been some Exocet newspaper articles. Along with the one that lifted the lid on the history of Bob Diamond at Barclays was a particularly memorable broadside in 2009 which castigated his former banking colleagues for paying massive bonuses to staff out of illusory paper profits during the bubble.

His mordant conclusion: “The system was brought down not because risk management was deficient (though it was), nor because greed was rampant (though it was), but because bankers could not count.” 

Perhaps that unusual self-confidence with the pen reflects his origins in journalism, and his editing of the Lex investment column for the Financial Times in the early 1980s.

Before I leave, I suggest to Taylor that it’s inconceivable that anyone today could progress from journalism to senior banker to regulator in the way that he did. “It was inconceivable then!” he laughs.

“I had a very strange career. But I think people ought to change sector. One of the problems I found towards the end of my business career was I was dealing with people who knew everything about a rather narrow area. You want people like that – [but] you don’t want everyone to be like that. Because otherwise it’s quite hard to see where connections are made.”

Tuesday, 6 November 2018

State pension equality for women and men is justified - but more important is giving us all the tools to take back control of our retirement

Gender equality wasn't supposed to be financially painful for women. Yet that's how it feels for many who have been experiencing a sharp rise in the female state pensionable age since 2010 which, as of yesterday, has finally been brought into line with the male age of 65.
Research by the Institute for Fiscal Studies last year found that the increase in the pension age has put upward pressure on the poverty rate of women in their early sixties. And a lobby group called Women Against State Pension Inequality (Waspi) has thrust the issue onto the agenda of journalists and MPs, not least the Labour leader Jeremy Corbyn.
The basic case for equalising the state pensionable age is not particularly controversial. The 1940 Old Age and Widow's Pensions Act introduced the 60/65 disparity in favour of women in an era of entrenched gender discrimination, when women were generally expected to devote themselves to raising children at home and men were seen as the main earner of a family. The earlier pension age for females was conferred in an environment of extreme disadvantage for women in the workplace, when they tended to earn far less and accumulated far fewer years of social insurance contributions.
But the structure of the labour market has changed dramatically over the past 70 years, with many more women in jobs. As recently as the early 1970s, the female working age employment rate was just 55 per cent. Today it is at 71 per cent - a record high. The male employment rate, by comparison, is 80 per cent.
Of course, women continue to face disadvantages and discrimination in the workplace when it comes to pay rises and promotions, particularly for those who return after having children, but there are few who argue that a lower female state pension age should be used as a partial compensation for all of this.
The fairness question over pensionable age equalisation relates to the handling of the transition. Did successive governments do enough to inform women that the change was coming, to enable them to plan for it? Was the transition, which was accelerated by the coalition government in 2011, too rapid? Should all women who have been affected be compensated? Or only those who are worst off? These are questions on which reasonable people can disagree.
Yet it would be unfortunate if this debate were to crowd out discussion of other, deeper-set problems with UK pensions. One of the reasons the state pension gender equalisation debate gets attention is that it is relatively easy to understand. But bigger injustices now actually lurk in the sphere of private, rather than state, pensions.
The introduction of "auto-enrolment" of workers into "defined contribution" workplace schemes in 2012 was a good reform, using people's natural inertia to nudge them into saving. The results have been impressive, with the proportion of employees covered shooting up from half to three-quarters. But, on its own, auto-enrolment was incomplete and might even potentially prove counterproductive.
This is because there is rampant ignorance and confusion among the public about how the kind of pensions onto which they are increasingly being enrolled actually function. A recent survey found that around a quarter of people aged 55 and over admit that they don't know how such pensions work and never check their pots. In a speech in 2016, even the Bank of England's own chief economist, Andy Haldane, confessed to being baffled by pensions. "Conversations with countless experts and independent financial advisors have confirmed for me only one thing - that they have no clue either," he added.
So what is being done to rectify this terrifying knowledge gap? Not nearly enough is the answer.
The government has been hesitating on creating a promised "pension dashboard" so people can monitor all their pension pots, from various employers, in one place, and with consistent information on projected income streams in retirement, enabling them to plan for their retirement accordingly. The pensions industry has introduced a proposal for simpler and less confusing annual pension statements but is still, disgracefully, dragging its feet on the transparent disclosure of fees for fund managers.
We need a much broader government-sponsored financial literacy drive, to introduce people to the basics of compound interest, inflation, portfolio diversification, annuities and the fundamental importance of keeping costs down.
We do not just have gender equality in the state pension age. There is also broad gender equality in a paralysing sense that pensions are mysterious and out of our control. All of us - women and men alike - need urgent assistance to rectify that.

Sunday, 4 November 2018

Sometimes the real economic gamble is sometimes too little government borrowing – not too much

With his dad jokes and fetish for spreadsheets, Philip Hammond does not fit the stereotype of a “gambler”.
But the Institute for Fiscal Studies (IFS) nevertheless argues that the chancellor rolled the dice in last week’s Budget and took a rather risky wager.
Instead of using his lower borrowing projection “windfall” from the official independent forecaster to reduce the deficit more rapidly, Hammond essentially spent it all on the health service, while leaving the overall path of government borrowing more or less unchanged.
He could have had a projected budget surplus in five years’ time, but instead there’s still set to be around £20bn of borrowing in 2023-24.
Virtually the entire UK news media took up this “gambler” theme in their headline coverage of the aftermath of the Budget.
Yet we should be extremely wary of this framing. Because it obscures the crucial truth that, in economics, the gamble is sometimes borrowing too little, not too much.
The IFS, to be fair, was using the phrase in a narrow sense of the chancellor jeopardising his chances of meeting his own self-imposed fiscal rules.
Those Office for Budget Responsibility (OBR) borrowing downgrades – whose origins remain mysterious given the official forecaster hasn’t upgraded its nominal GDP or growth forecasts which would be the most obvious explanations for higher than expected tax receipts lately – could very well be reversed in future budgets.
Since 2010, most underlying borrowing revisions have been negative (implying more borrowing than previously expected) rather than positive for the public finances.
What the lord of forecasting (in this case OBR director Robert Chote) giveth, he can also taketh away. He even warned as much last week
And what would happen then? Would Mr Hammond really try to hike taxes while the government is walking the tightrope of a hung parliament? Would he cut public spending when the prime minister has told the country that austerity has ended? Isn’t it more likely that the result would be more borrowing? And what would happen to his fiscal rules then?
All this raises the question of whether or not the chancellor’s fiscal rules are sensible. If a man had resolved to jump off a building, we wouldn’t describe a decision to place obstacles between himself and the ledge as a “gamble” because it might mean him not achieving his suicidal goal.
Hammond’s rules, including a deficit below 2 per cent of GDP in 2020-21, are not suicidal. They are far less economically destructive than those of his predecessor George Osborne, who was insisting on running an absolute budget surplus in 2019-20, ignoring the advice of just about every independent public finance expert.
Yet there are other fiscal rules available. There is no reason to believe Hammond’s represents perfection. Indeed, it’s quite possible for a country to borrow indefinitely and still see the debt stock as a share of GDP decline provided (roughly) that the growth rate is higher than the deficit as a share of output.
Labour’s own fiscal rule targets a day-to-day budget surplus in five years’ time with a suspension if interest rates are still stuck close to zero, meaning monetary policy and the Bank of England cannot reliably help to boost growth if we go into recession.
That’s a perfectly reasonable rule, consistent with stable public finances, and one which requires less consolidation – and allows more near term borrowing – than the Chancellor’s.
And then there’s the state of the overall economy to consider.
The OBR judges that there is now no slack in the UK economy, suggesting any additional borrowing would be inflationary. But the OBR may well be wrong about that. Several other credible forecasters think there remains an output gap. Oxford Economics puts it at more than one per cent.
And even if we were to accept that the economy is running roughly at capacity, an output gap could easily open up again if we have a chaotic Brexit. At that stage additional public spending will be an economically stabilising influence, just as it was during the last recession.But the media’s wholesale adoption of the “gamble” framing from the IFS briefing, and the failure to put it in the specific context of the chancellor’s own chosen fiscal rules and the neglect of all questions of macroeconomic management, is evidence of what the Oxford professor and magisterial economics blogger Simon Wren-Lewis has rightly called “mediamacro”
A key element of mediamacro is the naive assumption that higher government borrowing is inherently dangerous and that lower borrowing is always praiseworthy.
This is a rule of thumb used by far too many political journalists, commentators, presenters, editors and producers. Some of them do it for ideological reasons, out of their desire for a smaller state and tax cuts. Some lazily accept the framing of politicians. But most ubiquitous and dangerous are those who consider themselves to be neutral and non-partisan yet still drift into looking at fiscal policy through this distorting prism.
It’s depressing and really rather shameful that after a decade of well-documented macroeconomic mistakes across the western world, it’s apparently still necessary to restate the truth that a national economy cannot be usefully compared to a household, and that the only kind of economic “gamble” some seem able to recognise is the one where the risk is more borrowing.

Monday, 29 October 2018

Sanctioning Saudi Arabia is risky, but oil prices will not leave the west over a barrel

How important is the kingdom of Saudi Arabia to the global economy? Does Riyadh hold the world's economic destiny in its hands? These questions are not academic given the profound uncertainty over how the Jamal Khashoggi case will play out in the coming days and over how western governments could respond if credible evidence emerges that the dissident Saudi journalist's killing was not a "mistake" made by "rogue" operatives but was, in fact, explicitly ordered by the all-powerful crown prince Mohammed bin Salman himself.
Donald Trump has threatened "very severe" consequences under those circumstances. And, for their part, the Saudi government also last week made it clear they would not passively soak up western sanctions or other forms of punishment.
"The kingdom emphasises that it will respond to any measure against it with an even stronger measure," its Foreign Ministry said in a defiant statement. "The kingdom's economy has an influential and vital role in the global economy." Oil was not specifically mentioned. But then it doesn't really need to be.
The Saudi-led oil embargo of 1973, when Gulf states stopped sales to the likes of the US, the UK, Canada and Japan in retaliation for western support of Israel, was one of the most significant economic shocks to the global economy since the end of the Second World War.
It is branded in the memory of politicians and civil servants of a certain age, but also on the inherited folk memory of the current generation. The Saudi embargo quadrupled world oil prices, pushed consumer inflation into double digits and tipped the US and states across Europe into painful recessions. Some argue that it even helped wreck the credibility of centre-left governments in the 1970s, clearing the way for the neoliberal revolution of the following decade.
So would we be going back to the 1970s? How much economic and political disruption would a Saudi oil embargo actually do today? It would certainly be painful, but far less so than in the past, is the best guess.
The global energy market has evolved significantly over the past half century. Western countries have strategic reserves of oil and a wider range of suppliers. Recent years have highlighted the market's resilience in the face of handbrake supply and demand turns.
The recent spike in oil prices in 2010, when prices hit $125 a barrel, stimulated the domestic US shale oil and gas production sector. The industry grew so fast that domestic energy production today is almost 90 per cent of US consumption. A decade ago the US had net daily imports of 10 million barrels of oil and petroleum products. In 1973 it was 6.4 million. Today that is down to just 2.3 million.
When the oil price collapsed in 2016, falling all the way down to $30, Saudi Arabia held off from supporting the global price by moderating production for a long time precisely because it hoped the low price would help push highly indebted US shale producers into bankruptcy and restore the Saudi global market share for the long term. By and large that strategy was a failure and US shale production survived.
It's true that the UK and western Europe are still heavily reliant on imported energy and therefore appear particularly vulnerable to a sudden jump in global oil prices. But the European share of renewables as a source of final energy consumption has also been rising rapidly, hitting 17 per cent in 2016. A spike in oil prices would be likely to accelerate this switch away from fossil fuels (just as the 1973 embargo encouraged western energy conservation measures such as the Nixon administration's 50mph US highway speed limit). Again, while this could actually be beneficial in the medium term for the west, it would hardly be in the Saudi economic interest.
The oil price has been rising since the middle of last year and is now close to a four-year high at $80. One Saudi newspaper columnist has suggested that, if faced with severe western sanctions, Riyadh could slash its roughly 10-million-barrel-a-day production by two-thirds, sending the global price back to $100, or perhaps even on to a record $400 a barrel.
Yet there was little of that kind of sabre waving at the Saudi "Davos in the Desert" business investment event in Riyadh last week. The Saudi business folk to whom I spoke were, instead, keen to see western alliances preserved and almost desperate for the present crisis to dissipate.
It's clear why. An act of economic warfare like an extreme oil production cut would destroy Mohammed bin Salman's "Vision 2030" economic reforms. The crown prince's $500bn dream of a high-tech city in the desert will never materialise without tapping western expertise, implying a free flow of knowledge, people and investment. And Saudi will not become a tourist destination, as the current leadership fervently hopes, if relations with the west utterly disintegrate. Saudi Arabia itself has the most to lose economically in any standoff.
There are certainly reasons why the west should tread carefully with Saudi Arabia, from state-to-state cooperation on terror intelligence, to considerations of geopolitical stability. But fears of a repeat of the 1970s oil embargo should not, whatever folk memory holds, be high on the list.