In a complex and inter-related financial world, when so much information is flying across wires, fibre optic cables and satellite channels on a daily basis, the need to rely on heuristics – simple rules that govern your responses to the majority of everyday situations – is becoming ever greater. It is just as well then that research across a range of social sciences continues to provide regular confirmation of how powerful and effective such simple rules can be.
The value of simplification was famously captured six centuries ago in the maxim known as Occam’s Razor: when there are multiple solutions to a problem, pick the simplest one. As the inquest into the causes of the global credit crisis goes on, the clearer it becomes how seriously and expensively both investors and regulators failed this most basic of tests.
The paradox that the world’s most egregious banking crisis should come after a period in which both the scale and complexity of the regulatory oversight of banks had mushroomed was highlighted with brutal clarity by the Bank of England’s executive director Andrew Haldane in his speech at the Jackson Hole banking symposium in August this year.
After citing a range of academic studies that support the power and value of heuristics in understanding the behaviour of complex systems, Mr Haldane went on to contrast the burgeoning number of rules and people employed by regulators in the financial sphere before the crisis with the striking and incontrovertible evidence of their failure to spot – let alone prevent – the looming crisis until it happened. More, in this, as in so many cases in public life, actually meant less.
The trend towards proliferation continues with unabated vigour as politicians and regulators seek to close the stable door. So for example, whereas in 1980 the UK boasted one regulator for every 11,000 employees in the financial sector, today the comparable figure is one for every 300. The original 1935 Glass-Steagall Act which required banks to choose between traditional and investment banking, ran to 35 pages. The Dodd-Frank Act of 2010, which aims to put the genie of segregated banking at least partly back in the bottle, runs to 848 pages.
And that, as Mr Haldane pointed out, takes no account of the near 400 pieces of detailed rule-making by various agencies that the Act has spawned. Two years on from its passage through Congress, a third of these new rules have been finalised, adding a further 8,800 pages to the rulebook. The European Union, never to be outdone in any matter of bureaucratic follow through, will meanwhile produce an estimated 60,000 pages of rules by the time its raft of new financial industry regulations has finished grinding through the enactment process.
McKinsey has calculated that it will take the creation of 70,000 new full-time jobs for European banks to be able to comply with the requirements of the proposed Basel 3 bank capital regime, given the huge quantities of data that will need to be produced and handed over to regulators. There is of course no evidence that this exercise in regulatory overkill will make regulation any more effective. Indeed such evidence as there is all points in the opposite direction, that the regulators will flounder under the sheer weight of complex information – failing, just as bank boards themselves did, to understand the true scale and nature of the risks that really mattered, rather than the myriad of others that did not.
Mr Haldane’s suggestion was that a simple, common sense rule – such as imposing a limit on the amount of leverage a bank could add to the value of the equity in its balance sheet – is far more likely to prevent future banking disasters than any number of complex risk-weighted metrics of the kind that under Basel 2 so patently failed to prevent the 2008 crisis. (There is an inevitable irony in the fact that the two main epicentres of the debt crisis, sub-prime mortgage lending and more recently sovereign debt, were both assigned minimal capital at risk ratings under the Basel 2 regime).
Paul Volcker, the most famous central banker of his generation, made a similar point when appearing last month to give evidence at the UK’s joint parliamentary commission on banking standards. The message from his testimony was this: however you seek to draw the line between banks retail services and trading activities, make it a statutory prescription whose purpose is as clearly and as simply stated as possible. If in doubt, in other words, sharpen Occam’s razor.
Discretionary supervision is the last thing the system needs, as banks will always be able to claim they know more about their business than any regulator possibly can. (Mr Volcker amusingly recalled how when he took over as chairman of the Federal Reserve, a prominent bank CEO came to see him to argue that his bank did not need any capital at all since “we always make a profit”).
How and where to draw the line between conventional banking activities that justify having an implicit state guarantee and risky trading activity that should remain outside that ring of implicit protection has prompted a range of different proposed solutions in the US (Dodd-Frank), the UK (the Vickers report) and Europe. At heart however it remains the simplest of issues. The real risk from the existence of universal banks stems from the potentially contagious infection of a bank’s traditional public service activities by the culture and skewed incentives of a trading operation.
The important distinction, as Mr Volcker pointed out, is between activities involving a customer, where bankers have a clear fiduciary responsibility, and trading activity with a counterparty, which is a dog-eat-dog activity where no such responsibility exists. The two need to be kept at arm’s length if we are to be sure of avoiding a rerun of the 2008 crisis. If that implies a shrinking of the banking sector, or better still a fragmentation of the industry, so be it – and so much the better if the process is dictated by market pressures rather than by the heavy hand of hapless or hubristic regulators.