Volkswagen, scion of German industry, has fallen off its pedestal after revelations that up to 11 million of its diesel cars were fitted with software designed to deceive regulators about levels of toxic NOx emissions.
It all began to unravel last in May 2014 when Virginia University's Center for Alternative Fuels, Engines & Emissions (CAFFE) discovered that NOx emissions from two of three diesel-powered cars it was testing produced between 5 and 35 times more than the official emissions standard, though in laboratory conditions they complied. This was the first whiff of rat.
According to a Californian Air Resources Board (CARB) letter of 18 September, CARB took up the case and pursued investigations. Discussions ensued with VW, who solemnly carried out tests and provided fixes, which did not work when CARB tested them.
As CARB relates in its letter, on 3 September, after a year's prevarication, VW confessed. As the world now knows, large numbers of VW's diesel-powered motor cars were "designed and manufactured with a defeat device to bypass, defeat or render inoperative elements of the vehicle's emission control system" whilst meeting official testing procedures.
When the news became public, on 18 September, it was met with a mixture of incredulity and a drop in the VW share price of about 25%. After defending an almost indefensible position for almost five days, VW's Chief Executive Martin Winterkornstood resigned. Some began asking whether VW would survive.
Beyond VW there was collateral damage. The share prices of other car makers dropped, though less than VW's. A discussion began as to whether diesel technology, held responsible for large scale health problems, was a wrong turning that should be abandoned. And questions were asked whether, after a long string of governance failures in Germany, German industry was really the paragon represented by its good reputation.
But the intriguing question is: why did this happen?
We do not yet know what happened but it is not unknown for major corporate scandals to have their genesis in the board room. Think of the Olympus and Toshiba scandals. That said there is no evidence of active boardroom involvement and Mr Winterkornstood's protestations of surprise imply that he first learned of the problem very late in the day. But the composition of the supervisory board, which does not seem to have been chosen with skill sets as its primary concern, has echos of the boards at Airbus at the time of the A380 crisis and at the UK's Co-operative Group when it almost collapsed.
If that is so, he must have been in the dark, unaware of what seems to have been a piece of deliberate skulduggery going on under his nose - but without his knowledge.
This sadly is a very common state of affairs. We call it the Unknown Knowns problem. There are things that leaders would dearly love to know - but they cannot find out until it is too late. In our research, 85% of leaders were taken by surprise when a serious crisis engulfed their company. Yet most of these crises were caused by systemic failures that had lain unrecognised for years, sometime decades.
The most telling example is that there have been at least 14 rogue traders, averaging one every eighteen months, since Nick Leeson broke Barings in 1995: most recently the London Whale, who breached in 2012. JP Morgan sustained losses of $6 billion on positions said to amount to about $160 billion. All the rogue traders operated in an environment where risk teams are huge: JP Morgan's risk team ran to thousands but they didn't spot the Whale; nor did the astute Jamie Dimon - until an even more astute hedge fund spotted his problem from the outside and began to trade on his misfortunes.
What seems to happen is that some combination of character, culture, leadership, targets, incentives, corner-cutting, complexity, groupthink - and the slippery slope from gently bending rules to breaking them - leads an individual or team to start doing something that, as Warren Buffett put it, you "wouldn't be happy to have written about on the front page of a national newspaper in
an article written by an unfriendly but intelligent reporter." It doesn't help if regulators are not robustly independent.
Once the wrongdoing has begun, it is very hard for participants to confess - doing so will probably lead to unpleasant sanctions - so it continues. The hole gets deeper.
The wrongdoing is rarely known just to the participants. Others usually know, but are unwilling to rock the boat. This may be because the wrongdoer has higher status; or it may be because they are in the same 'tribe', but as time passes unwillingness becomes tacit complicity. Many may know things are wrong but they won't tell anyone above them. Often the root causes are visible to the thoughtful, perceptive outside observer, such as a hedge fund or professional investor. We call these companies 'predictably vulnerable'.
There may be a potential whistle blower; but anyone who researches whistle blowing as an activity will discover that it is commonly terminal if not merely frustrating. It takes courage and determination to blow the whistle; and it takes an exceptional leader to listen and understand what a whistle blower is alleging with an open mind.
This is one of the ways in which leaders find themselves in the dark. Breaking this silence is difficult. It takes an insider-outsider, as anthropologists term it, armed with trustworthiness, skill and understanding of human behaviour to learn what insiders think and know but won't tell. A sensitive investigation should uncover the root cause behavioural and organisational risks that lead to Unknown Knowns so that leaders can fix at least the root causes before they can cause more harm. And as our research also shows, they usually do have some time.
An investigation may uncover things you wouldn't be happy to have written about on the front page of a national newspaper. If so, you should listen and learn; and be grateful for the opportunity to deal with them before they blow up and destroy your personal reputation as well as that of your organisation.
- Reputability LLP
- Reputability LLP are pioneers and leaders globally in the field of reputational risk and its root causes, behavioural risk and organisational risk. We help business leaders to find these widespread but hidden risks that regularly cause reputational disasters. We also teach leaders and risk teams about these risks. Here are our thoughts, and the thoughts of our guest bloggers, on some recent stories which have captured our attention. We are always interested to know what you think too.
Monday, 21 September 2015
As organisations grow so do teams. The work of Robin Dunbar, an evolutionary psychologist and anthropologist suggests trouble starts as group size extends beyond about 150. As teams grow and multiply, so do team identities and purposes. Those in one team can easily come to see those in another as outsiders and rivals. Cooperation becomes more difficult as their interests increasingly conflict.
When we examine the entrails of crises, persistently asking the question “why?” we often find that the root causes were well known at mid-levels of the company. Sometimes the actual crisis was predictable, even predicted, from what one individual knew – but for a variety of reasons no message arrived in the consciousness of someone sufficiently senior to take action. Frequently, key information known at mid-levels was spread among individuals who did not share it - so the information was never joined up. We have dubbed this the 'Unknown Knowns' problem.
We analyse causes of failure such as these by reference to factors such as culture, incentives, structural silos and the resulting non-communication of information. Risk, psychology and sociology inform the analysis, but we have long suspected that anthropologists could enrich the analytical framework – if only they were interested in the business world.
As a postgraduate level anthropologist turned FT journalist, some of Gillian Tett’s most perceptive writing has taken an anthropological look at business life. Her latest book, ‘The Silo Effect’, explicitly brings her anthropological training to bear on it. As you would expect of an experienced journalist, it is engagingly written.
Tett begins by introducing anthropology and summarises a few core anthropological insights. Three are crucial:
- Human groups develop ways of classifying and expressing thoughts, and these become embedded in their ways of thinking;
- These patterns help to entrench patterns of behaviour, often in a way that reinforces the status quo;
- These mental maps are partly recognised by group members but some parts are subliminal whilst others are ignored because they are thought “dull, taboo, obvious or impolite”, leaving some subjects beyond discussion.
But the outsider typically lacks crucial information that is available to an insider. Tett describes how anthropologists attempt to become ‘insider-outsiders’ with access to inside information whilst retaining the relative objectivity of the outsider. It is no accident that our methodology has much in common with what she describes. We face the same challenges: except that we also aim to help insiders to understand what outsiders can see when given access to insiders’ knowledge.
The balance of the book consists of case studies, written in Tett’s usual lucid style. Two of her studies of failure are built on her extensive knowledge of the financial crash of Noughties. She dissects how UBS, the Bank of England and the host of financial market regulators, experts and economists managed not to see the crash coming. A third tells how Sony, then a world-leader, reorganised itself into a series of separate business units, each with its own objectives. By creating what became silos, Sony lost internal cooperation and its way.
Tett tentatively develops her theme to suggest an anthropological approach to mastering silos, from the outside as well as from within. She begins by describing how, with advice from Robin Dunbar, Facebook has set out to build structural bridges of friendship and trust between what might become silos; and a culture that encourages experiments and cooperation across what might be frontiers in a culture that treats mistakes as opportunities to learn.
Tett’s second, contrasting tale tackles breaking down long-established silos. Her story concerns one of the most tribally structured professions: medicine. Structured around disciplines, there is a wasteful temptation for every doctor to try to apply their particular skill to your symptoms rather than beginning with an objective diagnosis and only then prescribing treatment perhaps by another doctor. Tett tells how a perceptive question led Toby Cosgrove, CEO of Ohio’s Cleveland Clinic to question and dismantle the Clinic's disciplinary silos to deliver care centred on the patient’s need for a dispassionate diagnosis before prescribing the most appropriate treatment.
But for me, Tett’s third tale was the most telling. She relates how a detached but interested outsider, a hedge fund, was able to deduce that JP Morgan’s Chief Investment Office was placing huge bets on credit derivatives – at a time when JP Morgan’s leaders and risk team were completely ignorant of what was going on under their noses, let alone the scale. The hedge fund, BlueMountain, profited from the insight when the London Whale breached. The episode cost JP Morgan more than $6 billion in losses on a series of holdings with a value Tett estimates at approaching $160 billion.
This story resonates with our experience. We regularly find that external analysis can identify organisations that seem blithely to be living on the edge of a cliff. As with real cliffs, it is rarely possible to predict when they will fail. But it is possible to predict why and with what consequences they will fail.
Leaders who seek what can be an uncomfortable foresight will usually have time to deal with the issues and avoid disgrace since consequences usually take time to emerge.
Astute long term investors can use such insights to avoid or improve vulnerable investments. And in those rare cases where the timing seems imminent, there may be opportunities to profit from another’s risk blindness.