The enduring nature of Murphy’s reputation is fascinating in its own right. I rather like the idea that the essence of risk engineering can be expressed by a household name. Whether what we know as his law is in fact to be attributed to Edward Murphy (the Wikipedia entry suggests a number of sources for the idea: https://en.wikipedia.org/wiki/Murphy%27s_law. seen 9/5/2020) is beside the point – that is the name people know it by, so let’s go with the flow rather than argue aetiology. Edward’s son Robert recalled his father saying of another person something to the effect of “If there’s more than one way to do a job, and one of those ways will result in disaster, then he will do it that way.”
The Wikipedia entry (ibid) refers to research showing statements of a related nature being made as early as 1866:
Mathematician Augustus De Morgan wrote on June 23, 1866: “The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough.”
and then a little later:
“… a report by Alfred Holt at an 1877 meeting of an engineering society.
It is found that anything that can go wrong at sea generally does go wrong sooner or later, so it is not to be wondered that owners prefer the safe to the scientific …. Sufficient stress can hardly be laid on the advantages of simplicity. The human factor cannot be safely neglected in planning machinery. If attention is to be obtained, the engine must be such that the engineer will be disposed to attend to it. ”
The common factor here is to be found in the words “if we make trials enough” and “sooner or later”. Edward Murphy seemed to be blighted by a person who could be relied on to make it sooner, not later. Risk engineers understand they all refer respectively to Exposure and Error Probability.
This is simple enough, but underlying this is the elephant in the room: Holt’s “the human factor” and Edward Murphy’s “if there’s more than one way…he will do it that way”. The idea of the perfect person has a history that predates these sources. It goes back to the influence of the legal system in the industrial revolution, something of which I will write in a future article. If you think the idea is not relevant today, well – consider these cases.
Sometime in the late 1970s a coach taking skiers up to Falls Creek ski fields in Victoria from Mount Beauty (in the valley beneath) stopped to negotiate its way past another similar coach coming down the mountain. The sealed surface was narrow. The descending bus had no room to move further to the left because of the mountain side. The only room for movement, little though it was, was that for the bus ascending the mountain. The driver did what he could to make room for the descending bus. The shoulder of the road subsided and the bus tilted and slid. As the wheels got to the steep hillside, the bus tilted to the point where it rolled and rolled and rolled, eventually coming to rest some distance down the hill. Windows shattered, people were thrown out and chaos reigned. By evening some injured had yet to be treated. It is the nature of Risk that one may be absolutely certain this was not the first time that drivers had experienced this problem. Almost certainly it occurred every day of the ski season.
The response of those on the bus (a month or two later), heard by me first hand, was an annoyance with the driver because he had moved too far to the left (is that a value judgement I hear or a statement of the obvious?) and hence ‘caused the accident’. I don’t know whether he was prosecuted. I understand (unconfirmed) that subsequent to this changes were made to the road (perhaps increased width, edge barriers and the like).
On the 5th of June 2007 a truck collided with a train on the Kerang, Victoria, level crossing, killing 11 people and injuring others. See https://en.wikipedia.org/wiki/Kerang_train_accident (seen 12/5/2020).
A 1995 crash at the same site had resulted in a fatality and there had been “a number of near misses in the months leading up to the 2007 accident”. I have a first hand report from a railway labourer who worked that district and he told me that the maintenance crews were aware of the design problem (the road approached the rail at an angle and they had seen some ‘near misses’ when working there). They told their supervisor of this, but he appeared unimpressed and my informant thought it unlikely that he would pass on the information to his manager. This apparent lack of attention to reports is mentioned also in an article in the Age newspaper. “… calls for a major inquiry … seemingly forever from engineering experts and people who have lost loved ones in similar circumstances, have mostly gone unheeded”. In the same report is an example of the theory of perfect people in practice: “Transport Minister Lynne Kosky made clear when visiting the crash site yesterday that a big part of the Government’s response would be a renewed focus on driver behaviour.” In other words, the crossing would work well if only people behaved as intended and expected. On this occasion the driver of the truck was acquitted of all criminal charges by a Supreme Court jury. https://www.bendigoadvertiser.com.au/story/1853860/upgrade-victorias-level-crossing-infrastructure-says-coroner-in-kerang-rail-disaster-inquest-finding/ (seen 30/5/2020)
Subsequent changes to the crossing included improvements to warnings to drivers and a programme at level crossings to improve line of sight problems.
See: Rail Safety Investigation Report No. 2007/09, Office of the Chief Investigator, Transport and Martine Safety investigations
On 22nd February 2016 a motor coach from Ballarat hit the low (3m clearance) Montague Street overpass bridge in South Melbourne, peeling back the roof and injuring a number of passengers.
See https://www.theage.com.au/technology/accident-waiting-to-happen-20070609-ge5394.html (seen 30/5/2020).
https://en.wikipedia.org/wiki/Montague_Street_Bridge (seen 12/5/2020)
The driver was later prosecuted and jailed for over 5 years but a year later his conviction was overturned. I can only presume that at the time of the original prosecution what it is popularly called the outrage factor influenced the legal process and later, with the passage of time, the legal system was no longer influenced by this. According to the Ballarat Courier newspaper, at the time of the Occurrence, it was the 226th time the bridge had been hit! There is even an unofficial website counting the hits. I recommend visiting this Wikipedia site. It is both amusing and depressing but it contains a unique collection of data that makes meaningful risk estimation entirely feasible.
In each of these three cases it is possible to see a reliance on (by the road designers) and assumption (by the legal system) of ‘proper’ behaviour of the road user. It’s an assumption that perfect people are normal, the corollary of which is that imperfect people are abnormal. There’s an expectation that in the event of a crash the courts will tend to find the person involved abnormal and therefore negligent and in so doing uncover the fact, not noticed prior to this even by their employer, that some anomalous people are, in fact, not perfect. Really? Is this a surprise? I would argue the opposite – that imperfect people are normal and perfect people are a figment of our collective imagination.
These cases are just three that come to mind. I’d ask you to consider how many alternative cases from around the world could have been used to illustrate the point I wish to make. I know, for instance, that buses rolling down mountain sides are not at all uncommon. Search the WWW for yourself.
Each of these cases concerns road design, but the problem they illustrate is not confined to road design. In our profession, road design is the province of civil engineers. Do we therefore blame them for this? I don’t think we should, not only because I have long-standing dislike of ‘blame’ but also because we, as engineers, work within a very complex social and organisational system. Blaming engineers would overly simplify our understanding of the influences that lead to situations such as each of these.
As an illustration of these complex system interactions, the reason for the Montague Road bridge clearance being as little as it is are explained in the Wikipedia entry (ibid). After construction (1914) flooding of the road under the bridge was a problem (let’s assume capable of engineering prediction and maybe also control) such that it became unusable by pedestrians. In 1934 the responsible council solved that problem by the engineering solution of raising the level of the road surface by two feet, thereby creating the current abnormally low clearance of three metres. The fact of the low clearance is announced to road users by “prominent signage”.
There’s a further problem in that the bridge then carried trains (and now a light rail track) and the bridge itself could be damaged by impact with possible adverse effect on these services.
A risk-literate engineer would consider the possible failure Mechanisms of this strategy:
- the driver does not know the height of their vehicle;
- the driver does not see the sign owing to attending to traffic conditions;
- the driver knows about the bridge but is distracted – their attention is elsewhere (phone call, crying baby, anxiety about a personal problem etc. etc.).
The probability of any one of these is influenced, for example, by the possibilities that a driver is entirely new to the area and has never seen a bridge of non standard clearance before and that a driver is unaccustomed to driving high and wide vehicles, etc..
Looked at objectively rather than bureaucratically (we put a sign up) or legally (it’s easy to blame a driver), we would see that the potential for a bridge strike with serious consequences is self-evident to the responsible party and so it is therefore only a matter of probability, in other words a matter of when rather than if (recall Exposure – “if we make trials enough”). It is clearly not in the death and taxes category. The probability is that expected of human error in this situation: otherwise normal traffic and drivers operating in routine mode. For drivers familiar with this low clearance, it would also be a routine task. Enquiries would reasonably consider what proportion of traffic in general is tall vehicles? What proportion of these or their drivers are from outside this area? A little investigation illuminates the problem – the risk factors that influence probability and Consequence Values. The answer will be somewhere in the range of expected error. I’d tentatively put it in the range of 10-2 to 10-4 [unaware approaches to the bridge per vehicle]. However, not all of these will hit the bridge as at the last moment there is the possibility of recognition, the ‘goodness me’ moment and effective braking. It is only when all this fails that we have a collision. The collision in the case of the coach was spectacular and so reached the newspapers.
Ignoring the reality of the situation created all those years ago and the possible increase in vehicle heights and traffic volumes (Exposure) in the intervening time, the simple reaction is to plod through the legal process of prosecution (dangerous or reckless driving perhaps?), judgement, appeal etc. News readers will announce that the police are investigating the cause of the accident and the unspoken sub text is that we can then all rest easily because this will be found and removed.
My point in drawing attention to these three cases is to use them as a springboard for questioning what the role of the Risk Engineering Society could perhaps be in promoting a wider community understanding of the realities of risk and the needs of its management. I think the wider community here includes the engineering profession in general, those who are not part of this society. Of course it also includes, in the context of these three cases, local government, the police and the legal system. The tragedy at Kerang draws attention to the needs of organisations to make use of the information that is present within them (“near miss”) but largely unrecognised, in other words management of risk practices as whole.
My experience as a practitioner in this field over some 40 years is that if it is possible to name one major obstacle to improved management of risk it is simply this – not understanding the subject of risk. Many times I despaired of making progress in my client organisation, so deeply ingrained and immovable was the unhelpful preconceptions of everyone about what ‘safety’ or even ‘risk’ was. These preconceptions were expressed practically in the role and activities of the safety department, which I regret to say were very often ineffectual. More recently (in my timescale at least) they have given rise also to the parallel activities of the risk department. Picture an organisation in which in head office there is a risk management department and in the real world of operations there is a safety department. And picture these two departments operating without reference to one another and basing their work on two entirely different concepts. It’s the stuff of comic opera. Reality is stranger than fiction.
Furthermore, in organisations employing professions with a science base (engineers, geologists, chemists etc.) there was, in my experience, almost always a conceptual gulf separating these professions from the risk or safety people. Especially, I mostly found in the case of engineers, the politely unexpressed desire to have no involvement at all with the safety/risk people was easy to sense. The reason, I feel sure, is that the latter were seen to be akin to snake oil salesmen – all evangelism, no substance/science. I well remember attending a talk by the chief safety person from a well known major technology company from the USA, who was visiting Australia. He said that he took every opportunity to spend a few uninterrupted seconds with the senior engineers who ran the company to promote what he did and the values he represented. His favourite place was the lift, as his audience was captive there. How very depressing, but also how illuminating and illustrative of the gulf of which I write. That was over a decade ago and times change. With the overwhelming attention given to safety and risk assessment by legislation I have more recent experiences to indicate that similar senior engineers now believe that there must be something in what the safety department think, given the legislative promotion and perhaps they should rely on them and their methods for guidance. The last occasion I became aware of this was after a disaster of biblical proportions involving a particular engineer’s profession. He shook his head and told me he and his colleagues all over the world felt they had been let down by the safety people.
I am picking up here on a point I made in the article on Murphy’s Law – “This means risk engineering is something all designers should do and our role as professed risk engineers is to both promote the science and application of the subject and assist designers ourselves.” I believe the task of promotion is actually greater than attending to the needs of designers, it applies to science-based professions in general as well as to managers, accountants and even lawyers. My experience with the legal profession is that they find the concept of risk and uncertainty hard to understand, coming as they do from the position of the perfect and reasonable person and the adversarial system that is the basis of legal practice.
To do this, we need a jointly held strong scientific foundation for our theory, our terminology, our thinking process and hence what we talk about and how. We need to spend time on this and hold scientific meetings (just as physicists do), seminars and conferences on this topic alone. When we have developed a sound and shared knowledge base, we can confidently move outwards and provide others with the concepts and vocabulary to make sense of their own profession wherever uncertainty is to be found. This is everywhere other than death and taxes, so the task is worth doing. If we don’t do this, who will stop the history of the world being punctuated by disasters of biblical proportions?
(This article was first published in the Australian Risk Engineering Society’s newsletter in May 2020)