What makes an engineer a Risk Engineer?

Thoughts on the distinction between an engineer working in the OHS or Risk Management field and a Risk Engineer – based on a paper I gave to the Risk Engineering Society conference in Brisbane, Australia on the 7th September 2023.

Summary

It is about forty years since the notion of risk engineering was introduced in Australia with the formation of the Risk Engineering Society of the then Institution of Engineers, Australia.  A few years later an engineer, in a letter to the editor, asked what exactly was it that risk engineers did or offered?  That question could still be asked.    It is contended that, taken as a whole, risk engineers have lost knowledge of the underlying science of risk through failing to embrace the subject in their research and education roles, in turn through a general lack of awareness that there is a science of risk.  A consequence of this lack of content and identity is that the difference between a person with a degree in engineering practicing in the safety and health or risk management fields and a Risk Engineer has been lost.  I give examples in support of this contention and make suggestions for a professional structure that would invigorate university education in risk and the profession.

……………………………………………………………………………………………………………………………………………………….

What makes an engineer?

The engineers I am talking about are people with undergraduate degrees in any branch of engineering or its associated science (such as eg. physics, chemistry, geology) and who are involved in the efficient design and operation of engineered systems.  We are people who apply the physical sciences to the benefit (mostly) of mankind.  These physical sciences are based on ideas associated with Newton, Kelvin, Maxwell and Einstein. 

By way of example, what makes an engineer a mechanical engineer? I’d say, the study of the application of solid and fluid statics and dynamics, strengths of materials, thermodynamics, metallurgy, chemistry and a solid dose of mathematics, including statistics.

Then what makes an engineer an electrical engineer? I’d say the study of much of the above plus the application of Maxwell’s equations.

And also, what makes an engineer an aeronautical engineer? I’d say the study of much of the above plus the application of fluid dynamics, thermodynamics, gas dynamics, stability and control, aircraft performance.

So, by extension, what makes an engineer a risk engineer?  I’d say the study based on the physical sciences of the propensity of engineered systems to produce probabilistic by-products in the form of unwanted adverse Outcomes and Consequences. 

In short, engineers are nothing much without the science behind us, are we?

What can the physical sciences tell us about risk?

The physical science principles on which an understanding of risk can be based were evident in the 1970s and they have not passed their shelf life any more than Newton’s laws have. 

As a senior lecturer in engineering I became involved in discovering these principles when asked to develop a science-based curriculum for the study of “safety”. It became my job to find what there was and bring it together into a coherent whole. It is true that I have, since the late 1970s, worked with this foundation science, both academically and in practice and developed aspects of it, but the foundation itself, the underlying principles of the science, is not fundamentally “mine”.  

In this search I uncovered authoritative source material that drew attention to four significant principles:

  1. Cause-effect thinking has been comprehensively discredited for objective use for a long, long time, initially by well-regarded philosophers. Cause-effect ideas have no role to play in a scientific understanding of risk because ’causes’ are incapable of objective definition. Similarly the term ‘accident’ is incapable of being defined objectively and is hence also not a candidate for use in a physical sciences understanding of this field. See the papers by Hume, Collingwood and Haddon.
  2. For nearly 100 years the rather obvious point has been made that damage always arises from a transfer of energy. See the papers by De Blois, Gibson, Haddon and Surry.
  3. The physical sciences of botany, zoology, geology and medicine have all understood the need to focus attention on the processes leading to the phenomenon of interest. In the absence of this they have all learned that research is stultified. The knowledge base of these physical sciences does not rely on mathematical expression, unlike that of physics. See the paper by Haddon.
  4. Risk arises when a damage process could occur. It is characterised by the fact that this is uncertain both in whether it will arise and if so how serious the resulting consequence will be. See the book by Rowe for a comprehensive and unequalled treatment of this subject.

So, one must conclude that we should be focussing our attention on damage process models that necessarily involve an uncertain loss of control of the potentially damaging properties of an energy source. Our understanding of this in any given case can be both described objectively in words (what is the risk of interest?) and numerically represented by the relationship between the uncertainty and the consequence value (how significant is this risk?). This numerical relationship expressed in a frequency vs consequence value graph (the risk diagram) has also been known for a long time. See the Management Oversight and Risk Tree (MORT) and also Rowe’s book.

Process models should not feel strange to engineers.  For example  Fault Tree Analysis is a logical statement of what could make an Event happen.  Event Analysis is a logical analysis of what could then happen to produce damage etc.  The Event that joins these two necessarily involves energy if the adverse consequences of interest include damage.  My explanation of this very simple point has left some engineers in disbelief, astonished, bemused and confused, but no more astonished than I at their response.  Why is this so?

It will be evident after only a little reflection that this foundation in the physical sciences allows for the objective definition of basic terms and estimating Risk as a real number. It also contributes to an objective and structured understanding of Risk Factors and Risk Control Measures. And, crucially, because of this, allows us to define a science-based curriculum as a foundation for calling ourselves Risk Engineers.

Is this teachable?  Yes, eminently so.  I have taught many hundreds of engineers over the years.  One commented that this should be a compulsory subject for all engineers.  Another that his engineering colleagues at an overseas university were excited to see a different view of risk, that it has changed his frame of mind in both life and work.

This science has been overlooked. 

Over the years the literature has been dominated by the journal Safety Science and this in turn by authors from the fields of psychology, sociology and philosophy whose efforts have satisfied what I see as a deeply rooted need for the subject of “accidents” and their “causes” to be obscured by complexity.  I have no wish to demean the significant contribution of these authors in their fields, but these social subjects do not have the character of the physical sciences that are so fundamental to the work of engineers.  Their efforts have not succeeded in achieving even a consensus around terminology.  A further journal, that of the Society of Risk Analysis has been similarly unsuccessful in achieving even a vestigial definition of the subject area.   Even the meaning of the term Risk Analysis advanced by the SRA is quite at odds with how an engineer would typically understand this term.

In short, engineers as proponents of the methods of the physical sciences have been absent from leadership roles in the meanderings of this field over the last 50 years.  This absence is notable also in universities and therefore in the content of undergraduate courses, post graduate specialisation courses and in research.  

This absence is quite extraordinary in my view.  It was engineers faced with ensuring mankind was not obliterated by a failure in the testing of ICBM launch systems, namely an actual unwanted launch, that successfully developed and applied risk analysis and then also applied it to understanding risk in the moon landing.  Chemical engineers, too, took these methods and developed a whole science for quantifying risk around petrochemical plants.

A consequence of this lack of content and history outside aerospace and chemical engineering is that the difference between a person with a degree in engineering practicing in the safety and health or risk management fields and a risk engineer has been lost.

Is there a problem with all of this?

I believe there is and I will use some case studies of high-profile examples to illustrate both the problem and the way in which the engineering profession could, arguably should, step up to what is now an empty stage.

  1. My first case is the B737 MAX, an aircraft that is fundamentally unstable in pitch and requires an automated pitch control mechanism, which is by definition a mission-critical system.  However, this system relied on input from a single angle of attack vane.  No risk-literate engineer would fail to see the problem with this, yet both Boeing’s engineers and those of the Federal Aviation Administration as regulator did fail.  This case tells me that professional engineers of any discipline need to be risk literate.
  1. My second case is one in which the (very large) main cargo door of a cargo plane opened in flight.  Two such cases have occurred relatively recently on the same type of aircraft.  This problem also arose with the DC10 passenger aircraft of the 70’s and 80s and the design problem was solved then.  This case tells me there is a role for both regulators and universities to ensure design lessons from one era survive into the current era.
  1. Thirdly, in 2015 a large tailings dam in South America failed, the Vale dam. Some time after this a senior mining engineer told me he and colleagues around the world were left with the sense that their belief in the capability of safety practitioners to guide their organisations in the management of risk was misplaced and that they felt massively let down by them.  Like my first case, this also tells me there is a need for risk literate engineers in every engineering discipline.
  1. The point of interest to me in the Space Shuttle Challenger case is that engineering advice not to launch the vehicle was overridden by management for opportunistic reasons.  I have recently come to understand that NASA’s space vehicle engineering facility is located in a remote part of the country well away from the main facilities and that it is overlooked in lists of NASAs more glitzy sites.  This reinforces my perception that the public profile of professional engineers in general is minimal.  We have an image problem.   

I conclude that, 

  • all engineers need a basic understanding of risk science, just as they do of Newton’s laws
  • we have a need for a stronger role for professional bodies and universities
  • we have an image/reputation problem

Compare this with the medical profession.  

Firstly the science. Medical science is founded on disease process models and their classifications are known around the world.  Disease names are often in Latin to facilitate international understanding.  The language used to explain different diseases is the same, only the story told using it is different for different diseases.   Then, all medical undergraduates are taught this science and as GPs how to understand and  manage generally arising illnesses and when to refer their patients to specialists.  

Secondly, the medical professional bodies create a strong specialist professional structure and universities provide the content and the research to develop it.  

Thirdly, they do not have an image problem.  Is it not true that knowledge and ability lead to reputation and reputations gained lead to influence?  

Now, we do not have a recognised body of knowledge with which to educate either GPs or specialists.  We don’t define a specialist risk engineer as someone who must have a deep knowledge of the science of risk as applied to their chosen specialist field.

How would Risk Science help our profession?

I have seen recently in LinkedIn posts what I see as confusion about the meaning of such fundamental terms as Hazard, Risk and Consequence.  This was also the case in the1970s (I was there).  The fact that nothing has changed in 50 years illuminates by its lack the essential role played by universities in developing knowledge and passing it on from one generation to the next. In my day a body of knowledge was called a curriculum.  I mentioned the curriculum of my undergraduate degree, in each item of which there was a universally recognised content with an accepted and largely unchanging scientific foundation.

My view is that damage process modelling, risk literacy and reliability mathematics should be in every engineering and applied science undergraduate curriculum. Every practicing engineer is, in effect, a GP as the examples above have illustrated. I would define the required engineering undergraduate curriculum as follows.

CurriculumTopics
A)  Process models for adverse ConsequencesEnergy-based processes, non-energy based processes, Mechanisms, Outcomes, Consequences Generalised time sequence model
B)  RiskDefinitions, characteristics, risk parameters, risk factors, risk analysis, risk estimation, risk control principles
C) StatisticsDescriptive, probability, reliability mathematics

When Risk is properly defined as something other than a synonym of chance and a very minor amount of logic applied to this, not only can its typical character be discerned, as MORT showed, as a relationship between Frequency and Consequence Value but the relationship between Frequency, Probability and Exposure can be clearly recognised, which in turn allows for Risk to be Estimated as a numerical value with the units of (guess what?) $ per year – the same as Loss!  We see that the only difference between Risk and Loss is due to Exposure.

When this is combined with a generic damage process time model a basic theory of Risk Analysis becomes evident.  This provides an otherwise missing theoretical base for both FTA and EA, to the benefit of both.  Those of you who understand the proper use of the Bow-Tie model (as distinct from its cause-based misuse) and who have read my work will see where the Bow-Tie model came from: my lecture theatre in the late 1970s.

A working knowledge of statistics is essential to sensibly populate Risk models with probabilities and to recognise also the underlying uncertainty associated with the probability estimates themselves.  A good grounding in the well-developed mathematics of Reliability is essential.

Hence, we have the structure of a curriculum and can identify the topics that give it substance.

How can we ensure that in another 50 years we are not in the same position as we are now?

We need to promote a professional structure to Engineers Australia and use this as a foundation for undergraduate and post-graduate education.  At engineering undergraduate level, courses should include an awareness at least of Curriculum items A and B above and reliability mathematics (item C) should be incorporated in the existing mathematics syllabus.  This has more than one benefit as it is also a necessity for the rational management of maintenance.

Professional Risk Engineers, as the Specialists who provide the detailed back-up needed by the GP engineers in any given industry, should achieve their status by post graduate qualification.  

There is at least one part of the world in which I have worked where there is a government requirement that a person who offers their advisory services in the field of safety/risk must have a degree relevant to the industry in which they wish to practice and a minimum number of years (I recall 10 years) experience working within that industry.  Such rules underline the recognised significance of the work and promote a professional approach at all levels.  

If Engineers Australia take action on establishing curriculum requirements they will create a market to which universities will respond.  This engagement with universities will stimulate research and education, ensuring that in another fifty years time we are not still wondering how to spell “Risk” and how to distinguish ourselves from safety practitioners and risk management practitioners.

I conclude that the ball is in the court of the Risk Engineering Society.  If the Society kicks the ball back into play and persists with Engineers Australia it will do much for both the profession and for the society we serve. 

References

DeBlois L. (1926), Industrial Safety Organization for Executives and Engineers. McGraw-Hill Book Company, New York.

Gibson, J.J. (1961) The Contribution of Experimental Psychology to the Formulation of the Problem of Safety – A Brief for Basic Research. In: Jacobs, H.H. et al., Behavioral Approaches to Accident Research. New York: Association for the Aid of Crippled Children.

Haddon, W. (1973) Energy Damage and the Ten Countermeasure Strategies. Journal of Trauma, 13(4), 321–31. 

Hume (1739), Scottish philosopher:  “we are never able…to discover any…necessary connection..which binds the effect to the cause..”  (see Wikipedia entry)

Collingwood,  R.G. (1938), On the so-called idea of causation. Proc of Aristotelian Soc. 38. 85-112

Rowe, WD (1971) An Anatomy of Risk, John Wiley & Sons Ltd, New York

Surrey, Jean (1971 reprint) Industrial Accident Research: a human engineering appraisal, Labour Safety Council, Ontario, Canada

Leave a Comment