Definition
The word error has different meanings and usages relative to how it is conceptually applied. Sanders and Moray (1991, p.25 [16]) define it as “something that has been done which was not intended by the actor, not desired by a set of rules or an external observer, or that led the task or system outside its acceptable limits”. Reason (1990, p.5 [15]) sees an error as “a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some chance agency”. Woods, Johannesen, Cook and Sarter (1994, p.2 [18]) define error as “a specific variety of human performance that is so clearly and significantly substandard and flawed when viewed in retrospect that there is no doubt that should have been viewed by the practitioner as substandard at the time the act was committed or omitted”.
Most agree on fundamental aspects of error, seeing it as the result of something that people do or intend to do that leads to outcomes were different from what they had expected. Therefore, to be consistent with these views on error, it would be defined as an action or decision that results in one or more unintended negative outcomes (Strauch, 2004, p.21 [17]).
Theoretical frame
Theories of Error
Freud
Freud and his students believe that error was a product of the unconscious drives of the person (Brenner, 1964 [5]). Those who erred were considered to be less effective and possibly more deficient than those who did not. The concept of “accident proneness” (Greenwood & Woods, 1919 [7]) was influenced by Freud’s view of error, considered certain people to be more likely to commit errors than others because of their particular traits. However, the recent studies found serious methodological deficiencies in the initial studies upon which much of the later assumptions about error proneness had been based. Lawton and Parker (1998, p.656 [10]) conclude in their article, “…it proved impossible to produce an overall stable profile of the accident-prone personality”.
Norman
Unlike Freud, contemporary error theorists consider the setting in which errors are committed when examining error. For instance, Norman (1988 [11]) studied both cognitive and motor aspects of error and differentiated between two types of error: slips and mistakes. This classification system is also known as the hybrid classification.
Slips are actions errors or error of execution that are triggered by schemas, a person’s experiences, memories and organized knowledge. Slips are unintended failures of execution that occur almost everyday in our lives because attention is not fully applied to the task in hand. A few of us would have encountered something similar to pouring orange juice instead of milk into our cereal bowls while reading the newspaper. The act was definitely not intentional, but it was not attended to because attention was focused on the newspaper.
Mistakes are errors of thought in which a person’s cognitive activities lead to actions or decisions that are contrary to what was intended. Mistakes can result from the shortcomings of perception, memory, cognition and decision-making and result in the failure to formulate the right intentions [(Wickens & Hollands (2000 (bibcite 2000)))]. To Norman, slips are errors that logically result from the combination of environmental triggers and schemas. Applying the lessons of slips to design, such as standardizing the direction of rotation of window cranks in automobiles, would reduce the number of environmental triggers and therefore the likelihood of slips.
A real-life example is the crash of Singapore Airlines Flight SQ 006 that was ready for departure at Taipei’s CKS Airport [ASN Database (2000 3)]. The plan for the take off role was executed as planned, however, due to bad weather and low visibility, the pilot had deliberately taxied the Boeing 747 into the adjacent runway, which was closed for maintenance. While the aircraft powered its way through the runway, it crashed into the maintenance vehicles and caught fire.
Rasmussen
Jens Rasmussen (1983 [13]), expands the cognitive aspects of error that Norman and others described, and defines three types of operator performance and three types of associated errors: skill-based, rule-based and knowledge-based. It is widely known as the skills, rules and knowledge (SKR) classification scheme. The SRK model describes three different levels of cognitive processing that might be used by an individual during task performance. Rasmussen (1993) [13]) has reinforced this theory by justifying that the human operates at one of the three levels, depending on the nature of the task and the level of experience with the particular situation. That is, when information is first perceived and interpreted in the processing system, that information is processed cognitively in either the skilled- based, knowledge-based or rule-based levels, depending on the individual’s degree of experience with the particular situation.
Skill-based performance errors are similar to Norman’s slips, largely errors of execution. Individuals who process information at the skilled-base level are those who are extremely experienced with the task. They do not have to integrate informational cues and then interpret them but processing is done in a sub-conscious level. Reactions to cues are rather automatic just like driving where one can maintain a conversation on his wireless mobile while still paying heed to pedestrians, other vehicles and traffic lights. Pure stimulus-response associations developed at a neurological state, govern performance in the skilled-base level [Wickens, Gordon & Liu (199819)]. Errors in this level are usually errors of execution where a person deviates from the normal course of action and performs an automatic behaviour that is associated with another cue because his attention deviated on another course unknowingly. A skilled-base error is an action chosen by the operator but not in accordance with the operator’s intentions. It has similar properties to the substitution error type from the discrete classification scheme.
Rule-based performance error is more advanced than skill-based, it applies rules to situations that are similar to those operators have encountered through experience and training. Rule-based performance errors result from the inability to recognize or understand the situations or circumstances encountered. Individuals who perform at the rule-based level are those who are familiar with the task in hand but do not possess the wide experience to perform the task at a sub-conscious level. When cues are interpreted and recognized for meaning, the information is then matched with previous experience and then the appropriate actions are decided upon from stored rules in the memory. Errors made in this level often result from a misclassification of the situation and an application of the wrong rule. Rule-based error can also occur from the failure to recognize a familiar pattern because a situational change masked the normal know-how of the task [Wickens, Gordon & Liu (199819)]. This has often been the case of helicopter pilots who tend to engage the cyclic lever instead of the rpm controller when a loss of power occurs. A misclassification of the situation occurs and often results in mast bumping.
Rasmussen maintains that the highest level of performance is knowledge-based. Rather than applying simple tasks and rules to situations that are similar to those previously encountered, the operator applies previously learnt information, or information obtained through previously experiences. Knowledge-based performance errors result from shortcomings in operator’s knowledge or limitations in his/her ability to apply existing knowledge to new situations. Individuals, who operate at the knowledge-based level, do not have stored rules from previous experience to apply on and thus, intelligent problem solving is required. After individuals assign meaning to what has been interpreted, they then process with their working memory to identify what is happening. Wickens. et. al., [19] explains that mental models are often used to evaluate an action plan and furthermore, extensive analysis of the situation and memory retrieval are used to plan the course of action. For instance, a student pilot with limited flying experience might be able to land an aircraft with previous knowledge he has. However, it will be virtually impossible for the student pilot to perform the landing in similar conditions if he experiences a failure in his navigational equipment. This is because he has neither faced such a problem nor has he been trained to react to such a situation.
Reason
Slips and mistakes offer clear distinctions between two kinds of errors we are prone to making in our daily lives. However, they are not precise enough for the classification of errors in operational environments such as aviation. However, James Reason (1990 [15 has further classified slips and mistakes into distinctive categories such as capture errors, mode errors, lapses, ruled-based mistakes and knowledge-based mistakes.
James Reason (1990 [15]) enlarged the focus of earlier definitions of errors and further distinguished basic error types. He also distinguishes between mistakes and violations, both are errors of intent-mistakes result from inappropriate intentions or incorrect diagnoses of situations. In his view, he doesn’t necessarily consider violations to be negative.
Different Types of Slips & Mistakes
Capture errors refer to frequently performed tasks that take over the correct intended action. Wickens & Hollands [20] clearly define capture errors as resulting from two primary reasons: (1) the intended action and the resulted action share somewhat characteristics that are closely related: and (2) the action sequence is resulted from automated behaviour and therefore, not closely monitored by attention. The example stated earlier about pouring orange juice instead of milk is a capture error by itself.
The next is description error, which simply gathers errors where correct actions are applied to the wrong things [Norman (1988 11)]. In early days, pilots of the Fokker F27 aircraft were prone to committing such an error due to flaw designs of the brake and rudder pedal. The brakes to the wheels are situated just above the rudder pedals and pilots, in the notion to apply brakes after landing, frequently applied the rudder pedals instead.
Where slips represent carrying out an incorrect action that is different from the intended one, lapses represent the failure to carry out an action at all [(Wickens & Hollands (2000 (bibcite 2000)))]. Reason (1997) describes lapses as actions resulting from plain forgetfulness, and explains that lapses occur too frequently in maintenance or installation procedures, where an omission of a single step or task can be critical.
A similar category closely related to slips which also consists the memory failure characteristics of lapses is called mode errors. Mode error is one where the action is correctly executed and planned. However, the action is appropriate in one particular mode but when applied to a different mode, the intended task cannot be achieved. For simplicity, imagine stepping on the accelerator of a car with the intention to move forward. The action will be flawed if the gear is in the “reverse” mode [(Wickens & Hollands (2000 (bibcite 2000)))].
Reason [15]) has further sub-divided mistakes into two further categories of knowledge-based mistakes and rule-based mistakes. Mistakes primarily refer to the proper execution of a flawed plan. Knowledge-based mistakes then are the kinds of errors that would occur during the failure to correctly appreciate a situation or make the appropriate decision. This is due to the lack of knowledge or expertise to understand the situation, causing uncertainty to arise. Rule-based mistakes are made with confidence but a different procedure is chosen because of a lack of situational assessment or that the wrong action was chosen. These two sub-categories of mistakes are very similar to the SKR model of Rasmussen.
Accident Causation Model
As Reason’s categorization of errors corresponds to Rasmussen’s performance levels, he labels the “blunt end” of a system, in committing what he defines “latent errors” within a system. Operators, located at the “sharp end” of a system, commit what he calls “active errors” that directly lead to accidents.
Reason originally produced an accident causation model to illustrate how company-related defenses and resident pathogens affect safety. To Reason, even though managerial and design errors are unlikely to lead directly to incidents and accidents, an examination of human error should assess the actions and decisions of the managers and designers at the blunt end at least as much, if not more, than the actions of the system operators at the sharp ends. He describes that erroneous accidents or incidents occur as a result of an ‘organisational’ flaw where four levels of human failures are identified. The first of the four levels is unsafe acts of operators which ultimately led to the accident. Commonly identified in aviation as operator error, most accident investigations have focused their efforts in this area and consequently, most causal factors are surfaced here. After all, causes of accidents are typically the actions or inactions of the frontline operators themselves.
Reason [15] described three additional levels of human failure within this concept of latent failures. The first involves the condition of the frontline operator as it affects performance. Referred to as Preconditions for Unsafe Acts, this level involves human biological conditions such as mental fatigue and stress and also poor communication and coordination practices. However, it is imperative to note why communication and coordination break down actually occur? This is perhaps where Reason’s work departed from more traditional approaches to human error with the third level of human failure in Reason’s model.Reason’s model did not halt at the supervisory level either; this was identified as a result that the organisation’s management itself can impact performance at all levels. Tt is imperative that if the accident causation rate of humans is going to be reduced beyond current levels, investigators and analysts must examine the accident sequence in its entirety and expand all possible factors . Finally, causal factors within the organisational levels must be addressed to identify the root cause of any accidents or incidents.
His identifications of the role of both design and managerial antecedents of error have influenced contemporary treatments of error. In reality, Reason’s accident causation has revolutionised and provided a guiding framework on human errors. It is a simplified and classical theory with few details on how it is being applied in a real-world setting. It would be feasibly fortunate if such failures can be detected early and corrected before a disaster occurs For example, ICAO has formally adopted Reason’s model of error for its member states to facilitate their understanding of human factor issues and aviation safety (ICAO, 1993 [9]).
Error Investigation
Error investigations can have many objectives and purposes, depending on the investigator’s perspective. Senders and Moray (1991 [16]) admit that investigations can be conducted for a variety of purposes. “What is deemed to be the cause of an accident or error”, “depends on the purpose of the inquiry” (Senders & Moray, 1991, p. 106 [16]). Rasmussen, Pejtersen and Goodstein (1994 [14]) contend that investigators examine system events according to a variety of viewpoints.
The objective of an error investigation should be to mitigate future opportunities for error by identifying the critical errors and their antecedents, and eliminating them or reducing their influence in the system (Strauch, 2004, p. 25 [17]).
Error Prediction
If a system is used for a specific task by a number of different people, a variety of errors will occur. In principle, if we knew all the antecedent conditions of error, we could predict the timing and form of errors by theoretical or statistical analysis of the appropriate learning curves of a human being. However, in practice, because there are so many possible causes of error, and relationships between causes are complicated, we cannot predict timing with any degree of precision (Senders & Moray, 1991 [16]). The considerable method to predict is to estimate the probability of human error for a wide range of actions, referred as error rates. Although error rates can be predicted in principle, unfortunately, so far nobody is really sure how reliable these estimates are.
Error Management
An anonymous Latin saying, “To err is human”. Everyone makes an error from time to time. Complete elimination of human error is as impossible as complete elimination of machine failure. What the best organizations can do is to manage error effectively, decreasing the probability of errors and minimizing their consequences (Helmreich & Merritt, 1998 [8]).
Pérezgonzález (2007, pp.76-77 [12]) suggests that an error-free behavior or system is non-creative, which means to err is also an opportunity to learn and redesign the systems or procedures for better efficiency and safety. Therefore, managing error means the using all available data to understand the causes of errors and taking appropriate actions, including changing and improving policies, procedures, and special training programs to reduce and minimize the consequences of those that do occur.
Aviation accident investigators are taught to look at an incident/accident from multiple perspectives rather than having a fixed framework to start from. Framework refers to a conceptual structure of ideas which can be used by investigators as a tool to begin an investigation (Merriam-Webster Online Dictionary, 2008[23]. However, frameworks limit the evidence that might be important as it only accepts inputs that are associated with the framework.
Perspectives
Today, many human error models and taxonomies exist to construct a likely scenario of an accident.
A human error approach to aviation accident analysis: The human factors analysis and classification system (HFACS) book by Wiegmann and Shappell (2003) offers different points of views of the many human error models investogators nowadays are using in 6 different categories.
Cognitive Perspective
The cognitive perspective is said to be the most popular framework by investigators and analysts but let's use the term investigators because they are the ones doing the analysis. Merriam-Webster Online Dictionary (2008)[23] describes it as of, relating to, being, or involving conscious intellectual activity (as thinking, reasoning, or remembering).The structure of this approach revolves around the assumption that the pilot's intelligence can be conceptualised as basically an information processing system. This means that once a stimuli from the environment makes contact with our human senses, the stimuli prompts a series of mental operations culmination in a response.

(Wiegmann & Shapell, 2003 [23], embedded from Wickens' Model 1988 [21]).
The model depicts how information is process through different stages starting from the stimuli through to the response. From the model, the environment provides stimuli to our five human senses. Examples of stimulus are sound waves or light colour patterns which are changed into neural impulses and stored temporarily into the short-term sensory store. If adequate attention is given to the stimulus, information from the short-term memory store will be recalled and then compared with the long-term memory store which held previously similar patterns to create a mental representation or existing framework of the current situation.
Thus, individuals now decide on the information they hold to execute a response or just put the action on delay until further need.
Here is an example of how the model works. Imagine something significant is occurring such as triggering the smoke detector in the kitchen. Specific actions or multiple actions are needed to avoid further ‘disasters’. When this happens, information would pass through to the response execution stage where selection of suitable motor programmes takes place enabling an individual to switch off the alarm as the first move. However, the process does not end there as there is a feedback loop which is monitored just in case the action which stopped the alarm does not fully solved the problem which triggered it in the first place which then would require and modification of actions until the situation is resolved.
Short-Term Memory
This term refers to information which is stored for a short time and then forgotten. The number of information we can process at one single time is limited and usually only for a few seconds. One good example is usually reading a telephone number, then forgetting it the next moment and having to read it again before making the final dialling sequence.
The method of recall and made up of a number of sensory stimuli which can affect the length of time that information is preserved. For example, iconic memory makes use of images and is usually hold onto for about 0.5-1.0 seconds (Campbell & Bagshaw, 2002). Echoic memory is made from sounds that associates itself with an object or action to be named and retains information for 2-8 seconds (Campbell & Bagshaw, 2002 [6]). The working short-term memory can preserve information for 10-20 seconds.
The short-term memory’s maximum capacity is about seven as a general rule of thumb. It is referred to as the “magic number seven”(plus or minus two) because it can only hold around seven chunks of information. This rule however tells us very little about how big or small chunks are in order for this rule to be applied.
The numbers 1 2 3 4 5 6 7 are seven separate chunks if a young child is taught for the first time. Each number would be remembered separately. For adults like us who are well versed in numbers, 1-7 would probably be a single chunk.
Current studies show however that it is not entirely up to the number of chunks but how long it takes for us to say the words out (form of acoustic). It appears that information like this can be hold in the working memory if you can say it in 1.5-2.0 seconds. For that, slow speakers are at a disadvantage (About Memory, 2003 [1]).
Ergonomic Perspective
The ergonomic perspective is also known as “systems perspective”. This approach suggests that humans are rarely ever the sole cause of an accident but associates with a complicated relationship from several factors including interactions with individuals, their machineries and their general work environment.
The systems perspective is related the SHEL model proposed by Edwards (1988) which illustrates four basic elements necessary for successful man-machine integration and system design.

(Wiegmann & Shappell, 2003 [23], embedded from SHEL[24]).
S - Software
H- Hardware
E- Environment
L- Liveware
Software does not represent computer software unlike what we understand of it. It represents the rules and regulations that govern how a system operates (Wiegmann & Shappell, 2003, p.26)
Hardware refers to equipments, materials and physical assets.
Environment refers to the working conditions created for us humans.
Liveware represents us humans.
These four components interact among one another and rarely stand alone. In aviation, emphasis is very much given to the human-machine interface, which is the hardware-liveware interaction because of the evidence of various developments of cockpit displays, technologies that reduces workload and communication systems. The fact that human, machine and environment interaction is so important, aircraft developments today incorporate the human factor principles into the design process.
Psychosocial Perspective
This perspective takes on a more humanistic approach, unlike the contemporary approaches we normally use. Investigators who adopt this view believes that flight operations act as social endeavour that involves interfaces among numerous individuals including pilots, air traffic controllers, dispatchers, ground crew, maintenance personnel, and flight attendants. How amazing are these sectors of people and work so unequal and seemingly irrelevant to one another at first glance seems to work closely to ensure a high level of safety and productivity in the aviation industry. Rarely even a private pilot is entirely alone as air traffic controllers are just a push of a button away.
These interfaces are the main core of the psychosocial perspective. Undeniably, emphasis is usually given to the pilot role believing their performance is directly influenced by nature and quality of communication among group members. These communications in turn are influenced by the personal behaviours, attitudes and personalities of individuals within the group, besides just the environment they are working in.
Imagine the diversity an individual has to put up with his or her co-workers everyday. It can only be ponder upon the accuracy and level of safety the industry enjoys today. According to this approach, it is only when the balance between group dynamics and interpersonal interaction breaks down that errors and accidents will happen.
In the past, psychosocial models have been ignored by those in the aviation industry because of the lack of technical elements which the industry is all about. The argument was that psychosocial approaches apply to any other industry but the aviation industry. However it has only been in the past decade that investigators and analysts have begun study on the interpersonal aspects of human performance when investigating errors.
One such study concerning the industry-wide analysis of accidents found over 70 percent of all accidents resulted in crew coordination and communication problems (Wiegmann & Shappell, 2003, cited from Lautman & Gallimore, 1987 [23]). This catch-phrase has been inked into the industry, even seeing some studies publishing the percentage as high as ninety for that matter.
However such facts is not uncommon only in commercial in aviation as many accident reports confirmed such a matter well in military aviation accidents, the main cause being aircrew coordination failures (Wiegmann & Shappell, 2003, cited from Yacavone, 1993 [23]). The conclusion of these and other studies triggered many engineering psychologists to better the issues of the human-machine interface in design issues as the complexity of human interpersonal relationship increases. New studies and facts constantly occur and new input and ideas need to replace old ones to achieve a continuous improvements if high level of safety is to be reached.
As more incidents and accidents derived from simple communication failures, more defences are being aimed at cockpit communications typically in the areas of crew resource management training (CRM). CRM’s usefulness is recognised after the famous incident of a DC-10 United Airlines Flight 232 that crash landed in Sioux City in 1989 ([Sioux City Crash][2]). Death was practically guaranteed for everyone on board the flight but Captain Al Haynes and his crew showed tremendous resourcefulness and crew management which saved half the lives. Unlike the Everglades crash of 1972, the CRM training was not involved in pilot training yet at that time and that accident is a good example of the absence of CRM ([Everglades Crash][4]).
CRM involves educating and training aircrew to practise good communication that will enable individuals to interact, divide tasks and resolved conflicts more effectively when working in a high workload environment. In CRM, one of the issues the training was aimed at was the power distance relationship of pilot and co-pilot. The challenge was to change attitudes of captains about their authority to enable co-pilots or lower hierarchical personnel to override decisions when absolutely deemed necessary.
This perspective was once disregarded even as a framework unlike today. This was perhaps due to the more focussed attention given to personality variables which focuses on individuals rather than on crew coordination and communication of an entire system. These early models include the concept and idea of individuals who are just careless by nature and prone to accidents. The concept was that some individuals were simply predisposed to committing errors and causing accidents. However today, the idea that accidents are unavoidable among certain individuals is not easy for theorists to accept thus making such philosophical views unpopular.
Organisational Perspective
The organisational perspective has been utilised not only in the aviation industries but many other for a long time. However this approach has only been recently accepted among the aviation community because in the early days, attention was only placed on the aircraft and the individual flying them. This change of view point gives the realisation of how complex the nature of the aviation systems is and how they play a part in an accident/incident. In fact, it is the attention of this model given to the fallibility of decision makers, supervisors and others in the organisation hierarchy that differentiates this perspective from the others.
The most recognisable organisational theory was perhaps the model of human-error called the “Domino Theory” described by Bird in 1974. According to Bird’s theory, an accident is the natural culmination of a series of events or circumstances which invariably occur in a fixed and logical order (Wiegmann & Shappell, 2003, p.38, cited in Heinrich et. Al, 1980, p.23 [23]). Much like the game of domino, the nature of human error starts with the failures originating from the management to control losses within the organisation. This is because managers are the ones responsible in identifying and assigning tasks, establishing standards, measuring performance and making corrections to deviations where necessary to ensure a smooth transition of inputs to outputs. Failure to do so at any levels gives the chance for job related factors such as abnormal usage of resources or inadequate skills/training to appear. Over and over again these are referred to as the root of causes and it leads to immediate causes because these causes have always been included in safety programmes. In particular, immediate causes are those committed by employees or operators deemed unsafe such as misusing equipment, unauthorised use of tools, violating procedures and unsafe operations. It is these immediate causes which leads to injuries and accidents.
Supporting evidence
Refuting evidence
See also
Way forward (to do list)
Knowledge Management Space
- item
- explanation
Wiki of Science Team (contributors to this page)
Authors / Editors
Huili_LI
ZiZhanG NG
Esng
JDPerezgonzalez