Theoretical frame

Automation (n.) refers to "automatic operation", meaning operation with limited or inexistent input by humans. Thus, it can be applicable to machinery, but also to techniques, processes, etc. In the context of the following discussion, however, the most appropriate meaning is that of "2. a mechanical device, operated electronically, that functions automatically, without continuous input from an operator" ( Unabridged, 2006 [3]).

Billings (1997 [1]) defines automation as the use of machines to performed tasks previously done by humans. Moray, Inagaki and Itoh (2000 [5]) define automation more specifically as "any sensing, detection, information-processing, decision making or control action that could be performed by humans but is actually performed by machines (p.44)".

Automated subsystems now perform an ever-larger share of the manual and cognitive tasks that operators had previously performed themselves, ranking from minimal to complete system control. Parasuraman (2000 [9]), and Parasuraman, Sheridan, and Wickens (2000 [10]) describe up to ten levels of automated control that designers can incorporate into a system. In the lowest level, the operator performs all tasks, and in the highest level the machine makes all decisions and takes all actions independent of, and without communicating with, the operators.

Moray et al (2000 [5]) believe that the optimal level of automation depends on such elements as the complexity of the system, the risk of a fault and the dynamics of an event. To avoid unnecessary and quite costly system shutdowns in situations that are not time critical, Moray et al suggest that operators, not automation, retain ultimate control of the system.

Automation advantages and disadvantages

The increased role of automation in systems has enhanced many aspects of system operations, but it has also led to unique antecedents to errors which have led to incidents and accidents.

Some operators have demonstrated greater trusts in the abilities of the automation to control the system than in their own abilities. This may lead to unquestioning acceptance of automation guidance, and to overlook problems that the automation has failed to detect.

Advantages and Benefits

Wiener and Curry (1980 [14]), and Wiener (1989 [15]) examined the effects of automation in the aviation environment, and believed that these have resulted from a combination of technological, economic, and safety factors, not all of which had been realized. They also suggest that because automation can perform many cognitively demanding tasks faster and more accurately than operators could, thereby assigning cognitive demanding tasks to machines could reduce operator workload, enabling operators to attend to ‘high level’ activities, such as monitoring.

Disadvantages and shortcomings

Automation has also brought about disadvantages and shortcomings that have adversely affected operator performance and increased opportunities for error. These obtain from several factors:

  • User interface

Some automation applications have altered sources of data that operators had depended upon for system performance feedback, thus reducing operator’s awareness of the system states (Norman, 1991 [8]; Billings, 1997 [1]).

  • Opacity

In the event of a system anomaly, operators’ unawareness of the reasons for the actions of the automation or their inability to predict its next actions degrades their ability to diagnose and responds appropriately (Woods, Johannesen, Cook & Sarter, 1994 [16]).

Furthermore, automation opacity makes operators reluctant to intervene should they become uncertain of the automation outcomes to expect (Sarter & Woods, 2000 [13]).

  • Monitoring, vigilance and situation awareness

Automation has helped to distance the operator from man system-related cues. Norman (1981 [6], 1988 [7]) believes that in automated systems, operators may no longer directly observe the system. Instead they monitor the data that automated sensors detect and display, which may or may not effectively convey the needed information.

  • Workload redistribution

Automation has generally reduced operator workload, but it has often done during already low-workload operating phases, and it has increased it during already high-workload phases (Wiener, 1989 [15]). Wiener also describes that the redistribution of workload as ‘clumsy automation’, a phenomenon that increases rather than decreases opportunities for operator errors.

  • Trust, bias and skill degradation

Automation performs so well those operators’ interactions with the automation change. As system automation increases, the number of tasks that are performed more accurately and reliably than operators could do. This situation has increased operator trust in the automation’s ability to perform those tasks. Yet, as trust grows, their confidence in their own abilities to perform the same tasks may decrease (Lee & Moray, 1992 [4]).

  • Team performance

Researchers have suggested that automation can be considered as a member of a multi-operator team, altering the role of the team members. Paris, Salas and Cannon-Bowers (2000 11]) contend that automation can replace all or partial team functions, leading to restructured and redefined team member roles. Woods (1996 [17]) also notes, "introducing automated and intelligent agents into a larger system in effect changes the team composition. It changes how human supervisors coordinate their activities with those of the machine agents (p. 4)".

Supporting evidence

Analysis of Aviation Automation

Automation in aviation industry has been a key feature in reduction of pilots workload which is associated with manual tasks of flying along with pilots cognitive processing. Automation for cockpit systems are highly accurate and reliable which allow flight crews to enhance safety and maintain a high level of situation awareness, thus increasing safety (Campbell & Bagshaw, 1999 [2]). Automation in aviation started off prior to the World War II whereby autopilot systems were in placed to aid operational demands and to reduce pilot's workload. With the modern advancement of technology, Global Positioning systems, Primary Flight Displays, Advanced Cockpit Autopilot systems have steady increases the amount of safety for flights as well as reducing pilots workload, which eventually increases situation awareness and reduces human factor issues.

Cockpit automation diverts the focus towards pilots and computers in order to conduct a safe flight. With automation, boredom occurs and arousal level decrease significantly during cruising flights. Thus this can influence performances and reduce job satisfaction. Situation awareness decreases are often observed with pilots during cruising stages as the complex automation allows aircrafts independently with the Flight Management Computer (FMC) (Wiener & Curry, 1980 [14]). Therefore, pilot training is critical in order to learn about its automation and to adjust their workload or to maintain a level of intellectual activity to keep them 'in the loop' of the automation systems onboard an aircraft (Campbell & Bagshaw, 1999 [2]).

Automation has sometimes been seen as an 'end zone' rather than a tool to enhance aircraft operations. Excessive reliance will result in Automation Complacency where cross-checking systems and situational monitoring of FMC are greatly reduced due to the belief of 'infallibility of Cockpit automation' (Campbell & Bagshaw, 1999 [2]). Passive monitoring and narrowing attention (pilots only focus once visual or verbal alarm activates) from the multi functional capability of automated cockpit can lead towards the breakdown of the entire system. Thus confusion or time critical decisions were omitted resulting in fatal air accidents.

Anecdotical evidence suggests that pilot students and other GA pilots may bring some automatic equipment into the cockpit in order to supply information they would like to have while flying but that the current cockpit configuration does not provide. The majority of this equipment is in the form of a laptop computer or, more recently, a mobile phone. Exploring this need for automation, Perezgonzalez and Lee (200912) carried out an exploratory study on ab-initio pilot students and found that students value highly, in the following order, automatic features that display navigation charts while flying, increase airspace awareness while flying, have low running costs, display flown track for post-flight analysis, help with pre-flight route planning, provides TCAS functionality, and are portable. Among techonologies available nowadays (e.g. computer-based flight management systems, mobile phones and tracking devices), most students would purchase an integrated flight management systems if they could afford to do so.

Way forward (to do list)

1. BILLINGS Charles E (1997). Aviation automation: the search for a human-centered approach. Lawrence Erlbaum Associates (New Jersey, USA), 1997. ISBN 9780805821260.
2. CAMPBELL RD & M BAGSHAW (1999). Human performance and limitations in aviation (2nd ed). Blackwell Science (London, UK), 1999. ISBN 0-632-04986-3.
3. DICTIONARY.COM UNABRIDGED (2006). Based on the Random House Unabridged Dictionary. Retrieved from, on 28 September 2008.
4. LEE J & N MORAY (1992). Trust, control strategies and allocation of function in human-machine system. Ergonomics, 1992, vol.35, pp.1243-1270. ISSN 0014-0139.
5. MORAY N, T INAGAKI & M ITOH (2000). Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. Journal of Experimental Psychology: Applied, 2000, vol.6, pp. 44-58. ISSN 1076-898X.
6. NORMAN Donald A (1981). Categorization of action slips. Psychological Review, 1981, vol.88, pp.1-15. ISSN 0033-295X.
7. NORMAN Donald A (1988). The psychology of everyday things. Basic Books (New York, USA), 1988. ISBN 0465067093.
8. NORMAN Donald A (1991). Cognitive artifacts. In LM CARROLL [ed] (1991). Designing interaction: psychology at the human-computer interface. Cambridge University Press (New York, USA), 1991, pp.17-38. ISBN 0521409217.
9. PARASURAMAN Raja (2000). Designing automation for human use: empirical studies and quantitative models. Ergonomics, 2000, vol.43, pp.931-951. ISSN 0014-0139.
10. PARASURAMAN Raja, TB SHERIDAN & CD WICKENS (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man & Cybernetics, 2000, vol.30, pp.286-297. ISSN 0018-9472.
11. PARIS CR, E SALAS & JA CANNON-BOWERS (2000). Teamwork in multi-person systems: a review and analysis. Ergonomics, 2000, 43, pp. 1052-1075. ISSN: 0014-0139.
12. PEREZGONZALEZ Jose D & Seung Yong LEE (2009). New technologies for the student pilot. Aviation Education and Research Conference Proceedings, Blenheim (New Zealand), 2009 (July), pp.10-11. ISSN 1176-0729.
13. SARTER Nadine B & David D WOODS (2000). Team play with a powerful and independent agent: a full-mission simulation study. Human Factors, 2000, 39, pp. 553-569. ISSN: 0018-7208.
14. WIENER EL & RE CURRY (1980). Flight-deck automation: promises and problems. NASA Technical Memorandum 81206. NASA-Ames Research Center (California, USA), 1980.
15. WIENER EL (1989). Human factors of advanced technology (‘glass cockpit’) transport aircraft. NASA Technical Report 117528. NASA-Ames Research Center (California, USA), 1989.
16. WOODS David D, Leila J JOHANNESEN, Richard I COOK & Nadine B SARTER (1994). Behind human error: cognitive systems, computers, and hindsight. Wright-Patterson Air Force Base: Crew Systems Ergonomics Information Analysis Center (Ohio, USA), 1994.
17. WOODS David D (1996). Decomposing automation: apparent simplicity, real complexity. In Raja PARASURAMAN and M MOULOUA [ed] (1996). Automation and human performance: theory and applications. Lawrence Erlbaum Associates (New Jersey, USA), 1996, pp.3-17.

Wiki of Science Team (contributors to this page)

Authors / Editors

ZiZhanG NGZiZhanG NG

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License