W e live in an increasingly technological world. Automated systems certainly can make life easier, but they can also create complexity and uncertainty. Moreover, it is clear that automation does not merely supplant human activity, but...
moreW e live in an increasingly technological world. Automated systems certainly can make life easier, but they can also create complexity and uncertainty. Moreover, it is clear that automation does not merely supplant human activity, but also transforms the nature of human work. This review examines an original account of this transformation-a link between automation technology and the sense that our actions cause effects on the outside world (so-called 'agency'). Accordingly, we first discuss the human factor issues related to automation technology. Particularly, we introduce the out-of-the-loop performance problem. Then, we introduce recent findings about agency. We propose that several recently developed psychological approaches to the self-promise to enhance our comprehension of the transformation induced by increased automation. Next, we address the controversial issue of agency measuring, particularly the necessary dissociation between explicit and implicit agency measurement. In particular, we introduce the intentional binding effect as an implicit agency measurement, and we discuss the problems and issues related to the generalization of this effect to more complex situations. Finally, we suggest that the investigation of this authorship processing in the field of human-machine interaction may be fruitful, both to elaborate concrete design recommendations and to evaluate the potentiality for an HMI to satisfy the agency mechanism. Automation and human control in complex systems "The burning question of the near future will not be how much work a man can do safely, but how little." [85] There is perhaps no facet of modern society in which the influence of automation technology has not been felt. Whether at work or at home, while travelling or while engaged in leisurely pursuits, human beings are becoming increasingly accustomed to using and interacting with sophisticated computer systems designed to assist them in their activities. Even more radical changes are anticipated in the future, as computers increase in power, speed and "intelligence". We have usually focused on the perceived benefits of new automated or computerized devices. This is perhaps not surprising, given the sophistication and ingenuity of design of many such systems (e.g., the automatic landing of a jumbo jet, or the docking of two spacecraft). The economic benefits that automation can provide, or is perceived to offer, also tend to focus public attention on the technical capabilities of automation. However, our fascination with the possibilities afforded by technology often obscures the fact that new computerized and automated devices also create new burdens and complexities for the individuals and teams of practitioners responsible for operating, troubleshooting and managing high-consequence systems. Whatever the merits of any particular automation technology, it is clear that automation does not merely supplant human activity but also transforms the nature of human work. As a matter of fact, the role of the human actors may possibly evolve from direct control to supervision. Understanding the characteristics of this transformation is vital for successful design of new automated systems. Automation and OOL performance problem When new automation is introduced into a system, or when there is an increase in the autonomy of automated systems, developers often assume that adding "automation" is a simple substitution of a machine activity for human activity (substitution myth, see [92]). Empirical data on the relationship of people and technology suggest that this is not the case and that traditional automation has many negative performance and safety consequences associated with it stemming from the human out-of-the-loop (OOL) performance problem (see [22], [50]). Classically, the out-of-the-loop performance problem leaves operators of automated systems handicapped in their ability to take over manual operations in the event of automation failure [22]. The OOL performance problem has been attributed to a number of underlying factors, including human vigilance decrements (see [7], [86]), Issue 4-May 2012-Sense of Control in Supervision Tasks of Automated Systems