The Frankenstein Syndrome
figureposted on 15.11.2017, 12:00 by Megan Field
Digital image presented at the 2017 Defence and Security Doctoral Symposium.
Automation of technology and systems across such domains as defence, nuclear, transportation and healthcare is forecast to increase dramatically in the coming decades, and with that, levels of automation (LOA) are set to change the role of operators.
However, the shift from of working directly with and within a system to one characterised by supervision and (sometimes remote) surveillance, brings a range of human-centred issues and limitations. These issues are not solely focused on how the operator can cope with the huge amounts of real-time data and information; they also concern how individuals react and behave towards computerised teammates. This is especially critical in military environments, such as static and mobile Command and Control (C2) centres. These facilities must accurately and appropriately analyse, fuse and display considerable amounts of C3I (Communications, Command, Control and Intelligence) material. The ability to trust (or mistrust) a system is, therefore, vital for human safety and mission success.
Nonetheless, human actions and behaviours are not formed in a ‘cognitive vacuum’ – they are influenced by the context of tasks, environments, prior experiences and memories. Trust formation with technology and automation is affected by many precedents, in a process similar to which humans endow others with levels of trust and confidence. These include prior knowledge, experiences with similar technology (or people) and how expectations, lack of transparency and failures can lead to mistrust.
This research seeks to explore behaviours and attitudes of human operators, and how military culture shapes operator heuristics and naturalistic decision making. The qualitative inquiry will also probe whether these circumstances foster maladaptive behaviour which differs or deviate to those of civilian and defence personnel.