Salmon, P. M., McLean, S., Carden, T., King, B., Thompson, J., Baber, C., Stanton, N., & Read, G. J. M. (2022). It’s risk, Jim, but not as we know it: identifying the risks associated with future Artificial General Intelligence-based Unmanned Combat Aerial Vehicle systems. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 66(1), 560-564.
Abstract
The next generation of artificial intelligence, known as Artificial General Intelligence (AGI), could either revolutionise or destroy humanity. Human Factors and Ergonomics (HFE) has a critical role to play in the design of safe and ethical AGI; however, there is little evidence that HFE is contributing to development programs. This paper presents the findings from a study which involved the use of the Work Domain Analysis-Broken Nodes approach to identify the risks that could emerge in a future ‘envisioned world’ AGI-based unmanned combat aerial vehicle system. The findings demonstrate that there are various potential risks, but that the most critical arise not due to poor performance, but rather when the AGI attempts to achieve goals at the expense of other system values, or when the AGI becomes ‘super-intelligent’, and humans can no longer manage it. The urgent need for further work exploring the design of AGI controls is emphasised.
This paper is not available open access, please email the lead author, Paul Salmon (psalmon@usc.edu.au), to obtain a copy.