The role of Generative AI Test Ranges to protect online services from the impact of malicious synthetic content
Christopher Leckie1 1University Of Melbourne, Parkville, Victoria, Australia
Abstract
Generative Artificial Intelligence (AI) has made remarkable progress in recent years in terms of the ability to automate the production of different types of content, such as long-format text, images, speech, music and video. While “off-the-shelf” generative AI tools are now widely available to assist in content generation for workplace and entertainment applications, the ready availability of these tools raises the question of how might criminal and other hostile actors use such tools for malign purposes?
Organisations such as Europol have already identified several possible scenarios where generative AI could be used to automate content generation for activities such as fraud, disinformation campaigns and cyber attacks. These malicious activities require the development of a new class of cyber defences that can detect synthetic content and identify its intent in specific application contexts.
To achieve this aim, we have been developing a Generative AI Test Range. This builds on our experience of cyber security test ranges to design an environment in which online services, such as social media channels, can be emulated in a closed environment, and subjected to attacks using malicious synthetic content. New defences can then be implemented and tested in the closed environment of the test range to assess their effectiveness against potential attacks.
Through the development of this generative AI test range, we can anticipate likely online threats of the future. This can inform the design of future services and prepare effective defences to mitigate the effects of malign content in a proactive rather than reactive manner.
Biography
Chris Leckie is a Professor with the School of Computing and Information Systems at the University of Melbourne, and is the Director of the University of Melbourne Academic Centre of Cyber Security Excellence (one of two such Commonwealth-funded centres in Australia). He has over three decades of research experience in artificial intelligence and machine learning, having led research teams at Telstra Research Laboratories, National ICT Australia (NICTA) and the University of Melbourne. His research on using machine learning for anomaly detection, fault diagnosis, cyber-security and the life sciences has led to a range of operational systems used in industry, as well as over 300 articles published in leading international conferences and journals.