“Would you like to chat?” The Ethics of AI in Higher Education

Dr Craig Bellamy1, Mr Mohsin Murtaza1, Mr Ather Saeed1

1Charles Sturt University, Fitzroy, Australia


After many false dawns, AI may be gaining traction. Chatbots, Natural Language Processing, robots, autonomous vehicles, and the combination of big data and AI are all findings applications in a myriad of commercial and other contexts.  AI was once about explicit commands; what you put in is what you got out, but now it is about machine learning from big data, about machines that not only learn, but can also make decisions.

This ability to make decisions poses numerous thorny ethical dilemmas, can an autonomous vehicle avoid collisions ‘ethically’; can a chatbot impersonate a human for nefarious purposes, and can an autonomous military drone decipher images of illicit activity and then take action?  These are not dystopian projections of a sci-fi future, rather these ethical issues that exist now within the province of AI and its applications.

Whilst ethicists have been quick to provide critique, debate, and numerous frameworks for an ethical AI future, (indeed the Australian Government has just proposed a “technology roadmap, a standards framework and a national AI Ethics Framework”, and regulation in the space), higher education has been fairly quiet in terms of debating the impacts of AI on teaching and research and the broader HE education system.  Indeed, while AI applications are not yet readily used in research, this could change quite rapidly as has the use of ‘big data’ in research across both the digital humanities and the sciences.

Many ethical issues surround the foremost issue of IT ethics, being privacy, but also new issues arise, particularly centred upon the interpretation of data using machine learning, transparency, and Ai’s influence upon later research findings, its accreditation, and broader social influence.  This is a particularly difficult issue as AI does afford many benefits in terms of the researcher’s ability to deal with the scale and complexity of big data along with the phenomena it records and represents, but there are things that machines are good at and things that people are good at, and this intersection of machines and people, including the ethics of interpretation and decision making, needs to be considered from the very emergence of AI in research and education.

This Birds of Feather session proposes to discuss the ethics of AI, big data and research, with the purpose of providing a rudimentary ethical framework for embryonic AI in research and teaching practice.  This framework may be used as a stand-alone guide for researchers or ethics teachers or as an addendum to existing research ethics, privacy and data processing guidelines. During the BOF session, discussion materials, provocative example, and talking points will be provided to draw on the experience of the audience to help develop the framework.


  • Bostrom, Nick, “Superintelligence: Paths, Dangers, Strategies”, Oxford University Press, 2014
  1. Pollit, Edward, “Budget 2018: National AI ethics framework on the way, Increased regulation signalled as part of $30m investment” Australian Computer Society, https://ia.acs.org.au/article/2018/budget-2018–ai-boost-with-an-ethical-focus.html (Accessed 13 June, 2018).
  2. Rose Luckin, “Enhancing Learning and Teaching with Technology: What the research says” Institute of Education Press (IOE Press), 2018
  3. Seldon, Anthony, “The Fourth Education Revolution”, University of Buckingham Press, 2018


Dr Craig Bellamy is a Lectures in IT and Ethics at Charles Sturt University’s Study Centre in Melbourne.  He has a background in the Digital Humanities and has presented and published widely in the field.

Mohsin Murtaza currently working at Study Centre Melbourne CSU as Adjunct Lecturer and Course Coordinator IT. He worked as Lecturer in several Australian Universities including La Trobe, Central Queensland and Federation University. He has completed Master of Telecommunication Engineering from La Trobe University and achieved “Golden Key Award”.

Ather Saeed is a  Course Coordinator for CSU (IT & Networking Programs)  and is currently pursuing a PhD (thesis titled “Fault-tolerance in the Healthcare Wireless Sensor Networks”). He has a Masters in Information Technology & Graduate Diploma ( IT) from the University of Queensland, Master of Computer Science (Canadian Institute of Graduate Studies). He has published several research papers in international journals.