Latham, Annabel ORCID: https://orcid.org/0000-0002-8410-7950 and Goltz, Sean (2019) A Survey of the General Public’s Views on the Ethics of using AI in Education. In: Lecture Notes in Artificial Intelligence 11625 - AIED 2019. Lecture Notes in Computer Science, 11625 . Springer Nature, Switzerland. ISBN 978-3-030-23203-0
|
Accepted Version
Available under License In Copyright. Download (761kB) | Preview |
Abstract
Recent scandals arising from the use of algorithms for user profiling to further political and marketing gain have popularized the debate over the ethical and legal implications of using such ‘artificial intelligence’ in social media. The need for a legal framework to protect the general public’s data is not new, yet it is not clear whether recent changes in data protection law in Europe, with the introduction of the GDPR, have highlighted the importance of privacy and led to a healthy concern from the general public over online user tracking and use of data. Like search engines, social media and online shopping platforms, intelligent tutoring systems aim to personalize learning and thus also rely on algorithms that automatically profile individual learner traits. A number of studies have been published on user perceptions of trust in robots and computer agents. Unsurprisingly, studies of AI in education have focused on efficacy, so the extent of learner awareness, and acceptance, of tracking and profiling algorithms remains unexplored. This paper discusses the ethical and legal considerations for, and presents a case study examining the general public’s views of, AI in education. A survey was recently taken of attendees at a national science festival event highlighting state-of-the-art AI technologies in education. Whilst most participants (77%) were worried about the use of their data, in learning systems fewer than 8% of adults were ‘not happy’ being tracked, as opposed to nearly two-thirds (63%) of children surveyed.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.