News

EFI to fund PhD studentship for £3.2m project on Trustworthy Autonomous Systems

In this article

Researchers from EFI, across the University and beyond have come together to help ensure the trustworthiness of machines that act or make decisions on their own.

Image Credit: Paul Dodds

Researchers from EFI and across the University and beyond have come together to help ensure the trustworthiness of machines that act or make decisions on their own, from virtual assistants like Alexa to aircraft autopilot.

The project is part of the UKRI Trustworthy Autonomous Systems (TAS) programme, funded through the UKRI Strategic Priorities Fund and delivered by the Engineering and Physical Sciences Research Council (EPSRC). The TAS programme brings together research communities and key stakeholders to drive forward cross-disciplinary, fundamental research to ensure that autonomous systems are safe, reliable, resilient, ethical and trusted.

UKRI Trustworthy Autonomous Systems Programme

The TAS programme is a collaborative UK-based platform comprised of Research Nodes and a Hub, united by the purpose of developing world-leading best practice for the design, regulation and operation of autonomous systems. The central aim of the programme is to ensure that the autonomous systems are ‘socially beneficial’, protect people’s personal freedoms and safeguard physical and mental wellbeing.

TAS is comprised of seven distinct research strands termed Nodes: trust, responsibility, resilience, security, functionality, verifiability and governance and regulation. Each Node will receive just over £3 million in funding from UKRI to conduct their research.

Edinburgh’s Governance and Regulation Node

Led by Professor Subramanian Ramamoorthy from the University of Edinburgh’s School of Informatics and Edinburgh Centre for Robotics, a multidisciplinary consortium of researchers from six UK universities will develop a novel software engineering and governance methodology for regulation of Trustworthy Autonomous Systems (TAS). By creating and testing new frameworks and tools for the certification, assurance and responsible design of TAS, the project will help us understand how such systems can be used safely, legally and ethically.

Professor Shannon Vallor, Director of EFI’s Centre for Technomoral Futures, is one of several Co-Investigators on the grant. The focus of Professor Vallor’s contribution is the development of a responsibility framework for integrating moral, legal, professional, organisational and developer responsibilities in the governance of autonomous systems.

The Edinburgh Futures Institute, the School of Philosophy, Psychology and Language Sciences and the School of Informatics will support a fully funded studentship linked to the EPSRC-UKRI project. Funded by the global investment firm Baillie Gifford, the PhD will begin in the academic year 2021/2022.

The studentship research will focus on the qualitative phase of the project that elicits and maps the legal, ethical, and computational factors that must be taken into account by developers and regulators of autonomous systems and help to construct a responsibility framework for autonomous systems that can bridge the many different notions of moral, legal, professional, organisational and causal responsibility relevant to system governance.

The PhD student’s primary supervisor will be project Co-Investigator Shannon Vallor (Edinburgh Futures Institute and Department of Philosophy) with co-supervision from Principal Investigator Subramanian Ramamoorthy (School of Informatics). The selected candidate will also work closely with a Research Associate leading this project phase.

Details about this new EFI PhD opportunity and the application process can be found here. [https://www.ed.ac.uk/ppls/philosophy/prospective/postgraduate/funding-research-students/baillie-gifford-phd-studentship-2021-2022]

“The core motivation for this programme is that autonomous systems are becoming increasingly more prevalent in society- including both software decision making systems in finance and medical diagnostics, conversational agents like Alexa as well as physical systems ranging from home robots to self-driving cars and planes. Basically, Trustworthy Autonomous Systems covers the full range of the application of AI in real systems in our lives. Our specific project is on the governance and regulation of such systems.”

– Professor Subramanian Ramamoorthy

Principal Investigator of the TAS Governance and Regulation Research Node

A Collaborative and Multidisciplinary Project

The project is a collaboration with academics from across the University of Edinburgh as well as King’s College London, the University of Nottingham, Heriot-Watt University, University of Glasgow and the University of Sussex. The researchers will also work closely with a diverse group of external project partners, including Microsoft, BAE Systems, NASA Ames, Thales, NVidia, Optos, Digital Health & Care Institute (DHI), Legal and General’s Advanced Care Research Centre (ACRC), Civil Aviation Authority (CAA) Innovation Hub, NPL, DSTL, DRisQ, Imandra, Adelard Partners, Altran UK, MSC Software, Ethical Intelligence, Craft Prospect Ltd, Vector Four Ltd and SICSA.

The Node takes a multidisciplinary approach to its work, bringing together researchers with backgrounds in Computer Science and AI, Law, AI ethics, Social Studies of Information Technology and Design Ethnography. The diverse team offers a uniquely holistic perspective that combines technical, social science and humanities research to guarantee that autonomous systems can be trusted and integrated into society with confidence.

Related Links

Trustworthy Autonomous Systems Hub

UKRI Engineering and Physical Sciences Research Council (ESPRC)

Subramanian Ramamoorthy

Details on the project and funded studentship

Join us to challenge, create, and make change happen.

#ChallengeCreateChange