KINGS COLLEGE LONDON

Research Associate and Research Fellow

Location
London (Central), London (Greater) (GB)
Salary
Grade 6, £38,304 to £45,026, including LWA or Grade 7, £46,292 to £54,534 including LWA.
Posted
Wednesday, 20 January 2021
End of advertisement period
Wednesday, 3 February 2021
Ref
013445
Academic Discipline
Life sciences
Contract Type
Fixed Term
Hours
Full Time

Two candidates will join a team of post-doctoral researchers in the UKRI Trustworthy Autonomous Systems Hub (TAS Hub, www.tas.ac.uk) that King’s College London is a member of. Led from the University of Southampton with King’s College London and the University of Nottingham as partners, the TAS Hub aims to bring together the UK’s world-leading expertise in areas ranging from computing and robotics to social sciences and the humanities to ensure that autonomous systems are trustworthy by design and default and can ultimately benefit society and industry.

Autonomous systems are technologies that gain information about their environment, learn, adapt and make decisions, with little or no human control. They can include automated software and ‘smart’ devices as well as self-driving cars, drones and robots. Autonomous systems are already used in many sectors of society, and given their increased use, it is important to ensure that they are designed, built and deployed in a way that can be accepted and trusted by all.

Explanations can help human users understand and trust the choices of autonomous systems or interact with them in a safe and secure way. They can also help such systems interact with each other in a safe, secure and trusted way. Much work has recently been devoted to Explainable AI, which has so far focused on new AI techniques that enable users to understand, appropriately trust, and effectively manage the emerging generation of AI systems. Of interest, but not exclusively, the area of explainable safety and security in autonomous systems is still largely unchartered territory, especially since it involves different stakeholders (i.e., the system’s developers, analysts, users and attackers, as well as legislators and policymakers) and is multi-faceted by nature (as it requires reasoning about system model, threat model and properties of security, privacy and trust as well as concrete breaches, attacks, vulnerabilities and countermeasures).   Against this background, we consider two distinct research roles:

Role 1 (TAS formal aspects): To provide all stakeholders with the different levels of certified assurance that they require, it is useful to employ formal methods. In order to provide, and reason about, explanations in trustworthy autonomous systems, it will be necessary to formalise and integrate explanations in the different phases of system development, from design to execution. It will also be necessary to identify reasonable trade-offs that will allow to adapt explanations so that they can be verified and accepted by the different stakeholders, while at the same time guaranteeing the validity and formality of the explanations.

Role 2 (Human-TAS interaction): Since the main beneficiary of explanations provided by trustworthy autonomous systems will be their human users, designers will need to consider explicitly the human centred aspects of the explanations the systems provide. This requires (i) eliciting the socio-technical requirements underlying the relationships between specific systems and their intended users, (ii) understanding the users’ profiles, expectations, potential cognitive and behavioural biases, and reasoning approaches, and (iii) finding suitable mechanisms to deliver the explanations. This will be achieved by carrying out user studies and other socio-technical approaches.

This is an exciting opportunity to participate in a new area of research and in the process collaborate with our Hub partners (see www.tas.ac.uk). There will also be the opportunity to shape and engage in focused and agile multidisciplinary projects in the different research streams of the TAS Hub.

Applicants will need to express their preference for role 1 or 2. They will be highly experienced in autonomous systems and formal methods (role 1) or human computer interaction (role 2), as well as have a strong publication record in related areas. The research will have a substantial multidisciplinary ambition, but a PhD in computer science, mathematics or engineering is essential.

Our staff and students come from all over the world and the Department is proud of its friendly and inclusive culture. Diversity is positively encouraged with a number of family-friendly policies, including the operation of a core hours policy, the right to apply for flexible working and support for staff returning from periods of extended absence, for example maternity leave. The Department of Informatics is committed to ensuring an inclusive interview process and will reimburse up to £250 towards any additional care costs (for a dependent child or adult) incurred as a result of attending an interview for this position.

For further information about the Department of Informatics at King’s, please see https://nms.kcl.ac.uk/luc.moreau/informatics/overview.pdf.

Contact: Professor Luca Vigano via email luca.vigano@kcl.ac.uk or Professor Luc Moreau via email luc.moreau@kcl.ac.uk