KINGS COLLEGE LONDON

Postdoctoral researcher in Trustworthy Autonomous Systems

2 days left

Location
London (Central), London (Greater) (GB)
Salary
Grade 6, £38,304 - £45,026 per annum inclusive of £3,500 London Weighting Allowance per annum
Posted
25 Sep 2020
End of advertisement period
25 Oct 2020
Ref
R6/1212/20-KN
Contract Type
Fixed Term
Hours
Full Time

We have a position for a post-doctoral research associate to work in the UKRI/EPSRC National Hub on Trustworthy Autonomous Systems that King's College London is a member of. Lead from the University of Southampton with King's College London and the University of Nottingham as partners, the Hub aims to bring together the UK's world-leading expertise in areas ranging from computing and robotics to social sciences and the humanities to ensure that Autonomous Systems are trustworthy by default and can ultimately benefit society and industry.


Autonomous Systems are technologies that gain information about their environment, learn, adapt and make decisions, with little or no human control. They can include automated software and 'smart' devices as well as self-driving cars, drones and robots. Autonomous Systems are already used in many sectors of society, and given their increased use it is important to ensure that they are designed, built and deployed in a way that can be accepted and trusted by all. To ensure their adoption, considering in particular the “right to explanation” addressed by the GDPR, Autonomous Systems should be able to explain their decisions and behaviour.
The objective of this PDRA is to develop novel formal techniques to tackle explanations, governance and ethics for trust, safety, security and privacy in Autonomous Systems, considering, in particular, the provenance and flow of information. This is a challenging problem since the explainability of a system does not necessarily imply its security, and an explanation might sometimes even endanger the security of the system by revealing too much about how its workflows process information. The research will thus investigate the intersection, in the context of Trustworthy Autonomous Systems, of formal methods, cybersecurity and explainability with provenance, which describes the people, institutions, entities, and activities involved in producing, influencing, or delivering a piece of data, a document, or an automated decision. The research will also require reasoning formally about the governance and ethical aspects of the way Trusted Autonomous Systems interact with their users, in order to understand how the behaviour of human users interacting with an Autonomous System might influence its decisions and ultimately endanger its security and trustworthiness.


For example, how can systems provide explanations on how information is handled that are both formal (and thus can be generated and checked automatically) and understandable for users so that they interact with the system in an appropriate way? How can formal explanations be tailored to the specific recipient of the explanation, which could be another system or a human being? How can we formally prove that explanations do not endanger security and privacy?

This is an exciting opportunity to participate in a new area of research and in the process collaborate with our prestigious project partners, including key players in artificial intelligence, software engineering, software verification, cyber and physical security of critical infrastructures, electronic financial services, social media, to name a few. There will also be the opportunity to shape and engage in focused and agile multidisciplinary projects in the different research streams of the Trustworthy Autonomous System Hub.
You will need to be highly experienced in formal methods, cybersecurity, AI, as well as have high-impact research publications in related areas. The research will have a substantial multidisciplinary ambition, but a PhD in computer science, mathematics or engineering is essential.


Our staff and students come from all over the world and the Department is proud of its friendly and inclusive culture. Diversity is positively encouraged with a number of family-friendly policies, including the operation of a core hours policy, the right to apply for flexible working and support for staff returning from periods of extended absence, for example maternity leave. The Department of Informatics is committed to ensuring an inclusive interview process and will reimburse up to £250 towards any additional care costs (for a dependent child or adult) incurred as a result of attending an interview for this position.

For further information about the Department of Informatics at King's, please see https://nms.kcl.ac.uk/luc.moreau/informatics/overview.pdf.

Similar jobs

Similar jobs