Senior Research Associate (Systems/ML Security) Trustworthy Autonomous Systems Node in Security

Lancaster, Lancashire (GB)
Monday, 7 December 2020
End of advertisement period
Monday, 1 February 2021
Contract Type
Fixed Term
Full Time

Ref: A3230

Senior Research Associate (Systems/ML Security) Trustworthy Autonomous Systems Node in Security

School of Computing and Communications
Salary: £34804 - £40322 
Closing Date:   1 February 2021
Interview Date: 15 February 2021
Contract: Indefinite with end date

We are starting an exciting and wide-ranging fundamental research project that will address technical and social issues of security in order to develop Trustworthy Autonomous Systems (TAS). Lancaster’s Security Institute will lead the £3M EPSRC TAS Security (TAS-S) node for a 42-month period starting November 1, 2020. Lancaster is a top-10 UK university and is especially renowned for its unique socio-technical coverage of security. The TAS-S research also involves the Lancaster Intelligent, Robotic and Autonomous systems (LIRA) Centre and is partnered with Cranfield University (a leader in autonomous system research). It is additionally supported by a range of UK, EU and international academic, industry and policy partners including Airbus, BAE, Raytheon, Thales, NATO, CMU, EC-CONCORDIA/CODE, AIT, TTTech, Academia Sinica, Arthurs Legal, RISE, and the UK Coast Guard.   

For the TAS-S team, we are seeking multiple enthusiastic (and innovative!) post-doctoral researchers with solid research credentials and, ideally, with experience of participation in collaborative research projects. These positions, covering technical and sociological disciplines, will research security across Concepts, Machine Learning, Controls, Communication, Law and Sociology. 

For this specific position, we seek candidates with broad interests in researching fundamental concepts in distributed systems security and over the use of Machine Learning. Our interest is broadly in providing novel approaches to distributed threat modelling, security risk assessment and provision of secure ML. The intent is basic research in distributed/ML processes that can withstand attacks and developing secure infrastructures to support them.

This position offers an excellent opportunity to explore basic (academic and applied) research, and to collaborate with a multi-disciplinary group of investigators in accomplishing cutting-edge research. The role will encompass:

  • Pursuit of basic security research. 
  • Research engagement with the project’s internal and external stakeholders.
  • Participation in project workshops, seminars, dissemination and outreach events. 
  • Support on reports, presentations and research proposals.  

We will additionally provide PhD positions to support the RA’s research.

This is a challenging and highly rewarding role for someone interested in growing their academic/applied research career, and in playing a key part in a major research effort focused on the future of the security and trustworthiness of distributed systems. Further details of responsibilities are available in the job description.

You will join us on an indefinite contract. However, the role remains contingent on external funding which, for this position, will initially be for 18-months

This is a full-time position expected to start August 1, 2021. 

For further information please contact Professor Neeraj Suri,

We are committed to family-friendly and flexible working policies on an individual basis, as well as the Athena SWAN Charter, which recognises and celebrates good employment practice undertaken to address gender equality in higher education and research. We welcome applications from people in all diversity groups.