UNIVERSITY OF SURREY

Research Fellow in Machine Learning

Location
Guildford, United Kingdom
Salary
£31,302 to £39,609 per annum plus benefits
Posted
19 Feb 2019
End of advertisement period
20 Mar 2019
Ref
070818-R
Contract Type
Fixed Term
Hours
Full Time

Department of Electrical & Electronic Engineering

Location:      Guildford
Salary:      £31,302 to £39,609 per annum plus benefits
Fixed Term until February 2021
Post Type:      Full Time
Closing Date:      Wednesday 20 March 2019
Interview Date:      To be confirmed

Applications are invited for a postdoctoral research fellow position available at the Centre for Vision, Speech and Signal Processing (CVSSP), starting in April 2019, to work on machine learning and Internet of Things data analysis. The post is for a full-time researcher for a 22 months term (end date: February 2021).

The researcher will join a multidisciplinary group working on machine learning and Internet of Things for healthcare and smart cities. The post holder will join the EU H2020 IoTCrawler project team (https://iotcrawler.eu). The aim of this project is to develop distributed indexing, query and search mechanisms for Internet of Things resources and data. The University of Surrey’s research in this project is focused on time-series data analysis, pattern extraction and developing semantic query and search mechanisms for Internet of Things data.

You will join an established research group with a strong track record and excellent research infrastructure.CVSSP is a leading UK research centre in audio-visual signal processing, computer vision and machine learning. Our Centre is one of the largest in Europe with over 150 researchers and a grant portfolio in excess of £23 million, bringing together a unique combination of cutting-edge AI and machine learning expertise. Our aim is to advance the state-of-the-art in machine perception combining artificial intelligence (AI) and multi-modal sensing to enable machines to see, hear and understand the world around them. The Centre has an outstanding track record of innovative research leading to technology transfer and exploitation in biometrics, creative industries (film, TV, games, VR, AR), mobile communication, healthcare, robotics, autonomous vehicles and consumer electronics.

CVSSP promotes a friendly, supportive and inclusive research environment. As a Bronze Athena SWAN award holder, Stonewall Diversity Champion and Disability Confident employer, we are committed to supporting equality, diversity and inclusion.

Our team’s research has received several awards including: the Most Outstanding Innovation at Guildford’s Innovation Awards, HSJ 2018 Award for Improving Care with Technology, and an NHS Parliamentary Award.

You should hold a PhD in electrical engineering, computer science, mathematics, physics or similar area. You will be responsible for designing and developing new techniques for analysing real-time IoT data (including healthcare, in-home sensory data and smart city data), to extract actionable information for healthcare and/or smart city applications. You should have good experience in machine learning, time-series data analysis and probabilistic methods. You are expected to develop your solutions in Python or in some cases in MATLAB. You will work with the project team, attending the meetings and providing support to the project.

For informal enquiries please contact Professor Payam Barnaghi at: p.barnaghi@surrey.ac.uk, please apply online through the button below.

If you are unable to apply on-line, please contact Mr Bradley Thomas via email: b.thomas@surrey.ac.uk.

Please note, it is University Policy to offer a starting salary equivalent to Level 3.6 (£31,302) to successful applicants who have been awarded, but are yet to receive, their PhD certificate. Once the original PhD certificate has been submitted to the local HR Department, the salary will be increased to Level 4.1 (£32,236).

For more information and to apply online, please download the further details and click on the 'apply online' button above.

We acknowledge, understand and embrace diversity.