Skip to main content

This job has expired

PhD Position in the Field of Explainable Artificial Intelligence

Employer
NORWEGIAN UNIVERSITY OF SCIENCE & TECHNOLOGY - NTNU
Location
Trondheim, Norway
Closing date
31 Mar 2021

About the position

We have a vacant PhD position at the Department of Engineering Cybernetics in the field of Explainable Artificial Intelligence.

The recent rapid advances of Artificial Intelligence (AI) hold promise for multiple benefits to society in the near future. AI systems are becoming ubiquitous and disruptive to industries such as healthcare, transportation, manufacturing, robotics, retail, banking, and energy. According to a recent European study, AI could contribute up to EUR 13.3 trillion to the global economy by 2030; EUR 5.6 trillion from increased productivity and EUR 7.73 trillion from opportunities related to consumer experience. However, in order to make AI systems deployable in social environments, industry and business-critical applications, several challenges related to their trustworthiness must be addressed first. 

Most of the recent AI breakthroughs can be attributed to the subfield of Deep Learning (DL), based on Deep Neural Networks (DNNs), which has been fueled by the availability of high computing power and large datasets. Deep learning has received tremendous attention due to its state-of-the-art, or even superhuman, performance in tasks where humans were considered far superior to machines, including computer vision, natural language processing, and so on. Since 2013, Deep Mind has combined the power of DL with Reinforcement Learning (RL) to develop algorithms capable of learning how to play Atari games from pixels, beating human champions at the game of Go, surpassing all previous approaches in chess, and learning how to accomplish complex robotic tasks. Similarly, DL technology has been used in combination with Bayesian Networks (BNs), resulting in Deep Bayesian Networks (DBNs), a framework that dramatically increases the usefulness of probabilistic machine learning. Despite their impressive performance, DL models have drawbacks, with some of the most important being lack of transparency and interpretability, lack of robustness, and inability to generalize to situations beyond their past experiences. These are difficult to tackle due to the black-box nature of DNNs, which often end up having millions of parameters, hence making the reasoning behind their predictions incomprehensible even to human experts. In addition, there is a need to better understand societal expectations and what elements are needed to ensure societal acceptance of these technologies.

Explainable AI (XAI) aims at remedying these problems by developing methods for understanding how black-box models make their predictions and what are their limitations. The call for such solutions comes from the research community, the industry and high-level policy makers, who are concerned about the impact of deploying AI systems to the real world in terms of efficiency, safety, and respect for human rights. In order for XAI to be useful in business-critical environments and applications, it should not be limited to algorithm design because the experts who understand decision-making models the best are not in the right position to judge the usefulness and structure of explanations. It is necessary to enhance XAI research by incorporating models of how people understand explanations, and when explanations are sufficient for trusting something or someone. Such models have been researched for many years by philosophers, social and cognitive psychologists, and cognitive scientists. It becomes evident that significant interdisciplinary contributions are needed for AI systems to be considered trustworthy enough for deployment in social environments and business-critical applications.

The EXAIGON (Explainable AI systems for gradual industry adoption) project (2020-2024) will deliver research and competence building on XAI, including algorithm design and human-machine co-behaviour, to meet the society’s and industry’s standards for deployment of trustworthy AI systems in social environments and business-critical applications. Extracting explanations from black-box models will enable model verification, model improvement, learning from the model, and compliance to legislation. 

EXAIGON aims at creating an XAI ecosystem around the Norwegian Open AI-Lab, including researchers with diverse background and strong links to the industry. The project is supported by 7 key industry players in Norway who will provide the researchers with use cases, including data, models and expert knowledge. All involved researchers will work closely with each other, the industry partners, and researchers already working on relevant topics at NTNU, hence maximizing the project’s impact and relevance to the real world.  

You will report to Morten Breivik (Head of the Department).

Duties of the position

Supervised learning algorithms in particular the Deep Neural Networks in various forms have been behind the unprecedented success of artificial intelligence in recent times. However, their interpretability diminishes very quickly with increasing complexity of the network architecture. To employ such algorithms in real life applications we envision the PhD students to be working on one or more of the following (but not limited to) basic approaches

  • develop ways to simplify the neural networks either by enriching the input space or by expressing them in some alternate interpretable form like a collection of piecewise affine representations and then develop theories to prove their stability 
  • develop methods to inject physics / domain knowledge and thereby, not only guide the learning process of the networks but also to reduce their complexity
  • to develop analysis tools like symbolic regression to get better insight into the functional form of the mapping from the input to the output space.
  • develop approach to use blackbox machine learning methods with inbuilt sanity check mechanisms in place  

The candidate will be expected to disseminate the findings through reputed journals and conference proceedings. More information about our related activities can be found on: www.hybridmodelling.com 

Required selection criteria

The PhD-position's main objective is to qualify for work in research positions. The qualification requirement is that you have completed a master’s degree or second degree (equivalent to 120 credits) with a strong academic background in engineering cybernetics, computer science, control engineering, applied mathematics, or a related discipline or equivalent education with a grade of B or better in terms of NTNU’s grading scale. If you do not have letter grades from previous studies, you must have an equally good academic foundation. If you are unable to meet these criteria you may be considered only if you can document that you are particularly suitable for education leading to a PhD degree.

The appointment is to be made in accordance with the regulations in force concerning State Employees and Civil Servants and national guidelines for appointment as PhD, post doctor and research assistant. 

Applicants are required to justify their candidateship by explicitly explaining their personal motivation and academic aptitude for pursuing a doctoral degree within this research field. Applicants that expect to complete their master’s degree studies by summer 2021 can apply. Academic results, publications, relevant specialization, work or research experience, personal qualifications, and motivation will be considered when evaluating the applicants. 

Excellent English skills, written and spoken, are required. Applicants from non-European countries where English is not the official language must present an official language test report. The following tests can be used as such documentation: TOEFL, IELTS or Cambridge Certificate in Advanced English (CAE) or Cambridge Certificate of Proficiency in English (CPE). Minimum scores are:

  • TOEFL: 600 (paper-based test), 92 (Internet-based test)
  • IELTS: 6.5, with no section lower than 5.5 (only Academic IELTS test accepted)
  • CAE/CPE: grade B or A.

Preferred selection criteria

  • Strong background in Cybernetics, Mathematics, Statistics, Machine Learning and Artificial Intelligence
  • Excellent programing skill preferably in python
  • Excellent oral and written communication skill in english
  • Knowledge of partial differential equations, numerical methods, constrained optimization will be a plus
  • Some experience with commonly used machine learning algorithms like pytorch and tensorflow

Personal characteristics

  • Motivated
  • Creative
  • Team player
  • Independent

We offer

Salary and conditions

PhD candidates are remunerated in code 1017, and are normally remunerated at gross from NOK 482 200 per annum before tax, depending on qualifications and seniority. From the salary, 2% is deducted as a contribution to the Norwegian Public Service Pension Fund.

The period of employment is 3 years or 4 years with 25 % duty work.

The position is subject to external funding of “Explainable AI and the EXAIGON project”.

It is a prerequisite you can be present at and accessible to the institution on a daily basis.

Appointment to a PhD position requires that you are admitted to the PhD programme in Engineering Cybernetics (https://www.ntnu.edu/studies/phtk ) within three months of employment, and that you participate in an organized PhD programme during the employment period. 

The engagement is to be made in accordance with the regulations in force concerning State Employees and Civil Servants, and the acts relating to Control of the Export of Strategic Goods, Services and Technology. Candidates who by assessment of the application and attachment are seen to conflict with the criteria in the latter law will be prohibited from recruitment to NTNU.

After the appointment you must assume that there may be changes in the area of work.

About the application

The application and supporting documentation to be used as the basis for the assessment must be in English.

Publications and other scientific work must follow the application. Please note that applications are only evaluated based on the information available on the application deadline. You should ensure that your application shows clearly how your skills and experience meet the criteria which are set out above. 

The application must include:

  • CV, certificates and diplomas
  • Research plan or project proposal 
    • A short presentation of the motivation for a PhD study. 
    • Why the applicant is suited for the position.
    • The applicant’s view of research challenges for the PhD position, as well as his/her theoretical and methodological approach to the challenges.
  • Academic works - published or unpublished - that you would like to be considered in the assessment (up to 5 works)
  • Name and address of three referees

Joint works will be considered. If it is difficult to identify your contribution to joint works, you must attach a brief description of your participation.

In the evaluation of which candidate is best qualified, emphasis will be placed on education, experience and personal suitability.

NTNU is committed to following evaluation criteria for research quality according to The San Francisco Declaration on Research Assessment - DORA.

General information

Working at NTNU

A good work environment is characterized by diversity. We encourage qualified candidates to apply, regardless of their gender, functional capacity or cultural background. 

The city of Trondheim is a modern European city with a rich cultural scene. Trondheim is the innovation capital of Norway with a population of 200,000. The Norwegian welfare state, including healthcare, schools, kindergartens and overall equality, is probably the best of its kind in the world. Professional subsidized day-care for children is easily available. Furthermore, Trondheim offers great opportunities for education (including international schools) and possibilities to enjoy nature, culture and family life and has low crime rates and clean air quality. 

As an employee at NTNU, you must at all times adhere to the changes that the development in the subject entails and the organizational changes that are adopted.

In accordance with The Public Information Act (Offentleglova), your name, age, position and municipality may be made public even if you have requested not to have your name entered on the list of applicants.

Further details about the positions can be obtained from: Professor Adil Rasheed (adil.rasheed@ntnu.no ).

Please submit your application electronically via jobbnorge.no with your CV, diplomas and certificates. Applications submitted elsewhere will not be considered. Diploma Supplement is required to attach for European Master Diplomas outside Norway. Chinese applicants are required to provide confirmation of Master Diploma from China Credentials Verification (CHSI).

Publications and any other work that the applicant wishes to be considered must also be enclosed.

If you are invited for interview you must include certified copies of transcripts and reference letters. Please refer to the application number 2021/13146 when applying.

Application deadline: 31.03.2021.

NTNU - knowledge for a better world

The Norwegian University of Science and Technology (NTNU) creates knowledge for a better world and solutions that can change everyday life.

Department of Engineering Cybernetics (ITK)

Engineering cybernetics is the study of automatic control and monitoring of dynamic systems. We develop the technologies of tomorrow through close cooperation with industry and academia, both in Norway and internationally. The Department contributes to the digitalization, automation and robotization of society. The Department of Engineering Cybernetics is one of seven departments in the Faculty of Information Technology and Electrical Engineering.

Deadline 31st March 2021
Employer NTNU - Norwegian University of Science and Technology
Municipality Trondheim
Scope Fulltime
Duration Project
Place of service Trondheim

Get job alerts

Create a job alert and receive personalised job recommendations straight to your inbox.

Create alert