Logo

Six tips for implementing a careers chatbot ethically and effectively

Helping students make good academic choices fuelled by their career aspirations before they start studying is crucial – and a well-designed chatbot can be very useful

,

,

10 Oct 2022
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
Careers chatbot

Created in partnership with

Created in partnership with

Arden University

You may also like

How AI and chatbots can deliver personalised career planning
Chatbots can be vital tools for university careers services, with one-to-one discussions set to become a thing of the past

In 2021, statistics from a major Ucas study stopped us in our tracks: “Two in five students believe that more information and advice would have led to better HE choices.” Almost half of students from more deprived backgrounds raised the same concern.

While student support services, flexible courses and pastoral care can all help once students have started, improving choices before starting a course of study is key.

Alongside traditional outreach approaches, universities shouldn’t be afraid to engage with emerging technologies, especially through formal partnerships to access expertise they might not usually have access to. For example, we at Arden University have been working with CiCi – a NESTA- and DfE- funded start-up – to develop a careers chatbot to help prospective students understand the courses available, how they relate to different careers and what those careers might actually be like.

Here, we outline six tips for implementing a career chatbot ethically and effectively:

1. Balance privacy with chatbot improvement

Privacy considerations need to be driven by the bot’s audience and how data will be used. Universities need to ensure chatbots developed to engage with the public don’t require users to enter any identifying information. Privacy should be strengthened wherever possible, for example by creating a firewall between the chatbot, managed by external experts, and university staff, ensuring that the only data shared is a dashboard of aggregate usage and anonymised, sanitised chatlogs, allowing for priorities for future R&D work to be identified.

2. Help users find advice from outside your institution

In creating an ethical chatbot it is essential that institutions are explicit about where they source careers information and ensure they refer to a wide range of external, independent sources, such as prospects.ac.uk, quizzes and government websites, as well as information about an institution’s own offerings. For users wanting to talk to an adviser, the Arden bot we created will offer to connect them with the National Careers Service for free, impartial advice, to provide a call with a university adviser for questions specifically about Arden courses or to support them with finding qualified professionals for paid advice.

3. Integrate AI functionality with relevant data and technologies

Pure AI chatbots, such as Open AI’s GPT-3, are not currently fit for use in career advice, often generating results that are inconsistent and sometimes unethical. One potential solution, which we have identified while developing ours, has come from combining AI with other technologies, such as AI natural language processing, and providing menu options for users to click through (testing suggests users prefer to click than to type).  

A careers chatbot should look to help users in two key ways: finding data-driven answers to questions; and referring users to carefully curated sources of careers information and advice, vetted by careers professionals.

Examples of questions that would be responded to by data-driven answers include, what vacancies exist for what jobs in my region? What are the salaries and skills needed? What courses relate to those careers and what do they cost? To answer these questions, we integrated the bot via application programming interfaces (APIs) – basically, a way for two or more computer programmes to communicate – with a wide range of public and private datasets, surfacing information in accessible chunks, which also ensures the chatbot has the latest information. When dealing with less data-driven questions it is important to include resources such as external articles on careers information and advice on common questions such as formatting CVs, preparing for interviews, finding work experience and volunteering opportunities.

4. Incorporate coaching-style prompts

An ethical risk with a bot is that users may not appreciate the context of careers data presented or may not ask themselves the searching questions that would be prompted with a person. Our experience suggested a way that an institution might mitigate this is by incorporating coaching-style prompts and “soft language” throughout, while also referring users to where they can get advice from a person. For instance, a section on decision-making can invite users to reflect on their career motivators and highlight the type of insight not available via standard labour market information data.

5. Don’t hide your humans!

Many of us will have experienced the frustrations of circular conversations on a corporate online chatbot. Chatbots should improve user experience by being available 24/7, empowering users to ask more questions than time may allow with a person, and by supporting direct access to useful labour market information and job insights. But there are many careers information and advice questions it will not be able to help with, so factor in the options to always provide an easy-to-access option to be referred to a human adviser.

6. Make it as accessible as possible

As with any tool, chatbots are only useful if they are widely accessible, via all types of devices – mobiles, tablets, desktops and laptops – and are well signposted, which is why the Arden careers chatbot, which is free for anyone to use, will be promoted via activities by our university’s outreach team. Tracking user data and experimenting with the location of the bot is vital to ensure it has impact.

7. Build in feedback and foster a culture of innovation to plan for the long-term

Having launched a chatbot, keep in mind that it will need ongoing maintenance due to developments in machine learning models, technology advances and changes in data interfaces. Bot enhancements can also be identified through the monitoring of feedback, via proactively asking users for this after a few minutes, capturing questions that it doesn’t have answers to, and using evidence of usage and web analytics. This is the first of our chatbots but starting small has enabled us to learn how to develop and support effective, ethical solutions to ensure they are sustainable.

Caroline Tolond is head of careers and employability at Arden University, UK.

Deirdre Hughes is co-founder of CiCi and an honorary associate professor at the University of Warwick’s Institute for Employment Research.

Chris Percy is co-founder of CiCi, as well as a strategy consultant and careers researcher with expertise in quantitative methods.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site