Logo

How to design and supervise GenAI-integrated doctoral research

Strategies for research supervisors and curriculum designers when embedding GenAI in research activities
Kate Abraham's avatar
Hult International Business School (Ashridge)
9 Feb 2026
copy
  • Top of page
  • Main text
  • More on this topic
A man, robot and binary code in profile
image credit: iStock/Moor Studio.

You may also like

Essential GenAI skills for marketing students
6 minute read
Students working on a project together in class

Popular resources

Generative artificial intelIigence (GenAI) is now embedded in our devices and, increasingly, in our processes – always ready with answers and willing to scaffold new solutions. It’s steadily evolving from a tool to a thinking partner, in research and in the classroom.

In executive doctoral education, it becomes more nuanced. Our students are industry executives, university instructors, consultants and government advisers. They interact with GenAI and develop policies within their organisations.

These students approach research through an iterative process between humans and machines. But uncritical use of GenAI is now as great a risk as misuse. Supervisors now report that their students’ shallow engagement is making them feel uneasy, describing cases where their thinking feels absent. Research output, despite being visually appealing, lacks grounding.

In recent work, my colleagues and I introduced the concept of human-AI co-scholarship, reframing GenAI as a third actor in applied research. This shift demands new forms of academic support. We needed to go beyond prompt literacy to critical reflexivity, dialogic supervision and curriculum design that focuses on integrity, critical thinking and the ability to question GenAI thinking and its influence on the researcher.

Advice on supervising research

Feedback alone is not enough. Instead, encourage a dialogue that fosters psychological safety and open discussion about GenAI use. When students use GenAI tools to shape their thinking, help them to interrogate those contributions. In my experience across programme-level discussions with supervisors and candidates, they’re raising concerns about work that appears visually polished and conceptually complex, yet difficult to articulate or interrogate in depth. 

Supervisors describe situations where students struggle to articulate their contribution to their field, while students’ issues regarding time pressures, skills gaps and uncertainty reflect their over-reliance on quick fixes with GenAI. While ethical responsibility is the main goal for the student-researcher, supervisors play a key role in setting the stage for open dialogue about these tensions in GenAI use.

Begin supervision sessions with reflexive check-ins, such as “What felt unresolved or surprising in your recent work?” or “How did your GenAI usage shape your research stance?” Encourage learners to compare their own analysis with GenAI outputs and reflect on differences in voice, nuance, validity and bias.

Discuss GenAI use early in the research process and agree on boundaries with your students. This should be a continuing dialogue, revisited as the research evolves. Let students explore different GenAI tools, comparing features, assumptions, strengths and limitations, to understand how each influences process and outcome.

Encourage students to record instances where GenAI aided or impeded their thinking. Use these reflections to collaborate on strategies that harness GenAI’s advantages, while maintaining awareness of when learners need to step up their thinking.

Support students in developing ownership and authority over their GenAI use. Ask reflective questions such as “What are you ignoring because GenAI outputs did not highlight it?” or “What assumptions are you making about GenAI’s authority?”

Model vulnerability by sharing your own discomfort and learning curve related to GenAI usage in the research process. For example, some supervisors prefer introducing GenAI later, once there is greater clarity on the research direction, rather than during initial conceptualisation. 

Being open about your preferences supports early discussions on boundary-setting. It is also essential for supervisors, particularly those from more traditional institutions and disciplinary backgrounds, to acknowledge generational or contextual gaps in their familiarity and use of GenAI when supervising executive doctoral students, whose professional environments are already GenAI-integrated. This can reduce hierarchy-driven silence that is notable in traditional doctoral education, help to validate diverse experiences around technology framings and reinforce both parties’ willingness to adapt.

Advice for curriculum designers

Doctoral education must teach thinking, not just tools. Reflexivity should be a core learning outcome, not a methodological footnote.

When designing curricula, include GenAI literacy modules that embed ethics and reflexive prompts. Reflection tools should tap into epistemic tension, for example: “What area of this research did I feel most uncertain about?” or “Is GenAI surfacing thinking that challenges research norms?”

At my university, a core credit-bearing module at the beginning of the programme is the DBA Program Immersion. The module moves beyond plagiarism-only GenAI guidance to include reflection and reflexivity across different GenAI-embedded research activities. Incorporating reflexivity and ethics into a GenAI-embedded research context prompted students’ thinking about their assumptions on research and GenAI’s role in it, including choice of tools, how reliant they are on them and how influential they can be on their own sense of authority and agency when they’re working on the core coursework and initial conceptual design. 

The goal is for students to possess the tools and capacity to continuously question their understanding of their GenAI-embedded research, express epistemic and ethical judgement, and own their work regardless of GenAI usage across each research stage.

Use feedback to promote curiosity rather than critique. By designing formats that encourage enquiry, educators can create a culture of thoughtful engagement with GenAI.

Encourage contrast and re-engagement throughout your modules, allowing students to revisit earlier choices after exposure to new contexts. Clearly define permissible levels of GenAI integration in assessment design, while allowing for emerging human-machine interactions that contribute new knowledge.

Recognising GenAI’s risk versus reward

For doctoral educators, the goal is to help researchers recognise when GenAI enhances insight versus when it risks distortion. Educators must accompany students as they learn to think critically, reflexively and with integrity using GenAI.

This contributes to a broader reimagining of executive doctoral education where human-AI co-scholarship becomes a lens for curriculum and institutional strategy. The next phase will explore this concept in practice through curriculum pilots, supervision protocols and case studies, designing spaces that adapt to GenAI-integrated contexts, with the human at the centre.

Kate Abraham is assistant dean at Hult International Business School.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site