Biden’s AI executive order underlines need for student technology councils

Students must be allowed to participate in decisions about technology adoption that affect their education, careers and lives, says Mona Sloane

November 4, 2023
A group of students using mobile phones, symbolising student technology councils
Source: iStock

At long last, the US is getting back into the AI regulation game. The new AI Executive Order, unveiled on 30 October, shows that policymakers are taking seriously the many risks that AI systems can pose, from wrongful arrests to biased fraud detection and large-scale misinformation.

Importantly, the order highlights the need to support educators deploying AI tools in the classroom. Absent, however, are key constituents in education: students.

Young people have always been early adopters of technology, and the arrival of generative AI (such as ChatGPT) has shown that students often hold the knowledge and lived experience that decision-makers lack but which is crucial for deploying technology to its maximum potential.

Moreover, students are also experts in their own learning and are the primary stakeholders in the learning process. It is time for universities to establish student-led technology councils, empowering students to participate in the decision-making processes that affect their education, their careers, their lives.

These councils can bring fresh perspectives and a direct understanding of the challenges and opportunities technology presents in education, serving as a bridge between the student bodies and university administrators and educators, and helping identify and prioritise the technology needs and concerns of students. This is ever more important in the context of AI, where the pervasiveness of large-scale data collection and predictive analytics can affect students’ lives far into the future.

Student-led technology councils can also act as a catalyst for transparency and accountability. They can help oversee the ethical considerations surrounding the use of AI and technology, helping universities navigate data privacy, algorithmic bias and the responsible use of student data.

The councils should focus on examining AI trends and advising key decision-makers about the student perspective on technology procurement and implementation. To that end, they should comprise students from diverse academic backgrounds and levels (perhaps appointed or elected by existing student organisations), and they should seek diverse input from the broader student body on technology issues.

Critics might argue that students lack the expertise to make informed decisions about technology procurement and adoption. But students are constant AI users, especially in educational settings. They are continually made to interact with AI systems, from online learning platforms to AI-driven administrative systems, including in admissions. 

However, both university leadership and educators must, of course, retain some form of control over the technology that is used in the institution, so that administration is efficient and learning goals can be achieved. For example, students might argue for expansive use of generative AI in essay writing. But while these systems can help the thought and learning process, letting AI write an essay prevents students from achieving the all-important learning goal of coming up with an argument, structuring it well and bringing it to paper.

But the student technology council should discuss and formally advise university leadership on how technology can be used responsibly and meaningfully. In particular, the council’s input should be mandatory regarding the potential procurement of technologies that directly or indirectly affect students. These can include AI systems that predict student success, analyse student engagement or interact with students in the context of health and wellness, such as chatbots.

Council members should distil input from the broader student body while engaging in research on the functionalities and potential benefits and risks of any given technology. That includes liaising with university decision-makers who work directly with the technology, from educators to administrators and librarians. Drawing all this together, the council should then publish a well-reasoned recommendation.

The university leadership should formally acknowledge these recommendations and, where appropriate, pass them on to the right stakeholder group (whether that is educators, student services teams, admissions or other entities) and mandate that they be considered in decision-making. Whenever procurement and implementation decisions are not aligned with the recommendations, the responsible entity should explain why.

As well as providing students with important lessons in technology policy and governance, student technology councils would help to create a more inclusive and responsive environment that rapidly and equitably adapts to the evolving technological landscape in ways that align with the values and goals of higher education institutions. This is the kind of equitable and participatory AI use that President Biden envisions in his executive order.

All over the world, students have long demanded more participation in decision-making processes at universities, underlining the need for structural representation that is continuous, rather than ad-hoc or tokenised. And, in some places, student governance has a long and successful history. For example, the honour system at my own institution, the University of Virginia, asks students to make a commitment not to lie, cheat or steal. They sign a respective pledge with every work that they submit, and violations lead to expulsion. The system is enforced by a student-run committee that investigates allegations, provides counsel and works with accused students in their defence at trial – a process entirely administrated by students.

Examples like this show that student leadership is no mere pedagogical exercise. Students are willing and ready to deeply engage in the design and administration of their own learning and university experience – and technology should not be exempt from their input.

Mona Sloane is assistant professor of data science at the School of Data Science and assistant professor of media studies in the Department of Media Studies at the University of Virginia (UVA). She runs Sloane Lab, a research group on the social impact of AI.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please Login or Register to read this article.

Related articles