HESA’s AI Observatory: What’s new in higher education (Dec. 8th, 2023)

Spotlight

Good afternoon all, 

In today’s newsletter, we share articles about governance structures to coordinate institution’s response to GenAI, student perspectives on the use of GenAI, the need to train students on ethical uses of AI, ways in which GenAI can support student wellbeing, the need for better data training of AI tools, and more.

Also, we are starting to plan our Winter 2024 AI Roundtables. If you’d like to suggest any topic or guest speaker, please reach out to us by email

Policies & Guidelines Developed by Higher Education Institutions

Tags: Guidelines, Academic integrity, Pedagogy, Canada

Thompson Rivers University’s (Canada) Guide for Students lists appropriate and inappropriate uses of AI. Appropriate uses include using it with approval from an instructor, as a study aid to prepare for exams, as a study aid to improve understanding, and as an example to use for critical discussion. The website provides access to a chatbot to ask further questions live. 

Tags: Guidelines, Research, North America

Duke University’s (United States) Artificial Intelligence Policies: Guidelines and Considerations state that “every course should have an AI policy” and that “instructors should update their plagiarism policies to include guidance on the use of generative AI text in their class”. Instructors have the discretion to define if, and when GenAI may be used in their courses. “Establishing an AI policy for your class allows you to have meaningful discussions with students on this topic. Being specific about how AI is or isn’t allowed makes the rules clear for students and faculty if there are academic integrity violations”. The Duke Community Standard has been updated to include the unauthorized use of GenAI as a form of cheating. The guidelines provide examples of GenAI policies for instructors to use in their courses, as well as considerations to keep in mind when defining acceptable use. The continuum of policies includes prohibiting the use of GenAI, allowing use only with prior permission, allowing use only with acknowledgement, and allowing use freely with no acknowledgement. Finally, the guidelines state that the use of AI detection software is not recommended.

News & Research

Schroeder, R. Inside Higher Ed. November 27th, 2023. 

In this article, the author argues that GenAI has become “too important for universities to merely respond in an ad hoc way to new developments from a world filled with AI-enhanced program developers”. He states that institutions should identify and appoint a leader to coordinate and vision the AI initiative. “It will be important to create such a C-level leadership position at each university to coordinate the policy-making, research, application, and the internal and external teams of experts who will serve to chart the course of AI initiatives”. In addition, institutions should identify and appoint a committee of internal and external experts and representative stakeholders. This committee should include faculty and staff, top leadership and HR leaders from businesses, agencies and other entities that regularly employ graduates from the university, and a librarian to curate the materials brought to and reviewed by the committee. Linkages must also be created to IT support units, public information offices, academic governance entities, and provost/VPAA, president/chancellor and governing board offices. 

MacGregor, K. University World News. November 30th, 2023. 

The 2023 Global Student Survey produced by Chegg.org showed that 40% of students worldwide report using GenAI in their studies, and that students want training on AI tools. Motivations for students to use GenAI include “augmenting the quality of their education and boosting skills needed to compete in the global economy”. The main reasons for using GenAI were that it helps students learn faster (53%), it includes the ability to personalize learning (35%), it reduces the cost of extra tutoring (28%), and students are not always comfortable asking professors for help (23%). 47% of students who said they use GenAI for their studies reported being concerned about receiving incorrect or inaccurate information. “65% of students worldwide said they would like their curriculum to include training in AI tools relevant to their future career”, and “51% said there should be better guidance on the acceptable use of AI tools in assessments”. Respondents indicated they were mainly using GenAI for writing tasks, “and are not yet fully leveraging the technology for STEM subjects”. “Among students who use GenAI, 44% said that their understanding of complex concepts or subjects had improved, 33% said that their academic confidence improved, and 32% said their writing skills improved”. 

Walsh, J. Times Higher Education. November 30th, 2023. 

The author of this article, Head of Student Engagement Projects at the University of Galway (Ireland), shares how the University of Galway developed an AI student engagement platform to better support student well-being and success. This enables access to on-demand support, at any time of the day or the night. The AI-driven virtual assistant “actively responds with the most up-to-date information on topics related to student life, ensuring clarity and understanding. From fees and registration queries to mapping existing support pathways, Cara [their AI chatbot] can also facilitate timely human support when needed”. Additionally, “through on-demand surveys, sentiment analysis and real-time analysis of trending topics and student queries, Cara has highlighted unexpected issues that [the University] would not have identified previously”. Finally, AI tools helped streamline administrative tasks and information management: “by liberating staff from repetitive queries, we have enabled them to build more meaningful relationships and deliver personalized support to students”. 

Joint Council for Qualifications. November 2023. 

The JCQ published guidance related to AI and protecting the integrity of qualifications. “Students are expected to demonstrate their own knowledge, skills and understanding as required for the qualification in question and set out in the qualification specification. This includes demonstrating their performance in relation to the assessment objectives for the subject relevant to the question/s or other tasks students have been set. Any use of AI which means students have not independently demonstrated their own attainment is likely to be considered malpractice. While AI may become an established tool at the workplace in the future, for the purposes of demonstrating knowledge, understanding and skills for qualifications, it’s important for students’ progression that they do not rely on tools such as AI.” It lists a series of examples of AI misuse, and states that “AI misuse constitutes malpractice as defined in the JCQ Suspected Malpractice: Policies and Procedures. The malpractice sanctions available for the offences of ‘making a false declaration of authenticity’ and ‘plagiarism’ include disqualification and debarment from taking qualifications for a number of years”. The document lists a series of recommendations for centres, such as explaining the importance of students submitting their own independent work, updating the centre’s malpractice/plagiarism policy to acknowledge the use of AI, and ensuring that teachers and assessors are familiar with AI tools. 

Shepperd, P. Jisc National Centre for AI. December 1st, 2023. 

In this blog post, the author addresses the following myths regarding GenAI: 1) that anyone can use GenAI; 2) that GenAI can create perfect and original content; 3) that AI detectors can distinguish between human and AI generated content, to check if learners have used AI; 4) that GenAI tools such as ChatGPT and Google Bard are learning from our prompts; 5) that qualification bodies have banned the use of all AI; 6) that GenAI has bias and can create false information; 7) that it is possible to redesign all assessments to outwit GenAI; and 8) that anything can be input into GenAI. 

Matei, S. A. Times Higher Education. November 28th, 2023. 

In this article, the author, who is associate dean of research and graduate education at Purdue University’s College of Liberal Arts, questions if the way GenAI tools are being trained is appropriate for use in the academy. Instead, he suggests creating AI agents that cater to academic needs. 

Barnabas, E. and Kotran, A. Honolulu Civil Beat. December 3rd, 2023. 

In this article, the authors make the case for the need to integrate AI training in education, stating as early as K-12: “To ensure a sustainable talent pipeline, AI education must be integrated into all aspects of our education system – not only in terms of coursework, but also in the ways teachers teach, and students learn – beginning at K-12, to prepare our youth for the age of AI”. “Nearly every practitioner in every field will need some level of AI comprehension to remain competitive within the workforce. […] Unfortunately, the demand for the talent pool that has this knowledge far outstrips the supply – highlighting the need for an education system that can build and sustain an AI talent pipeline”. 

Ward, D. et al. Inside Higher Ed. December 1st, 2023. 

Professors and administrators from five major public universities in North America collaborated to provide advice on how to get moving ahead with AI in the classroom. They argue that “the academy can no longer live in denial” and “must help students learn to use GenAI in effective, ethical ways in their learning and, eventually, in their careers and lives as citizens”. “The academy must step off the sidelines and into the scrum, especially as AI continues to change rapidly”. They provide simple steps for instructors to follow to begin moving students and classes into the age of GenAI: 1) creating syllabus language about AI use; 2) drafting a plan with students; 3) planning conversations about AI; 4) embedding exploration of ethical practice into assessments; 5) creating an exploratory assignment; and 6) taking this approach with all assignments. The authors then suggest five approaches to deepen the use of GenAI as a teaching and learning tool: 1) evaluating assignments through the lens of GenAI; 2) adding methods or reflection components to assignments; 3) allowing students to work on assignments and activities in class using AI; 4) doing oral checks of students’ understanding; and 5) adopting authentic assessments. 

Davis, J. Inside Higher Ed. November 30th, 2023. 

The author of this article, an assistant professor of philosophy at the University of Georgia, shares how ChatGPT has “introduced new tensions to professors’ dual roles as educators and assessors”. Indeed, he argues that “in order to avoid ChatGPT, professors are now tasked with introducing substantial changes to their assessment methods. They are all but forced to do this even when doing so is substantially less effective in terms of student learning. That is, ChatGPT has forced professors to place greater emphasis on their role as assessor over that of educator.”

Bourjaily, P. Times Higher Education. December 5th, 2023. 

In this article, an associate professor of instruction at the University of Iowa shares ways in which faculty can facilitate more effective use of ChatGPT for writing assignments. This includes emphasizing the importance of input; creating an AI prompting plan; asking students to follow up to improve generated content; letting AI assist with hard-to-teach topics; stressing the importance of collaboration with AI; and taking the pressure off themselves

More Information

Want more? Consult HESA’s Observatory on AI Policies in Canadian Post-Secondary Education.

This email was forwarded to you by a colleague? Make sure to subscribe to our AI-focused newsletter so you don’t miss the next ones.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.