HESA’s AI Observatory: What’s new in higher education (Nov. 3rd, 2023)

Spotlight

Good afternoon,

At last week’s AI Roundtable, we had the pleasure of hearing student perspectives on GenAI. Our panelists shared that they have noticed similar polarized views amongst students as those seen across faculty: while some students remain skeptical about the tools and prefer to stay away from them, others are fully embracing them and have started to experiment with GenAI beyond academic purposes, using these tools to increase efficiency in many facets of their life. 

One of our panelists, who started their program after the emergence of ChatGPT, made us realize that some students will only have known higher education with access to GenAI. 

GenAI seemed to be discussed at varying level between institutions. In certain cases, there were little to no direction concerning how GenAI can be used, leaving students confused regarding whether they could use the tools, and how. Other institutions seemed to have developed plenty of resources… however, the question remained about whether students were made aware of them or not. 

Panelists also voiced concerns about equity and the inherent biases of GenAI tools. When prompted about their opinion of institutions starting to use GenAI, they responded that they believe institutions could experiment and use the tools to increase their efficiency, but should be particularly cautious about the potential perpetuation of social inequities, and potential impacts on student access and success.

Finally, the panelists reiterated the importance for institutions to build clear and comprehensive policies regarding GenAI – and for students to be involved in the policy-making process. 

If you missed it, you can watch the recording of the panel here

Next Roundtable Meeting

Date: Tuesday, November 21st, 2023
Time: 12h00-1h00PM ET

Join us on November 21st, from 12:00PM to 1:00PM ET, for our next AI Roundtable meeting, focused on Inclusion. During this session, a series of guest speakers will discuss critical perspectives on GenAI tools and how they can perpetuate inequities, and also address how these tools can be used to support accessibility and inclusion for individuals living with disabilities within the higher education context. This session will be facilitated by Lan Keenan, President of the Schulich Disability Alliance at Dalhousie University’s Law School. We’ll share with you more information about our guest speakers in the upcoming weeks. Register now for free to save your spot!

If you missed our last AI Roundtable on Student Perspectives, you can watch the recording here.

Policies & Guidelines Developed by Higher Education Institutions

Tags: Guidelines, Academic integrity, Pedagogy, Canada

University of Manitoba’s (Canada) resources mention that instructors may use GenAI for pedagogical purposes. However, since the tools have not been vetted by the University yet for privacy or security, the mandatory use of these tools by students is discouraged. The resources provide some guidance for instructors who decide to ask or encourage students to use GenAI tools for their courses, such as never inputting confidential information into GenAI tools, offering alternative forms of assessments for students who are opposed to using AI tools, and more. “Instructors are strongly encouraged to speak to their students about what tools, if any, are permitted in completing assessments.” The University of Manitoba discourages the use of AI-detection tools.

Tags: Statement, Guidelines, Academic integrity, Pedagogy, Oceania

James Cook University’s (Australia) statement on GenAI and its use goes as follow: “At JCU, we measure our impact by the success of our students. Preparing our students for the future of work will mean ensuring they have the skills for lifelong learning, and we acknowledge that GenAI tools will become an embedded part of future ways of working. Our students’ success as graduates will be underpinned by proactive educational opportunities that allow them to engage with emerging and evolving technologies, including GenAI, with an emphasis on the development of the general capabilities and skills that foster their professional expertise, critical thinking, evaluation, and intellectual curiosity. At JCU, the use of GenAI in learning, teaching, and assessment is required to be ethical, pedagogically sound, transparent, and purposeful.” JCU’s online resources include tools to design assessments that minimize academic misconduct, how to investigate inappropriate use of GenAI, examples of how to work with AI, as well as ethical and integrity considerations.  

Tags: Guidelines, Academic integrity, Pedagogy, North America

University of San Francisco’s (US) Guidelines for GenAI and Academic integrity states that it is important for faculty to become familiar with GenAI tools and their applications in order to better determine clear criteria on student use specific to their course context. The guidelines mention that instructors should address parameters for GenAI use in syllabus statements or for specific assignments by providing a brief description of GenAI, as well as clear guidelines for their use, including limitations and consequences. More resources are available on USF’s Generative AI Hub.  

Tags: Guidelines, Academic integrity, Canada

Algonquin College’s (Canada) Guidelines for students on the responsible use of GenAI mentions that inappropriate use of GenAI tools can include: altering your writing style to the point that it is not recognizable as your own; using these tools to mask plagiarism; and submitting work you cannot sufficiently explain or understand as a result of using these tools. The guidelines also indicate how to properly cite GenAI.

Tags: Guidelines, Academic integrity, Canada

Emily Carr University of Art + Design’s (Canada) Guidelines for Working with Generative AI in your classes provide the following guidance: 1) talk openly about GenAI in your classes; 2) encourage students to consider or explore the ethical implications of GenAI; 3) help students understand the limits and limitations of these platforms; 4) make expectations clear in course outlines and assignments; 5) communicate how and when students should document their use of GenAI; and 6) help students protect their privacy when using GenAI. 

News & Research

Tyton Partners. Sponsored by Turnitin. Fall 2023.

Earlier in the fall, Tyton Partners conducted a survey on GenAI writing tools that reached over 1,000 higher education faculty and 1,600 current post-secondary students. The survey followed up another one conducted in March 2023. Results show that the use of GenAI continues to grow: half of students and almost a quarter of faculty are regular users of GenAI. The report shows the top 10 uses of GenAI by students and faculty. Most common uses amongst student daily users include summarizing or paraphrasing text, organizing schedules, answering homework questions, and supporting job applications. Non-daily users use it mostly to understand difficult concepts. As for faculty, they are mostly running prompts through GenAI tools to see what students see, or teaching students to make effective use of GenAI tools. While half of students believe GenAI will have a positive impact on learning outcomes, faculty members remain more pessimistic, although less than they were in March. 

There are tensions between how students would like to use GenAI in their studies, and what is permitted by their institution. 75% of students indicated that they are at least somewhat likely to continue using GenAI even if their institution or professor banned the technology. 

Finally, faculty revealed that GenAI policies were currently primarily made at the institutional (25%) and individual course (21%) levels. “AI-using faculty are more likely to regulate GenAI (57% vs. 45% of non-AI using faculty) and less likely to ban it (7% vs. 23% of non-AI using faculty) indicating that using GenAI gives faculty a more nuanced perspective”.

Greenfield, N. University World News. October 20th, 2023.

Evaluating admission essays is subject to a lot of human biases. 10 researchers developed an algorithm to “read college and university application essays and determine pro-social and leadership qualities”. Since “the algorithm uses the exact same judgement call for every essay”, they wish to reduce biases in admission processes. The correlation between human evaluation and the results produced by the algorithm when trying to assess seven personal qualities (pro-social purposes, leadership, learning, goal pursuit, intrinsic motivation, teamwork, and perseverance) were statistically significant. 

Cline, S. Times Higher Education. July 20th, 2023.

In this article, Sara Cline, professor of biology at Athens State University, shares some strategies on how to change the way we approach discussion boards to prevent students from simply entering the prompt in ChatGPT. The strategies include: 1) using prompts that force a personal opinion; 2) having students include their source(s) as an attachment; 3) using current or local events; 4) having students take and caption a photo of themselves; 5) drawing a diagram or chart; 6) building and explaining a 3D model; 7) including timestamps from lecture videos; or 8) scrapping the discussion boards altogether. 

Morsch, L., Gribbins, M., and Boles, E. The EvoLLLution. October 24th, 2023.

This article presents ways to design assignments with GenAI tools in mind so that it is possible to harness their potential while maintaining academic integrity. These strategies include: 1) encouraging student collaboration; 2) fostering intrinsic motivation; 3) assigning AI-assisted tasks; 4) critically analyzing AI output; and 5) encouraging ethical AI use.

McIntosh, A. EdTech. October 20th, 2023.

AI was one of the most spoken about topics at the recent EDUCAUSE annual conference in Chicago. Many sessions focused on the potential impact of AI on student learning outcomes. Presenters focused on how AI can help enable adaptive learning, by using data analysis to adapt courses over each semester based on student needs. Additionally, AI-powered chatbots “can answer student questions at any hour of the day, and individualized AI tutors can provide customized academic support based on each student’s progress in a given class”. 

More Information

Want more? Consult HESA’s Observatory on AI Policies in Canadian Post-Secondary Education.

This email was forwarded to you by a colleague? Make sure to subscribe to our AI-focused newsletter so you don’t miss the next ones.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.