HESA’s AI Observatory: What’s new in higher education (Sept. 22nd, 2023)

Spotlight

Good afternoon all,

If you’ve been reading our AI-focused emails in the past few weeks, you’ll know that we’ve been noting that higher education institutions are mainly focusing their policy efforts on developing guidelines about generative artificial intelligence (GenAI) in teaching and learning (T&L), while staying relatively silent on how to (or not) integrate AI in business operations. 

However, the lack of guidance doesn’t mean AI isn’t being increasingly used for operational purposes in institutions. We’ve already shared many articles highlighting how, for example, GenAI tools have started to be used for general administration purposes, for marketing, in admissions, or to design courses. GenAI tools are also being used to better connect students with support services, or to improve career services

We’d love to hear how GenAI tools are currently being used in your institutions – outside T&L. If you have any success (or horror) stories to share, or if your institution is one of the few that has actually developed guidance on the use of GenAI for business operations, please send them our way (by email or via this online form). We’d love to dive deeper on the business side of things so that our AI Observatory can better support higher education institutions on that front.

Also, make sure to opt-in to the AI-focused emails by clicking here or on the button below if you want to keep receiving these weekly AI-focused emails. After the end of next week, we’ll shift to only sending these AI-focused emails to those who subscribed to that specific list.

Next Roundtable Meeting

Date: Tuesday, September 26th, 2023
Time: 10h00-11h00AM ET

Last call to join us next Tuesday, on September 26th, from 10:00AM to 11:00AM ET, for our next Roundtable meeting focused on Pedagogy and curriculum. This meeting will be facilitated by Grant Potter, Instructional Designer at UNBC’s Centre for Teaching, Learning, and Technology. We’ll also welcome Paddy Shepperd, Senior AI Specialist at Jisc’s National Centre for AI, as guest speaker. Register now for free to save your spot!

In the meantime, catch up on our previous AI Roundtable meetings here.

Policies & Guidelines Developed by Higher Education Institutions

Tags: Guidelines, Academic integrity, Pedagogy, Canada

University of Guelph’s (Canada) Provisional Recommendations for the Use of Generative AI followed the release of an institutional Statement on Artificial Intelligence Systems, ChatGPT, Academic integrity. The latter states that acceptable use of AI should be determined by the course instructor, and that unauthorized use of AI constitutes academic misconduct. The provisional recommendations mention that instructors should clearly communicate with their students if and to what extent GenAI use is acceptable in their course (via the course outline, in the learning management system, verbally in-class, and in assessment descriptions). The recommendations also include tools to adapt assessments to incorporate GenAI. Finally, they state that AI detection softwares are not currently approved for use at UofG. UofG also developed additional resources, such as a Tool for Determining Allowable Uses of AI with Writing Assignments and a list of Annotated Resources on GenAI

Tags: Guidelines, Academic integrity, Pedagogy, Canada

University of Waterloo’s (Canada) guidance on Using ChatGPT and other Text-Generating Artificial Intelligence states that the use of GenAI should be guided by productivity and ethics. GenAI tools should help students learn and support the development of their skillset, rather than replace them in doing the work. GenAI tools should also only be used under permission, and students must be transparent in their use. The guidance also states the risks of using GenAI, such as the lack of sources, issues with copyright and intellectual property, privacy risks, and the replication of biases. UofWaterloo also provides resources on how to use GenAI in writing processes, such as when brainstorming, drafting, revising and editing, and documenting and citing. 

Tags: Guidelines, Academic integrity, Pedagogy, North America

Yale University’s (US) AI Guidance provides guidance for instructors, which include the following: (1) Instructors should be direct and transparent about what tools students are permitted to use, and about the reasons for any restrictions; (2) Controlling the use of AI writing through surveillance or detection technology is probably not feasible; and (3) Changes in assignment design and structure can substantially reduce students’ likelihood of cheating—and can also enhance their learning. The guidance also recommends including an academic integrity statement in syllabi, and to address the use of GenAI tools in that statement. Yale also provides a list of Examples from Yale Instructors on how to incorporate AI in teaching. 

Tags: Guidelines, Academic integrity, Pedagogy, North America

Ohio State University’s (US) guidance AI: Considerations for Teaching and Learning covers benefits and limitations of GenAI, and provides examples on how to use AI to support learning activities. It also encourages instructors to set clear expectations for students’ use of AI in their syllabi and to discuss these expectations with students openly; to design transparent assignments and activities, and share explicit assessment criteria for them; to involve students in decision-making to give them ownership in AI policies and assessment criteria for their courses; and to discuss the ethical implications of AI in real-world contexts beyond the classroom. The guidance also supports instructors in (re)designing assessments to promote academic integrity.

Tags: Guidelines, Academic integrity, Pedagogy, Sub-Saharan Africa

University of Cape Town’s (South Africa) guidance on Artificial Intelligence for Teaching & Learning includes two guides for staff, that cover the challenges surrounding (re)designing assessments and ensuring academic integrity, as well as integrating GenAI in T&L; and a student guide, focused on how to use GenAI ethically. In addition, UCT has established a Working Group on AI Tools in Education, which focuses on raising awareness about AI and communicating latest developments on the issue, developing training and workshops for teaching with AI, promoting AI literacy, reviewing the academic misconduct policy and testing the use of AI detectors, collecting data on the use of AI in T&L, developing recommendations regarding AI governance, and exploring possibilities on how to use AI to improve student support. 

News & Research

Plé, L. Times Higher Education. September 22nd, 2023.

In this article, the author, Director of Pedagogy and Head of the Centre for Educational and Technological Innovation at IÉSEG School of Management (France), argues that institutions shouldn’t be wondering whether to trust students with GenAI: “Posing such a question might lead to the use of AI detection tools whose effectiveness is, at best, limited. This can lead to unfair processes and decisions.” Rather, “trust must be built jointly with students. This is why looking beyond the simple use or misuse of AI in formal assignments and guiding students in the many varied, complex and ethical applications of GenAI is essential and will contribute to building trust”. Within his institution, administrators, professors and students collaborated in adapting their plagiarism policy to AI.

Jisc National centre for AI in tertiary education. August 2023.

This report was developed following five discussions forums with tertiary education students in the UK. The goal of these discussions was to better understand what use students are making of GenAI tools, as well as the potential impact of these tools on the students’ learning experience. The findings focused on five areas of interest: current usage, student concerns, the role of GenAI in assessments, how best to support students, and the significance of GenAI.  

Sharma, Y. University World News. September 13th, 2023.

After an initial period of caution, Joe Qin, president of Lingnan University (Hong Kong), is now embracing GenAI and has recently purchased a license for the 3rd version of ChatGPT for the whole university to use. His intention is to make sure students and faculty develop AI literacy and can use GenAI responsibly – especially since AI competency will become more and more critical in the workplace. Qin, who admitted using ChatGPT himself, intends to extend the use of AI beyond T&L, including in research and in university governance. 

Jaschik, S. Inside Higher Ed. May 15th, 2023.

As higher education institutions are increasingly incorporating AI in their ways of doing, so are admissions offices. “Many institutions receive tens of thousands of applications each year, making it a Herculean task for human evaluators to sort through and assess each one. This is where AI comes into play – software can quickly and accurately analyze large swathes of data, enabling universities to streamline their admissions process and spend more time focusing on the finer aspects of student evaluation”. However, institutions remain cautious and agree that AI shouldn’t make the final call about whether to admit a given applicant.

Grove, J. Inside Higher Ed. September 22nd, 2023.

The author of this article interviewed multiple Nobel prize winners to get their opinion on the use of GenAI in research. Like in any field, opinions are mixed. While some believe that GenAI could help push boundaries of what research can achieve and that researchers are clever enough to critically analyze any AI-produced output, others are concerned with phenomenon like hallucinations, and the lack of transparency in the data sets that are being used. Some advocate for legislators to require that LLMs indicate the margins of certainty of the output produced, and to acknowledge when there are sources with contradictory conclusions.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.