HESA’s AI Observatory: What’s new in higher education (Feb. 16th, 2024)

Spotlight

Good afternoon all, 

In today’s newsletter, we share articles covering a broad range of topics – from taking a whole institution approach to GenAI, to using GenAI in research. Some also cover how GenAI might impact assessments and feedback, and particular considerations related to using GenAI in creative work and work-based learning.

Also, ICYMI – HESA recently launched its AI Advisory Services. If your institution is struggling to develop its response to this new technology, or is simply in search of an extra hand, we might be able to help. Reach out to our team to discuss ways in which we could support your efforts.

Next Roundtable Meeting

Date: Tuesday, February 20th, 2024
Time: 12h00-1h00PM ET

Join us next Tuesday, February 20th, from 12:00PM to 1:00PM ET, for our next AI Roundtable, which will focus on opportunities, risks and considerations surrounding the use of GenAI tools in research, with a research integrity lens. We will have the pleasure of welcoming Dr. Sarah Elaine Eaton, Associate Professor at the Werklund School of Education at the University of Calgary, Editor-in-Chief of the International Journal for Educational Integrity and Council member of the Committee on Publication Ethics (COPE), who will share with us preliminary reflections on the issue before we open up the discussion to participants. Participants will be asked to fill a short survey upon registration to better align the discussion. This session will be facilitated by Amanda McKenzie, Director of the Office of Academic Integrity at the University of Waterloo. Register now (it’s free!) to save your spot!

If you missed last month’s AI Roundtable, you can watch the recording here.

Are you a faculty member or an instructional designer who has been using GenAI tools in teaching and learning in innovative and practical ways? We invite you to present during our “community poster” session on Pedagogy and Curriculum roundtable in April (date TBD). Mention your interest here

Advertisement

News & Research

Attewell, S. Jisc National centre for AI. February 1st, 2024

Jisc put together a list of elements and questions to consider to adopt a comprehensive, institution-wide approach to operational AI. “Adopting a comprehensive, institution-wide approach to operational AI is vital to ensure that all departments and stakeholders are aligned and contributing to a unified vision. It facilitates the integration of AI across the institution and enhances innovation. By involving staff and students in the operational process, you can address diverse needs and perspectives, fostering an environment of collaborative learning and adaptation. This holistic approach helps in navigating the ethical and practical challenges associated with its deployment in an educational setting”. Their list of questions cover planning and leading, supporting student skills for an AI-enabled workplace, supporting staff skills to enhance AI benefits and efficiency, preserving academic integrity while developing student AI skills, committing to safe, ethical and responsible use, and providing equitable AI tools access to students. 

Robert, J. Educause. February 12th, 2024

Educause recently published its AI Landscape Study, for which it polled 910 individuals working within universities. The study highlights that most institutions are working on AI-related strategy, with only 11% of respondents saying that no one at their institution was working on an AI-related strategy. Institutions indicated providing training for faculty (56% of respondents), staff (49%) and students (39%). More than half of the respondents (56%) indicated they had been given responsibilities related to AI strategy. Stakeholders also shared that they were lacking awareness of AI-related sentiments, strategy and policy across their institutions. Respondents indicated various appropriate uses for AI in higher education, such as providing personalized student support, acting as an administrative assistant, conducting learning analytics, and supporting digital literacy training. However, inappropriate uses included using outputs without human oversight, failing to disclose AI use, and failing to properly protect data security.

Van Noorden, R. and M. Perkel, J. Nature. September 27th, 2023

Nature surveyed over 1,600 researchers to find out their perceptions on GenAI’s impact on research. “Focusing first on machine-learning, researchers picked out many ways that AI tools help them in their work. From a list of possible advantages, two-thirds noted that AI provides faster ways to process data, 58% said that it speeds up computations that were not previously feasible, and 55% mentioned that it saves scientists time and money. […] The survey results also revealed widespread concerns about the impacts of AI on science. From a list of possible negative impacts, 69% of the researchers said that AI tools can lead to more reliance on pattern recognition without understanding, 58% said that results can entrench bias or discrimination in data, 55% thought that the tools could make fraud easier and 53% noted that ill-considered use can lead to irreproducible research.”

Nicholson, S., Gibbs, B. and Chakraborty, M. Wonkhe. January 24th, 2024 

The authors of this article urge for special caution when embracing GenAI, especially in the areas of creative work and work-based learning. Indeed, the use of GenAI tools might degrade the learner’s creative journey. Additionally, inputting creative work through GenAI makes it become a part of the public domain, which might lead to intellectual property issues. As for work-based learning, confidentiality is sometimes crucial. Many students will be asked to sign NDAs. Uploading any such material to GenAI tools could compromise the confidentiality agreed upon and be a cause for legal concern.

Felix, J. and Webb, L. UK Parliament Post. January 23rd, 2024

This note provides an overview of issues regarding the use of GenAI in education delivery and assessment. Amongst other things, it covers how GenAI can be used to reduce educators’ workloads, to personalize education and to provide educator training. It also lists concerns regarding the use of AI by educators in assessment (e.g., for grading exams and coursework). 

Bekker, M. University World News. February 8th, 2024

In this article, Martin Bekker, from the School of Electrical and Information Engineering at the University of the Witwatersrand, shares “a five-tier system to simplify thinking around permissions and prohibitions related to using LLMs for academic writing”. While tier 1 is techno-pessimistic, tiers 2 and 3 are technology-embracive and tiers 4 and 5 lean towards AI hype. “For scientists across every branch of knowledge, mental panic and ossification remain our nemeses. We gain most by seeing neither cataclysmic doom nor total redemption in technology, but, instead, recalibrating a new technology’s value based on what it can change, and what it can’t”.

Erduran, S. University World News. February 10th, 2024

In this article, Sibel Erduran, professor of science education and Director for Research in the department of Education at the University of Oxford, reflects on how AI is influencing science education. “Scientists are relying on AI tools in designing experiments, generating hypotheses and interpreting data. Practically all aspects of the knowledge generation practices of science, from writing of manuscripts to modelling of data, are being influenced by AI. […] It is imperative, then, that when university science education introduces students to AI, that it includes a component where they not only use the tools to solve scientific problems, but also have the space to ‘think’ about AI and its impact on science.” According to the author, students need to be exposed to AI early in their education. The author recommends that curricula have explicit components that deal with AI and its impact on science. She also states that the approach to innovation in the use of AI needs to be evidence-based. 

Nigel, F. University World News. February 10th, 2024 

In this article, Dr. Nigel Francis, lecturer from the School of Biosciences at Cardiff University, shares how GenAI can be leveraged to improve student assessment. Amongst other examples, he talks about personalizing assessments so they are adapted to the learning patterns and performance of students, focusing more on the process rather than on the final product, and assessing soft skills. “The implementation of GenAI in assessments does necessitate a shift in pedagogical paradigms. It requires educators to become facilitators and co-learners, engaging with AI to curate and refine assessment strategies. […] Institutions must invest in training and infrastructure to fully realize the benefits of this technology.”

 

Warner, J. Inside Higher Ed. February 5th, 2024

Here, the author shares a reflection on how instructors provide feedback to students. “Along my multiyear journey of evolving my pedagogy, it became rather clear that in the context of what I was asking students to do during that period—write by using a prescription that results in a good grade—many of my comments were literally meaningless when it came to students improving as writers. The comments primarily existed to justify the grade, to explain my thinking that resulted in a particular score. […] This evolution immediately changed the kind of feedback I was giving as I shifted to a mode where I was responding not as a teacher evaluating an assignment to assign a grade, but as a reader who was responding to the text as readers do, with thoughts, feelings and ideas of our own.” He then concludes that “allowing machine learning algorithms (like ChatGPT) to evaluate student writing should be a nonstarter, because these algorithms cannot think, feel or communicate with intention”.


Schroeder, R. Inside Higher Ed. February 14th, 2024

This article is a great read if you want a refresher on available GenAI tools out there, and their capabilities. The author shares his favourite LLMs and how he makes use of them. “If you get started now by using generative AI to enhance five to 10 daily searches for information, by spring break you will be confident and skilled enough to teach others.”

More Information

Want more? Consult HESA’s Observatory on AI Policies in Canadian Post-Secondary Education.

This email was forwarded to you by a colleague? Make sure to subscribe to our AI-focused newsletter so you don’t miss the next ones.

Nigel, F. University World News. February 10th, 2024 

In this article, Dr. Nigel Francis, lecturer from the School of Biosciences at Cardiff University, shares how GenAI can be leveraged to improve student assessment. Amongst other examples, he talks about personalizing assessments so they are adapted to the learning patterns and performance of students, focusing more on the process rather than on the final product, and assessing soft skills. “The implementation of GenAI in assessments does necessitate a shift in pedagogical paradigms. It requires educators to become facilitators and co-learners, engaging with AI to curate and refine assessment strategies. […] Institutions must invest in training and infrastructure to fully realize the benefits of this technology.”

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.