HESA’s AI Observatory: What’s new in higher education (Jan. 19th, 2024)

Spotlight

Good afternoon all, 

If you’ve been following the growth of our AI Observatory since it got launched last August, you’ve probably noticed that we’ve built quite an extensive database of institutional policies, statements, guidelines, and recommendations developed across the country and all around the world to respond to GenAI in higher education.

In parallel, we’ve been keeping track of what they cover (or don’t). Combined with conversations with administrators and faculty, we now have an in-depth understanding of how GenAI is being addressed around the world, complexities around the issue, and good practices that have been implemented at other institutions.

If your institution is struggling to develop its response to this new technology, or is simply in search of an extra hand, we might be able to help. Take a look at our AI Advisory Services and reach out to our team to discuss ways in which we could support your efforts.

Next Roundtable Meeting

Date: Tuesday, January 30th, 2024
Time: 12h00-1h00PM ET

Join us on Tuesday, January 30th, from 12:00PM to 1:00PM ET, for our first AI Roundtable meeting of the year. To kick off the new year, this session will focus on analyzing how post-secondary institutions, a little over one year after the release of ChatGPT, are reacting to this ever-evolving technology. Participants will be asked to fill a short survey upon registration, and the meeting will focus on unraveling the results and discussing the key trends and challenges that emerge. This session will be facilitated by Simon Bates, Vice-Provost and Associate Vice-President, Teaching and Learning, at the University of British-Columbia. Register now (it’s free!) to save your spot!

If you missed last year’s AI Roundtables, you can watch the recordings here.

Policies & Guidelines Developed by Higher Education Institutions

Tags: Guidelines, Academic integrity, Pedagogy, CanadaGuidelines, Academic integrity, Pedagogy, Canada

Toronto Metropolitan University’s (Canada) Guidance on the Use of Generative Artificial Intelligence in Teaching and Learning mentions that “authorization for the use of GenAI in coursework is always at the discretion of the instructor and expectations should be communicated to students clearly in a syllabus or other course policy statement”, and that “unless explicitly communicated by the course instructor, the use of GenAI for coursework is not permitted”. It also states that “the Academic Integrity Office does not currently endorse the use of GenAI detection tools”. Educational developers and academic integrity specialists are available for consultation to assist instructors. TMU has also put together a FAQ as well as a collection of resources for the teaching community.

Tags: Guidelines, Academic integrity, Pedagogy, North America

University of Washington’s (United States) guidance on ChatGPT and other AI-based tools list a series of strategies that can help instructors react to GenAI: 1) Set expectations; 2) Communicate the importance of college learning; 3) Acknowledge that struggle is part of learning; 4) Discuss the social, ethical, and practical issues surrounding AI; 5) Assess process as much as (or more than) product; 6) Design assignments that ask students to connect course content, class discussion, and lived experience; and 7) Consider teaching through AI-based tools. The guidance also provides some examples of how instructors might use AI to facilitate learning, such as 1) Think-pair-AI-share; 2) Evaluating AI output; 3) Improving upon/adapting AI-generated output; 4) Explaining the steps in an AI-generated solution; 5) Visualizing concepts with AI; and 6) Exploring AI in your field. The University also provides sample syllabus statements

Tags: Guidelines, Pedagogy, Operations, North America

University of Central Florida’s (United States) guidance on Artificial Intelligence provides some ideas of how to react to GenAI in higher education, including 1) Leaning into the software’s abilities (e.g., re-envisioning writing, refining editing skills, writing rebuttals, evaluating for bias, teaching information literacy, asking the AI to role play, overcoming writer’s block, making the AI your teaching assistant, or creating sample test questions to study for tests); 2) Using the software to make teaching/faculty life easier (e.g., creating grading rubrics, writing correspondence, generating study guides, creating clinical case studies, or creating test questions); 3) Teaching ethics, integrity, and career-related skills; and 4) Attempting to neutralize the software (by customizing assignments to reduce the potential to rely on GenAI). UCF recommends its faculty to work with CoPilot over other LLM AI tools. Only faculty (not students or staff) have access to CoPilot through their UCF email. The guidance also emphasizes the importance to develop AI fluency.

News & Research

Rummel, H. AZ Central. January 18th, 2024. 

Arizona State University just announced that it will become the first university to launch a formal partnership with OpenAI. Students and faculty members will receive full access to ChatGPT Enterprise. ASU Chief Information Officer emphasized that they want to make sure that everyone has equitable access to these tools. He also stated that the data input in the ChatGPT Enterprise system won’t be used for machine learning purposes. ASU will create an ethics committee to monitor the partnership.

Blackwell, A. and Swenson-Wright, Z. Times Higher Education.January 12th, 2024.

In this article, the authors share how AI systems are exploiting scholars by exploiting their intellectual property. “As soon as a researcher uploads a manuscript [into an AI editing system], their intellectual property – original ideas, innovative variations on established theories, newly coined terms – is appropriated by the company and will be used, likely in perpetuity, to “predict” and generate text in similar papers edited by the service (or anyone using company-provided editing tools).” The authors state that publishers should be required to name the editing companies they outsource work to, and clarify if any data is being used for AI training.

Clark, D. Plan B. January 2nd, 2024.  

In this blog post, the author shares his opinion that the tertiary education system in the UK is in need of a ‘fresh idea’ – “a University that uses AI to create and deliver high quality online education at relatively low cost”. That University would be “based on the competence model, with a focus on skills shortages”. The author proposes 25 initial ideas, which include the fact that it would target non-traditional students in terms of age and background, the fact that learning would be personalized using AI and would be multimodal from the start, the fact that every teacher would have a chatbot available 24/7, the fact that teaching could be done in various languages aided by AI, the fact that feedback and assessment would be automated, and the fact that the curriculum could be completed at one’s own pace. “The full array of courses (generated in part by AI), delivered partly by AI, assessed by AI should be the aim.”

Basken, P. Times Higher Education. January 3rd, 2024. 

When announcing he’d be stepping down as president of Southern New Hampshire University, Paul LeBlanc shared concerns about the US higher education sector not taking the implications of AI seriously enough. Pr. Siemens, from the University of Texas at Arlington and the University of South Australia, criticizes academic leadership for “being asleep at the wheel”: “There’s no foregone conclusion about how AI will shape universities, but there are people with conclusions selling services to us – and the fact that we’re not in that arena, that’s the part that alarms me most. […] Trying to answer it is a hell of a lot better than absorbing the answer that Big Tech offers me”. 

Grove, J. Times Higher Education. January 16th, 2024. 

After a test round last summer, Elsevier just launched its new GenAI tool, Scopus AI, which will help researchers by quickly providing summaries of research from more than 27,000 academic journals. Scopus AI will provide “fast overviews of key topics” and has an “academic expert search which identifies leading experts in their fields”. 

Sebesta, J. Davis, V. L. WICHE Cooperative for Educational Technologies. June 30th, 2023.

In April 2023, WCET conducted a survey in the US to determine how higher education institutions are currently using AI, what policies they developed to react to this technology, and what opportunities and challenges relate to its use. Key findings include the following: Online and Distance Education Administrators and Staff, including Instructional Designers, are the primary roles leading this work on their campuses; The overwhelming majority of institutions do not offer incentives to encourage faculty to use AI, and a majority also reported no faculty development or training around AI; The majority of institutions lack an official strategy around the use of AI, but have or will be developing policies, primarily around academic integrity and instructional use; The highest percentage of existing, planned, or considered use of AI is for detecting AI-generated content/plagiarism (56% of respondents), editing (52%) and content creation (44%). The primary challenge to using AI was lack of expertise among faculty and administrators, followed closely by lack of policies and guidelines, and concerns about protecting academic integrity; A majority of respondents identified both teaching critical digital skills and learner engagement as the top benefits to using AI to support instruction and learning.

Fischer, M. and Tasneem, A. EAB. January 9th, 2024. 

In this podcast, Michael Fischer and Afia Tasneem discuss how institutions’ perceptions around GenAI evolved throughout the past year. They share five promising uses for AI within academia: 1) preparing students for the future of work and changing workforce needs; 2) leveraging AI to provide real-time, personalized support to students across their life cycle; 3) enhancing productivity to shift, or boost faculty and staff efficiency, and optimize costs; 4) maximizing enrolment and advancement yield with AI; 5) expanding the frontiers of knowledge. 

Sharma, Y. University World News. December 7th, 2023. 

At the end of November 2023, at the conference “Creative Fluency: Human flourishing in the age of AI”, many experts in AI, policy and education warned that GenAI “are being incorporated into education systems without adequate consideration of the ramifications for education equality, learning outcomes, national sovereignty and culture”. There are fears that GenAI tools are starting to be used by instructors and students without knowing how it could impact education outcomes. There are also concerns that AI tools will reproduce hegemonic powers by imposing specific views and ways of thinking, without considering the local ways of doing things. “There’s a strong need for AI education to consider the cultural context, ensuring the development and deployment of AI that reflects it, respects values and diversities of that nation”.

Jones, J. EAB. December 4th, 2023. 

In this blog post, the author explores opportunities and pitfalls of using AI in annual giving. Opportunities include using AI for content creation (e.g., tailoring marketing copy to different audiences), data analytics (e.g., identifying triggers most likely to convert prospective donors), brainstorming, and communications automation. A useful reminder is shared at the end of the blog post: “Remember these two things: you need quality data to feed it and expert professionals to manage it.”

Dietis, N. Times Higher Education. January 8th, 2024.  

In this article, the author, who is an assistant professor of pharmacology at the Medical School of the University of Cyprus, shares three practical applications of ChatGPT in the classroom to enhance the cognitive abilities of learners: 1) Problem-solving with ChatGPT; 2) Creative research brainstorming with ChatGPT; 3) Debate and argumentation training with ChatGPT. He goes into the details of how each exercise works, as well as advantages and considerations.

More Information

Want more? Consult HESA’s Observatory on AI Policies in Canadian Post-Secondary Education.

This email was forwarded to you by a colleague? Make sure to subscribe to our AI-focused newsletter so you don’t miss the next ones.

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.