HESA’s AI Observatory: What’s new in higher education (Sept. 29th, 2023)

Spotlight

Good afternoon all,

Thanks to those of you who joined us for our AI Roundtable on Pedagogy and curriculum last Tuesday, and for sharing your knowledge and experience in the chat. 

It was interesting to learn more about the possibilities of GenAI in developing teaching and learning materials. From drafting learning objectives and lesson plans to building rubrics and drafting quizzes, there are clearly a lot of ways GenAI could support instructors. However, many questions remain, notably with respect to how to reach the best student learning outcomes. It is still early days, and more research is required to fully understand how the use of GenAI tools will impact student learning, and the development of their abilities and knowledge. 

In the meantime, we can gain some great insight by continuing to cautiously experiment with these tools, and remaining open to seeing how they can improve some aspects of teaching and learning. For instance, GenAI can help develop individualized learning materials to better meet specific needs, or act as a helpful learning companion.

While it might be too early for a large-scale adoption, now is definitely the time to be having these discussions. Make sure to join us for our next AI Roundtable on Student perspectives, on October 24th (see below)!

In the meantime, we’ll continue sharing with you the latest developments on AI policies in higher education. See anything that could be added to our AI Observatory? Please share it with us by email or via this online form

Finally – Last call to opt-in to the AI-focused emails by clicking here or on the button below if you want to keep receiving these weekly AI-focused emails. Starting next week, we’ll shift to only sending these AI-focused emails to those who subscribed to that specific list.

Next Roundtable Meeting

Date: Tuesday, October 24th, 2023
Time: 12h00-1h00 PM ET

Join us on October 24th, from 12:00PM to 1:00PM ET, for our next AI Roundtable meeting, focused on Student perspectives. During this session, a panel of students from across the country, representing a wide range of perspectives, will share their opinions on GenAI in higher education. The panel will be facilitated by our team at Higher Education Strategy Associates. We’ll share with you more information about our panelists in the upcoming weeks. Register now for free to save your spot!

If you missed our last AI Roundtable on Pedagogy and curriculum, you can watch the recording here.

Policies & Guidelines Developed by Higher Education Institutions

Tags: Guidelines, Academic integrity, Pedagogy, Canada

University of Northern British Columbia’s (Canada) guidelines for instructors on AI, machine learning, and generative technologies state that instructors can determine if they allow the use of GenAI in their course or not. They list a series of actions instructors can implement to encourage students to not use GenAI. They recommend instructors to update their syllabus, to talk with students about academic integrity, to be transparent about assignments, to reconsider their approach to grading, to shift from extrinsic to intrinsic motivation, and to use these technologies as educational tools. The guidelines also highlight the alarming issues surrounding AI-detection tools. 

Tags: Guidelines, Academic integrity, Governance, Canada

Cégep du Vieux Montréal’s (Canada) Charter on Artificial Intelligence (in French) lists a series of principles and values that should guide the use (or not) of AI: respect of the individual and of its privacy; transparency; justice; support to pedagogy and success; and AI literacy. This charter includes a series of questions to ask in order to reflect on the previously mentioned principles and values.

Tags: Guidelines, Pedagogy, North America

Texas State University’s (United States) Artificial Intelligence (AI) in Academia: Resources for Faculty presents a list of opportunities and concerns for AI and ChatGPT, as well as some general advice, for instance on how to update syllabi for ChatGPT. It also highlights the importance of teaching students on how to use AI. 

News & Research

Government of Canada. September 2023.

This document provides guidance on the use of GenAI to federal institutions. “Federal institutions should explore potential uses of GenAI tools for supporting and improving their operations. However, because these tools are evolving, they should not be used in all cases. Federal institutions must be cautious and evaluate the risks before they start using them. The use of these tools should be restricted to instances where risks can be effectively managed.” In their choice of using or not GenAI (and how to do so), federal institutions and public servants must consider the following principles: Fair, Accountable, Secure, Transparent, Educated, and Relevant. The guidance also presents a list of potential issues (e.g., with respect to the protection of information, biases, quality, autonomy, legal risks, and environmental impacts), and presents best practices on how to respond to these issues.

Malmström, H., Stöhr, C. and A. Wanyu Ou. Chalmers Studies in Communication and Learning in Higher Education. 2023.

5,894 students from across Swedish universities were surveyed about their use of, and attitudes towards AI for learning purposes. More than a third of respondents use ChatGPT regularly, and many students claim that AI makes them more effective as learners. While more than 60% believe that the use of chatbots during examination is cheating, approximately the same proportion is against prohibiting AI in education settings. “Asking a chatbot to write an entire text and submit that text as one’s own work is perceived as cheating but using AI to study and prepare for exams is not.” Alarmingly, most respondents didn’t know if their educational institution had rules or guidelines regarding the responsible use of AI. Respondents also highlighted that chatbots can prove helpful for students with disabilities and mentioned that a better integration of AI in higher education could enable to better support individual learning needs of students.

Pôle montréalais d’enseignement supérieur en intelligence artificielle. August 23rd, 2023. 

On May 31st, nearly 130 representatives from higher education institutions member of the Pôle montréalais d’enseignement supérieur en intelligence artificielle gathered at the Université de Montréal to deliberate about AI in higher education, and to produce concrete recommendations for institutions and decision-makers. Nine recommendations emerged, which include dedicating enough resources to adapt GenAI tools to the needs of T&L, bringing together experts from the education community to provide guidance on the development of AI in T&L, and for higher education institutions to adopt clear guidelines regarding GenAI. 

Group of Eight Australia. September 19th, 2023. 

The principles developed by Go8 to guide the approach to GenAI tools across their universities are the following: 1) Maintaining academic excellence and integrity in teaching, learning, assessment and research; 2) Promulgate clear guidelines for the appropriate use of GenAI by academic staff, researchers and students; 3) Develop resources to empower students, academic and research staff to engage productively, effectively and ethically with GenAI; 4) Ensure equal access to GenAI; and 5) Engage in collaborative efforts to exchange and implement best practices as GenAI continues to evolve. 

Ferguson, K. Western News. September 27th, 2023.

Western University has just announced the creation of what is believed to be the first-ever chief AI officer at a Canadian university. Mark Daley is an AI researcher and respected leader in neural computation, with experience in academic administration and as vice-president of research at the Canadian Institute for Advanced Research. Daley’s five-year term will begin on Oct. 15th. His mandate will be to develop and implement a university-wide AI strategy that supports the university’s academic mission and research objectives.

Coffey, L. Inside Higher Ed. September 28th 2023.

One of the key ideas presented at the University of Central Florida’s Teaching and Learning With AI conference was to create “an AI advisory board that brings together students, faculty and staff for open conversations about the new technology”. Some institutions have already created their own AI boards. This is the case for the University of Louisville, Stanford University, Vanderbilt University, Northeastern University and the University of Michigan. Interviewees agree that important factors to make such boards work include ensuring diversity of thought by including both faculty and staff, making sure to include students, looking at technology leaders, being flexible in setting rigid rules, considering both an internal board and an external board, talking to other institutions to share and learn from others, and focusing on what’s best for your institution. 

Provost, A. Le Devoir. September 29th.

Some higher education institutions are starting to use DALIA, an AI tool that predicts the failure and drop-out rates of students. This tool helps identifying students in difficulty to better connect them with appropriate support. DALIA is already being used by many institutions – around 15 CÉGEPs in Québec at the moment, and it is supposed to be deployed in around 40 institutions. Students need to sign a consent form, and they can opt-out at any time. At Collège Bois-de-Boulogne et Cégep Ahuntsic, more than 95% of students agreed to sign. The provincial student association, the Fédération étudiante collégiale du Québec, invites institutions to be cautious. For instance, such an AI tool shouldn’t be used for admissions, because it might reinforce discrimination; nor should it be made accessible to instructors, because it might lead to a Pygmalion effect. 

Kim, J. Inside Higher Ed. September 29th, 2023.

In this piece, the author reflects on “All in on AI: How Smart Companies Win Big With Artificial Intelligence”, by Tom Davenport and Nitin Mittal. Even if the book itself makes few mentions of AI in higher education, the author argues that it ignites great reflection, notably “what might a university that is all in on AI look like?”. The book and this piece both emphasize the importance of getting a handle on the data, in order to better train AI. The author argues that in order for institutions to go all in on AI, the first step would be to put data governance and management at the core of the organizational leadership structure. 

US Department of Education’s Office of Educational Technology. May 2023. 

Following listening sessions that were attended by more than 700 constituents, the US Department of Education produced a report to summarize key takeaways and share recommendations. The report states that policies are urgently needed to implement the following: 1) Leverage automation to advance learning outcomes while protecting human decision making and judgement; 2) Interrogate the underlying data quality in AI models to ensure fair and unbiased pattern recognition and decision making in educational applications, based on accurate information appropriate to the pedagogical situation; 3) Enable examination of how particular AI technologies, as part of larger edtech or education systems, may increase or undermine equity for students; and 4) Take steps to safeguard and advance equity, including providing for human checks and balances and limiting any AI systems and tools that undermine equity. The report’s recommendations go as follow: 1) Emphasize humans in the loop; 2) Align AI models to a shared vision for education; 3) Design using modern learning principles; 4) Prioritize strengthening trust; 5) Inform and involve educators; 6) Focus R&D on addressing context and enhancing trust and safety; and 7) Develop education-specific guidelines and guardrails. 

Singh, S. Times Higher Education. September 29th, 2023.

Shweta Singh, Assistant Professor in Information systems and management at the University of Warwick, writes about how GenAI and its inherent biases can impact academic freedom. The author also makes the case for the need to better support academics who are doing research on AI. 

Posted in

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.