Spotlight
Good afternoon all,
In today’s newsletter, you can read about the UK Department of Education’s policy paper on GenAI in education; reflect on how GenAI tools can be used as assistive technologies to help students with disabilities; consult the results of surveys assessing both institutional leaders’ and students’ perspectives on GenAI tools; and discover new ways in which GenAI tools can be used by students, instructors and institutions to improve outcomes.
Next Roundtable Meeting
Date: Tuesday, November 21st, 2023
Time: 12h00-1h00PM ET
Join us on November 21st, from 12:00PM to 1:00PM ET, for our next AI Roundtable meeting, focused on Inclusion. During this session, a series of guest speakers will discuss critical perspectives on GenAI tools and how they can perpetuate inequities, and also address how these tools can be used to support accessibility and inclusion for individuals living with disabilities within the higher education context. This session will be facilitated by Lan Keenan, President of the Schulich Disability Alliance at Dalhousie University’s Law School. We’ll share with you more information about our guest speakers in the upcoming weeks. Register now for free to save your spot!
If you missed our last AI Roundtable on Student Perspectives, you can watch the recording here.
Policies & Guidelines Developed by Higher Education Institutions
Tags: Guidelines, Academic integrity, Pedagogy, Canada
The University of Saskatchewan’s (Canada) ChatGPT & Generative Artificial Intelligence webpage includes 10 Guidelines for Assessment Practice in a GenAI Environment. These guidelines include learning about GenAI technologies; discussing approaches and practices with colleagues; designing assessments to meet course learning outcomes in ways that include acceptable uses of GenAI tools and/or reduces the likelihood of unacceptable uses; explaining how students should or shouldn’t use GenAI and the reasons for these expectations; emphasizing the learning value of completing assessments as per expectations to students; equipping students with the information and experiences they need to develop the skills and knowledge required to meet the expectations; reassuring students that their questions are welcome; following-up to suspected academic misconduct; protecting copyright by not submitting work of others and students to GenAI detection tools; and reflecting on results of the approaches to better plan for next time. There are also Recommendations for Ethical Use of Artificial Intelligence, which include acknowledging the use of GenAI tools, not listing them as authors, and recognizing limits and biases of these tools. The webpage shares many resources for faculty and instructors, as well as students.
Tags: Policy, Academic integrity, Governance, North America
Boston University’s (United States) Faculty of Computing & Data Sciences developed its Generative AI Assistance Policy, which states that “students should learn how to use AI text generators and other AI-based assistive resources to enhance rather than damage their developing abilities as writers, coders, communicators, and thinkers”, and that “instructors should ensure fair grading for both those who do and do not use AI tools”. The policy lists a few limitations to an “otherwise embracing approach to AI tools”. Students shall: 1) give credit to AI tools whenever used, even if only to generate ideas rather than usable text or illustrations; 2) when using AI tools on assignments, add an appendix showing the entire exchange, a description of which AI tools were used, and explanation of how they were used, and an account of why they were used; 3) not use AI tools during in-class examinations or assignments, unless explicitly permitted or instructed; 4) employ AI detection tools and originality checks prior to submission, ensuring their work is not mistakenly flagged; and 5) use AI tools wisely and intelligently, aiming to deepen understanding of subject matter and to support learning. Instructors shall: 1) seek to understand how AI tools work, as well as their strengths and weakness, to optimize their value for student learning; 2) treat work by students who declare no use of AI tools as the baseline for grading; 3) use a lower baseline for students who declare use of AI tools; 4) employ AI detection tools to evaluate the degree to which AI tools have likely been employed; and 5) impose a significant penalty for low-energy or unreflective reuse of material generated by AI tools, and assign zero points for merely reproducing the output from AI tools.
News & Research
UK Department of Education. October 26th, 2023.
A couple of days ago, the UK Department of Education released its position on the use of GenAI in the education sector, in which it states that it aims to “identify opportunities to improve education and reduce workload using generative AI”. It claims that the education sector must adequately prepare students for changing workplaces, which includes teaching students on how to properly and safely use technologies such as GenAI. Such teaching may include: “1) the limitations, reliability, and potential bias of GenAI; 2) how information on the Internet is organized and ranked; 3) online safety to protect against harmful or misleading content; 4) understanding and protecting IP rights; 5) creating and using digital content safely and responsibly; 6) the impact of technology, including disruptive and enabling technologies; and 7) foundational knowledge about how computers work, connect with each other, follow rules and process data”.
George-Briant, K. and Brown, A. Association for Learning Technology. October 17th, 2023.
In this article, the authors advocate for how GenAI tools “have the potential to revolutionize the way students with disabilities learn” by levelling the playing field. Indeed, the authors claim that GenAI tools can be used as legitimate assistive technologies for students who face additional barriers, notably in organizing their thoughts, in writing, and more. The authors caution institutions against banning the use of GenAI, which could remove access to critical learning tools. They also mention that lack of guidance on how these tools can be used can lead some students to not use them altogether, even as a legitimate assistive technology, by fear of being accused of academic misconduct. They mention that current institutional guidelines and policies fail to mention GenAI tools as potential assistive technologies, or how they should be accommodated. Fixing this would require engaging directly with learners with disabilities, and accommodation offices.
The Chronicle of Higher Education. 2023.
Last summer, 404 college leaders responded to a survey conducted by The Chronicle of Higher Education about GenAI and its effect on the future of higher education. 78% of respondents agreed that GenAI tools offers an opportunity for higher education institutions to improve how they educate, operate, and conduct research. However, 57% agreed that GenAI tools also pose a threat to how institutions educate, operate, and conduct research. College leaders said they believed GenAI was now inevitable; however, they also find that it is moving too fast. When asked which aspect of campus life will be most affected by GenAI, respondents indicated teaching (57%), research (14%), admissions (8%), and IT and cybersecurity operations (6%). Notably, respondents said that GenAI will both have a positive and a negative impact on teaching in the upcoming 5 years. For other spheres, their perceptions were generally that it would have more of a positive impact than a negative one. Views tended to be optimistic regarding how GenAI tools can aid student learning. Some explain the mix of responses as a difference in the level of familiarity with the technology. Yet, respondents were almost unanimous (95%) that their institution should teach students the basics of AI ethics and literacy… although less than half of institutions already had plans to do so. Another element with which respondents were almost unanimously in favor (98%) is with the fact that GenAI tools will require instructors to rethink how they assess student work. Most institutions still hadn’t taken steps to develop policies to govern GenAI.
Byles, R. Lea, K., and Howe, R. University of Northampton. October 11th, 2023.
In May 2023, the University of Northampton Learning Technology team conducted a survey of students’ experience of GenAI. The survey received 129 responses from across faculties. Key findings include the fact that most respondents were not, at the time, engaging with GenAI tools, notably due to ethical concerns and personal values. The other 38%, who were actively engaging with these tools, highlighted their usefulness for various tasks, such as generating new content, summarizing content, and editing text. Students who had not yet engaged with GenAI tools were more likely to view the use of these tools in education as an unfair advantage than students who were already using them. In addition, students who had used GenAI tools were more likely to agree with the fact that such tools should be made available to all, to ensure equity. Almost 60% of respondents believed that students trained in the use of AI tools would have access to more opportunities in the future. Although the University of Northampton had published guidelines around the use of AI tools, most respondents were not aware of their existence. With respect to GenAI tools being used by instructors, many students expressed skepticism regarding using AI for grading, and there were a wide range of opinions regarding using AI-generated teaching content.
Schroeder, R. Inside Higher Ed. November 8th, 2023.
In this article, the author shares ways in which AI tools could be used to effectively improve outcomes within higher education institutions. This includes, for example, using predictive AI to identify students who are most at risk of dropping out and providing them with proactive support. GenAI can also be used for research and writing, such as quickly synthetizing literature reviews, writing paper proposals, and drafting sections of manuscripts. It can support administrative tasks, such as drafting communications, generating insights from reports, compiling meeting agendas and minutes, and more. GenAI tutors can also answer student questions, provide customized explanations and feedback to students, or grade assignments, essays and exams.
Michaels, W. WUNC. November 6th, 2023.
A North Carolina State University English professor asked his students to use ChatGPT to help them write their final essays, and then reflect on their experience. Students found that ChatGPT often went in directions they didn’t want to, or produced things that they didn’t like or were straight up false. A student complained that “it was like being matched up in a nightmare group project with the class slacker”. The professor says it is crucial to help students develop AI literacy so that they not only learn to use the tools, but also approach their outputs critically.
Hough, D. Association for Learning Technology. October 10th, 2023.
In this blog post, the author shares responsible uses, as well as limitations and things to avoid, when using GenAI for lectures, written work, problem-based learning, exams, tutorials, practicals, and written assessments. She also shares actions that can be taken to uphold academic integrity, such as properly citing the use of the tool, stating it in the acknowledgements section, providing a copy of the AI conversation as proof of responsible use, and including a web link to all other references cited.
Follmer, C. Times Higher Education. November 6th, 2023.
A University of Iowa adjunct instructor of Business shares ways in which he is integrating AI in his classes this autumn. As an interactive learning tool for students, AI can serve as a teammate (e.g., to generate ideas, or to play devil’s advocate and generate counterpoints and different perspectives), as a mock audience/role player, or as a peer (for peer review, for e.g.). AI can also serve as an instructional aid for teachers, playing the role of either process auditor, or scenario/prompt generator.
More Information
Want more? Consult HESA’s Observatory on AI Policies in Canadian Post-Secondary Education.
This email was forwarded to you by a colleague? Make sure to subscribe to our AI-focused newsletter so you don’t miss the next ones.