Read the Full CDLRA Report
Check out all the findings from the report to get details on the use of generative AI in Canadian postsecondary education.
Generative AI is becoming a regular part of our everyday lives. But how have higher education institutions been adapting to its presence and use?
In this blog post, learn more about new research that sheds light on how generative AI is being used in higher education.
As generative artificial intelligence (AI) becomes a regular part of our everyday lives, how have higher education institutions been adapting to its presence and use?
A new report, conducted in partnership with D2L and the Canadian Digital Learning Research Association (CDLRA), digs into some of the details of how faculty, staff and institutions are developing policies, regulations and guidelines around the use of generative AI.
“Generative AI in Canadian Post-Secondary Education: AI Policies, Possibilities, Realities, and Futures” includes 438 responses from admin and faculty and admin across 126 publicly funded Canadian institutions.
In this post, we’ll tackle the high-level findings of the report and commentary from leaders in the U.S. higher ed landscape on how they stack up against what they’re seeing.
The 438 responses to open-ended questions about generative AI collected by the CDLRA show that:
The CDLRA has a handful of recommendations based on the survey, including:
The CDLRA found that many Canadian institutions are in the early stages of developing guidelines around the use of AI. Consequently, its use isn’t standardised among faculty or admin but rather is used on an individual basis.
MJ Bishop, vice president for integrative learning design at the University of Maryland Global Campus (UMGC), has heard similar stories in the U.S. “Institutions are cautiously optimistic about the use of AI. Many are ‘watching and waiting’ to better understand the implications of these new tools before jumping into making large investments in technologies or wide-sweeping policy changes,” she said.
Bishop said UMGC has taken an institutional stance of embracing AI tools and supporting responsible use for faculty, staff and students.
“UMGC has had a very progressive academic integrity policy in place for several years. So far, it’s proving to be a fairly straightforward matter to incorporate this philosophy about AI into guidance for students about how to use the technologies to support rather than undermine their learning from our academic programs.”
Depending on your institution and the policies already in place, making minor updates—like UMGC has done—could be all that’s required. Instead of creating an entirely new policy dedicated to the use of AI, minor tweaks to an item like an academic integrity policy could be all that’s needed.
The study findings also show that faculty’s reception and use of AI varies—from optimism to concern. While some want nothing to do with the tool, others have been using it to create lesson plans or rubrics and encouraging students to use it as an “expert on the side” or as an aid in writing.
Terry Di Paolo, vice-provost of e-learning at Dallas College, said the CDLRA findings resonate with the perspective of many faculty and admin in the U.S.
“It’s a similar mixed bag of views, and I think the conversation has been dominated by AI as a ‘bad’ disrupter for the higher education space,” he said.
“The real question we need to all be contending with is: Are we equipping our graduates for a future where the life course is disrupted by frontier technologies, including AI?
Terry Di Paolo, vice-provost, e-learning, Dallas College
“The narrative around AI in higher education seems to paint it as making unwanted and dangerous incursions, with a focus on academic dishonesty, the need for policy or guidelines, and in some quarters, the risk of AI replacing the faculty member’s role,” continued Di Paolo.
“It’s important to temper our conversation with a reality that is far removed from this imagined sense of AI turning the higher education space into a hotbed of lazy dishonesty and fast-food-esque education.”
Bettyjo Bouchey, vice provost, digital strategy and operations at National Louis University, believes that getting the delineation between AI and human interaction right can help offload mundane tasks from faculty to free up their time to provide more personalised learning.
“AI opens up the possibility of understanding our learners more deeply than we ever could before,” she said. “In doing so, we can plan personalised learning and support structures perfectly designed for each learner and their specific needs. That’s nearly impossible to do today.”
While the research found that many thought a future with AI in education inevitable, there were still areas of concern, including:
Bishop is considering some of these concerns with her own use of AI. “As I’ve engaged in generative AI to support my work, I struggle with many of the same questions that are being raised by the report,” she said. “Particularly with respect to ethical practice and concerns about bias as well as the need for more transparency around where the information is coming from.”
While Bouchey also has concerns about bias, she has ideas on how to reduce them. “I believe bias can be mitigated by supervised machine learning. Thoughtful quality assurance plans can spot-check the answers, solutions and guidance we get from AI.”
One of the recommendations based on the study’s findings is for institutional leaders to make their stance, guidance and policies surrounding AI clear to faculty and staff. On top of this, the policies should support an environment of learning and experimentation when engaging with AI.
Bouchey notes that National Louis University is adapting policy language and creating a faculty and executive steering committee focused on harnessing and leveraging AI for better teaching and learning.
“This fall, we’ll be partnering with another institution on a course they’ve designed that will enable our academic leaders to host communities of practice on the responsible and ethical use of AI in our classrooms,” she said. “We see it not only as a way of embracing the future but also as a direct responsibility we have to be training our learners for the future of work that will invariably involve the use of AI.”
Like Bouchey, Bishop agrees that training will play a part in the integration of AI in higher ed, specifically for faculty and staff. “I think we all could benefit from additional professional development on generative AI,” she said. “Particularly with respect to how we might leverage these tools to create more equitable learning opportunities and avoid the potential dystopic futures that AI critics envision.”
The role AI plays in the future of higher education is also important to Di Paolo. He believes institutions need to look beyond the classroom setting when considering the integration of AI into education.
“We need to consider AI’s impact on the wider operations of the college campus and the future world of work for college graduates,” he said.
“The very people college educators are getting to think critically and positively about AI as a societal tool are actually the population most at risk from the impact of AI in their career pathways and trajectories,” continued Di Paolo. “The real question we need to all be contending with is: Are we equipping our graduates for a future where the life course is disrupted by frontier technologies, including AI?”
While faculty and staff’s opinions on the future use of AI in higher ed vary, Bouchey believes we can lean on AI to help expand how faculty, staff and students learn and grow past historical limitations.
“We can be using AI to innovate higher education in unprecedented ways that our human minds can’t because we tend to come to brainstorming with our set of preconceived notions,” she said. “AI can not only collect and analyse data in nearly infinite ways, but also challenge our thinking and help us see new paths and new ways of being that could disrupt higher education at a much more rapid pace.
“We can be harnessing AI to keep us in a futurist mindset, helping us see it more clearly and rapidly, thereby helping us solve and respond to historical challenges in brand new ways.”
Check out all the findings from the report to get details on the use of generative AI in Canadian postsecondary education.
Written by:
Educators and training pros get our insights, tips, and best practices delivered monthly