Using AI to enhance well-being for under-represented groups

A man meditating

Kiemute Oyibo, an assistant professor at York University’s Lassonde School of Engineering, is leveraging artificial intelligence (AI) machine learning to build group-specific predictive models for different target populations to promote positive behaviour changes.

Kiemute Oyibo
Kiemute Oyibo

From reminders to take a daily yoga lesson to notifications about prescription refills, persuasive technology is an effective technique used in many software applications. Informed by psychological theories, this technology can be incorporated in many electronic devices to change users’ attitudes and behaviours, including habits and lifestyle choices related to health and well-being.

“People are receptive to personalized health-related messages that help them adopt beneficial behaviours they ordinarily find difficult,” says Oyibo.

“That is why I am designing, implementing and evaluating personalized persuasive technologies in the health domain with a focus on inclusive design, and tailoring health applications to meet the needs of under-represented groups.”

By considering the specific needs of these groups, Oyibo’s work has the potential to change the one-size-fits-all approach of software application design. “By excluding features which may discourage some populations from using certain health applications and focusing on their unique needs, such as the inclusion of cultural elements and norms, personalized health applications can benefit users from marginalized communities,” he explains. “Another method that can help improve user experience is participatory design. This enables underrepresented groups, such as Indigenous Peoples, to be a part of the design and development of technology they will enjoy using.”

Through demographic studies, Oyibo is investigating the behaviours, characteristics, preferences and unique needs of different populations, including under-represented groups, throughout Canada and Africa. For example, he is examining cultural influences on users’ attitudes and acceptance of contact tracing applications – an approach that is unique for informing the design and development of public health applications.

“Group-specific predictive models that do not treat the entire target population as a monolithic group can be used to personalize health messages to specific users more effectively,” says Oyibo of his work, which is supported by a Natural Sciences and Engineering Research Council of Canada Discovery Grant.

In related work, Oyibo is collaborating with professors from Dalhousie University and industry partners at ThinkResearch to explore the application of persuasive techniques in the design of medical incident reporting systems, to improve their effectiveness in community pharmacies across Canada.

“There are a lot of near misses and incidents in community pharmacies across Canada that go unreported,” says Oyibo. “Apart from personal and administrative barriers, such as fear of consequences and lack of confidentiality in handling reports, the culture of little-to-no reporting reflects system design. We want to leverage persuasive techniques to enhance these systems and make them more motivating and valuable, to encourage users to report as many incidents and near misses as possible so that the community can learn from them. This will go a long way in fostering patient safety in community pharmacies across Canada.”

Oyibo’s work is part of a global effort to bridge the digital divide in health care and utilize technology to improve the lives of diverse populations.

York-developed safe water innovation earns international praise

Child drinking water from outdoor tap water well

The Safe Water Optimization Tool (SWOT), an innovative technology used to help humanitarian responders deliver safe water in crisis zones, developed by two professors in York University’s Lassonde School of Engineering and Dahdaleh Institute for Global Health Research, was recently highlighted as a success story in two international publications.

Syed Imran Ali

Built by Syed Imran Ali, an adjunct professor at Lassonde and research Fellow at the Dahdaleh Institute, in collaboration with Lassonde Associate Professor Usman Khan, the web-based SWOT platform generates site-specific and evidence-based water chlorination targets to ensure water remains safe to drink all the way to the point of consumption. It uses machine learning and process-based numerical modelling to generate life-preserving insight from the water quality monitoring data that is already routinely collected in refugee camps.

One of the SWOT’s funders, the U.K.-based ELRHA Humanitarian Innovation Fund, recently published a case study on the tool to serve as an example of a successful humanitarian innovation.

As a result of that publication, the SWOT was then highlighted as a success story in another case study, this time in the U.K. government’s latest white paper, titled “International development in a contested world: ending extreme poverty and tackling climate change.”

Water quality staff tests chlorination levels in household stored water at the Jamam refugee camp in South Sudan. Photo by Syed Imran Ali.

“These international recognitions highlight the impact our research is having on public health engineering in humanitarian operations around the world,” explained Ali.

As his team works to scale up the SWOT globally, he believes these publications will help increase awareness of and confidence in the technology. “We’re excited to build new partnerships with humanitarian organizations and help get safe water to the people who need it most,” he said.

For more information about the Safe Water Optimization Tool, visit

To learn more about how this innovation is advancing, read this YFile story.

Federal grant supports innovative project to improve Canadian digital health care


A three-year grant totalling $500,000 will fund a collaborative project between York University Professor Maleknaz Nayebi and RxPx, a company that creates and supports digital health solutions.

Maleknaz Nayebi

Naybei is a professor in the Lassonde School of Engineering in the Department of Electrical Engineering & Computer Science and a member of the Centre for Artificial Intelligence & Society (CAIS). CAIS unites researchers who are collectively advancing the state of the art in the theory and practice of artificial intelligence (AI) systems, governance and policy. The research includes a focus on AI systems addressing societal priorities in health care.

The funding, awarded by the Natural Sciences & Engineering Research Council of Canada’s Alliance Grant program, will support the development of the Digital Health Defragmenter Hub (DH2).

Alliance Grants support university researchers collaborating with partner organizations to “generate new knowledge and accelerate the application of research results to create benefits for Canadians.”

This collaborative project aims to address the intricate challenges within the Canadian digital health-care landscape by integrating advanced software engineering principles with machine-learning algorithms.

The project’s goal is to develop a software platform dedicated to digital health services. Currently, digital health services are designed and offered in isolation from other social, economic or health services, says Nayebi, adding that this results in inharmonious digital health care where many services overlap, while many pain points and requirements remain untacked.

“Lack of co-ordination among providers, the inability of patients to choose services and make open decisions, the rigidity of the market toward digital innovations and isolation of providers are known as the main barriers in the Canadian digital health-care ecosystem,” says Nayebi. “In this ecosystem, the physicians act as service-supply-side monopolists, exercising significantly more power than their demand-side patients. A survey conducted by Price Waterhouse Cooper showed the unpreparedness of the ecosystem, where only 40 per cent could envision a collaboration with other organizations. This further leads to increased inequality within the health-care system. In contrast, 62 per cent of American-based active health-care organizations had a digital health component in their strategic plan.”

DH2 is a platform that brings together open innovation in health care, allowing health-care providers to deliver personalized services to the public. The project is aimed to provide software and AI-based technology that makes digital health services more affordable and accessible to a broader population, integrates innovative business strategies for new entrants or low-end consumers, and creates a value network where all stakeholders benefit from the proliferation of innovative technologies.

“DH2 serves as a marketplace where not only can individuals with basic health-care services contribute, but it also features AI-driven matchmaking services, connecting patients with the specific demands of health-care providers and caregivers,” says Nayebi.

In this capacity, DH2 addresses the defragmentation in the wellness and health ecosystem by enabling users and user communities.

“DH2 goes beyond just connecting people; it also uses machine learning to help patients make informed decisions about their digital health-care options. Such platforms can act as the governing and strategic solution for leading market and innovation, and provide faster time to market by assisting providers in their deployment, distribution and monetization processes. They provide even access to information for all parties and effectively reduce inequalities.”

In addition, platforms add to the geographic diversity of participants. Moreover, says Nayebi, the platform enhances the diversity of participants across different geographic locations, establishing an ecosystem that enables quicker responses to disruptive events such as the COVID-19 pandemic.

Faculty of Liberal Arts & Professional Studies sheds light on new projects, global opportunities

Header banner for INNOVATUS

In this issue of Innovatus, you will read stories about how the Faculty of Liberal Arts & Professional Studies (LA&PS) is responding to the needs of our students with innovative new projects and programs to help them succeed in a rapidly changing world.

Dean J.J. McMurty
Faculty of Liberal Arts & Professional Studies Dean J.J. McMurty.

One such program is our 12 U Math waiver pilot class. After the COVID-19 lockdowns, it became clear that some students needed to catch up in math fundamentals. This prompted the development of the pilot class to help address the numeracy shortfall experienced by many incoming LA&PS students.   

We also know that students want paid work experience in opportunities related to their field of study; this is one of the reasons paid co-op placements will replace internships and be available for all LA&PS programs starting September 2024.  

And now, more than ever, we know global leaders need a global perspective. We’ve reactivated our fleet of summer abroad opportunities, offering seven study abroad courses in 2024.  

Finally, educators across universities are all grappling with artificial intelligence (AI). Learn more in this issue about how we are dealing with both the drawbacks and benefits of AI. 

Thank you to our entire LA&PS community for all the work you have put into making our teaching and pedagogy so great.  

I hope you enjoy learning more about some of the ways we are helping our staff, students and faculty.  

J.J. McMurtry
Dean, Faculty of Liberal Arts & Professional Studies 

Faculty, course directors and staff are invited to share their experiences in teaching, learning, internationalization and the student experience through the Innovatus story form, which is available at

In this issue:

LA&PS study abroad program evolves, expands its offerings
Students in LA&PS have opportunities – at home and abroad – to engage in global citizenship and learning.

Summer course opens door for students missing numeracy skills
A pilot program created to close the gap on math skills is adding up to success for students in LA&PS.

LA&PS opens conversation about academic honesty and artificial intelligence
A recent event to educate students about generative artificial intelligence, and the University’s policies, sparked meaningful discussions about the changing landscape of education.

It’s co-op programs, not internships, for liberal arts and professional studies students
The introduction of an optional paid co-op program will allow students to participate in work-integrated learning earlier in the educational journey.

LA&PS opens conversation about academic honesty and artificial intelligence 


By Elaine Smith 

With generative artificial intelligence (AI) top of mind for many members of the York community these days, the Faculty of Liberal Arts & Professional Studies (LA&PS) decided that Academic Honesty/Integrity Month was a perfect opportunity to discuss the topic with students. 

LA&PS held a tabling event at Vari Hall on Oct. 24 to educate students about generative AI, address the current parameters for using it in courses and build digital literacy around these emerging tools. They also posed scenarios involving AI so students could consider what is appropriate in various contexts. Approximately 150 students stopped to talk with faculty and staff on hand. 

“We’re really thinking about being proactive and connecting with students around academic honesty and AI in more engaging ways,” said Mary Chaktsiris, a historian and associate director of teaching innovation and academic honesty for LA&PS. “We hope that as a result of this event, students will reach out to instructors to talk about generative AI and connect with available supports at York.”

Students at Vari Hall learn more about academic integrity in the context of artificial intelligence.
Students at at tabling event in Vari Hall learn more about academic integrity in the context of artificial intelligence.

Chaktsiris and the LA&PS academic honesty team co-led the Vari Hall event with Stevie Bell, head of McLaughlin College and an associate professor with the Writing Department. They had support from Michelle Smith, a learning innovation specialist, and academic honesty co-ordinators Namki Kang and Angelica McManus. Neil Buckley, associate dean of teaching and learning, and knowledgeable representatives from the Writing Centre, Peer-Assisted Study Sessions (PASS) and Student Numeracy Assistance Centre (SNACK) instructional teams were also on hand to converse with students. 

“We wanted students to get the facts about academic honesty and give them some guidance regarding AI now that the York Senate’s Academic Standards, Curriculum and Pedagogy (ASCP) Committee has given a policy clarification,” said Buckley. “It was an opportunity to inform students about this, because every student experiences AI in different contexts, and this is a domain that will be growing and growing.” 

ASCP states that “Students across York are not authorized to use text-, image-, code- or video-generating tools when completing their academic work unless explicitly permitted by a specific instructor in a particular course.” As part of a regular review process, a newly revised Senate Policy on Academic Honesty is expected to be announced in coming months. 

Bell noted, “In my experience with academic honesty since I began teaching writing in 2002, I’ve never found a student who wanted to cheat; they want to find out how to do things correctly. 

“So, we brought the conversation to Vari Hall. We wanted this event to be an inviting space for students to discuss AI openly, because the landscape is shifting. In some courses, professors suggest that students use it to do specific tasks, while in other courses, it’s a no-go zone. We wanted students to know how to talk to their professors about it. From talking to students in the Writing Department, I know they are very confused about if, when and how to use AI, so this was very generative for all.”

Students at a tabling event in Vari Hall.
Students at a tabling event in Vari Hall.

Students had a variety of concerns to share at Vari Hall. Some wanted to talk specifically about academic honesty, but others wanted to discuss generative AI more specifically. Faculty, too, are exploring AI, Buckley noted. For example, the Teaching Commons has a community of practice dedicated to discussing AI and how it is being used across campus and recently held a Summit on Generative AI in Higher Education. With the use of AI expected to grow exponentially in the workplace, understanding how to use generative AI will be essential. 

“AI is already a tool in the workplace,” Bell said. “If you look at job postings on the Indeed site, for example, many of them request experience in using generative AI technology productively. As a result, in the Writing Centre, we’re looking at building digital literacies. Students need to understand generative AI’s incentives and motivations to tell you what you want to hear, and they need to learn to fact check. 

“The questions can become very nuanced. For instance, are you giving away a company’s proprietary information if you use it?” 

The success of the Vari Hall event inspired the LA&PS team and they would like to see the conversation continue. Bell has begun holding ongoing workshops at the Writing Centre with a student focus; the first one drew 75 people, including teaching assistants.  

“From a pedagogical perspective, connection and conversation are important parts of navigating the emergent aspects of AI,” Chaktsiris said. “More connections with students will be important to building digital literacies and helping navigate the shifting contexts of generative AI. A focus on connection and support also leans into more inclusive pedagogical practice. I hope there are more touch points for us to discuss AI and academic honesty more generally.” 

Students who have questions can turn to available LA&PS resources such as the Writing Centre, PASS, SNACK, peer mentors, academic advising and academic honesty co-ordinators to discuss generative AI and academic honesty in more detail.

York Circle Lecture Series presents experts on topical subjects

York Circle Lecture series

In collaboration with Jennifer Steeves, the York Circle Chair and associate vice-president research, the Office of Alumni Engagement invites the community to York University’s Keele campus for a new instalment of the York Circle Lecture series.

Beginning Nov. 25 from 9 a.m. to 1 p.m. at the Life Sciences Building, prominent faculty members will delve into a diverse array of compelling subjects, reflecting the defining themes of York University.

The York Circle Lecture Series is held four times a year and is open to York’s community, including alumni and friends. Tickets are $5 and include coffee, light snacks and lunch.

Sessions will feature the guest speakers, and attendees will be asked to select one lecture from each session during registration.

10 a.m. sessions

Maxim Voronov
Maxim Voronov

Maxim Voronov, professor, organizational behaviour and industrial relations, Schulich School of Business, presenting “The good, the bad, and the ugly of authenticity.”

Authenticity seems ever-present in today’s society, and it has become an important research topic among organizational scholars. Much of the time, both scholars and practitioners see authenticity as unambiguously good. But we need to acknowledge the darker side of authenticity and explore its implications. The purpose of this talk is to explore “the good, the bad and the ugly” of authenticity, shifting the focus away from authenticity as an attribute of people and things and toward unpacking the process by which people and things are cast as authentic. A particular focus will be on unpacking the contribution of authenticity to both social good and social harm.

Emilie Roudier
Emilie Roudier

Emilie Roudier, assistant professor, School of Kinesiology & Health Science, Faculty of Health, presenting “Wildland fires: studying our blood vessels to better understand the impact on health.”

Over the past decade, the intensity and size of wildland fires have increased. Wildland fire seasons have lengthened, and these fires contribute to global air pollution. This presentation will highlight how wildland fire-related air pollution can impact our heart and blood vessels.

11:20 a.m. sessions

Usman Khan
Usman Khan

Usman Khan, associate professor and department Chair, Department of Civil Engineering, Lassonde School of Engineering, presenting “Harnessing the power of AI for flood forecasting.”

Floods are the most frequent weather-related natural disasters, affecting the largest number of people globally, with economic damages in excess of $900 billion (between 1994 and 2013). Globally, climate change and urbanization have led to an increase in floods in recent decades and this trend is projected to continue in the coming years, including in Canada. Despite this, Canada is the only G7 country without nationwide flood forecasting systems, which are key to saving lives and reducing the damages associated with floods. Hydroinformatics, the study of complex hydrological systems by combining water science, data science and computer science, attempts to improve traditional flood forecasting through the use of advanced techniques such as artificial intelligence (AI). This talk will outline recent research in this area and plans to build a Canada-wide, open-source, real-time, operational flood forecasting system that harnesses the power of AI to improves our ability to predict and prepare for floods.

Antony Chum
Antony Chum

Antony Chum, assistant professor, Canada Research Chair, School of Kinesiology & Health Science, Faculty of Health, presenting “The impact of recreational cannabis legalization on cannabis-related acute care in Ontario.”

This presentation will discuss the effects of cannabis legalization on cannabis-related acute care (emergency department visits and hospitalizations). The research conducted discovered specific impact patterns among different demographic groups. Additionally, the talk will delve into regional disparities and analyze the policy implications arising from the legalization process.

Since 2009, York Circle has showcased the ideas and research being generated by York University’s community. Topics come from every Faculty and have included discussions around gender issues, brain function, mental health, international aid, sports injuries, financial policy and many more evolving subjects.

New funds aid in AI methods to advance autism research


Professor Kohitij Kar, from York University’s Department of Biology in the Faculty of Science, is among 28 early-career researchers who received grants valued at $100,000 from Brain Canada’s Future Leaders in Canadian Brain Research program. His project will combine neuroscience and artificial intelligence (AI) studies of vision into autism research.

Kohitij Kar

Kar, a Canada Research Chair in Visual Neuroscience, combines machine learning and neuroscience to better understand visual intelligence. His new project funded by Brain Canada will explore these intersections in the context of autism.

“The ability to recognize other people’s moods, emotions and intent from their facial expressions differs in autistic children and adults,” says Kar. “Our project will introduce a new, vastly unexplored direction of combining AI models of vision into autism research – which could be used to inform cognitive therapies and other approaches to better nurture autistic individuals.”

Based on prior funding from the Simons Foundation Autism Research Initiative, Kar’s research team at York University has been developing a non-human primate model of facial emotion recognition in autism. The machine learning-based models the team will use are called artificial neural networks (ANNs), which mimic the way the brain operates and processes information. Kar will develop models that predict at an image-by-image level how primates represent facial emotions across different parts of their brain and how such representations are linked to their performance in facial emotion judgment tasks. They will then use state-of-the-art methods developed by their team to fine-tune the ANNs to align them more with the performance of neurotypical brains and those of an autistic adult.

The second part of Kar’s project will focus on using the updated ANNs to reverse-engineer images that could potentially be used to help autistic adults match their facial emotion judgments to that of the neurotypically developed adults. This work builds on his previous research (published in the journal Science) that showed ANNs can be used to construct images that broadly activate large populations of neurons or selectively activate one population while keeping the others unchanged, to achieve a desired effect on the visual cortex. In this project, he will shift the target objective from neurons to a clinically relevant behaviour.

Brain Canada’s Future Leaders in Canadian Brain Research program aims to accelerate novel and transformative research that will change the understanding of nervous system function and dysfunction and their impact on health. It has been made possible by the Canada Brain Research Fund, an arrangement between the Government of Canada (through Health Canada) Brain Canada Foundation and the Azrieli Foundation, with support from the Erika Legacy Foundation, the Arrell Family Foundation, the Segal Foundation and the Canadian Institutes of Health Research.

Professor receives patent to improve AI machine learning


Steven Xiaogang Wang, a professor in York University’s Department of Mathematics & Statistics at the Faculty of Science, and a member of the Laboratory of Mathematical Parallel Systems, has had a U.S. patent approved for an algorithm that will reduce the training time of artificial intelligence (AI) machine learning (ML).

The patent, titled “Parallel Residual Neural Network Architecture and System and Method for Training a Residual Neural Network,” was inspired by a 2018 paper titled “Decoupling the Layers in Residual Network.” Both were based on collaborations with Ricky Fok, a former postdoctoral Fellow student; Aijun An, a professor in the Department of Engineering & Computer Science; and Zana Rashidi, a former graduate research assistant who carried out some of the computing experiments.

Steven Wang

The now-patented algorithm, approved this year, was a result of six months of research at York. It was submitted to the United States Patent and Trademark Office in 2019. The algorithm’s framework is based on mathematical arguments that helps significantly reduce the training time of machine learning, as it absorbs, processes and analyzes new information. It does so by using a mathematical formula to allow residual networks – responsible for the training of AI – to compute in parallel to each other, thereby enabling faster simultaneous learning.

Wang’s desire to accelerate machine learning’s abilities is driven, in part, by a specific area of AI applications. “I want to apply all the algorithms I develop to health care,” Wang says. “This is my dream and mission.”

Wang has especially focused on using AI to improve care for seniors and that work has previously earned him the Queen Elizabeth II Platinum Jubilee Award from the House of Commons for initiatives during COVID-19 to mitigate the spread of the virus in long-term care facilities.

Wang plans to use the patented algorithm in ongoing projects that aim to provide smart monitoring of biological signals for seniors. For example, it could be used in long-term care to continuously monitor electrocardiogram signals at night to register heartbeats that have stopped. To move towards that goal, Wang is also working on building an AI platform that will complement those ambitions, and expects it to be ready in several years.

He is deeply invested in the social impact of AI as a member of the York organized research unit Centre for Artificial Intelligence & Society, where researchers at York who are collectively advancing the state of the art in the theory and practice of AI systems, governance and public policy. 

“I can use the machine learning to help the long-term care facilities improve the quality of care, but also help out with the struggles of the Canadian health-care system,” says Wang.

Upskill digital storytelling through new course at Glendon


By Elaine Smith

Raiman Dilag, director of information technology services (ITS) at York University’s Glendon College, and his team are working to ensure their students have access to the most current technology to enhance their storytelling capabilities.

They will make this possible through an Academic Innovation Fund grant that allowed them to create a new eight-week extracurricular course – XR Storytelling in Extended Reality / XR Accroche Narrative en Réalité Étendue – that will provide interested Glendon students and faculty, with an introduction to virtual reality (VR), augmented reality (AR), 360-degree cameras, podcasting and 3D printing. The course is not for credit, but those who complete it will earn a microcredential and a digital badge that can be affixed to their resumes and LinkedIn profiles.

Glendon 360 video screenshot
Glendon College offers a new course for all students that allows them to upskill digital storytelling. This photo is a screenshot from a video showing 360-degree photoraphy. For another example, go here.

“While it’s expected for STEM students to be exposed to technological tools, at Glendon, we are deeply rooted in the liberal arts tradition,” Dilag said. “I saw the opportunity to complement resources currently in place, and enhance our students’ access to these and other new tools. Our students have stories to tell, and they benefit from sharing them using new media.”

For those on the outside looking in, the idea of using these tools can be confusing and/or daunting. VR and its sleek headsets can immerse users in another space, such as the Notre-Dame Cathedral in Paris before and after the fire. Fans of Pokémon Go know that AR allows users to employ a device to interact digitally with the real world, bringing images to life. 360-degree photography brings the viewer into the space, letting them experience that moment from all of the photographer’s points of view. Podcasts stream on digital devices and are excellent audio/video tools for storytelling, while 3D printing enables the creation, and customization, of 3D objects crafted in one’s imagination or modified from previous designs.

Each of these technologies, independently or in combination, are valuable for storytelling in a digital era.

The eight-week course created by the ITS team will familiarize students with these key tools and require them to work on a group project to show their facility with one or more of them. The project will also reinforce teamwork skills, and in true Glendon nature, is conducted in English or French by the bilingual XR technology co-ordinator.

“I’d like students to think about the stories they want to tell,” said Dilag. “These are just tools; however, a course like this can open doors, because opportunities following graduation may be influenced by things beyond academics, such as exposure to any or all of these XR technologies.

“We’re all about the student experience, recruitment and retention. If this course helps them graduate more career-ready, it’s a great way for us to add value to their university, and post-graduation, experience.”

The in-person course is open to all Glendon students and will be offered during both the Fall and Winter terms. Dilag hopes the success of the course will lead to expansion for all York students.

The team has been planning the course since February: designing the curriculum, writing the proposal, purchasing necessary equipment and making the space attractive. The course will be conducted by the XR technology co-ordinator with oversight from Dilag.

“Let’s get technology in the hands of this dynamic generation and see what they can do,” Dilag said. “I think they’ll impress us.”

He is proud of his team’s work and reminds the larger community that the ITS department “is about more than resetting passwords,” he said. “We aim to humanize technology, and to use it to enable the telling of great stories.”

XR Storytelling in Extended Reality / XR Accroche Narrative en Réalité Étendue begins the week of Oct. 16. Glendon students can register online.

Faculty who may be interested in the course can contact to discuss their needs and learning objectives.

Copyright could make or break artificial intelligence, says prof

Artificial intelligence: A human hand shakes a robot hand

Osgoode Hall Law School Professor Carys Craig is stepping into the role of associate dean of research and institutional relations at a time when her own scholarship is posing fundamental questions about society’s next technological shift and the future of artificial intelligence (AI).

Carys Craig
Carys Craig

Craig, who will serve in this new role over the next year, argues in a series of recent articles and presentations that the text and data mining required for AI to produce anything could potentially clash with copyright law, significantly hampering the development of the technology or changing its direction.

“The way I see it now, copyright is the thing that could make or break AI,” she said. “The question about whether training AI involves making copies that constitute copyright infringement is an enormous issue.”

An internationally recognized intellectual property (IP) law expert, Craig said that, more than ever, the interaction of law and technology is forcing legal scholars to re-examine legal principles and concepts that may have been taken for granted – especially when it comes AI. In many areas, she added, Osgoode is at the forefront of that process.

“When you have a technology that has this paradigm-shifting capacity,” she observed, “suddenly you’re looking at the law and you’re thinking, well, we always knew, for example, what copyright laws protected when something was authored, but we didn’t really know what an author was because we hadn’t really encountered that question before.

“We want to future-proof the law,” she added, “but we need to understand that the law will evolve and so we need to look at how technology is shaping the law, as well as the potential for the law to shape technology.”

In her new role as associate dean, research and institutional relations, Craig said she will be focused on continuing to enhance Osgoode’s vibrant research community post-pandemic, supporting its researchers, communicating their successes and improving the public’s understanding of the importance of legal research.

In particular, Craig said, she will work hard to ensure that students feel that they’re an integral part of Osgoode’s research community. The law school’s curriculum aids in that by ensuring students are engaged in legal research and scholarly writing, especially with its upper-year writing requirement, editorial opportunities with its leading law journals and the annual JD Research Symposium.

“It’s easy for them to get caught up in their studies and their deadlines and their exams and their grades, but being at Osgoode gives you a real opportunity to participate in an intellectual community,” she said. “We want to see the students here and engaged in that scholarly conversation.”

In addition to her role as an associate dean, Craig is director of IP Osgoode, which explores legal governance issues at the intersection of IP and technology. She is also editor-in-chief of the Osgoode Hall Law Journal. A recipient of multiple teaching awards, she is often invited to share her work and expertise with academic audiences, professional organizations, policymakers and the press. Her publications are regularly cited in legal arguments and judicial opinions, including in several landmark rulings by the Supreme Court of Canada.

To read some of Craig’s recent work on AI and copyright law, visit