Two cutting-edge projects on AI-and-human interaction awarded major grants

Artificial intelligence: A human hand shakes a robot hand

Two York University projects led by Lassonde School of Engineering Professors Michael Jenkin (Electrical Engineering and Computer Science) and Jinjun Shan (Earth and Space Science and Engineering) were awarded funding from the Department of National Defence’s Innovation for Defence Excellence and Security (IDEaS) program under Innovation Networks in October 2019. York’s securing two of the six grants from the IDEaS Program speaks to the University’s leadership in this area.

Each contribution is worth close to $1.5 million. Adding the support of other funders, these two projects are, collectively, worth $5 million.

“The technological, social and economic benefits of these two projects are profound. The discoveries from these projects will have a lasting impact on Canadian society by training highly qualified personnel over a range of technical skills,” said Professor Rui Wang, interim vice-president research & innovation at York University.

Both Lassonde projects involve human and AI or robot interaction, and advance key technologies

Jenkin’s winning project: Developing trust among soldiers, civilians and robots

The first project is led by Jenkin, a core member of Vision: Science to Applications (VISTA) and a member of the Centre for Innovation in Computing at Lassonde and the Centre for Vision Research at York. Professors James Elder (Electrical Engineering and Computer Science) and Debra Pepler (Department of Psychology), also at York, are Co-investigators. Industrial partners include CrossWing Inc, CloudConstable Inc. and Kaypot Inc.

Michael Jenkin

It is worth noting that although the investment from the IDEaS program in this project is $1.48 million  over three years, the total project is anticipated to cost $2.3 million over three years.

The objective of this project, titled “SENTRYNET: Developing trust between soldiers, civilians and robots,” is to explore the development of methodologies and technologies that will enable autonomous robots to interact with the public and respond to situations in a manner that maintains control and security of the local environment.

Key outcomes of this project will be:

  • To develop sensing for human-robot interaction to detect human visitors and identify potential threats by creating person detection and recognition algorithms, and sensor positioning strategies to enable the robot to act as a human guard;
  • To develop technology to monitor visitors to a facility to better characterize their behaviour during engagement of trust;
  • To develop technology to enhance human-robot interaction to support audio interaction and avatar-based interaction with the public; and
  • To develop operational mechanisms, sensing and communications strategies to engender trust between human security personnel and robots deployed in the field.

“This project will contribute to the advancement of knowledge in the field of autonomous systems related to trust and barriers to adoption. It would also train personnel over a wide range of technical skills including computer vision, robotics and human behaviour,” said Jenkin.

Shan’s winning project: Human-machine cooperation with autonomous systems

Injun Jordan Abel
Jinjun Shan

The second project is led by Shan, and York Professor Robert Allison (Electrical Engineering and Computer Science) is a Co-investigator. The total amount of this project is $2.45 million, with $1.49 million being funded from the IDEaS program. Industrial collaborators include Imagine 4D, Computer Research Institute of Montreal (CRIM) and C3 Human Factors Consulting Inc.

The objective of this project, titled “Effective human-machine cooperation with intelligent adaptive autonomous systems,” is to advance key technologies in the area of autonomous systems including intelligent adaptive systems, automated task execution, high-precision navigation and control of autonomous systems and effective human-machine interactions.

The key outcomes will be:

  • To improve the accuracy of real-time predictive models for operator stress and fatigue;
  • To generate a dataset for validating human-automation trust models;
  • To investigate the use of realistic immersive environments as a means to improve situational awareness, reduce fatigue and maintain calibrated human-automation trust levels; and
  • To develop high-precision navigation strategies for autonomous systems for path planning, collision and obstacle avoidance with minimal human overseer input.
Shan’s drone lab: Spacecraft Dynamics Control and Navigation Laboratory (SDCNLab)

“This project will provide a platform for autonomous systems related research in multiple science and engineering disciplines, and provide solutions to trust model development and validation, operator state monitoring, human-machine interaction, high-accuracy navigation and artificial intelligence,” said Shan.

IDEaS program provides creative thinkers with structure and support

The Innovation for Defence Excellence and Security (IDEaS) program is an investment of $1.6 billion over 20 years aimed at meeting the demands of today’s complex global defence and security environment. The program enables Canada to deliver the capabilities needed for a strong and agile military by providing financial support to foster innovation through contracts, contribution agreements and grants. The IDEaS program helps innovators by supporting analysis, funding research, and developing processes that facilitate access to knowledge. It will also support testing, integration, adoption, and acquisition of creative solutions for Canada’s defence and security communities.

For more information on IDEaS, visit the website. To learn more about Jenkin, visit his Faculty profile page. To learn more about Shan, visit his website. To see a video of Shan’s SDCNLab, visit the website. To read a related YFile story about Shan’s drone lab, visit the website.