York researchers invent novel computer vision system for automatic hockey videography

Researchers at York University have developed a deep learning-based computer vision system for attentive puck tracking (APT) that is able to automatically track the play in a game of hockey and deliver a dynamic zoomed video feed similar to an NHL broadcast feed.

The project, which is a collaboration between York University PhD candidate Hemanth Pidaparthy and Professor James Elder at York University’s Centre for Vision Research (CVR), will be presented on Jan. 10 at the 2019 IEEE Winter Conference on Applications in Computer Vision in Hawaii.

While hockey involves a large playing surface, instantaneous play is typically localized to a smaller region of the ice. Live spectators attentively shift their gaze to follow play, and professional sports videographers pan their cameras to mimic this process. Unfortunately, manual videography is economically prohibitive below the elite level.

The APT system takes as input a wide-field 4K video feed and automatically tracks the play, allowing an HD 3x-zoomed video feed to be dynamically cropped and re-targeted to a spectator’s display device (e.g. an iPad). This allows family and friends to enjoy high-quality video recordings of amateur games in the comfort of their homes and provides coaches and players with the ability to review play.

Original 4K video. Green dot: automatically estimated puck location. Blue rectangle: instantaneous region of interest

HD 3x-zoomed view, re-targeted for a standard display device

“Without this technology, it is really not possible to enjoy watching a game of amateur hockey on a standard display, as the puck is too small to be visible,” explained Pidaparthy.

A prototype of the system is currently operating at the Canlan Ice Sports facility at York’s Keele Campus, where it is being used to record the York Lions varsity home games for the 2018-19 season. A provisional patent application has been filed, and Innovation York and York’s Vision: Science to Applications (VISTA) program are supporting the researchers to commercialize the technology. The project was awarded funding from the the VISTA Prototyping Fund to accelerate the development.

“This is a good example of how AI (artificial intelligence) can be used to simplify and enrich our lives. In this case, greater online access to community sporting events can cut down on vehicle traffic while keeping us socially engaged,” said Elder, who holds the York Research Chair in Human and Computer Vision and is jointly appointed to both the Department of Psychology and the Department of Electrical Engineering & Computer Science at York.

The research was supported by the Natural Sciences & Engineering Research Council (NSERC) Idea to Innovation program and by the NSERC CREATE Training Program in Data Analytics & Visualization, led at York by Elder.

The researchers are currently enhancing their system to automatically extract video summaries of game highlights, and plan to extend their research to other sports such as soccer in the spring.

For more York University news, photos and videos, visit the YFile homepage