Menu Close

Research Projects

Current Projects

Off-Road Multi-Vehicle Autonomy for Challenging Outdoor Environments

This project is about studying and developing demonstrably practical methods and algorithms for reconnaissance activities using multiple robotic vehicles, with a particular focus on especially harsh and uncertain terrains (e.g., rocky, forested, unstructured, off-road) as well as on the problem of maintaining secrecy from potential intruders in such operations. The proposed research includes work on vehicle navigation, robot teaming, learning-based control, terrain assessment, as well as the concepts of opacity and unobservability for maintaining secrecy, where some vehicles may be only semi-autonomous, remotely driven, and/or supervised by human operators. In addition to this fundamental research, the project plan includes integration on actual mobile robots and field testing in representative environments. Although reconnaissance is the primary

interest of the research partners associated with this project, it is anticipated that this research may also find application in other Canadian “off-road” scenarios, including mining automation, forestry, agriculture, as well is in planetary exploration and science. The proposed project also aims to provide unique HQP training experiences on multiple fronts. Student researchers will be given the opportunity to not only collaborate across several disciplines, but also take theory into practice by way of access to state-of-the-art mobile robots and field testing environments.

Project Partners

NSERC Collaborative R&D Program

General Dynamics Land Systems Canada (GDLS-C)

Defence R&D Canada (DRDC) Suffield

Smart Environments for Improved Patient Care in Acute and Non-Acute Healthcare Facilities

Medical and electronics in circles around a medical symbol

The goal of this project is to initiate research into a novel process for improved patient care in acute and non-acute healthcare facilities. The process is based on the deployment of a smart environment enable nurses and other healthcare professionals to more effectively and efficiently monitor in-patients at risk of harm from events such as falls. The smart environment comprises a variety of sensors (vision, audio, proximity, wearable) as well as automated data processing and Artificial Intelligence and Machine Learning (AI/ML) methods, combined with a mobile interface to allow practitioners to easily and seamlessly access and interpret the data. The project will involve researchers at Queen’s, as well as end users at Scarborough Health Network (SHN), who are eager to provide the environment for this development. The first phase of the project, will deploy an initial subset of the process that will yield tangible improvements in safer healthcare delivery, the success of which will serve as a springboard for further and larger funding opportunities. Ultimately, this project will be regarded as the first step towards enabling the modernization of efficient, effective, data-driven patient monitoring in acute and non-acute healthcare facilities at a reduced cost.  

Image Analysis and AI for Structural Inspection

Rails of a bridge

This research program is a multi-disciplinary partnership between the City of Kingston and Ingenuity Labs that will investigate robotic inspection technologies as an integral step towards turning the Kingston Third Crossing into a smart structure.

The goal of this proposed research program is to begin development of a high resolution digital image analysis and artificial intelligence system to support real-time bridge inspection. The proposed systems will be partnered with advanced robotic systems developed in parallel. The digital image acquisition and analysis system will be paired with onboard machine vision analysis to identify anomalies, along with their location, for real-time artificial intelligence (AI) based screening and decision making or later analysis by engineers. The research project will take place in three phases: (i) literature survey, (ii) sensor and algorithm selection, and (iii) initial prototype development.

 

Project Researchers

Neil Hoult (PI) 
Brian Surgenor
Xiaodan Zhu

Project Partner

City of Kingston

Gesture-Based Semi-Autonomous Vehicle Control

Car console with steering wheel

With smart and autonomous vehicles around the corner, significant advances are being made in integration of sensing, machine learning, and advanced control systems in vehicles. Modern vehicles currently contain 60-100 sensors on average, and this number is expected to rise to 200 sensors per car by 2020. These sensing technologies, however, are mostly aimed at monitoring vehicle components (e.g., engine, transmission, etc.), vehicle expendables (e.g. gas, fluids, tire air, etc.), and road/driving conditions, paying little attention to monitoring passengers and drivers, despite the fact that studies indicate that a considerable portion of vehicle accidents are caused due to driver distraction. Accordingly, we believe that through in-vehicle monitoring of drivers, critical factors such as passenger safety and quality of experience can be significantly improved. In this project, we propose the use of cameras to monitor drivers and passengers with the aim of using gestures for interacting with vehicles and automatically controlling factors such as ambient conditions (e.g., music, temperature, etc.) and especially in the case of autonomous vehicles, driving parameters such driving style (aggressive vs. conservative), speed, direction, and others.

Project Researchers
Ali Etemad
Joshua Marshall

Virtual Reality Plant for Chemical Process Design

Virtual Reality (VR) has the ability to immerse users in a simulated environment and provide them with experiential learning opportunities. Most undergraduate Chemical Engineering students are required to design a chemical plant for their capstone design project without ever having visited or interacted with a full-scale processing plant. This project will design and develop an interactive VR chemical processing plant; which will be used as an educational tool for chemical engineers completing their capstone design project at Queen’s University. The goals of the project include evaluating the effects of VR on student comprehension, retention, and chemical processing design competency. The VR educational tool will give students the ability to view and interact with the unit operations inside a chemical processing plant without special training, expensive protective equipment and security clearance. Students will complete a number of challenges in VR and will be evaluated on their comprehension and invited to provide feedback on the effectiveness of the VR tool. Two versions of VR application will be developed. Eye-tracking capabilities will be incorporated into the first version to track student interaction with the VR environment and asses their level of expertise. The subsequent version will utilize the eye tracking and other data sources to develop a neural net which can determine the students level of expertise and progress in real time. An artificially intelligent assistant will be developed to help guide students in the second iteration of the VR tool. The impact of VR and the artificially intelligent assistant on student learning will be evaluated.

Project Researcher 

Paul Hungler (PI)