With smart and autonomous vehicles around the corner, significant advances are being made in integration of sensing, machine learning, and advanced control systems in vehicles. Modern vehicles currently contain 60-100 sensors on average, and this number is expected to rise to 200 sensors per car by 2020. These sensing technologies, however, are mostly aimed at monitoring vehicle components (e.g., engine, transmission, etc.), vehicle expendables (e.g. gas, fluids, tire air, etc.), and road/driving conditions, paying little attention to monitoring passengers and drivers, despite the fact that studies indicate that a considerable portion of vehicle accidents are caused due to driver distraction. Accordingly, we believe that through in-vehicle monitoring of drivers, critical factors such as passenger safety and quality of experience can be significantly improved. In this project, we propose the use of cameras to monitor drivers and passengers with the aim of using gestures for interacting with vehicles and automatically controlling factors such as ambient conditions (e.g., music, temperature, etc.) and especially in the case of autonomous vehicles, driving parameters such driving style (aggressive vs. conservative), speed, direction, and others.
Ali Etemad (PI), Joshua Marshall