Off-campus WSU users: To download campus access dissertations, please use the following link to log into our proxy server with your WSU access ID and password, then click the "Off-campus Download" button below.

Non-WSU users: Please talk to your librarian about requesting this thesis through interlibrary loan.

Access Type

WSU Access

Date of Award

January 2022

Degree Type

Thesis

Degree Name

M.S.

Department

Mechanical Engineering

First Advisor

Jerry C. Ku

Abstract

Advanced Driver Assistance Systems and Autonomous Vehicles are becoming more and more popular in last few years. These systems heavily rely on multiple sensor information to perceive and understand the environment in which the vehicle is operating. Sensor fusion is the backbone of these multi sensor systems of these vehicles. To improve the accuracy of the object detection and sensor fusion algorithm overall, in this thesis a deep learning-based (CNN) sensor fusion approach using the simulated sensor data is proposed for camera-radar system to address the blind-spot monitoring issue on large Class-8 trucks. One of the biggest problems revolving around deep learning networks is that, training them requires massive labeled datasets, and in our study, we mainly address this problem by proposing the use of simulated sensor data. The proposed deep learning network in this study is of hybrid form in which the feature extractions are done using ResNet-50 and object detection and bounding box scores are estimated through YOLOv2 network. The use of Extended Kalman Filter is proposed for the radar tracks fusion and tracking and finally a Pseudo Inverse method is used to do the spatial alignment of the data coming from radar and camera. The overall algorithm of object detection, classification, tracking, and tracks fusion is tested using the sufficiently calibrated Model-in-the-Loop approach. The proposed algorithm and network structure demonstrated a very good accuracy and real-time performance when tested in MIL platform. Overall, the camera net achieved the classification score of 0.9 and radar net achieved around 0.8. The algorithm and proposed solution are tested and performed very well in two different virtual scenarios - in daytime and nighttime.

Off-campus Download

Share

COinS