Visual and Thermal Data for Pedestrian and Cyclist Detection

Sarfraz Ahmed, M. Nazmul Huda, Sujan Rajbhandari, Chitta Saha, Mark Elshaw, Stratis Kanarachos

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    6 Citations (SciVal)

    Abstract

    With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

    Original languageEnglish
    Title of host publicationTowards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings
    DOIs
    Publication statusPublished (VoR) - 17 Jul 2019

    Fingerprint

    Dive into the research topics of 'Visual and Thermal Data for Pedestrian and Cyclist Detection'. Together they form a unique fingerprint.

    Cite this