ETRO VUB
About ETRO  |  News  |  Events  |  Vacancies  |  Contact  
Home Research Education Industry Publications About ETRO

Master theses

Current and past ideas and concepts for Master Theses.

Radar-Image depth completion for scene perception

Subject

Depth estimation is a fundamental task for scene understanding in autonomous driving and robotics navigation. While learning-based supervised approaches for monocular depth have achieved good performance in outdoor scenarios [1], the use LiDAR sensors provide the most accurate depth information. However, the depth maps generated by these devices are sparsely distributed compared to the ones obtained from RGB images. This sparsity significantly impacts the performance of LiDAR-based applications. Image-guided methods for predicting dense depth maps from sparse LiDAR data using lidar-camera fusion showed significant improvement in results compared to the conventional depth-only techniques [2, 3].
Recently, due to the common use of radar sensors in the automotive and robotics industry, researchers attempted to address the problem using radar and camera fusion [4]. Still, using radars for depth completion is not yet thoroughly explored due to the increased sparsity and noises in the measurements. However, radars are proven to be more robust in harsh environments and generally, they are more cost-effective and used in the automotive/robotics industry.

Kind of work

In this study, we will investigate existing radar-camera completion approaches. In addition, we will investigate a self-learning fusion approach for scene reconstruction in adverse surrounding conditions.

  • Master Thesis internship @ IMEC (6 months)
  • Preceded by (optional) summer internship (1-3 months) @ IMEC (the summer internship alone is not possible)


    Framework of the Thesis

    In Collaboration with
    Dr. André Bourdoux (IMEC) Andre.Bourdoux@imec.be
    Dr. Seyed Hamed Javadi (IMEC) hamed.javadi@imec.be

    References:
    [1] S.F. Bhat, I. Alhashim, and P. Wonka. “AdaBins: Depth estimation using adaptive bins”. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4009–4018, 2021.
    [2] M.A.U. Khan, D. Nazir, A. Pagani, H. Mokayed, M. Liwicki, D. Stricker, M.Z. A Afzal, “Comprehensive Survey of Depth Completion Approaches”, Sensors 2022, 22, 6969. https://doi.org/10.3390/s22186969
    [3] C. Fu, C. Mertz and J. M. Dolan, "LIDAR and Monocular Camera Fusion: On-road Depth Completion for Autonomous Driving," 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 2019, pp. 273-278, doi: 10.1109/ITSC.2019.8917201.
    [4] S. Gasperini, P. Koch, V. Dallabetta, N. Navab, B. Busam and F. Tombari, "R4Dyn: Exploring Radar for Self-Supervised Monocular Depth Estimation of Dynamic Scenes," 2021 International Conference on 3D Vision (3DV), London, United Kingdom, 2021, pp. 751-760, doi: 10.1109/3DV53792.2021.00084.

    Expected Student Profile

  • Following an MSc in a field related to one or more of the following: Electrical engineering, Computer Science, or Applied Computer Science.
  • Experience with image processing, signal processing, and computer vision. Some knowledge of radar concepts is a plus.
  • Experience with machine learning and statistics.
  • Strong programming skills (Python).
  • Interest in developing state-of-the-art Machine Learning methods and conduct experiments.
  • Ability to write scientific reports and communicate research results at conferences in English.
  • Promotor

    Prof. Hichem Sahli

    +32 (0)2 629 2916

    hsahli@etrovub.be

    more info

    - Contact person

    - IRIS

    - AVSP

    - LAMI

    - Contact person

    - Thesis proposals

    - ETRO Courses

    - Contact person

    - Spin-offs

    - Know How

    - Journals

    - Conferences

    - Books

    - Vacancies

    - News

    - Events

    - Press

    Contact

    ETRO Department

    info@etro.vub.ac.be

    Tel: +32 2 629 29 30

    ©2024 • Vrije Universiteit Brussel • ETRO Dept. • Pleinlaan 2 • 1050 Brussels • Tel: +32 2 629 2930 (secretariat) • Fax: +32 2 629 2883 • WebmasterDisclaimer