|
Subject
Gold-standard, marker-based Motion Capture systems like Vicon are expensive and lack portability, which limits their adoption in clinical settings. Recent advancements in low-cost depth sensors and Deep Learning (DL) technology have enabled the development of affordable and portable marker-based MoCap solutions [1]. In this context, we developed HoloMoCap [2], which accurately and in real-time captures motion using a portable Augmented Reality setup. The system utilizes the HoloLens depth sensor to track reflective markers placed on the subject's skin. However, the depth sensor's frame rate is 5 Frames Per Second (FPS), which is too low for clinical applications.
Kind of work
Objective The objective of this thesis is to enhance the current version of HoloMoCap by increasing its frame rate to ideally 30 FPS. This will be achieved by developing a new motion tracking method that will leverage not only the depth sensor but also the other HoloLens cameras (RGB and grayscale cameras).
Description of Work - Literature review. - Getting familiar with the HoloLens, Unity, and HoloMoCap. - Development of the tracking method. Two main methods will be explored to achieve higher FPS: o Exploiting optical flow from the RGB sensor [3]. o Using the two lateral grayscale cameras for triangulation [4,5].
Framework of the Thesis
References [1] Ceglia A. et al. (2025). Real-time, accurate, and opensource upper-limb musculoskeletal using a single RGBD camera An exploratory hand-cycling study. Comput. Biol. Med. 184, 109 [2] Zaccardi S. et al. (2025). HoloMoCap: Real-Time Clinical Motion Capture with HoloLens 2. Proc. IEEE AIxVR 2025. [3] SukHwan Lim and A. El Gamal, "Optical flow estimation using high frame rate sequences," Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205), Thessaloniki, Greece, 2001, pp. 925-928 vol.2, doi: 10.1109/ICIP.2001.958646. [4] C. Kunz, P. Maurer, F. Kees, P. Henrich, C. Marzi, M. Hlavac, M. Schneider, and F. Mathis-Ullrich, Infrared marker tracking with the hololens for neurosurgical interventions, Current Directions in Biomedical Engineering, vol. 6, p. 20200027, 09 2020 [5] C. Gsaxner, J. Li, A. Pepe, D. Schmalstieg, and J. Egger, Inside-out instrument tracking for surgical navigation in augmented reality, in Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology, ser. VRST 㤝. New York, NY, USA: Association for Computing Machinery, 2021. [Online]. Available: https://doi.org/10.1145/3489849.3489863
Expected Student Profile
Programming experience in Python and C#/C++ is required. Experience with Unity gaming development software is preferred. The candidate should be interested in computer vision and augmented reality.
|
|