Integration of Leap Motion sensor with camera for better performance in hand tracking

Khadijeh Mahdikhanlou, Hossein Ebrahimnezhad

Article ID: 3020
Vol 5, Issue 2, 2024
DOI: https://doi.org/10.54517/m.v5i2.3020
Received: 22 October, 2024; Accepted: 27 November, 2024; Available online: 10 December, 2024; Issue release: 31 December, 2024


Download PDF

Abstract

In this paper, we propose a framework for hand tracking in human-computer interaction applications. Leap Motion is used today as a popular interface in virtual reality and computer games. In this study, we evaluated the merits and drawbacks of this device. The limitations of this device restrict the user’s free movement. The purpose of this study is to find an optimal solution for using Leap Motion. We propose a framework to estimate the pose of the hand in a bigger space around Leap Motion. Our framework uses an integration of Leap Motion with a camera that are placed in two different places to capture the information of the hand from various views. The experiments are designed based on the common tasks in the human-computer interaction applications. The finding of this study demonstrates that the proposed framework increases the precise interaction space.


Keywords

Leap Motion; hand tracking; human-computer interaction; 3D hand pose dataset; deep learning


References

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Khadijeh Mahdikhanlou, Hossein Ebrahimnezhad

License URL: https://creativecommons.org/licenses/by/4.0/


This site is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).