Development of a Biomechatronic Device for Motion Analysis Through a RGB-D Camera

Main Article Content

Francesca Pristerà https://orcid.org/0000-0003-1671-1520
Alessandro Gallo https://orcid.org/0000-0002-7989-5921
Salvatore Fregola https://orcid.org/0000-0001-5636-3490
Alessio Merola https://orcid.org/0000-0002-8728-2084

Keywords

Motion analysis, Smart rehabilitation, Home rehabilitation, Biomechatronic device

Abstract

This work investigates the validity and reliability of a novel biomechatronic device providing an interactive environment in Augmented Reality (AR) for neuromotor rehabilitation. A RGB-depth camera and telemonitoring/remote signaling module are the main components of the device, together with a PC-based interface. The interactive environment, which implements some optimized algorithms of body motion capture and novel methodologies for human body motion analysis, enables neuromotor rehabilitation treatments that are adaptable to the performance and individual characteristics of the patient. The RGB-Depth camera module is implemented through a Microsoft Kinect, ORBBEC ZED2K devices; the telemonitoring module for teleassistance and therapy supervision is implemented as a cloud service.


Within the module of body motion tracking, the abduction and adduction movements of the limbs of the full-body structure are tracked and the joints angles are measured in real-time; the most distinctive feature of the tracking module is the control of the trunk and shoulder posture during the exercises performed by the patient. Indeed, the device recognizes an incorrect position of the patient's body that could affect the objective of the exercise to be performed.  The recognition of an incorrect exercise is associated to the generation of an alert both to the patient and the physician, in order to maximize the effectiveness of the treatment based on the user's potential and to increase the chances of a better biofeedback.


The experimental tests, which have been carried out by reproducing several neuromotor exercises on the interactive environment, show that the feature recognition and extraction of the joints and segments of the musculo-skeletal structure of the patient's, and of wrong posture during exercises, can achieve good performance in the different experimental conditions. 


The developed device is a valid tool for patients affected by chronic disability, but it could be extended to neurodegenerative diseases in the early stages of the disease. Thanks to the enhanced interactivity in augmented reality, the patient can overcome some difficulties in interaction with the most common IT tools and technologies; at the meanwhile she/he can perform rehabilitation at home. The physician can also check in real time the results and customize the care pathway.


The enhanced interactivity provided by the device during rehabilitation session increases both the motivation by the patient and the continuity of the care, as well as it supports low-cost remote assistance and telemedicine by optimizing therapy costs.


The key points are:



  1. i) making rehabilitation motivating for the patient, becoming a "player";

  2. ii) optimize effectiveness and costs;


iii) possibility of low-cost remote assistance and telemedicine.

Downloads

Download data is not yet available.

Abstract 1181 | PDF Downloads 452

References

1.Du W and Li H. Vision Based Gesture Recognition Sys-tem with Single Camera. CAD Laboratory Institute of Computing Technology, Chinese Academy of Sciences, 100080, Beijing, China; 2000.
2.MCNEILL, D. & LEVY E. T.Speech, Gesture, and Dis-course, in Discourse Processes,15, (1992) 277-301.
3.Rose FD, Brooks BM, Rizzo AA. Virtual reality in brain damage rehabilitation, Cyber Psychol Behav 2005;8:241−262.
4.Paraskevopoulos IT, Tsekleves E, Craig C, et al. Design guidelines for developing customised serious games for Parkinson’s disease rehabilitation using bespoke game sensors, Entertain Comput 2014;5:413–24.
5.Abate AF, Acampora G, Ricciardi S. Augmented Tour of Archaeological Site by Means of Adaptive Virtual Guide. DMS; 2008.
6.Steven B, Feiner K. Augmented reality: a new way of seeing. Sci Amer 2002.
7.Larsona EB, Feigonb M, Gagliardod P, Dvorkina AY. Virtual reality and cognitive rehabilitation: A review of current outcome research. NeuroRehabil 2014;34.
8.Rahman M. Beginning Microsoft Kinect for Windows SDK 2.0: Motion and Depth Sensing for Natural User Interfaces, APress; 2017.
9.Roccetti M, Marfia G. Recognizing Intuitive Pre-defined Gestures for Cultural Specific Interactions: An Image-based Approach”, Proc. 3rd IEEE International Workshop on Digital Entertainment, Networked Virtual Environ-ments, and Creative Technology (DENVECT’11) - 8th IEEE Communications and Networking Conference (CCNC 2011), Las Vegas (USA), IEEE Communications Society, January 2011.
10.Wren C, Azarbayejani A, Darrell T, and Pentland A. Pfinder: Realtime Tracking of the Human Body.” IEEE Trans. on Patt. Anal. And Machine Intell 1997;19.
11.Kinect. Microsoft Corporation, 2014. Accessed: 2014-10-01. Available at: http://www.kinect.com/.
12.Linee guida per le attività di riabilitazione. Conf. Stato-Regioni, Gazzetta Ufficiale N. 146 del 24 Giugno; 2002.