automatic estimation system of dynamic knee...

102
TRABAJO DE FIN DE GRADO GRADO EN INGENIER ´ IA ELECTR ´ ONICA INDUSTRIAL Automatic Estimation System of Dynamic Knee Alignment Based on Depth Video Autor Carlos Bail´on Romacho Directores Miguel Damas Hermoso ector Pomares Cintas Departamento Arquitectura y Tecnolog´ ıa de Computadores Granada, Julio de 2016

Upload: ngokien

Post on 12-Oct-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

TRABAJO DE FIN DE GRADO

GRADO EN INGENIERIA ELECTRONICA INDUSTRIAL

Automatic Estimation System ofDynamic Knee Alignment Based on

Depth Video

Autor

Carlos Bailon Romacho

Directores

Miguel Damas HermosoHector Pomares Cintas

Departamento

Arquitectura y Tecnologıa de Computadores

Granada, Julio de 2016

Page 2: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

AUTOMATIC ESTIMATION SYSTEM OF DYNAMIC KNEEALIGNMENT BASED ON DEPTH VIDEO

Carlos Bailon Romacho

PALABRAS CLAVE:alineamiento de rodilla · angulo de proyeccion frontal · marcadores reflectantes · lig-amento cruzado anterior · sındrome patelofemoral · sensor de profundidad · Kinect ·analisis de video en 2D

RESUMEN:La medida del grado de alineamiento de la articulacion de la rodilla es uno de los indi-cadores mas utilizados para medir el riesgo de sufrir lesiones en el complejo rotuliano,especialmente en el ligamento cruzado anterior (LCA). Para estimar dicho grado dealineamiento, un metodo ampliamente utilizado es la medida de la proyeccion sobre elplano frontal (APPF) del angulo de la rodilla. Sin embargo, los procedimientos tradi-cionales para medirlo presentan limitaciones practicas, que derivan en altos costes ygrandes inversiones de tiempo al evaluar multiples sujetos. En este trabajo se presentaun novedoso sistema de analisis en vıdeo, orientado al apoyo de los expertos en la medidadinamica del APPF de un modo economico y sencillo. El sistema desarrollado empleael sensor de profundidad incluido en el sensor Kinect V2 para monitorizar marcadoresretro-reflectantes situados sobre las articulaciones de la cadera, rodilla y tobillo del pa-ciente, para proporcionar una estimacion automatica del angulo formado por las mismas.La informacion registrada por el sensor es procesada y gestionada por una aplicacion deordenador que simplifica el trabajo del experto y acelera el analisis de los resultados.

KEYWORDS:knee alignment · Frontal Plane Projection Angle · reflective markers · anterior cruciateligament · patellofemoral pain syndrome · depth sensor · Kinect · 2D video analysis

ABSTRACT:Knee alignment measurements are one of the most extended indicators of knee-complexinjuries, especially in the anterior cruciate ligament (ACL). The knee Frontal PlaneProjection Angle (FPPA) is widely used as a 2D estimation of knee alignment. However,traditional procedures to measure this angle suffer from practical limitations, which leadsto high costs and huge time investments when evaluating multiple subjects. This workpresents a novel video analysis system aimed at supporting experts in the dynamicmeasurement of the FPPA in a cost-effective and easy-of-use way. The system employsKinect V2 depth sensor to track retro-reflective markers attached to the patient legjoints to provide an automatic estimation of the angle formed by the hip, knee andankle joints. The information registered by the sensor is processed and managed by acomputer application that simplifies expert’s work and expedites the analysis of the testresults.

Page 3: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

UNIVERSIDAD DE GRANADA

INGENIERIA ELECTRÓNICA INDUSTRIAL

AUTORIZACIÓN DE LECTURA DE

TRABAJO FIN DE CARRERA

D. Miguel Damas Hermoso y D. Héctor Pomares Cintas, profesores del

Departamento de Arquitectura y Tecnología de Computadores de la Universidad de

Granada, como director/es del Trabajo Fin de Grado titulado “Automatic Estimation

System of Dynamic Knee Alignment Based on Depth Video” y realizado por el

alumno D. Carlos Bailón Romacho,

CERTIFICA/N: que el citado Trabajo Fin de Grado, ha sido realizado y redactado por

dicho alumno y autorizan su presentación.

Granada,

Fdo. Miguel Damas Hermoso Fdo. Héctor Pomares Cintas

Page 4: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

UNIVERSIDAD DE GRANADA

INGENIERIA ELECTRÓNICA INDUSTRIAL

AUTORIZACIÓN DE DEPÓSITO EN EL DEPARTAMENTO DE

ARQUITECTURA Y TECNOLOGÍA DE COMPUTADORES

Yo, D/Dña. Carlos Bailón Romacho con DNI 75164237E, autor del Trabajo Fin de

Grado titulado “Automatic Estimation System of Dynamic Knee Alignment Based on

Depth Video” realizado en la Universidad de Granada,

AUTORIZO: al depósito de dicho Trabajo en la Secretaría del Departamento de

Arquitectura y Tecnología de Computadores, y a su visualización bajo firma de

documento de confidencialidad.

Granada,

Fdo. D/Dña. Carlos Bailón Romacho

Page 5: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

Preface

When I decided to take this project, I aimed to do something special, something by

which I could make my little contribution to the world. It has been a long road to reach

this point. The amount of work has been so huge, since I have had to deal with several

disciplines from many areas, such as physical rehabilitation, medical knee injuries, video

analysis technologies, computer programming, data processing and database organiza-

tion. In addition, the research character of the project makes it difficult to enclose the

whole project in this document.

Through this words, I would like to emphasize the effort put on this work, as well

as the difficulty to join knowledges from all aforementioned areas, and the courage to

employ the technologies which provide the best solution to the problems faced during

the project, despite they were new for me.

I would also like to note that this Degree Thesis is part of an extensive research

project, which opens the way to a new research line on which I hope to keep working in

the following years. Without further delay, I hope the reader enjoys this work as much

as I have done during its development.

Carlos Bailon Romacho

Page 6: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

Acknowledgements

I would like to acknowledge people that supported me throughout the development of

this work. Miguel Damas and Hector Pomares, who encouraged me to take this project;

Oresti Banos and Jose A. Moral-Munoz, whose wise advices and revisions have made

this possible; Salvador Moreno for putting up with each other during long working hours;

Antonio Fernandez and Dioni Gonzalez for their interest and support when conceiving

the project; Pablo Molina, who gave me detailed information of the tests; my family

because they always trusted and supported me in all moment; and my friends, who have

made me believe on the greatness of this project.

Thanks a lot, this is only the beginning.

Page 7: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

Contents

List of Figures ix

List of Tables xi

List of Codes xiii

1 Introduction 1

1.1 Motivation and Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1.3 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2 State of the Art 7

2.1 Rehabilitation Assessment Systems . . . . . . . . . . . . . . . . . . . . . . 7

2.2 Motion Capture Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.2.1 Inertial Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.2.2 3D Video Motion Tracking . . . . . . . . . . . . . . . . . . . . . . 11

2.2.3 2D Offline Video Analysis . . . . . . . . . . . . . . . . . . . . . . . 12

2.3 Kinect Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3.1 Healthcare . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3.2 Biomechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3.3 Robotics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.3.4 Retail Industry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.3.5 Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3 Test Procedure 17

3.1 Single Leg Landing Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.2 Other Knee Alignment Tests . . . . . . . . . . . . . . . . . . . . . . . . . 20

vii

Page 8: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

CONTENTS

4 System Description 21

4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4.2 Kinect V2 Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.2.1 Operation Principle . . . . . . . . . . . . . . . . . . . . . . . . . . 24

4.2.2 Technical Specifications . . . . . . . . . . . . . . . . . . . . . . . . 25

4.3 Retro-reflective Markers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.4 Automatic FPPA Measurement . . . . . . . . . . . . . . . . . . . . . . . . 30

4.4.1 Marker Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4.4.2 FPPA Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

5 Computer Application 35

5.1 Application Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

5.1.1 Start Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.1.2 Database Window . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.1.3 Main Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

5.1.4 Trend Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

5.2 Application Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 44

5.2.1 Kinect APIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

5.2.2 Image Construction . . . . . . . . . . . . . . . . . . . . . . . . . . 49

5.2.3 Algorithm Implementation . . . . . . . . . . . . . . . . . . . . . . 50

5.2.4 Graph Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . 53

5.2.5 Database Management . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.2.6 Extra Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

5.3 Data Post-processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

5.4 System Robustness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

6 Evaluation 67

7 Conclusions and Future Work 71

7.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

7.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

Appendix 75

References 81

viii

Page 9: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

List of Figures

1.1 Distribution (percentages) of injuries by body part for games and practices

for 15 sports [1]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Characteristic joint movements at ”position of no return” [2]. . . . . . . . 3

1.3 Frontal Plane Projection Angle [3]. . . . . . . . . . . . . . . . . . . . . . . 4

1.4 Investments in healthcare devices during the last 15 years [4]. . . . . . . . 5

2.1 Number of documents indexed in Scopus database with “rehabilitation

system” keywords . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.2 Xsens body suit for motion tracking . . . . . . . . . . . . . . . . . . . . . 10

2.3 3D motion capture laboratory setting. A high number of cameras are

employed [5]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.4 3D motion tracking cameras: (a) Vicon Bonita, (b) OptiTrak Prime 13. . 12

2.5 Kinovea software main screen [6]. . . . . . . . . . . . . . . . . . . . . . . . 14

2.6 10-meters walkaway gait analysis. Comparison between Kinect and Op-

totrak [7]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.1 Diagram of the lateral view of a SLL test scenario set-up . . . . . . . . . . 18

3.2 SLL test procedure: (a) marker placing, (b) test start position, (c) test

end position with FPPA displayed . . . . . . . . . . . . . . . . . . . . . . 19

4.1 Proposed system structure and elements . . . . . . . . . . . . . . . . . . . 22

4.2 Kinect V2 Sensor [8]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.3 Time-of-Flight operation principle . . . . . . . . . . . . . . . . . . . . . . 24

4.4 Spherical retro-reflective markers used in the project . . . . . . . . . . . . 27

ix

Page 10: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

LIST OF FIGURES

4.5 SLL data measured with markers (blue line) and without markers (red

line) during the same experiment . . . . . . . . . . . . . . . . . . . . . . . 28

4.6 Bland-Altman plot of the measurements with and without markers. The

three dotted lines indicate the mean (x) and the Bland- Altman “limits

of agreement” (x± 1.96σX) . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.7 FPPA measurement algorithm . . . . . . . . . . . . . . . . . . . . . . . . 30

4.8 Classification of markers: (a) reflective element in scene which is not part

of a marker, not tracked, (b) knee marker pixel, (c) ankle marker pixel . . 32

4.9 Representation of the two vectors used to compute FPPA and the refer-

ence system for the algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 33

5.1 Application structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.2 Application Start Window . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.3 Database Window: (a) database editor with some example users included,

(b) dialog window for new patient register . . . . . . . . . . . . . . . . . . 38

5.4 Main Window of the Single Leg Landing test . . . . . . . . . . . . . . . . 39

5.5 Trial icons: Green tick (correct), Yellow alert (many frames missing) and

Red cross (incorrect). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

5.6 Results Dialog Window shown when the test is finished . . . . . . . . . . 42

5.7 Historical data analysis window. Patient 1 and Single Leg Landing test

are selected. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

5.8 High-level architecture of the Kinect Sensor . . . . . . . . . . . . . . . . . 47

6.1 Bland-Altman plot for the agreement analysis between measurement meth-

ods. The three dotted lines indicate the mean (x) and the Bland- Altman

”limits of agreement” (x± 1.96σX) . . . . . . . . . . . . . . . . . . . . . . 69

7.1 (a) View Database button click, (b) profile creation dialog . . . . . . . . . 75

7.2 (a) profile selection for editing and deleting, (b) confirmation of user deletion 76

7.3 (a) Test window with sensor unplugged, (b) test window working . . . . . 77

7.4 Results window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

7.5 Historic window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

x

Page 11: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

List of Tables

4.1 Kinect V2 sensor specifications . . . . . . . . . . . . . . . . . . . . . . . . 26

5.1 Coordinate systems of Kinect sensor . . . . . . . . . . . . . . . . . . . . . 51

6.1 Case study results. Angle values are expressed in degrees (o). . . . . . . . 68

6.2 Inter-rater reliability between Kinovea and the proposed system. . . . . . 68

xi

Page 12: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

LIST OF TABLES

xii

Page 13: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

List of Codes

5.1 XAML Example code of a ComboBox creation . . . . . . . . . . . . . . . 45

5.2 Initialization of the Kinect sensor and acquisition of the frame . . . . . . 48

5.3 Image construction from the IR frame . . . . . . . . . . . . . . . . . . . . 50

5.4 Acquisition of the estimated joint position (3D) and mapping to Depth

Space Point (2D) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

5.5 Computing of the marker’s coordinates . . . . . . . . . . . . . . . . . . . . 52

5.6 Computing of the FPPA angle value . . . . . . . . . . . . . . . . . . . . . 52

5.7 Construction of a data graph with the Dynamic Data Display library . . . 54

5.8 Database management class . . . . . . . . . . . . . . . . . . . . . . . . . . 56

5.9 Custom method for row searching into database . . . . . . . . . . . . . . . 59

5.10 Bitmap encoding method . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

5.11 Process to bind data to GUI element properties . . . . . . . . . . . . . . . 61

5.12 Export of the data to a CSV file . . . . . . . . . . . . . . . . . . . . . . . 62

5.13 Export of the data to an EXCEL file . . . . . . . . . . . . . . . . . . . . . 63

5.14 Matlab script for CSV data extraction . . . . . . . . . . . . . . . . . . . 64

xiii

Page 14: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

LIST OF CODES

xiv

Page 15: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1

Introduction

1.1 Motivation and Context

Knee-joint injuries are considered one of the most common injuries among sport practi-

tioners. According to a study performed by the National Collegiate Athletic Association

(NCAA) [1], lower extremity injuries constitute the 58.3% of whole body injuries suffered

during sport practice (Figure 1.1). Moreover, the tear of the knee ligaments is ranked

second within lower limb injuries [9].

Two of the most remarkable disorders are the anterior cruciate ligament (ACL) injury

and the patellofemoral pain syndrome (PFPS). In special, ACL injury stands out because

of the need of surgery in most cases and the long rehabilitation periods. It is also

associated with potential long-term complications, including chronic knee instability,

meniscus tears, cartilage injury, and development of osteoarthritis [10]. In addition,

after ACL surgery, less than 50% of patients will return to sports within 1 year, less

than 65% will return within 2 years and 11% will cease sports practice [11, 12].

ACL injuries have become a public health concern, particularly with the number

of young individuals involved in competitive sports, the increasing incidence of tears in

pediatric patients, and the long-term consequences of the injuries. In addition, the ACL

injury not only have a big impact on the sport practice, but also a remarkable economic

impact. The estimated long-term mean lifetime cost to society is high, over $38,121 per

patient after reconstructive surgery and $88,538 for those managed with rehabilitation

[13].

1

Page 16: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1. INTRODUCTION

Figure 1.1: Distribution (percentages) of injuries by body part for games and practicesfor 15 sports [1].

Furthermore, it is also known that accomplishing a prevention program in time can

have a quantifiable reduction in the risk of ACL injury – 52% in females and 85% in

males, according to [14] –. Other research done by Kato et al. [15] found that specific

training would bring about a 41% decrease in knee valgus1 during jump-shot landing.

All the aforementioned consequences of ACL injuries have led to a growing interest in

the identification of the risk factors for the injury, so that the injury could be avoided

whenever possible. With this aim, one of the essential tasks to achieve is the identification

of the mechanisms of ACL injury.

According to some researches [10], a non-contact mechanism of ACL injury occurs in

70% - 80% of cases. One of the most commonly described mechanism of injury, specially

on girls, is the landing from a jump [16]. This involves landing with the hip extended

and the knee in valgus position, also with internal rotation of the tibia and a pronated

foot – the so called “position of no return” – [16, 17]. This position produces a high

knee abduction moment, resulting on the knee collapse, and it has been demonstrated

to provoke ACL injury [2]. Figure 1.2 depicts the joint movements at this position.

Based on this statements, an interesting field of research are the techniques to perform

a dynamic evaluation of knee alignment when trying to identify excessive knee valgus and

landing abnormalities. In order to quantify knee alignment, Wilson et al. [18] introduced

1For the difference between knee valgus and knee varus, see Section 3.1

2

Page 17: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1.1. MOTIVATION AND CONTEXT

Figure 1.2: Characteristic joint movements at ”position of no return” [2].

the Frontal Plane Projection Angle (FPPA), presenting good within-day reliability in

their study (ICC2 = 0.88). Many subsequent researches [3, 19, 20] have demonstrated

that the FPPA have a direct relationship with the knee alignment measurement, and

therefore with the ACL injury risk (and also with other mentioned injuries, as PFPS).

Figure 1.3 shows the composition of the FPPA. It is obtained as the projection of the

knee angle onto the frontal plane of the body, and results on the deviation of the hip-knee

line from the knee-ankle line. It can be measured during both static and dynamic task,

but for ACL injury risk assessment, is more useful during dynamic movements, as they

simulate real sport movements. We have focused on those which involve landing from a

jump because of being one of the aforementioned mechanism of ACL injury.

As it will be demonstrated in Section 2, currently used techniques for measuring the

FPPA are either excessively expensive or time consuming. Within this context it appears

the need of a tool for FPPA measurement that combines low-cost with promptness, and

provides an automatic measurement, thus improving its accuracy.

The employment of new devices and software in healthcare disciplines can cope with

some limitations introduced by human errors during medical procedures, and have been

improving during the last years. Figure 1.4 shows the increasing investment done in

2Intraclass Correlation Coefficient. Valoration of the agreement between the observations of differentraters when evaluating the same set of measurements.

3

Page 18: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1. INTRODUCTION

Figure 1.3: Frontal Plane Projection Angle [3].

healthcare devices development in the United States during the last 15 years, which

have been doubled since 2013. The dotted part of the 2015 data is the prevision for the

rest of this year. Some factors attributable to this growing interest are:

• The demand by healthcare users for novel systems of early identification of the

risk of suffering injuries and disorders, in order to prevent their appearance. Pre-

habilitation [21] is a concept that have been employed by athletes in an effor to

reduce the risk of injuries, and consist on improving the functional capacity before

encountering an injury.

• The need of reduction of healthcare cost [22]. Saving time when performing biome-

chanical analysis and using low-cost devices will reduce greatly the investment done

in this procedures.

• The appearance of new technologies that have not been previously used within

healthcare disciplines [23].

4

Page 19: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1.2. OBJECTIVES

Figure 1.4: Investments in healthcare devices during the last 15 years [4].

1.2 Objectives

In the light of the utility of the FPPA measurement as well as the potential of new

technologies, this work presents a 2-D automatic video analysis system, intended to

support experts in the dynamic measurement of FPPA by using a cost-effective, easy-

to-use, single-camera solution, in contrast to the existing technologies to perform this

analysis. The main tasks of this project are:

• The tracking of reflective markers attached to a person’s body.

• The real-time determination of the FPPA based on the position of the markers.

• The acquisition and representation of the FPPA values during a test performance.

• The analysis of the online results when a test is finished and the offline results

stored in an internal database, all through a computer application designed ad-

hoc.

The system makes use of the depth sensor included in the Kinect V2 sensor to track

the reflective markers. The system has been designed not only to expedite the acquisition

of the tests results, but also to provide a framework that simplifies the results’ analysis

and management, through the previously mentioned computer application.

5

Page 20: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

1. INTRODUCTION

1.3 Structure

This document is divided into seven chapters:

• Chapter 1: Introduction. The current chapter. It is composed by three sec-

tions. The first one is titled Motivation and Context 1.1, establishes the theoretical

bases of the project, and describes the importance of its development. The second

one, Objectives 1.2, contains a summary of the proposal objectives at the beginning

of the thesis. The last section is Structure 1.3, which presents the structure and

composition of the thesis.

• Chapter 2: State of the Art. This chapter shows the recent proliferation of

systems for assessing the diagnosis of injuries and rehabilitation treatments 2.1.

Then, it presents a review of the current Motion Capture Systems 2.2 (oriented to

FPPA measurement), and the different application fields of the Kinect Sensor 2.3.

• Chapter 3: Test Procedure. In this chapter, we describe the procedure of the

Single Leg Landing Test 3.1 and the FPPA measuring method during it. We also

present an overview of all tests in which FPPA can be measured 3.2.

• Chapter 4: System Description. This chapter describes the presented system

of this work 4.1. It also present a description and analysis of the Kinect V2 Sensor

4.2 and the retro-reflective markers 4.3, justifying its usage. Finally, it describes

the algorithm developed for the automatic FPPA measurement 4.4.

• Chapter 5: Computer Application. This chapter provides a description of the

application for data management 5.1 and the implementation of the system and

the programming code 5.2. It also provides a proposal for data post-processing 5.3

and explains the system robustness approach 5.4.

• Chapter 6: Evaluation. This chapter contains the results of the evaluation

realized to check the reliability of the system in contrast to a 2D video analysis

software.

• Conclusions and Future Work. The last chapter summarizes the conclusions

reached in this thesis 7.1 and makes a proposal of future work 7.2.

6

Page 21: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2

State of the Art

In this chapter we show the recent proliferation of systems for assessing healthcare

experts in rehabilitation treatments and injury diagnosis, also helping at the decision-

making process, all in Section 2.1. In Section 2.2, we describe the existing techniques

for motion capture. Finally, main fields of application of Kinect Sensor are described in

Section 2.3.

2.1 Rehabilitation Assessment Systems

When we talk about biomedical engineering, we refer to that field of study which mixes

medicine and physical therapy with computer science, electronics and other similar fields.

The main goal of this subject is to develop the necessary technology for improving and

making it easier everything related to medicine. The applications of the biomedical

engineering have a range so wide that it could be the topic of a thesis. Therefore, in this

section we are going to give an overview of its application in the physical rehabilitation

field.

This field of study have had a great proliferation in recent years, due to the develop-

ment of new technologies and the reduction of the costs. In the literature, we can find

many issues related to the rehabilitation assessing systems. Figure 2.1 depicts the num-

ber of documents indexed in Scopus database with “rehabilitation system” keywords,

from 1990 to 2015. It shows an absolute amount of 47115 documents, growing from 483

in 1990 to 3663 in 2015, reaching the maximum in 2014, with 4147 documents indexed.

7

Page 22: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2. STATE OF THE ART

Figure 2.1: Number of documents indexed in Scopus database with “rehabilitationsystem” keywords

Those systems have been developed for multiple platforms, computer as well as mo-

bile and web based. One example of a novel rehabilitation system that makes use of

electronic devices and digital technologies to improve its efficiency is the one presented in

[24], a web-based rehabilitation system to help patient’s recovery away from the clinic. In

[25] it is developed a computer-based system which combines diagnostic and terapeutic

tools to measure intracranial pressure on children. Another example is mDurance [26],

a mobile health system for experts supporting on the functional assessment of trunk

endurance. The use of digital technologies have also been applied to the next step of the

injury diagnosis chain: the support of the decision-making process. In this field there

exist systems which manage the data acquired by sensors and produce recommenda-

tions to health experts based on the clinical history of the patient and the previously

established knowledge base and models. Some examples can be found in the literature

[27, 28, 29].

8

Page 23: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2.2. MOTION CAPTURE SYSTEMS

2.2 Motion Capture Systems

In this section we describe the main existing techniques of measuring the joint angles

of the body. We establish a classification based on the technology employed: inertial

sensors (Section 2.2.1), 3D motion tracking systems (Section 2.2.2) and 2D offline video

analysis software (Section 2.2.3).

2.2.1 Inertial Sensors

Probably, the first approach that comes to mind when thinking about real angles mea-

surement is the use of inertial sensors, whose increasing miniaturization make them

perfect to put on any part of the human body. The inertial measurement units (IMUs)

are devices that use a combination of triaxial accelerometers, gyroscopes and sometimes

also magnetometers to register the movement of a specific body (in this context, the

human body). They allow us to characterize and get the movement and posture of the

human body through the acceleration, the angular velocity and the position with respect

to the Earth’s magnetic field monitored by the IMU [30]. Using the suitable data pro-

cessing algorithms and filters, it can be obtained very accurate data of the 3D rotations

and relative position of the joint. Some researches performed in the last years employed

this technology to measure knee valgus angle [31, 32, 33].

Its mayor advantages include its reduced size and accurate data acquisition [34]. How-

ever, IMUs have various disadvantages in our field of application. The first of them is

that they typically suffer from accumulated error. As the guidance system is continually

adding detected changes to its previously-calculated position, any errors in measurement

are accumulated from point to point, leading to a drift. Another disadvantage is known

as the gimbal lock, and consists in the loss of one degree of freedom. It occurs when the

axes of two of the three gimbals are driven into parallel configuration, thus “locking”

the system into rotation in a degenerate 2D space. However, there exist algorithms that

are capable to compensate this error, like Madgwick’s algorithm [35]. Finally, another

drawback when using IMUs during dynamic tasks is the non-deliberated sensor displace-

ment [36, 37], which can introduce a substantial error, as well as the time consumption

when repeating the test.

9

Page 24: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2. STATE OF THE ART

Some examples of commercial IMUs commonly used in biomechanics are:

• Shimmer. This device, developed in 2006 by Intel, was released in 2008, and

nowadays its distribution and development is in charge of Shimmer Researcher

which is a division of the company Realtime Technologies. The device is composed

by a mother board on which are attached the different available sensors. One of

those sensors is the IMU [38].

• Physilog. Device developed by GaitUp company, is fully oriented to motion track-

ing for biomechanics applications [39].

• Xsens. This device provides a body suit (Figure 2.2) that includes 17 intercabled

sensors to provide an accurate representation of the body motion. Although it

reduces the problem of the sensor displacement, the price is strongly increased.

[40].

Figure 2.2: Xsens body suit for motion tracking

10

Page 25: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2.2. MOTION CAPTURE SYSTEMS

2.2.2 3D Video Motion Tracking

Optical systems use video cameras to capture the movement from special markers which

are attached directly to the surface of the subject’s body. Although it implies placing

external elements on the body, like inertial sensor-based systems, the size and weight of

the markers is so reduced that the subject’s movement is not restricted in any way.

The markers can be either passive or active. Passive markers [41] are made of retro-

reflective material to reflect light that is generated near the cameras lens. They are

illuminated usually using infrared (IR) light. Active markers [42] use pulsed-LED mark-

ers which emit IR light rather than reflect. These systems triangulate positions by

illuminating one LED at a time very quickly or multiple LEDs with software to identify

them by their relative positions.

In order to obtain the data, the subject is surrounded by calibrated cameras. Each

camera extracts 2D coordinates information of each marker, and the set of 2D data

captured by each independent camera is then analyzed and the result generates the 3D

coordinates of the markers [43].

Figure 2.3: 3D motion capture laboratory setting. A high number of cameras areemployed [5].

Those systems provide accurate 3D motion data of the subject. However, its use

requires hard financial and spatial resources. The need of using several cameras around

the subject force the room to be pretty large (Figure 2.3 depicts a laboratory set as a

motion capture room, with a high number of cameras and a large and clear room). In

11

Page 26: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2. STATE OF THE ART

addition, each individual camera is very expensive, so the economical investment needed

is very high. Furthermore, this systems require complex data processing in order to gen-

erate the 3D coordinates. This could be too much time-and-resource-consuming, making

this technique not practical for most clinical settings, including the FPPA measurement.

Some examples of 3D motion capture video systems are the following:

• Vicon. These cameras make use of passive markers to track the position and motion

of the subject. It also offers its own software tools with integrated data processing

[44]. The Figure 2.4a shows the Vicon ”Bonita” camera, which individual cost

start from $3000 [45].

• OptiTrak. This system offers the chance to employ active markers, thus removing

the light source of the camera. Software tools are provided too [46]. The Figure

2.4b shows the OptiTrak ”Prime 13” camera, which individual cost rounds the

$2000 [45]

Figure 2.4: 3D motion tracking cameras: (a) Vicon Bonita, (b) OptiTrak Prime 13.

Although this technology is not widely used on knee alignment measurement, it is

considered as a gold-standard in motion tracking, so it is frequently used as a comparison

standard to validate other 2D systems.

2.2.3 2D Offline Video Analysis

In the light of the advantages of using optical sensors, and in an attempt to deal with

the amount of resources required by 3D motion capture systems, 2D video analysis

12

Page 27: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2.2. MOTION CAPTURE SYSTEMS

techniques have proliferated among simple biomechanics analysis. One of the main

researchers in this field is Allan G. Munro, who not only demonstrated the reliability of

2D motion analysis [47], but also related it with the assessment of knee injury risk, thus

proving many statements on which this thesis is based.

Many studies have introduced 2D video analysis of the FPPA measurement in a

variety of tests and have contrasted its validity against the existing 3D techniques. A

recent study [19] assessed the reliability of this technique and also established the associ-

ated measurement error, in order to provide information to perform informed judgments

about individual FPPA values. It also emphasizes the need of further investigation of

these screening tools. In [20], this method is specifically evaluated as a screening tool

for ACL injury, and the results were promising. Other researches [48, 49] also show the

contributions of the hip and the ankle to the FPPA measurement.

However, it is worth considering the results obtained in [3], which affirm that 2D

estimation of FPPA reflected 23% to 30% of the variance of 3D kinematic measurements

during the activities included in this study. Nevertheless, although 2D analysis is not

a substitute for 3D measurements for lower limb kinematics, the previously mentioned

research also proved 2D analysis to be useful for screening knee-joint FPPA, to identify

high-risk injury subjects. For example, a recent research [50] validates the use of 2D

analysis during a single leg landing, main test used in this project, as explained in

Section 3.1.

This type of analysis is performed as follows: the subject’s movement is recorded with

a digital video camera. The video sequences are then introduced in a specific software (see

Figure 2.5), which allows the user to play the recordings in slow motion and freeze them

at the desired frame. Afterwards the software allows the user to place markers and lines

over the frozen image, and perform the estimation of the angle subtended by those lines.

This approach, although effective, has a potential limitation: the time consumption.

The fact of being an offline procedure demands huge time expenses, specially when

evaluating multiple subjects, because it requires two phases: sequence recording and

manual video processing. In addition, human-made errors are increased due to video

freezing at correct frames and manual marker positioning, which makes the process much

less reliable since it depends on the expert’s ability when doing these tasks. It has other

13

Page 28: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2. STATE OF THE ART

Figure 2.5: Kinovea software main screen [6].

potential disadvantages as the high resource consumption, as the storage of the video

files. Some software tools to perform 2D biomechanical video analysis are:

• Kinovea [6]. This software is an open-source project under GPL license. Is the

one used as a gold-standard for validation the system presented in this work. Its

main screen is shown in Figure 2.5.

• Quintic [51]. It has a wide range of software tools depending on the objective of

the analysis. Used in [19].

• Simi MotionTwin [52]. Allows the comparison of multiple videos. Used in [48].

• DgeeMe [53]. Freeware software. Used in [20].

• eHAB V3 [54] Oriented to telerehabilitation. Used in [49].

14

Page 29: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2.3. KINECT SENSOR

2.3 Kinect Sensor

Kinect is a motion sensing input device developed by Microsoft, originally for its use

with Xbox video game consoles. Later it would be developed a new sensor with improved

features, oriented to its use for computer application development. This device includes

different sensors, most of them optical ones, which enable users to control and interact

with their console or computer through a natural user interface, using gestures and

spoken commands. Although is was originally intended to work as a game console input

device, its potential has triggered its use in a wide range of scenarios. In the following

we describe the main areas of application of the Kinect sensor and the main works done

in each one.

2.3.1 Healthcare

One of the main application fields of Kinect sensor is medicine and healthcare. It has

been proved to be a very useful device that can cope with several limitations of many

medical procedures. For example, in [55], Kinect sensor is used to track human respira-

tory motion through the tracking of special markers attached to a lycra shirt. Another

research [56], recognizes human facial patterns in order to assess the evaluation of the

subject’s facial paralysis. Along the same lines, in [57] a system for Parkinson’s disease

assessment is developed. Human body modeling has been used also to construct a model

of the human head, in order to develop a motion tracking system for brain PET [58].

2.3.2 Biomechanics

This is the more related topic to this work. Human motion tracking for rehabilitation

and biomechanical studies has been evaluated in many researches [59] and the results,

although good, all agree in the lack of accuracy of Kinect’s pose estimation algorithm

when performing quantitative analysis. Because of this, one of the main applications

of Kinect sensor has been the extraction of human gait features and the assessing gait

analysis [60, 61, 7]. This is displayed in Figure 2.6, which shows a comparison of the

analysis using the Kinect sensor and the Optotrak system. Another research [62] makes

use of Kinect’s depth sensor when assessing spinal loading.

15

Page 30: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

2. STATE OF THE ART

Figure 2.6: 10-meters walkaway gait analysis. Comparison between Kinect and Opto-trak [7].

2.3.3 Robotics

Computer vision applications have spread since the development of Kinect sensor. Its

accurate depth images have been used to provide a tool to perform simultaneous localiza-

tion and mapping (SLAM) approaches, and have performed an evaluation and modeling

of the sensor [63, 64].

2.3.4 Retail Industry

Some recent researches have developed virtual dressing rooms using Kinect sensor [65,

66]. These systems work as interactive mirrors on which people can try on garments in a

virtual way, without having them physically, thus accelerating the process. They make

use of Kinect sensor to create a virtual model of human body and fit the clothes on it.

2.3.5 Education

Kinect sensor have become a useful tool in educational programs, as it provides an

interactive Human-Machine Interface (HMI). The KinectEDucation [67] platform pro-

vides Kinect-based applications oriented to deliver active learning experiences within the

classroom environment.

16

Page 31: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

3

Test Procedure

In this chapter, we describe in detail the main test used in this work to perform the

measurement of the FPPA (Section 3.1). Then, we present an overview of the rest of

tests available in the literature to measure knee alignment (Section 3.2). It is worth to

note that those tests can be easily introduced in the presented system, and it is part of

the future work.

3.1 Single Leg Landing Test

There exists a great range of tests on which knee alignment can be measured, regardless

of the physical condition of the assessed subject. The results obtained for a given patient

help experts to determine their status relative to ACL injury risk, as mentioned in Section

1.1.

Several functional tests to measure the FPPA at dynamic conditions can be found

in the literature. The aim of those tests is to simulate specific movements encountered

during sport practice and examine the FPPA of the knee during them. By this method,

the effect of real movements over the knee can be analyzed without having to track the

subject during the full sport practice.

As explained in [68], unilateral landings are a more common ACL injury mechanism

than bilateral landings, because all the weight of the body is held by one leg, creating

a high knee abduction moment (Section 1.1). In order to simulate these landings, the

single leg landing (SLL) test requires the subject to perform a unilateral step landing

17

Page 32: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

3. TEST PROCEDURE

task.

In this work we have followed the method proposed in [69] for the SLL test perfor-

mance. The only material needed (apart from the system presented in this work) is a

30-cm-high bench. The procedure is the following: the subject is asked to stay on the

bench over one leg and step off the bench with the opposite leg. The subject have to land

onto a mark positioned on the floor, 30 cm from the bench, and hold on the position for

at least 2 seconds. During the test, subjects have to keep their hands on their hips and

ensure that the contralateral leg makes no contact with any other surface. The sensor

is placed at the subject’s knee level, 2 meters away from the landing target and aligned

perpendicular to the frontal plane. Figure 3.1 displays a diagram of the lateral view of

a SLL test scenario set-up.

Figure 3.1: Diagram of the lateral view of a SLL test scenario set-up

It is recommended to perform some practice trials before the test, to orient subjects

with the requested task. During the test, the performance of various landings is also

recommended, and the final result will be extracted based on each FPPA value (e.g. the

average value).

For a correct FPPA measurement, three retro-reflective markers are placed on the hip,

18

Page 33: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

3.1. SINGLE LEG LANDING TEST

knee and ankle joint, which are used to identify those points during the measurement.

Whereas the bench height, the landing target separation, and the sensor height are

recommended indications for a correct test performance, the markers should be strictly

placed and fixed on the anatomic points described in [18] to obtain a reliable FPPA

measurement. Those anatomic points are: (i) anterior superior iliac spine, (ii) middle

of the tibiofemoral joint and (iii) middle of the ankle mortise. The markers alignment

draws two lines, whose projected angle is recorded as the FPPA. Usually it is measured

at the maximum flexing point, and the value obtained is considered the result of the

test, however, as it will be described in future sections, the presented system allows the

user to analyze the FPPA value during the whole test performance.

Finally, we need to distinguish to which direction the knee has been flexed [70]. If the

knee moves to the subject’s sagittal plane (to the body center), it is known as knee valgus

– it is the most common situation – and if the knee moves outside, it is known as knee

varus. According to the aforementioned scientific literature, there are some established

normal FPPA values for men and women for SLL tests. The average FPPA values for

men (± confidence interval) are 4.9 ± 3.5◦. For women, values are significantly higher,

reaching 7.1± 2.5◦. Likewise, according to [18], those values should be symmetrical for

both legs within the range of 1 – 9◦ for males and 5 – 12◦ for females.

Figure 3.2a shows the position of the three retro-reflective markers. Figures 3.2b

and 3.2c show the starting and ending position of the test. In Figure 3.2c, the FPPA

has been displayed, showing an example of knee valgus – markers are displayed as red

crosses.

Figure 3.2: SLL test procedure: (a) marker placing, (b) test start position, (c) testend position with FPPA displayed

19

Page 34: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

3. TEST PROCEDURE

3.2 Other Knee Alignment Tests

As it was mentioned at the beginning of Section 3.1, several functional test to dynami-

cally measure the FPPA can be found in the literature. In this work, the SLL test have

been implemented due to its promptness and simplicity, saving time to experts and mak-

ing it easier for the subjects to understand the procedure, as well as for the researcher

to acquire a complete dataset. However, in this section we describe other examples of

commonly used tests, which simulate other body interactions during sport practice.

• Single Leg Squat (SLS) [3, 18, 71]. This is another commonly used test. The

subject has to stay over one leg and perform a squat, descending the trunk and

thus flexing the knee. Then, subjects return to the initial position. During the

squat, the subject’s hands have to stay on the hips and the contralateral leg cannot

make contact with the floor. The maximum knee flexion angle varies among the

literature, but it is not recommended to descend beyond 45◦. The drop speed

should be medium, at a rate of 15 squats per minute.

• Vertical Drop Jump (VDJ) and Single Leg Vertical Drop Jump (SLVDJ) [2, 72].

This test can be performed either with one leg or with two legs. It comprises

dropping of a bench before immediately performing a maximum vertical jump,

moving the arms freely. It may include an extrinsic motivation, such as an overhead

goal hanging above the ground, which has to be touched by the subject at the

maximum jump height.

• Drop Jump Landing (DJL) [69]. This task have the same performance as the VDJ.

The difference between them is that in the VDJ, FPPA is measured during the

reaction phase before the maximum jump, while in the DJL, the FPPA is measured

at the landing of the jump.

20

Page 35: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4

System Description

In this chapter we present a general overview of the proposed system in Section 4.1. The

hardware employed is described in Sections 4.2 (Kinect Sensor) and 4.3 (Retro-reflective

Markers). In that section, it will be included a comparison between the detected joints

positions with and without markers, in order to justify their employment. Finally, the

FPPA measurement procedure and algorithms are explained in Section 4.4.

4.1 Overview

In Chapter 2 we talked about the limitations of 2D video analysis techniques when

assessing the FPPA measurement and the excessive resource requirements of 3D motion

tracking video systems. In the light of the limitations of existing techniques and taking

into account the objectives proposed for this project, in this section we present an

innovative system to support practitioners during the performance of functional tests to

measure the knee FPPA. The proposed system tries to cope with the aforementioned

limitations, offering an automatic 2D marker tracking system which provides real-time

FPPA computing and processing with no need of offline analysis.

Figure 4.1 provides a representation of the system structure. It is composed by

three retro-reflective markers, which are attached to the subject’s leg joints, on the

anatomical points described in Section 3.1. Those markers are tracked by a depth

sensor, which captures IR video frames. The raw data is seamlessly transmitted to a

computer application, which computes the FPPA based on the position of the markers.

21

Page 36: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

Figure 4.1: Proposed system structure and elements

The application not only processes the data, but also provides a framework to support

experts during test performance. This includes a functional user interface (UI), the

graphic representation of the FPPA values during the whole test – not only the maximum

value, as it is done in existing 2D analysis softwares –, and the storage of data in a local

database.

The sensor chosen for this project has been the Microsoft’s Kinect V2. This sensor,

which will be thoroughly described in next section, has two main characteristics that

makes it stand out, in the context of our objectives, over other high-end cameras. The

first one is easiness-of-use. The Software Developement Kit (Kinect for Windows SDK

2.0 [73]) provides tools and APIs to facilitate the sensor data management and the devel-

opment of Kinect-based applications. The second one is low cost. Kinect sensor’s actual

price plus the necessary computer adapter is less than $200 [74], which is tremendously

cheaper than the other high-end cameras mentioned in Section 2.2.2. Taking into ac-

count its functionality, makes it more cost-effective than previously mentioned high-end

systems.

22

Page 37: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.2. KINECT V2 SENSOR

Finally, the last feature of the project achieved is time-effectiveness. Joining both

sensor and application, the system offers an accurate real-time FPPA measurement and

an easy data management, thus reducing hugely the time invested in these analysis, in

contrast to existing 2D video analysis techniques.

4.2 Kinect V2 Sensor

The Kinect sensor is a device developed by Microsoft. It contains a range of sensors,

which can be used together or individually, and each of them provide a different func-

tionality. There are two existing versions of Kinect sensor. The first one (Kinect V1),

was released in 2010 with two models, one for the Xbox game console and the other

(with improved features) oriented to its use for computer application development. The

second version of the sensor (Kinect V2) was released in 2014 with major improvements,

specially for the depth sensor. In this work, Kinect V2 is used.

The device is shown in Figure 4.2. Depth cameras offer several advantages over

traditional intensity sensors, working in low light levels, giving a calibrated scale esti-

mate, being color and texture invariant, and resolving silhouette ambiguities in pose.

In addition, the grayscale image obtained provides visualization without the complex

computation and algorithms required to produce color [75]. In this work we only make

use of the depth sensor. For that reason, only this one will be described in detail, while

it will be given an overview of the rest of sensors.

Figure 4.2: Kinect V2 Sensor [8].

In this section we will describe the operating principle of the Kinect’s depth sensor

(Section 4.2.1) and a brief view of the hardware specifications (Section 4.2.2).

23

Page 38: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

4.2.1 Operation Principle

The Kinect’s depth sensor is an infrared CMOS camera [76], which works in combination

with three infrared (IR) emitters. The depth information is then provided for each pixel

by the Time-of-Flight (ToF) approach [77, 78, 79]. This technology is based on measuring

the time that light emitted by an illumination unit requires to travel to an object and

back to the sensor. First of all, the emitter sends an IR pattern, modulated by the

Continous Wave (CW) Intensity Modulation approach. This results in a square wave

sent to the scene. Then, the light is reflected and travels back to the CMOS camera.

Figure 4.3: Time-of-Flight operation principle

Figure 4.3 depicts the operation principle of this technique. Each pixel of the CMOS

camera is divided into four parts, and each one have a photo-sensitive element (usually a

single-photon avalanche diode (SPAD)), which converts the incoming light into a current

which fills a memory element (e.g. a capacitor). Each of the four memory elements (C1,

C2, C3 and C4) has a switch that takes a sample of the same width of the current pulse,

but each one phase-stepped by 90o. This results in four samples per pixel and per frame

(Kinect works at 30 fps). As the reflected light will have a delay which depends on the

distance traveled, each memory element will fill with a different charge depending on

the distance. Measuring the electrical charges accumulated during the samples (Q), we

can calculate the phase angle of the illumination and the reflection (ϕ), and the distance

(d):

Qi =

tp∫0

i(t)dt (4.1)

24

Page 39: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.2. KINECT V2 SENSOR

ϕ = arctan

(Q3 −Q4

Q1 −Q2

)(4.2)

d =c

4πfϕ (4.3)

where tp denotes the pulse width, c denotes the speed-of-light constant (3 ·108 m/s) and

f the frequency of the modulated IR pattern. These values allow us to also calculate

pixel intensity (A, amplitude of the reflected signal):

A =

√(Q1 −Q2)2 + (Q3 −Q4)2

2(4.4)

This method leads to the obtaining of a depth value and an intensity value for each

pixel, which will be used to compose the IR image in next sections. An interesting

feature of Kinect V2 sensor is that it has a built-in ambient light rejection, where each

pixel individually detects when its memory elements are saturated – this means that the

IR intensity is higher than a previously established limit – and resets this pixel in the

middle of an exposure. This allows us to use the sensor in any situation, regardless of

the light conditions.

4.2.2 Technical Specifications

The main technical specifications of the sensor are shown in Table 4.1. It is worth noting

that, although Kinect’s depth range extends to 8 m, measurements lose accuracy father

than 4.5 m [8]. That is not a problem for our project, since SLL test needs the camera to

stay closer. To demarcate the space covered by the sensor during SLL test, we calculate

the frontal plane within the Kinect field of view (FOV) at 2 m – distance from subject

to sensor during SLL test, see Figure 3.1 –. Taking into account that the sensor will

be at the subject’s knee level (∼0.5 m), and the previously mentioned FOV, the frontal

plane area can be obtained:

Area =

(2 ∗ tan

70◦

2∗ 2m

)×(

tan60◦

2∗ 2m+ 0.5m

)= 2.8m× 1.65m (4.5)

25

Page 40: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

Specification Value

Sensor dimensions 24.9 x 6.6 x 6.7 cm

RGB camera resolution 1920 x 1080 @30 Hz

Depth camera resolution 512 x 424 @30 Hz

Depth range 0.5 – 8 m

Field of view 70o (horiz.) x 60o (vert.)

Skeleton joints detected 26 joints

Number of skeletons detected up to 6

USB standard 3.0

Table 4.1: Kinect V2 sensor specifications

This area is more than enough for tracking the lower limb joints. Furthermore,

examining the resolution of the depth sensor (512 x 424 pixels), we realize that each

pixels covers 5.47 mm, so we have a sub-cm sensitivity in marker tracking.

We also want to highlight the frame rate of the sensor. The normal frame rate is 30

fps, but it can be lowered to 15 fps in low light conditions. It involves increasing the

exposure time of the cameras in order to capture as much light as possible. Although at

first sight it would be interesting, this frame rate is excessively low, and will make us to

miss too many data of the tests performance.

4.3 Retro-reflective Markers

The Kinect sensor is well-known for being a markerless system. To achieve this, the

Kinect SDK provides a pose estimation algorithm, which estimates the position of the

body joints. For this task, Kinect’s algorithm uses single depth images obtained by

the depth sensor and compute 3D coordinates of human body joints via trained and

randomized decision tree forests [80]. This is an amazing feature in the original field of

application of Kinect sensor: the gaming industry.

This algorithm has also been applied on the rehabilitation field (Section 2.3), mak-

ing amazing improvements. However, some researches [81, 82, 83] demonstrate that,

although the results of Kinect sensor usage on qualitative studies have been promising,

26

Page 41: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.3. RETRO-REFLECTIVE MARKERS

its pose estimation algorithm is not accurate enough for quantitative and precise clinical

applications. Some approaches have been proposed in the literature to overcome this

issue and increase depth sensor precision [81, 84]. However, the Kinect’s pose estima-

tion does not notice precise anatomical differences between users. These differences,

although not relevant in qualitative analysis, are critical in our application, as they can

mean some degrees of deviation from real FPPA values. Finally, another important fact

to mention is that, when Kinect’s algorithm do not detect a joint, it infers its position.

The problem is that, when many joints are not detected, the human body model of the

algorithm reduces its accuracy, and in our system only lower limb joints are going to be

inside the sensor FOV.

To cope with this situation, our approach is the use of retro-reflective markers, which

are placed on the anatomical points described in Section 3.1, which correspond to the

real position of the hip, knee and ankle joints. Those markers must be attached by a

health expert, and are tracked by the depth sensor to get the real position of the joints.

The markers used in this project [85] are shown in Figure 4.4, whose diameter are smaller

than 9 mm.

Figure 4.4: Spherical retro-reflective markers used in the project

Although the use of markers might be seen as a step back from markerless technol-

ogy, they provide important improvements to our project. They not only give experts

certainty that the system is tracking exactly the desired points, but also allow for track-

ing points that are not joints (e.g. the breastbone) and are used in other biomechanical

measurements. Nevertheless, as it will be explained in the next section, the pose esti-

mation algorithm is used to avoid tracking any reflective element in scene that is not a

marker.

Finally, we have studied the agreement between measures of the system with and

without markers. With this aim, three single leg landings have been performed and

27

Page 42: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

Figure 4.5: SLL data measured with markers (blue line) and without markers (redline) during the same experiment

measured with both methods at the same time, and the obtained data in both situations

have been represented in Figure 4.5. As it can be seen, pose estimation algorithm can

detect significant changes in the FPPA angle, but the data obtained with it not even get

close to that obtained with markers. To give quantitative data about the agreement of

the measures, first of all we have calculated the root-mean-square error (RMSE) of each

one of the three trials performed:

RMSE =

√√√√ 1

n

n∑i=1

(αi − αi)2 (4.6)

where n denotes the number of samples, α the FPPA value measured with markers

and α the one measured without markers. The RMSE values obtained for each trial are,

respectively, 9.7226o, 7.8922o and 7.8808o. Taking into account the data values range,

those high RMSE values are unacceptable for this application, since it requires higher

precision. In order to ensure the reliability of the results, the images of the markers’

method have been manually analyzed and contrasted.

For a better comparison, we have also used the Bland-Altman plot [86]. This method

analyses of the agreement between two methods by plotting their differences against the

mean values of the measurements. The result is shown in Figure 4.6.

28

Page 43: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.3. RETRO-REFLECTIVE MARKERS

Figure 4.6: Bland-Altman plot of the measurements with and without markers. Thethree dotted lines indicate the mean (x) and the Bland- Altman “limits of agreement”(x± 1.96σX)

We can see that, when changes in FPPA are small (near to 0 FPPA value), both

methods have similar measurements (difference ≈ 0). However, when the average of the

measurements grow (higher FPPA values), the difference of the measurements increases

a lot, reaching differences higher than 10o. This means that the markerless algorithm can

detect the changes in the FPPA angle, but its sensitivity is lower than using the markers

(the algorithm detects the angle variations, but with very low accuracy). Furthermore,

the value extracted as a result of the test performed is the maximum FPPA angle, where

markerless method has the worst accuracy. This makes the pose estimation algorithm

not accurate enough for our application.

Reflective markers have no restriction on shape or size. In the next section, a tech-

nique to make the marker position independent of its shape and size will be presented.

However, it is recommended not to use high-volume markers which projects out the

subject’s body, because in joint rotations, the position measured could deviate from the

real joint position.

29

Page 44: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

4.4 Automatic FPPA Measurement

In this section we describe the main procedure of the automatic FPPA measurement

and explain the algorithms designed and implemented with this aim. The first step is to

track the markers, and then the FPPA is computed based on the markers coordinates.

4.4.1 Marker Tracking

This is the first task to face with for FPPA measurement. Figure 4.7 shows the flow

diagram for the algorithm developed.

Figure 4.7: FPPA measurement algorithm

Kinect sensor has a sampling rate of 30 Hz, so that each second it captures 30 video

frames. Each frame sent to the application is composed by a 512× 424 array (according

to the sensor’s resolution), which contains a 16-bit IR intensity value for each pixel. The

procedure of extracting IR values from the InfraredFrame object and constructing the

image is explained in Section 5.2.2. For that reason, in this section we are going to start

from the previously mentioned array with its values already normalized in a scale from

30

Page 45: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.4. AUTOMATIC FPPA MEASUREMENT

0 to 1. The procedure explained is applied for each data frame.

The first step is to detect reflective elements on the scene. Once the IR frame has

been received and normalized, its values are high-pass filtered, so that those values which

exceed a sufficiently high intensity threshold are considered a reflective element – these

pixels are candidates for being part of a marker–. The cutoff value (intensity threshold)

has been chosen based on a previously done analysis, however, as the markers’ pixels

have an IR intensity value much higher than the rest of the image’s pixels, the threshold

selection does not require a high accuracy. In this work, an intensity threshold of 0.45

have been used (0 to 1 scale).

Once reflective elements have been detected and their corresponding pixels filtered,

it is time to extract their 2-D coordinates. The data array transmitted by the sensor

stores the pixel’s data from the top left corner of the sensor’s field of view to its down

right corner. Therefore, we set the origin of our reference system at the top left corner

of the image (see Figure 4.9). To extract the coordinates, we need to identify which

pixel row and column correspond to the evaluated pixel, so we make use of the pixel’s

index into the frame array, which ranges between 0 (top left corner pixel) and 217,088

(bottom right corner pixel). The last value is computed based on the image’s resolution

(512×424 pixels). The pixel’s coordinates are computed as follows:

(x, y) =

(i mod 512,

⌈i

512

⌉)(4.7)

where i denotes the pixel’s index and 512 the width of the data array (horizontal reso-

lution of the sensor).

For the calculation of the horizontal coordinate (x), we need to get the column of the

pixel into the array. As the number of columns is 512, we apply the modulo operation

to get the reminder after the division of the pixel’s index (i) by 512. Following this

structure, we need to get the row of the pixel to calculate the vertical coordinate (y).

Therefore, we apply the ceiling operation to get the smallest integer greater thani

512(pixel’s row).

At this point, we have a selection of the image’s pixels that belong to a reflective

element. Thus, each reflective element is formed of a group of selected pixels. The next

step is to eliminate those groups of pixels that do not correspond to a marker and, for

31

Page 46: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

the rest, identify which pixel corresponds to each marker, and here is where the Kinect’s

pose estimation algorithm is applied (see Section 4.3). The estimated hip, knee and

ankle 3D coordinates are obtained from Kinect’s body frame and mapped from camera

space point (3-D) to depth space point (2D). However, as the mapping will be thoroughly

explained in Section 5.2.3, we are going to continue the explanation with the estimated

joint coordinates already obtained.

This step consists in the setting of a neighborhood of pixels around each estimated

joint coordinates. If a previously selected pixel is inside one neighborhood, it belongs to

the marker of this joint. If the pixel is not inside any neighborhood, it doesn’t correspond

to a marker, and its not processed. By applying this process, we obtain the group of

pixels related to each marker and ignore any possible reflective element in scene that is

not part of a marker, thus reducing the algorithm workload and improving its robustness.

Figure 4.8 depicts this method.

After classifying reflective pixels, a bidimensional point cloud is stored for each

marker (group of pixels). Then, we need to obtain the marker’s coordinates. In or-

der to increase accuracy and make it independent of the subject’s distance to the sensor

and the marker’s size, the midpoint of each pixel group is computed:

(x, y)marker =

n∑i=1

(xi, yi)

n(4.8)

where n denotes the number of pixels into the marker’s pixel cloud.

Figure 4.8: Classification of markers: (a) reflective element in scene which is not partof a marker, not tracked, (b) knee marker pixel, (c) ankle marker pixel

32

Page 47: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4.4. AUTOMATIC FPPA MEASUREMENT

4.4.2 FPPA Computing

Once the markers’ coordinates have been obtained, those three points tracked give us

enough information to calculate the FPPA. As it is a projected angle, it can be calculated

based on the 2D coordinates of the markers. With this aim, we create two vectors from

the aforementioned points:

~v1 = (xhip − xknee, yhip − yknee) (4.9)

~v2 = (xankle − xknee, yankle − yknee) (4.10)

~v1 goes from knee to hip and ~v2 goes from knee to ankle. As the FPPA is a relative angle

and depends only on those vectors, this measurement is robust against accidental camera

rotations. Finally, FPPA is simply measured based on the angle of the two vectors:

FPPA = 180◦ − arccos~v1 ∗ ~v2|~v1| ∗ |~v2|

(4.11)

Figure 4.9 displays the positions of the two vectors and the reference system of the

algorithm.

Figure 4.9: Representation of the two vectors used to compute FPPA and the referencesystem for the algorithm

33

Page 48: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

4. SYSTEM DESCRIPTION

34

Page 49: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5

Computer Application

This chapter presents a deep description of the functionality of the designed application

to help the specialists to manage the sensor data and assess the different tests proposed

(Section 5.1). Then, it describes how the application and algorithm have been imple-

mented (Section 5.2). Finally, it presents a proposal for data post-processing in Section

5.3 and gives the details of the system robustness in Section 5.4.

5.1 Application Description

During the last years, the use of wearable devices and mobile software in healthcare disci-

plines has become more common due to the constant technological improvement. This is

called mHealth [87]. It could be an interesting platform to develop our system. However,

a computer desktop application was finally designed due to the following reasons:

• Kinect sensor is designed to be connected to a computer (it communicates via

USB), so for a mobile app it would need an intermediary that would make the

system more complicated, against the easiness-of-use objective.

• Kinect SDK provides useful libraries and tools for programming Windows Presen-

tation Foundation (WPF) applications, which are computer-based.

• Since the visualization of the image during the tests performance is a key point of

the project, it is better to have a larger screen to visualize it, what it’s not possible

in many mobile devices.

35

Page 50: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

In this section, the computer application of the system is thoroughly described. Fig-

ure 5.1 shows a diagram of the interconnection between the windows of the application,

the dialog windows used to introduce or extract data, and the database connections. In

the following all the windows and features of the application are described step by step.

Figure 5.1: Application structure

5.1.1 Start Window

The screen showed in Figure 5.2 is the first screen after initiation of the application. It

is used to navigate through the application windows and features. First of all, when

the application is executed, it looks for the local database file and the profiles directory.

If not exist, it creates the database file with two tables (tests and users) and creates

the personal directory (Profiles) within the application folder. The users table stores

personal information about the patients registered in the database. The tests table

stores the results of each test. Both tables are related by a personal ID assigned to each

patient. A further description of the database will be provided in Section 5.2.5.

This window has six buttons to access to the performance window of six tests. How-

ever, as the test implementation is quite similar, for this project only Single Leg Landing

test has been introduced. The View Database control opens the database editor, and

the Historic Trends control allows the visualization of the stored results.

36

Page 51: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.1. APPLICATION DESCRIPTION

Figure 5.2: Application Start Window

5.1.2 Database Window

For the first time of use, the first option that is necessary to select in main screen is

the View Database button. Then, the user is directed to a new screen (Figure 5.3a),

which displays the database table users. Its main feature is a table (made with the

DataGrid WPF control and the DataTable class), where the user can visualize all patients

stored in the database and its personal information. The data included in this table are

the columns displayed in Figure 5.3a, and the patients are sorted by an unambiguous

personal ID which is automatically assigned when the patient is added to the database

and will never change regardless of the editions of the patient’s information.

To add a new patient, the expert has to click on New Profile button. It opens the

Dialog Window shown in Figure 5.3b, which have text and numeric fields to introduce the

patient’s information, as well as two RadioButtons to select the patient’s sex. Pressing

the OK button will add the patient to the database, as well as assign its personal ID.

Once a new profile is created, it will be created a new directory with its name into the

37

Page 52: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

Figure 5.3: Database Window: (a) database editor with some example users included,(b) dialog window for new patient register

Profiles directory. It will be its personal directory, and it includes two folders: Data

and Screenshots. The first one will store the .csv files with the data extracted from

the tests, and the second one will store the screenshots automatically made for each

test when the maximum FPPA value is reached. As it will be explained in Section 5.4,

the dialog window have security and robustness code implementation to ensure that the

expert can neither introduce a new patient with a name already existent in the database

nor introduce text or non-alphanumeric characters, to avoid troubles when using this

information.

If the expert want to edit a patient’s information or remove an user from the database,

the first step is to manually select the user from the table. Only one user can be selected

at the same time. When the Edit Profile button is selected, the dialog of Figure 5.3b is

opened, but this time each field is filled with the selected user’s information. Pressing

the OK button will update the database with the new information and the personal

folder with the new name (if changed). In order to delete a user, just clicking the Delete

Profile button will remove the patient from the database and its personal directory (with

all the information). When this option is selected, a message box will show up asking

for a confirmation, to avoid involuntary information loss.

38

Page 53: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.1. APPLICATION DESCRIPTION

5.1.3 Main Window

Once the users have been added to the database, the expert can now start the test

performance selecting the desired test in the Start Window. In this project only the SLL

test have been implemented, so its screen is shown in Figure 5.4.

Figure 5.4: Main Window of the Single Leg Landing test

The first element to notice is the status bar at the bottom of the window. It will be

all time providing information about sensor’s connection status and the progress of the

test. The first task is the sensor connection. If it is not connected, the image will turn

black, and the status bar will display ”Sensor not detected”. Once the sensor is plugged

in, the image will display the sensor IR image and the status bar will display ”Sensor

working correctly”. The whole control panel is disabled until the sensor is working.

The next step is the profile selection. The control panel buttons will not work until

a patient is selected. To perform the selection, the expert have to open the Combo box

at the top of the control panel, which will display a list with the names of the users

registered in the database. Once the name is selected, the next step is the leg selection

– this test is a unilateral landing, so the FPPA is measured only in one leg – from the

39

Page 54: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

Radio buttons at the bottom of the control panel.

With patient and leg selected, the expert can now start the test. Before the test

performance explanation, we will describe first the angle image visualized. For a better

FPPA visualization, a see-through Canvas control have been positioned over the IR

image, with the same resolution. Onto this canvas, some shapes are placed representing

the markers, vectors and angle computed (see Figure 5.4). Over the markers, it have

been placed three red ellipses; two yellow lines have been placed matching the markers

(the lowest one is larger in order to display the angle shape correctly); the angle is

represented by a green circle sector placed between the upper line and the extension

of the lower one. As the shapes are attached to the marker coordinates, their position

is dynamically modified as the subject moves. If the expert changes the leg selection,

the shapes are displayed on the other leg. Finally, if any marker is not tracked, those

shapes related to it are not displayed, providing the possibility to check if the markers

are correctly tracked.

5.1.3.1 Test Execution

To perform a test, the expert just need to click on the Start button, and the system

will start registering data and showing the real-time FPPA value at the bottom of the

control panel. The expert has full control over the duration of the test, as it can be

stopped whenever wanted pressing the Stop button. When it is clicked, it triggers some

information display on the screen. First of all, the trial number is increased, so that

the text above the buttons is changed to the current trial, changing also the text of

the status bar. Then, the FPPA real-time value disappears (only shown during the test

execution), and the maximum FPPA angle reached (test result) is shown at the right of

the finished trial. It is accompanied by some extra information:

• Trial status icon. This icon have three possible options shown in Figure 5.5:

Figure 5.5: Trial icons: Green tick (correct), Yellow alert (many frames missing) andRed cross (incorrect).

40

Page 55: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.1. APPLICATION DESCRIPTION

– Green tick. This means that the test has been correctly executed.

– Yellow alert. Each frame in which any marker is not detected do not have

FPPA information, since it lacks one marker coordinates. In this situation,

the data is ignored. The system have a counter that counts both number of

frames received and number of frame ignored; if the number of ignored frames

is higher than the 20% of the full amount of frames received, a message box

shows up alerting the user of the number of frames ignored, and the alert icon

is placed, meaning that this trial would have limited validity.

– Red cross. This icon appears in case that the final FPPA angle value is 0,

which means that the trial have been incorrectly performed or registered.

• Knee orientation. It is necessary to determine the orientation of the knee (varus

or valgus) apart from its angle’s absolute value. The system obtains it data by

an evaluation of the orientation of the two FPPA vectors. This information is

included in the database and show at the right of the trial icon, with the exception

of when the test have been incorrect (red cross).

There are three possible trials to perform in each test, and the result will be the

higher value among them. However, a new test can be started in every moment clicking

on the New Test button, providing that the test execution is stopped, so although the

maximum number of trials is three, the expert can perform less. The same situation is

provided for the results visualization (View Results button), they can be analyzed after

each trial.

5.1.3.2 Results Visualization

Once the test is finished, the results can be analyzed pressing the View Results button,

which opens the Results Dialog, shown in Figure 5.6.

This window includes three controls. The first one is a Combo box that allows the

selection of the trial data to visualize (only already performed trials are enabled to

select). The second one is a graph which displays the FPPA values reached during the

whole trial against the time stamp. The graph changes when another trial is selected,

an it is interactive, as the user can zoom it in and out and move through it, with the

41

Page 56: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

Figure 5.6: Results Dialog Window shown when the test is finished

chance to fit it into the window clicking the right button. The last one is the Save

Results button. It is the last step to perform the test, and it triggers the storage of the

following information:

• Trial data. The trial data is stored in a buffer in memory during the test execution.

When this button is clicked, the data is saved to a comma-separated values (.csv)

file, which can be opened with a calc sheet program, and gives the chance to be

introduced in other software to perform an offline post-processing of the data (e.g.

MatLab, Octave, LabView or R). This file is stored in the data folder into the

personal directory.

• Screenshot. The system fills a secondary bitmap each time that the maximum

FPPA is reached. When this button is clicked, this bitmap is saved to the screen-

shots folder into the personal directory, and encoded as a portable network graphics

(.png) file. This allows the expert to have a screenshot of the image at the maxi-

mum knee abduction point for further analysis.

42

Page 57: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.1. APPLICATION DESCRIPTION

• Test results. Finally, the result information is stored into the tests table of the

database. The information stored is the following: patient’s ID, test name, date of

the performance, maximum FPPA angle reached, leg selection and knee orientation

(varus or valgus).

5.1.4 Trend Window

This is the last feature of the application, and it can be accessed through the Start

Window. In this window, the expert can inspect the patient’s historical data. In order

to visualize any result, the database has to have information stored. The window opened

is shown in Figure 5.7.

Figure 5.7: Historical data analysis window. Patient 1 and Single Leg Landing testare selected.

This window contains a table which displays the data stored in the tests table of the

database. It contains all information about all patients and all tests, and is sorted by

ID, test and date. Above it, there are two Combo boxes which contains pull-down lists

with the tests names and patients names, respectively. Any combination of user and test

43

Page 58: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

can be selected from the lists, including the option ”All”, and the information shown in

the table will update according to the selection. This allows the expert to visualize the

results for every user, a single user, a single test or a single user and test.

When the last option is selected (single user and single test), the historical informa-

tion of this user and test is displayed in the chart, for each leg. This gives the expert

the chance to inspect graphically the evolution of the user, and lays the foundation for

a future decision-making assessing system (Section 7.2)).

The last button of this window is the Save to EXCEL file. This button opens a Save

File Dialog that allows the expert to export the data shown in the table to an EXCEL

file, for further analysis.

5.2 Application Implementation

In this section we will explain how the previously described application and the FPPA

measurement algorithm have been implemented, and the resources used and the main

characteristics of the programming code. The programming code generated has been so

long that is not possible to explain all in this text. Therefore, only main parts of it will

be described. The whole code is included in the extra documentation attached to this

thesis.

The application has been implemented within Microsoft .NET Framework 4.5.2,

which provides hardware platform independence (the programs written for .NET frame-

work execute in a software environment, in contrast to hardware environment ones).

The Graphic User Interface (GUI) have been designed within the Windows Presentation

Foundation (WPF) framework, which allows us to create an application with a wide

range of GUI elements pre-defined, called controls.

The first fact to notice is that, for the implementation, various programming lan-

guages have been studied and employed, and they are described here:

• C# [88]. Object-oriented language developed by Microsoft within its .NET frame-

work. It has been used for the implementation of the algorithms, since Kinect

APIs are widely used and documented for this language. Microsoft provides a

great range of libraries to work with C# within Windows Presentation Founda-

44

Page 59: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

tion (WPF) applications.

• XAML [89]. Declarative markup language based on XML and developed by Mi-

crosoft. It is widely used within WPF applications, and in our project it has been

employed to design the User Interface (UI) of the application.

• SQLite [90]. Relational database management language, based in SQL. It has

been used with a library for C# to perform the queries needed to manage the local

database of the system.

One important task of this project has been the learning of this languages, whose

learning curve has been notably pronounced for everyone. For programming and compil-

ing the code, the Integrated Development Environment (IDE) used is Microsoft Visual

Studio 2015 [91]. The structure followed for programming is described in the following

paragraph.

Each window of the application is created from a XAML code file, which defines

the GUI through the previously mentioned controls. Some examples of controls used in

this project are Grids, StackPanels, WrapPanels, ComboBoxes, TextBlocks, etc. An

example of the use of XAML code to create the combo box of the Results Window (see

Figure 5.6) is the following:

<StackPanel Margin="275.5,0,0,0">

<TextBlock FontWeight="Bold" FontSize="14">

Select Trial to plot data </TextBlock >

<ComboBox Name="cmbGraphic" Margin="0,10"

SelectionChanged="ComboBox SelectionChanged">

<ComboBoxItem x:Name="trial1Item">

Trial 1 Data Plot</ComboBoxItem >

<ComboBoxItem x:Name="trial2Item">

Trial 2 Data Plot</ComboBoxItem >

<ComboBoxItem x:Name="trial3Item">

Trial 3 Data Plot</ComboBoxItem >

</ComboBox >

</StackPanel >

Code 5.1: XAML Example code of a ComboBox creation

The XAML code file of each window only defines the GUI, so it needs a C# code

file which manages the operation of the controls (e.g. what’s happens when a button

45

Page 60: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

is clicked). It is called the code-behind. The algorithms are also implemented in this

files. The result is one XAML code file and one C# code-behind file per window. As the

XAML code structure is always similar to the previous example, it is not explained in

this text.

Finally, the SQLite code is used within the C# files and managed by the SQLite

library, which allows us to insert SQLite queries directly in C# code. In the following

subsections, we will explain how the main features of the application have been imple-

mented. It is worth noting that, despite C# is an object-oriented language, for this

application we have combined object-oriented and event-oriented programming, as will

be explained in the following.

5.2.1 Kinect APIs

The first step of the implementation is the connection and initialization of the sensor.

The Kinect for Windows SDK 2.0 provides three different API sets [92] that can be

used to crate Kinect-based applications. The Windows Runtime APIs support the de-

velopment of Windows Store applications; the .NET APIs support the development of

WPF applications and the native APIs support applications that require native code.

In this project, as the application designed is WPF-based, the .NET APIs have been

used. Figure 5.8 displays the high-level architecture of the Kinect V2 sensor.

Once the API has been installed in our Visual Studio project as a resource, we can

perform the initialization of the sensor. This process have the following components:

1. Sensor. This is the first level in the initialization. We need to declare an object of

the KinectSensor class, which represent a single physical sensor. It is necessary

to declare an object per sensor used. Our sensor have been named kinect.

2. Source. Each sensor has different data sources (RGB camera, depth sensor, mi-

crophone...). Those objects expose metadata about the source. When different

sources are used (as in our case), it is not necessary to declare the source object.

3. Reader. At least one reader needs to be declared inside each source, since it gives

access to data frames. When a reader is opened and receives a frame, it holds it

46

Page 61: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

Figure 5.8: High-level architecture of the Kinect Sensor

until it is disposed. That allows us to have unlimited access to the frame, but on

the other hand, we cannot acquire more frames until the current one is disposed.

We have to be careful and dispose the frames according to the frame rate of the

sensor (30 Hz). Multiple readers can be created on a single source, and they can

be paused when we don’t need to use them. As we use multiple sources, we have

declared an object (msFrame) of the MultiSourceFrameReader class.

4. Frame. The frame reception of each reader is controlled by an event of the reader.

Each frame give access to its data and its metadata by the FrameDescription class.

The process of the initialization is the following: the objects kinect, msFrameReader

and irFrameDescription are created. Then, each time we open the Main Window they

are set-up with the following methods and event:

1. KinectSensor.GetDefault(). Acquires the sensor.

2. kinect.Open(). Opens the connection with the sensor. If the window is closed it

in necessary to close the connection with the kinect.Close() method.

3. kinect.OpenMultiSourceFrameReader(). Selects the frames that will be used by

47

Page 62: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

the msFrameReader. In this situation they are the Infrared (for IR data acquisition)

and Body (for pose estimation algorithm) frames.

4. kinect.InfraredFrameSource.FrameDescription. Gets the metadata of the

Infrared frame, in order to set up the bitmap on which the image will be encoded.

5. msFrameReader event. It is an event for the arrival of a frame. It triggers every

time that a frame is sent by the sensor, so all the algorithms for data processing

are included in this event.

using Microsoft.Kinect;

public partial class MainWindow : Window

{

KinectSensor kinect;

MultiSourceFrameReader msFrameReader;

FrameDescription irFrameDescription;

public MainWindow ()

{

kinect = KinectSensor.GetDefault ();

kinect.Open ();

msFrameReader =

kinect.OpenMultiSourceFrameReader(

FrameSourceTypes.Infrared | FrameSourceTypes.Body

);

irFrameDescription =

kinect.InfraredFrameSource.FrameDescription;

msFrameReader.MultiSourceFrameArrived +=

msFrameReader_MultiSourceFrameArrived;

}

Code 5.2: Initialization of the Kinect sensor and acquisition of the frame

For the acquisition of the data, we have to set up the MultiSourceFrameArrived

event. Within the event, we declare the MultiSourceFrame object, and through it we

acquire the InfraredFrame and the BodyFrame:

msFrame.InfraredFrameReference.AcquireFrame()

msFrame.BodyFrameReference.AcquireFrame()

48

Page 63: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

In order to ensure the correct disposal of the frames, we acquire them within the using

environment. Once the frames have been acquired, we can implement the algorithms.

5.2.2 Image Construction

For constructing the video image shown in the application, we have to create a bitmap

with the same characteristics of the frame received. The bitmap is updated every time

a frame arrived, since it is the image shown in the application. To extract the charac-

teristics of the frame, we use the metadata stored in the FrameDescription object:

bitmap = new WriteableBitmap(irFrameDescription.Width,

irFrameDescription.Height, 96.0, 96.0, PixelFormats.Gray32Float, null)

we have defined the width, height, points per dot and pixel format of the bitmap. To

create the image, we have to extract the intensity value of each pixel from the frame, set

up it and write it on the bitmap. However, to skip the intermediate storage of the data

and the writing operation on the bitmap, we are going to use the underlying buffers of

the frame and the bitmap. First, we access to the underlying buffer used by the system

to store the frame data:

KinectBuffer irBuffer = irFrame.LockImageBuffer()

and use our own method ProcessInfraredFrameData(). This method allows the con-

struction of the image. We create two pointers, one pointing to the IR frame data

(frameData) and other pointing to the bitmap back buffer (backBuffer) position in

memory. Once the pointers have been created and the bitmap has been locked, we ex-

tract one per one the intensity values of the IR frame and normalize to a scale from 0 to

1 (for its utility, see Section 4.4). Finally, the normalized values are copied to the bitmap

back buffer. As it access directly to the native memory, it requires unsafe compilation.

The bitmap is bound to a Image control in the XAML code, so each time it changes,

the image displayed on the application is updated.

49

Page 64: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

private const float irSourceValueMax = ushort.MaxValue;

private const float irSourceScale = 0.75f;

private const float irOutputValueMin = 0.01f;

private const float irOutputValueMax = 1.0f;

private unsafe void ProcessInfraredFrameData(IntPtr irFrameData ,

uint irFrameDataSize)

{

ushort* frameData = (ushort*) irFrameData;

bitmap.Lock ();

float* backBuffer = (float *) bitmap.BackBuffer;

for (int i=0; i<(int)( irFrameDataSize /

irFrameDescription.BytesPerPixel ); ++i)

{

backBuffer[i] = Math.Min(irOutputValueMax ,

(( frameData[i] / irSourceValueMax * irSourceScale)

* (1.0f - irOutputValueMin )) + irOutputValueMin );

}

bitmap.AddDirtyRect(new Int32Rect(0, 0,

bitmap.PixelWidth , bitmap.PixelHeight ));

bitmap.Unlock ();

}

Code 5.3: Image construction from the IR frame

5.2.3 Algorithm Implementation

In this section we show how the algorithm designed in Section 4.4 is implemented. The

first step is the intensity filtering. It is implemented by an

if(backbuffer[i]>0.45f)

condition inside the ProcessInfraredFrameData method. The next step was the ex-

traction of the coordinates of the filtered pixels. It is implemented in the same method:

double MarkerYpos = Math.Ceiling(i / 512.0d);

double MarkerXpos = i % 512.0d;

using the ceiling and modulo operators. Then, we have to classify the pixels among the

joints using the pose estimation algorithm, so we have to extract the estimated 3D joint

50

Page 65: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

positions. First of all, we have to declare a Joint class object, which is extracted from

the BodyFrame. Then we map the 2D position of the joint from its 3D position, and

extract its coordinates. Here we show an example for the hip joint:

if (body.IsTracked) { Joint hipJoint = isRightLeg ?

body.Joints[JointType.HipRight] :

body.Joints[JointType.HipLeft ]; }

if (hipJoint.TrackingState == TrackingState.Tracked)

{

DepthSpacePoint hipDsp =

kinect.CoordinateMapper.MapCameraPointToDepthSpace(

hipJoint.Position );

hipXpos = hipDsp.X; hipYpos = hipDsp.Y;

}

Code 5.4: Acquisition of the estimated joint position (3D) and mapping to DepthSpace Point (2D)

The coordinate mapping is necessary since the coordinate systems for the Body Frame

and the Infrared Frame are different. The Table 5.1 shows the differences between the

three coordinate systems of Kinect sensor. The CoordinateMapper class provides simple

methods to map between them.

Name Frames Dimensions Units Range Origin

ColorSpacePoint Color 2 pixels 1920 × 1080 Top left corner

DepthSpacePoint Infrared 2 pixels 512 × 424 Top left corner

CameraSpacePoint Body 3 meters - Depth camera

Table 5.1: Coordinate systems of Kinect sensor

Once the data has been mapped and the estimated joint coordinates extracted, we

proceed to the classification by the

if(Math.Abs(MarkerXpos - hipXpos) < 30) &&

(Math.Abs(MarkerYpos - hipYpos) < 30)

condition, which defines a square neighborhood in which pixels are considered part of

51

Page 66: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

a marker (example for the hip joint). With the marker pixels of one joint selected, we

can calculate the midpoint of those pixels (considered as the marker position) with the

GetMarkerCoordinates double:

private double GetMarkerCoordinates(List<double > _list)

{

double [] _buffer = _list.ToArray ();

double _average = new double ();

for (int i=0; i<_buffer.Length; i++) {_average += _buffer[i];}

double _markerPos = _average / _buffer.Length;

return _markerPos;

}

Code 5.5: Computing of the marker’s coordinates

The pixels are stored in a List object because we do not know how many of them

will be stored, so we cannot fix the dimension of an array previous to the data storage.

Then, the list is converted to an array and its average value is computed. This method

is applied to each marker and each coordinate (six times). Finally, when the coordinates

are acquired, the FPPA value can be obtained:

private void AngleComputing ()

{

upperVector = new Vector(hipMarkerXpos - kneeMarkerXpos ,

hipMarkerYpos - kneeMarkerYpos );

lowerVector = new Vector(ankleMarkerXpos - kneeMarkerXpos ,

ankleMarkerYpos - kneeMarkerYpos );

if (testRunning)

{

FPPAangle = 180 -

Vector.AngleBetween(upperVector , lowerVector );

bool kneeValgus = isRightLeg ? (FPPAangle < 180 ?

true : false) : (FPPAangle > 180 ? true : false);

if (FPPAangle > 180) FPPAangle = 360 - FPPAangle;

if (FPPAangle > FPPAangleMax && FPPAangle != 180)

{

FPPAangleMax = FPPAangle;

FPPAangle = Math.Round(FPPAangle , 1);

FPPAangleMax = Math.Round(FPPAangleMax , 1);

}

}

}

Code 5.6: Computing of the FPPA angle value

52

Page 67: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

The vectors created are those shown in Figure 4.9. The FPPA angle is computed

based on the angle between them and finally some corrections and rounds are applied.

5.2.4 Graph Visualization

For the data visualization in the Results Window as well as the historical representation

of the patient’s data, we have used the open-source library Dynamic Data Display [93],

which provides interactive visualization of the data.

The first step is the creation of a ChartPlotter class object in the XAML file; this

will be our graph’s object. Then, each trial’s data is stored in a float array (data), even

if it has been already saved to the .csv file. For the visualization, we have to create

other array which contains the time scale of the trial (timeScale). As we know the

time span between samples (30 Hz od sampling rate), we can create a time array of the

same length of the data array.

The methods in this library only work with IEnumerable class objects. For that

reason, first of all we need to transform our arrays in EnumerableDataSource<> objects.

The two objects created will be combined in a CompositeDataSource object. Those

objects will be used to create the two graph objects, of the classes CirclePointMarker

(data markers of the graph) and LineGraph (line connecting the markers of the graph).

Finally the graph is created with the AddLineGraph() method and fitted to the view

with the ViewPort.FitToView() method.

The process for the creation of the historical data chart is the same. The only

difference is the data source. For the results graph, the source is the array stored in

memory with the trial data (online). For the historical graph, the source is the data

stored in the database (offline). The graph creation method is the following:

53

Page 68: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

using Microsoft.Research.DynamicDataDisplay;

using Microsoft.Research.DynamicDataDisplay.DataSources;

using Microsoft.Research.DynamicDataDisplay.PointMarkers;

private void PlotDataChart(float[] data)

{

float timeSpan = (1f / 30f);

float [] timeScale = new float[data.Length ];

for (int i=0; i<data.Length; i++) {

timeScale[i] = timeSpan *(i); }

EnumerableDataSource<float > xData =

new EnumerableDataSource<float >( timeScale );

EnumerableDataSource<float > yData =

new EnumerableDataSource<float >(data);

xData.SetXMapping(X => X);

yData.SetYMapping(Y => Y);

CompositeDataSource dataComp =

new CompositeDataSource(xData , yData);

CirclePointMarker marker = new CirclePointMarker {

Size = 5, Fill = Brushes.Red };

graphChart.AddLineGraph(dataComp , new Pen(Brushes.Blue , 1d)

,marker , new PenDescription("Angle value during test"));

graphChart.Viewport.FitToView ();

}

Code 5.7: Construction of a data graph with the Dynamic Data Display library

5.2.5 Database Management

During the whole text, we have talked about the database on which test results data

and patient profile information are stored. The system storage functionality relies on

a local SQLite database [90] deployed on the user’s computer storage disk, within the

directories of the application. SQLite database engine is different than the client/server

SQL database engines such as MySQL, Oracle, PostgreSQL or SQL Server. It provides

an internal storage on a single file, and it has been chosen due to the characteristics

of our project, such as small amount of stored data and non-concurrent queries. This

database engine adjust to the system characteristics, since the amount of data managed

is able to fit in a single disk file and there will not be more than one user sending queries.

For that reasons, it offers a good trade-off between performance and simplicity, making

it better for our system than a client/server database engine.

54

Page 69: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

The database created for the system is called ”SLLdatabase”. It contains two tables

to store all the information already mentioned:

• users. This table stores the personal information about all the patient registered

in the application. In order to identify them by an unambiguous key, an unique

personal ID is associated with each patient when they are added to the database.

This field has the property PRIMARY KEY and it is assigned automatically and

auto-increment when a new user is added. Thus, the fields in this table are:

– ID (integer primary key). ID number of the patient.

– Name (string). Name of the patient.

– Age (int). Age of the patient.

– Sex (string). Male or female.

– Height (real). Height of the patient in cm.

– Weight (real). Weight of the patient in kg.

– Comments (string). Any comment about its health condition.

The REAL data type in SQLite is equivalent to the double data type in C# .

• tests. This table stores all the information related to the test results. It is related

with the users table with the ID field. However, in this table it is not a PRIMARY

KEY, since it is extracted from the patient information when a test is performed.

The information stored in this table is the following::

– ID (int). ID number of the patient.

– Test (string). Name of the test performed.

– Date (string). Date of the performance with the format DD-MM-YYYY.

– Angle (real). Result of the test.

– Leg (string). Evaluated leg (only for unipodal tests).

– Comments (string). Any comment about the test (for example, varus or

valgus).

55

Page 70: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

SQLite library provides several class to handle the database management. However,

in order to simplify the process, a custom Database class has been created:

using System.Data.SQLite;

class Database

{

public SQLiteConnection Connection { get; set; }

public SQLiteDataAdapter Adapter { get; set; }

public SQLiteCommand Command { get; set; }

private DataTable table;

private DataGrid grid;

public void ExecuteCommand(string _command)

{

Connection.Open ();

SQLiteCommand command = new SQLiteCommand(

_command , Connection );

command.ExecuteNonQuery ();

Connection.Close ();

}

public void ExecuteGridCommand(string command ,

DataTable _table , DataGrid _dataGrid)

{

table = _table;

grid = _dataGrid;

ExecuteCommand(command );

if (table != null)

{

table.Clear ();

Adapter.Fill(table);

grid.ItemsSource = table.DefaultView;

}

}

}

Code 5.8: Database management class

The first thing to analyze are the classes provided in the SQLite library:

• SQLiteConnection. It contains a string representing the connection to the

database. This is, the source of the data when a query is sent. Our connection

string is the following:

"Data Source=data\\SLLdatabase.sqlite;Version=3;"

56

Page 71: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

• SQLiteDataAdapter. In most cases, the information in the database not only

wants to be stored, but also displayed in tables or grids. This class allows us to

adapt the data queried to a DataTable object, which will fill a DataGrid XAML

control.

• SQLiteCommand. It contains another string with the query command written in

SQLite language. The ExecuteNonQuery() and ExecuteScalar() methods allows

us to execute those SQLite commands.

The two methods of the class simplify the execution of SQLite command. The dif-

ference between them is that the ExecuteCommand method only executes the command,

while the ExecuteGridCommand method updates the DataGrid after the execution, so is

designed for those situations that require the visualization of the database table.

The first step of the execution is the definition of the connection. Then, every time

we send a query, the connection must be opened. Once opened, we need to create a

SQLite command, providing the command string and the connection. Right after, we

execute the command with the ExecuteNonQuery method, and close the connection.

Table Creation

The creation of the database in implemented at the execution of the Start Window code-

behind. Once the database object is created and the connection has been established,

we use the ExecuteCommand method to execute the creation of the two tables with the

following SQLite commands:

"CREATE TABLE users (ID INTEGER PRIMARY KEY, Name VARCHAR(20), Age INT,

Sex VARCHAR(6), Height REAL, Weight REAL, Comments VARCHAR(100))"

"CREATE TABLE tests (ID INT, Test TEXT, Date VARCHAR(20), Angle REAL,

Leg VARCHAR(5), Comments TEXT)"

Data Display

To display the data in both Database Window graph and Historic Window chart, we

use the SQLiteDataAdapter. The first step is to set up the adapter with the SELECT

SQLite command:

57

Page 72: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

db.Adapter = new SQLiteDataAdapter("SELECT * FROM users", db.Connection);

This command selects all the information from the users table. Then, we fill a DataTable

object with the information queried:

dbTable = new DataTable();

db.Adapter.Fill(dbTable);

Finally, we set the table as the source of the elements displayed on the DataGrid control

of the XAML code file:

dbGrid.ItemsSource = dbTable.DefaultView;;

Table Filling

In order to fill the tables with information, we follow the same process than in the table

creation, but we change the SQLite command. To insert new information, we use the

Profile Dialog shown in section 5.1.2. We extract the data introduced in the TextBox

controls and insert them into the database executing the following command:

"INSERT INTO users (Name, Age, Sex, Height, Weight, Comments) VALUES (

’@name’, @age, ’@sex’, @height, @weight, ’@comments’)

each value preceded by an @ symbol is an attribute. We can substitute them with our

data using the following command (example):

db.Command.Parameters.AddWithValue("@name", "Patient 1")

The update process is similar to the inserting one. First of all we have to select a user

from the DataGrid and click the Edit Profile button. Then, we extract the patient’s data

from the DataTable, using a DataRowView object, which allow us to analyze the cells

of each table’s row. Then, the Profile Dialog fields are filled with this information, and

when pressing OK, any field whose information had changed is updated on the database

just changing the SQLite command:

58

Page 73: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

"UPDATE users SET Name =’@name’, Age = @age, Sex = ’@sexx’, Height =

@height, Weight = @weight, Comments = ’@comments’ WHERE ID = @id"

Finally, for the user removing process, we only need to extract the patient’s ID from the

table, and use the following command:

"DELETE from users where ID = @id"

Row Search

When writing test results on the tests table, it is identified by the user name, the test

name, the date, and the evaluated leg. However, if the same test is performed by the

same user, with the same leg and in the same date, the data has to be overwritten, in

order to not create two rows with the same information, which could loose its uniqueness.

For that reason, we have designed a method which counts all the rows of the table

with the specified information and returns a bool indicating whether the row exists or

not. For this task, it sends a query to the database by the ExecuteScalar method,

which returns a value with the counter. If the counter value is 0, it means that no rows

exist with this information.

public bool RowExists(object ID , string date , string leg)

{

db.Connection.Open ();

db.Command = new SQLiteCommand(

SELECT COUNT (*) FROM tests WHERE ID = @id AND

Test = Single Leg Landing AND Date = @date AND

Leg = @leg , db.Connection );

db.Command.Parameters.AddWithValue("@id", ID);

db.Command.Parameters.AddWithValue("@date", date);

db.Command.Parameters.AddWithValue("@leg", leg);

string usercount = db.Command.ExecuteScalar ();

db.Connection.Close ();

bool result = usercount > 0 ? true : false;

return result;

}

Code 5.9: Custom method for row searching into database

59

Page 74: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

5.2.6 Extra Features

Here we describe how the extra features of the project have been implemented. We

call them ”extra” because, although they are not essential for achieving the project’s

objectives, they provide useful functions to enhance the expert’s work.

Screenshots

If we remember, when the maximum FPPA angle is reached in a test, a screenshot of the

image is made and stored. For this task, we define a second bitmap (screenshotBitmap)

with the same characteristics as the image bitmap. First of all, every time the FPPA

angle overcomes the previous maximum value (new maximum reached), we copy the

pixels of the bitmap to an array (data):

int stride = bitmap.BackBufferStride;

byte[] data = new byte[stride * bitmap.PixelHeight];

bitmap.CopyPixels(data, stride, 0);

Once the test has finished, the screenshot is only saved if we press the Save Results

button (same procedure as the database storage). When it is pressed, it triggers the

following method, which encodes the bitmap in a BitmapEncoder object, and writes it

on a FileStream, which is disposed when finished:

private void BitmapEncoding(string path , string directory)

{

if (!Directory.Exists(directory )) {

Directory.CreateDirectory(directory ); }

using (FileStream fs = new FileStream(path , FileMode.Create ))

{

BitmapEncoder encoder = new PngBitmapEncoder();

encoder.Frames.Add(BitmapFrame.Create(screenShotBitmap ));

encoder.Save(fs);

}

}

Code 5.10: Bitmap encoding method

60

Page 75: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.2. APPLICATION IMPLEMENTATION

GUI Data Binding

When designing the GUI, many text displayed on the windows shows data that changes

during the execution of a test. In order to avoid the manual change of those TextBlock

controls, we use the so-called data binding. It is the easiest way to bring data from the

code-behind to the GUI layer.

First of all, we have to implement the INotifyPropertyChanged interface to our

window class. By doing that, our objects bound to the XAML controls are capable of

alerting the GUI layer of changes to its properties. Then, we bound an object to the

property of a control. For example, the trial text:

<TextBlock FontSize="22" Text="Binding TrialNumber,

UpdateSourceTrigger=PropertyChanged"/>

Finally, we create an element which notifies of the changes on the object bound:

public event PropertyChangedEventHandler PropertyChanged;

public int TrialNumber

{

get { return trialNumber; }

set

{

if(trialNumber != value)

{

trialNumber = value;

PropertyChanged ?. Invoke(this ,

new PropertyChangedEventArgs("TrialNumber"));

}

}

}

Code 5.11: Process to bind data to GUI element properties

Data Exporting

There are two situations on which we export data to external files. The first of all is

the .csv file exporting. There, we export all the test data, which is available for further

online analysis. The comma-separated value file is a string of values separated by an

special character, in our case, the semicolon character (;).

61

Page 76: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

Each time we receive a frame data, it is stored in a string (fileData) with the

corresponding semicolon and a data counter is increased:

fileData += (FPPAangle.ToString() + "; ");

dataCounter++;

when each trial is finished, the data is written to the .csv file by the SaveDataToCSV

method:

private string filePath = @"data\SLLdata.csv";

private void SaveDataToCSV ()

{

fileData = dataCounter.ToString () + ";" + fileData;

if (TrialNumber == 1) {

File.WriteAllText(filePath , fileData+Environment.NewLine ); }

else File.AppendAllText(filePath , fileData+Environment.NewLine );

fileData = null;

dataCounter = 0;

}

Code 5.12: Export of the data to a CSV file

The data counter is used to identify the number of elements of each row (each trial),

since each trial data could be different. The other situation is the Export to EXCEL

file option for the historical data in the Historic Window. For managing this files, we

use the Microsoft.Office.Interop library, which allows us to perform interoperations

between Office files. Then, we copy the elements of the DataTable object adapted from

the database to the EXCEL sheet:

62

Page 77: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.3. DATA POST-PROCESSING

private void excelButton_Click(object sender , RoutedEventArgs e)

{

Excel.Application app = new Excel.Application{Visible = false };

Excel.Workbook wb = app.Workbooks.Add(

System.Reflection.Missing.Value);

Excel.Worksheet sheet = wb.Sheets [1];

sheet.Name = Test data;

for (int i = 0; i < dbTable.Columns.Count; i++)

sheet.Cells[1, i + 1] = dbTable.Columns[i]. ColumnName;

for (int i = 0; i < dbTable.Rows.Count; i++)

{

for (int j = 0; j < dbTable.Columns.Count; j++)

sheet.Cells[i+2, j+1] = dbTable.Rows[i][j];

}

wb.Close(true);

app.Quit ();

}

Code 5.13: Export of the data to an EXCEL file

5.3 Data Post-processing

As it was explained in the previous section, the system writes all text data in a comma-

separated value file which is stored in the personal directory of the patient. This file is

intended to give the chance of performing an offline post-processing of the results for

one or more users and tests.

It has been not included in this thesis due to time restrictions. However, we have

developed a script for the software Matlab which extract the information of the .csv

file and stores it in a matrix, which can be used to represent or process the data. The

changes between commas, dots and semicolons is necessary since the data is exported by

the application with commas as decimal separators and semicolons as value separators

(e.g. 12,3; 11,5) and Matlab reads it with dots as decimal separators and commas as

value separators (e.g. 12.3, 11.5). For the reading, the file is copied to another .csv file

(due to the number format change) and mapped to memory, from where is read.

63

Page 78: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

% FILE SETUP

% Map the file and adapt to csv simple format

copyfile(’SLLdata.csv’, ’SLLdataMatlab.csv’);

file = memmapfile(’SLLdataMatlab.csv’, ’writable’, true);

comma = uint8(’,’);

dot = uint8(’.’);

semicolon = uint8(’;’);

% Change commas per dots and semicolons per commas if required

if sum(file.Data == dot) == 0

file.Data(transpose(file.Data == comma)) = dot;

end

file.Data(transpose(file.Data == semicolon )) = comma;

clear file

% DATA EXTRACTION

% First cell of each row indicates the number of data of the row

size1 = csvread(’SLLdataMatlab.csv’,0,0,[0,0,0,0]);

size2 = csvread(’SLLdataMatlab.csv’,1,0,[1,0,1,0]);

size3 = csvread(’SLLdataMatlab.csv’,2,0,[2,0,2,0]);

% Each row represent data from a trial

trial1 = csvread(’SLLdataMatlab.csv’,0,1,[0,1,0,size1 ]);

trial2 = csvread(’SLLdataMatlab.csv’,1,1,[1,1,1,size2 ]);

trial3 = csvread(’SLLdataMatlab.csv’,2,1,[2,1,2,size3 ]);

% Set the timescale of each trial (sampling at 30 Hz)

time1 = [0 : 1/30 : (size(trial1 ,2) -1)/30];

time2 = [0 : 1/30 : (size(trial2 ,2) -1)/30];

time3 = [0 : 1/30 : (size(trial3 ,2) -1)/30];

Code 5.14: Matlab script for CSV data extraction

5.4 System Robustness

Since the developed system will by used by external users, it is exposed to several errors

and failures due to bad usage. During the application development, we have done a

great time investment on improving the system robustness against possible errors. In

this section we describe some examples of situations where the robustness has been

improved.

The first example is within the Database Window. To edit or delete a profile, the

user needs to have any profile selected. If not, the corresponding buttons do not trigger

64

Page 79: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5.4. SYSTEM ROBUSTNESS

any method. Other example is on the Profile Dialog. We have to ensure that the correct

data type is inserted in all fields, and that all fields are filled (except from the comment,

which can be empty). To ensure this, we analyze the number length of the text input in

all field and use the TryParse method on the age, height and width fields. If any field

is empty or the mentioned fields have any non-numeric character, the system shows an

alert message and aims the user to correct the information.

The rest of the main examples are on the Test Window. First of all, the control panel

is disabled while the sensor is not connected. Then, any button works until a user is

selected from the ComboBox list. When Start button is clicked and the test is running,

all buttons are disabled except from the Stop one. Finally, when a test has been begun,

the leg selection is also disabled, until New Test button is clicked.

In addition, all the creation of files, such as the database file or the personal directo-

ries, previously ensures that the file not exists, in order to create a new file or overwrite

the existing one.

Finally, it is worth noting that all these aspects conform a first approach to the pro-

tection against SQL injection. It consists on the creation or alteration of SQL commands

by an external user, which could be used to expose or overwrite hidden data or execute

dangerous commands at system level on the computer that hosts the database.

65

Page 80: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

5. COMPUTER APPLICATION

66

Page 81: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

6

Evaluation

This chapter described the evaluation realized to check the reliability of the developed

system and validate it against one 2D video analysis software whose use is very extended:

Kinovea software [6]. This tool was presented in Section 2.2.3. It allows the user to

analyze video frames, freeze them and manually allocate the body joints to get the joint

angles.

To perform the study, ten volunteers, five males and five females ranging from 18 to

36 years old, were recruited to be evaluated in a Single Leg Landing test by a external

physical therapist using both Kinovea and the proposed system. Before performing the

evaluation, the volunteers were informed about the research aims, risks and benefits of

participation. The execution of the test was recorded simultaneously with the Kinect

sensor and with a digital color video camera. Three attempts were performed by each

participant with a rest time of 30 seconds. The SLL test procedure was explained to

the subjects before performing the evaluation, assuring the full understanding of it. The

procedure followed is the one explained in Section 3.1. The marker placement was done

by the physical therapist on the anatomic points proposed in the aforementioned section.

The first part of this evaluation aims at estimating the inter-rater reliability between

both measurement systems. The results of each patient in each trial are gathered in Table

6.1. As it can be observed, the result obtained through both methods are generally

similar, which reflects the utility of the developed system. Measurements performed

with Kinovea software do not have decimal places since this software limits the values

obtained to integer numbers.

67

Page 82: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

6. EVALUATION

Patient ID 1 2 3 4 5 6 7 8 9 10

Age 26 22 21 23 36 18 31 25 32 24

Trial 1 (S) 12.4 13.5 14.9 14.1 18.7 20.2 10.8 9.7 13.4 17.1

Trial 1 (K) 12 14 15 14 18 21 12 9 13 17

Trial 2 (S) 17.6 21.4 12.8 15.1 15.8 16.2 6.7 12.6 15 21

Trial 2 (K) 18 22 14 16 16 17 5 12 15 21

Trial 3 (S) 18.5 16.3 9.6 23.3 18.2 17.4 8.5 8.7 12.4 18.7

Trial 3 (K) 18 16 10 23 18 17 8 9 12 19

Average (S) 16.2 17.1 12.4 17.5 17.6 17.9 8.7 10.3 13.6 18.9

Average (K) 16 17.3 13 17.7 17.3 18.3 8.3 10 13.3 19

(K) Kinovea

(S) Proposed system

Table 6.1: Case study results. Angle values are expressed in degrees (o).

To perform a formal statistical analysis we consider three coefficients: the Intraclass

Correlation Coefficient (ICC)(ρ) [94], the Cronbach’s α estimator [95] and the Bland-

Altman plot, with its ”limits of agreement” statistics for continous variables [96]. Those

methods are widely used in the clinical domain to evaluate the agreement among two

measurement methods. The results of this analysis have been extracted using SPSS

Statistics v.22 software, and are shown in Table 6.2.

Variable ICC(ρ)* 95% CI of ICC Cronbach’s α

FPPA on SLL test 0.914 0.796-0.975 0.909

*ICC(ρ) was calculated using a one-way random model

Table 6.2: Inter-rater reliability between Kinovea and the proposed system.

First of all, the (ICC)(ρ) and its confidence intervals (CI) are calculated for the

inter-rater reliability trials. According to [97], a (ICC)(ρ) value under 0.3 reflects poor

inter-rater reliability; the range from 0.31 to 0.7 represent bad to moderate reliability;

values from 0.71 to 0.9 reflect good reliability; and values higher than 0.9 are considered

excellent. The results obtained for this evaluation (Table 6.2) can be categorized as

68

Page 83: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

excellent inter-rater reliability.

For the Cronbach’s α, a value less than 0.5 is considered unacceptable, 0.5 to 0.59

is poor, 0.6 to 0.69 is questionable, 0.7 to 0.79 is acceptable, 0.8 to 0.89 is good and 0.9

to 1 is excellent [98]. The value obtained in the present evaluation (0.909) leads to an

excellent reliability between the two methods.

Finally, the Bland-Altman plot, already used in Section 4.3, helps to understand the

agreement between the methods. The result obtained is shown in Figure 6.1. The image

shows that even all the measures fall within the 95% CI, thus not suggesting the presence

of relevant disagreement. Moreover, the mean value of the differences is almost 0, which

means that the measurements are highly correlated.

Figure 6.1: Bland-Altman plot for the agreement analysis between measurement meth-ods. The three dotted lines indicate the mean (x) and the Bland- Altman ”limits ofagreement” (x± 1.96σX)

Although the results of the study are promising, a study including higher number of

participants would be required to further confirm this findings. For future evaluations,

we propose to perform a Clinical Randomised Trial with a much higher number of

participants. Another interesting study would be to perform a usability study with the

System Usability Scale (SUS) [99], using a Likert scale [100] in place of the usual numeric

scale.

69

Page 84: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

6. EVALUATION

70

Page 85: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

7

Conclusions and Future Work

7.1 Conclusions

The measurement of the knee alignment has been demonstrated to be a good indicator of

the anterior cruciate ligament (ACL) injury risk. In order to quantify this measurement,

the Frontal Plane Projection Angle (FPPA) is widely used. In this project, we have

done a research work, performing an extent study of the ”State of the Art” through the

current FPPA measurement techniques, extracting their limitations. The key limitations

found are high costs and great time investments.

In order to cope with this limitations, this work presents an automatic 2D video

analysis system to support experts during dynamical measurements of the FPPA. The

system requires to place three retro-reflective markers on the lower limb joints of the

subjects, and it makes use of a cost-effective depth sensor to automatically track this

markers and compute and display the real-time measurement of the FPPA. The system

also provides a computer application from which experts can manage the test perfor-

mance, store the tests results in a local database, perform an online analysis of them

through graphic representations and export the information to external programs. This

system significantly simplifies the routine of the expert, expedites the analysis of the

results and help to manage the information obtained when evaluating multiple subjects.

An initial evaluation of the system has been performed in order to show the potential of

this tool. The results of the study show an excellent reliability, which foretells a good

acceptation of the system among the physical therapy community.

71

Page 86: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

7. CONCLUSIONS AND FUTURE WORK

From a personal point of view, there are more cross-objectives that have been ac-

complished throughout the development of this thesis:

• The integration of several (some of them unknown) disciplines within the context

of the project, performing an extensive research of them.

• The selection of the best solution for the presented challenge, using novel technolo-

gies and devices, such as Kinect V2 sensor.

• The learning process of new programming languages (C#, XAML) and the use

of multiple tools during the development (Kinect SDK, Microsoft Visual Studio,

Matlab, SPSS Statistics) and a relational database management system (SQLite).

• The development of a computer application intended to be used by external users,

so it includes an extent robustness layer, not allowing the user to introduce wrong

information or trigger elements that could lead to the program crush.

• The work have high prospective, allowing it to be a successful line of research and

leading to publications in scientific journals and conferences.

7.2 Future Work

Given the results of this project and all the future possibilities of it, we propose some

future work that can be performed as a continuation of the project:

1. This system is intended to autonomously give the experts indications about sub-

ject’s ACL injury risk and possible actions to take, based on both current and

previous FPPA values existent in the database, as well as a possible base of knowl-

edge based on expert’s indications and research results.

2. Application development. Many improvements can be introduced in the computer

application. The first one is to introduce more tests as shown on the Start Window,

which has been skipped in this work due to the similarity of the tests and the time

restrictions. Other one is the analysis of the spread of the measurements, in order

to automatically detect outliers and potentially wrong results and recommend

72

Page 87: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

7.2. FUTURE WORK

the performance of more test. This does not mean that the application is at an

incomplete state, moreover, it can be already used to perform the FPPA analysis

during the SLL test. However, it shows the prospective of this tool. Finally, but

not least important, the aspect of the application is being improved.

3. Marker tracking. We are working on the improvement of the marker tracking

procedure. A possible approach will be to use three markers per joint, allocated

around the joint, and compute the center of mass of the markers. This could be

useful for ensuring that minor joint rotations do not cause marker occlusion and

for estimating this rotations, in order to study its contributions to the FPPA. We

are also working in the automatic detection of which leg wears the markers and

if the hands are correctly kept on the hips and the contralateral leg makes any

contact with the floor.

Finally, I would like to show the magnitude of this project. Its main intention is

to simplify as much as possible the work of health experts in order to contribute to

the introduction of novel technologies in the healthcare field and to the improvement

of the life quality. A paper presenting the first approach of this project has been sent

to the 10th International Conference on Ubiquitous Computing & Ambient Intelligence

(UCAmI), and we are looking forward to the notices of acceptation. It has been also

proposed a patent of the system, which is actually under study.

73

Page 88: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

7. CONCLUSIONS AND FUTURE WORK

74

Page 89: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

Appendix

User Manual

This manual describes the guidelines for the use of the system’s computer application.

The first step at the opening is the creation of new user profiles. Once the application

has been started, click on the View Database button (Figure 7.1a). Inside the database

window, click on New Profile to open a dialog where the personal information of the

profile has to be introduced (Figure 7.1b). Remember to use the indicated units for the

information, otherwise a message box will show up alerting of it. Click OK to create

the profile.

Figure 7.1: (a) View Database button click, (b) profile creation dialog

75

Page 90: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

APPENDIX

This window also allows for the edition and deletion of users. For this task, first

select the desired profile (Figure 7.2a) from the database and then, click on Edit Profile

or Delete Profile. In the first case, the same dialog of profile creation will show up with

its personal information. In the second one, a message box will show up asking for

confirmation of the deletion (Figure 7.2b).

Figure 7.2: (a) profile selection for editing and deleting, (b) confirmation of userdeletion

Once the users have been created, the test performance can be started. Click on the

arrow button at the top right of the database window to return to the start window.

There you can select any test among the offered. We are going to show the example of

the ”Single Leg Landing” test. Click on Single Leg Landing button to go to the test

window (Figure 7.3b). This window is divided in two sides. On the left side it shows

the image of the camera; on the right side it shows the control panel.

The first step is the sensor connection. The status bar at the bottom of the window

gives information about the sensor’s connection status and the test progress. The camera

image will be black until it is plugged in the computer (Figure 7.3a).

76

Page 91: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

APPENDIX

Figure 7.3: (a) Test window with sensor unplugged, (b) test window working

If the markers are not properly tracked at the beginning, ask the subject to perform

some trial movements closer to the camera. The marker’s image will appear in a few

seconds. To perform a test, the first step is to select the desired user from the list on

the top of the control panel. Then, you need to select the evaluated leg at the bottom

of the panel and click Start to start the test. When the test is running, click the Stop

button to finish this trial. The results of the test will appear at the middle of the control

panel. You can start a new test in every moment clicking on New Test. Once the test is

finished, clicking on View Results will redirect to the results window (Figure 7.4).

Figure 7.4: Results window

77

Page 92: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

APPENDIX

This window shows a graph with the evolution of the FPPA during the test. Each

trial has its own graph, which can be selected from the list at the top of the window.

Finally, to save the results, click on Save Results at the top right corner of the window.

The results will be stored into the database.

To examine the results of all profiles included on the database, return to the start

window and click on Historical Trends. In this window (Figure 7.5), select any combina-

tion of user and test from the two lists on the top of the window. If an individual user

and test are selected, the chart at the bottom reflects the evolution of the results on this

test, for this patient. To export the results shown on the table to an external EXCEL

file, click on Save to EXCEL file. A dialog window will ask you the directory of the file.

Figure 7.5: Historic window

78

Page 93: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

APPENDIX

List of Acronyms

ACL Anterior Cruciate Ligament

API Application Programming Interface

DJL Drop Jump Landing

FOV Field Of View

GPL General Public License

GUI Graphic User Interface

HMI Human-Machine Interface

ICC Intraclass Correlation Coefficient

IDE Integrated Development Environment

IMU Inertial Measurement Unit

IR Infrared

NCAA National Collegiate Athletic Association

PFPS Patellofemoral Pain Syndrome

RMSE Root-Mean-Square Error

SLL Single Leg Landing

SLS Single Leg Squat

SUS System Usability Scale

ToF Time-of-Flight

VDJ Vertical Drop Jump

WPF Windows Presentation Foundation

79

Page 94: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

APPENDIX

80

Page 95: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

References

[1] Jennifer M. Hootman, Randall Dick, J.A.: Epidemiology of collegiate injuriesfor 15 sports: Summary and recommendations for injury prevention initiatives.Journal of Athletic Training 42(2) (June 2007) 311–319

[2] Hewett, T.E., Meyer, G.D., Ford, K.R., Heidt, R.S., Colosimo, A.J., McLean,S.G., van den Bogert, A.J., Patterno, M.V., Succop, P.: Biomechanical measuresof neuromuscular control and valgus loading of the knee predict anterior cruciateligament injury risk in female athletes. The American Journal of Sports Medicine33(4) (2005) 492–501

[3] Willson, J.D., Davis, I.S.: Utility of the frontal plane projection angle in fe-males with patellofemoral pain. Journal of Orthopaedic & Sports Physical Therapy38(10) (2008) 606–615

[4] SVB: Sillicon Valley Bank. Available online: http://www.svb.com/. Accessed:2016-05-08

[5] National University of Singapore. Available online:http://www.comp.nus.edu.sg/vision/resources/sensor lab.html. Accessed: 2016-05-15

[6] Kinovea Association: Kinovea. Available online: http://www.kinovea.org/. Ac-cessed: 2016-03-20

[7] Geerse, D.J., Coolen, B.H., Roerdink, M.: Kinematic validation of a multi-Kinectv2 instrumented 10-meter walkway for quantitative gait assessments. PLOS ONE13 (October 2015)

[8] Lower, B., Relyea, B.: Programming Kinect for Windows V2: Jump Start. Avail-able online: https://mva.microsoft.com/en-US/training-courses/programming-kinect-for-windows-v2-jump-start. Accessed: 2016-02-17

[9] Soufiane Boufous, R.D., Finch, C.: A profile of hospitalations and deaths due tosport and leisure injuries: Sports injury report. NSW Injury Risk ManagementResearch Centre (September 2006)

[10] Acevedo, R.J., Rivera-Vega, A., Miranda, G., Micheo, W.: Anterior cruciateligament injury: Identification of risk factors and prevention strategies. CurrentSports Medicine Reports 13(3) (May 2014) 186–191

81

Page 96: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[11] Feller, J., Webster, K.E.: Return to sport following anterior cruciate ligamentreconstruction. International Orthopaedics 37 (2013) 285–290

[12] Hewett, T.E., Stasi, S.L.D., Myer, G.D.: Current concepts for injury preventionin athletes after anterior cruciate ligament reconstruction. American Journal ofSports Medicine 41(1) (January 2013) 216–224

[13] III, R.C.M., Koenig, L., Kocher, M.S., Dall, T.M., Gallo, P., Scott, D.J., Jr.,B.R.B., Group, M.K., Spindler, K.P.: Societal and economic impact of anteriorcruciate ligament tears. The Journal of Bone & Joint Surgery 95(19) (October2013) 1751–1759

[14] Sadoghi, P., von Keudell, A., Vavken, P.: Effectiveness of anterior cruciate liga-ment injury prevention training programs. The Journal of Bone & Joint Surgery94(9) (May 2012) 769–776

[15] Kato, S., Urabe, Y., Kawamura, K.: Alignment control exercise changes lowerextremity movement during stop movement in female basketball players. Knee15(4) (August 2008) 199–304

[16] Boden, B.P., Dean, G.S., Feagin, J.A.J., Garrett, W.E.J.: Mechanisms of anteriorcruciate ligament injury. Orthopaedics 23(573-578) (2000)

[17] Hewett, T.E., Meyer, G.D.: The mechanistic connection between the trunk, hip,knee, and anterior cruciate ligament injury. Exercise & Sport Sciences Reviews39(5) (October 2011) 161–166

[18] Willson, J.D., Ireland, M.L., Davis, I.: Core strength and lower extremity align-ment during single leg squats. Medicine & Science in Sports & Exercise (2006)

[19] Munro, A., Herrington, L., Carolan, M.: Reliability of 2-dimensional video as-sessment of frontal-plane dynamic knee valgus during common athletic screeningtasks. Journal of Sport Rehabilitation 21 (2012) 7–11

[20] McLean, S.G., Walker, K., Ford, K.R., Meyer, G.D., Hewett, T.E., van den Bogert,A.J.: Evaluation of a two dimensional analysis method as a screening and evalu-ation tool for anterior cruciate ligament injury. British Journal of Sport Medicine39 (2005) 355–362

[21] Ditmyer, M.M., Topp, R., Pifer, M.: Prehabilitation in preparation for orthopaedicsurgery. Orthopaedic Nursing 21(5) (2002) 43–54

[22] Lee, E.O., Emanuel, E.J.: Shared decision making to improve care and reducecosts. New England Journal of Medicine 368(1) (2013) 6–8

[23] Microsoft: Kinect for Windows V2. Available online:https://developer.microsoft.com/es-es/windows/kinect. Accessed: 2016-05-08

[24] Chen, L., Yan, N., Kiang, M., Muth, A.S., Krishna, K.S.: Innomotion: a web-based rehabilitation system helping patients recover and gain self-awareness oftheir body away from the clinic. In: CHI ’14 Extended Abstracts on HumanFactors in Computing Systems. (2014) 233–238

82

Page 97: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[25] Exo, J., Kochanek, P.M., Adelson, P.D., Greene, S., Clark, R.S.B., Bayir, H.,Wisniewski, S.S., Bell, M.J.: Intracranial pressure monitoring systems in childremwith traumatic brain injury: combining terapeutic and diagnostic tools. PediatricClinical Care Medicine 12(5) (September 2011) 560–565

[26] Banos, O., Moral-Munoz, J.A., Diaz-Reyes, I., Arroyo-Morales, M., Damas, M.,Herrera-Viedma, E., Hong, C.S., Lee, S., Pomares, H., Rojas, I., Villalonga, C.:mdurance: A novel mobile health system to support trunk endurance assessment.Sensors Journal 15 (2015) 13159–13183

[27] gross, D.P., Zhang, J., Steenstra, I., Barnsley, S., Haws, C., Amell, T., McIntosh,G., Cooper, J., Zaiane, O.: Development of a computer-based clinical decision sup-port tool for selecting appropriate rehabilitation interventions for injured workers.Journal of Occupational Rehabilitation 23 (March 2013) 597–609

[28] Informes de evaluacion de tecnologıas sanitarias: Patient decision aids tool forbreast cancer. Technical report, Innovation and Science Ministry of Spain (Septem-ber 2007)

[29] Galvin, J., Levac, D.: Facilitating clinical decision-making about the use of virtualreality within paediatric motor rehabilitation: Describing and classifying virtualreality systems. Developmental Neurorehabilitation 14(2) (2011) 112–122

[30] Olivares, A., Olivares, G., Mula, F., Gorriz, J.M., Ramirez, J.: Wagyromag:Wireless sensor network for monitoring and processing human body movement inhealthcare applications. Journal of Systems Architecture 57 (2011) 905–915

[31] Favre, J., Jolles, B.M., Aissaoui, R., Aminian, K.: Ambulatory measurement of3d knee joint angle. Journal of Biomechanics 41 (2008) 1029–1035

[32] Hu, W., Charry, E., Umer, M., Ronchi, A., Taylor, S.: An inertial sensor system formeasurements of tibia angle with applications to knee valgus/varus detection. In:2014 IEEE Ninth International Conference on Intelligent Sensors, Sensor Networksand Information Processing (ISSNIP). (2014)

[33] Tik-Pui Fong, D., Chan, Y.: The use of wearable inertial motion sensors in humanlower limb biomechanics studies: A systematic review. Sensors Journal 10 (2010)11556–11565

[34] Chambers, R., Gabbett, T.J., Cole, M.H., Beard, A.: The use of wearable mi-crosensors to quantify sport-specific movements. Journal of Sports Medicine 45(2015) 1065–1081

[35] Madgwick, S.O.H., Harrison, A.J.L., Vaidyanathan, R.: Estimation of IMU andMARG orientation using a gradient descent algorithm. In: 2011 IEEE Interna-tional Conference on Rehabilitation Robotics (ICORR). (2011) 1–7

[36] Banos, O., Toth, M.A., Damas, M., Pomares, H., Rojas, I.: Dealing with theeffects of sensor displacement in wearable activity recognition. Sensors Journal14(6) (2014) 9995–10023

83

Page 98: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[37] Banos, O., Damas, M., Pomares, H., Rojas, I.: On the use of sensor fusion toreduce the impact of rotational and additive noise in human activity recognition.Sensors Journal 12(6) (2014) 8039–8054

[38] Burns, A., Greene, B.R., McGrath, M.J., O’Shea, T.J., Kuris, B., Ayer, S.M.,Stroiescu, F., Cionca, V.: SHIMMER - A wireless sensor platform for noninvasivebiomedical research. Sensors Journal 10(9) (2010) 1527–1534

[39] GaitUp: Physilog. Available online: http://www.gaitup.com/products/#RTK.Accessed: 2016-05-12

[40] XSENS: Xsens IMU. Available online: www.xsens.com. Accessed: 2016-05-12

[41] Stoll, J., Ren, H., Dupont, P.E.: Passive markers for tracking surgical instrumentsin real-time 3-d ultrasound imaging. IEEE Transactions on Medical Imaging 31(3)(2012) 563–575

[42] Maletsky, L.P., Sun, J., Morton, N.A.: Accuracy of an optical active-markersystem to track the relative motion of rigid bodies. Journal of Biomechanics 40(2007) 682–685

[43] Vicon Motion Systems Ltd.: Essentials of motion capture. (2002)

[44] Vicon Motion Systems Ltd.: Vicon Cameras. Available online:http://www.vicon.com/. Accessed: 2016-05-15

[45] Vicon Motion Systems Ltd.: Vicon Bonita brochure. Available online:http://www.vicon.com/file/vicon/bonita-brochure.pdf. Accessed: 2016-03-12

[46] NaturalPoint Inc.: Optitrak. Available online:http://http://www.optitrack.com/hardware/. Accessed: 2016-03-12

[47] Munro, A.G.: The use of two-dimensional motion analysis and functional perfor-mance tests for assessment of knee injury risk behaviours in athletes. PhD thesis,School of Health Sciences, University of Salford, Salford, UK (July 2013)

[48] Bittencourt, N.F.N., Ocarino, J.M., Mendonca, L.D., Hewett, T.E., Fonseca, S.T.:Foot and hip contributions to high frontal plane knee projection angle in athletes:A classification and regression tree approach. Journal of Orthopaedic & SportsPhysical Therapy 42(12) (December 2012) 996–1004

[49] Wyndow, N., De Jong, A., Rial, K., Tucker, K., Collins, N., Vicenzino, B., Rusell,T., Crossley, K.: The relationship of foot and ankle mobility to the frontal planeprojection angle in asymptomatic adults. Journal of Foot and Ankle Research 9(3)(2016)

[50] DiCesare, C.A., Bates, N.A., Myer, G.D., Hewett, T.E.: The validity of 2-dimensional measurement of trunk angle during dynamic tasks. The InternationalJournal of Sports Physical Therapy 9(4) (August 2014) 420–427

[51] Quintic Consultancy Ltd.: Quintic video analysis software. Available online:http://www.quintic.com/software/. Accessed: 2016-03-20

84

Page 99: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[52] Simi Reality Motion Systems GmbH: Quintic Video Analysis Software.Available online: http://www.simi.com/en/products/movement-analysis/simi-motiontwin.html. Accessed: 2016-03-20

[53] GeeWare: DgeeMe. Available online: http://dgeeme.software.informer.com/. Ac-cessed: 2016-03-20

[54] NeoRehab: eHAB V3. Available online: http://www.neorehab.com/. Accessed:2016-03-20

[55] Ernst, F., Saß, P.: Respiratory motion tracking using Microsoft’s Kinect v2 cam-era. Current Directions of Biomedical Engineering 1 (2015) 192–195

[56] Gaber, A., Taher, M.F., Wahed, M.A.: Automated grading of facial paralysisusing the Kinect v2: A proof of concept study. In: International Conference ofVirtual Rehabilitation (ICVR2015), Valencia, Spain. (June 2015)

[57] Noonan, P.J., Howard, J., Hallet, W.A., Gunn, R.N.: Repurposing the MicrosoftKinect for Windows v2 for external head motion tracking for brain PET. Physicsin Medicine & Biology 60 (2015) 8753–5766

[58] Rocha, A.P., Choupina, H., Fernandes, J.M., Rosas, M.J., Vaz, R., Silva Cunha,J.P.: Kinect v2 based system for parkinson’s disease assessment. In: 37th AnnualInternational Conference of the IEEE Engineering in Medicine and Biology Society(EMBC). (2015)

[59] Alabbasi, H., Gradinaru, A., Moldoveanu, F., Moldoveanu, A.: Human motiontracking & evaluation using Kinect V2 sensor. In: 5th IEEE International Confer-ence on E-Health and Bioengineering - EHB 2015. (November 2015)

[60] Yang, L.: 3D Sensing and Tracking of Human Gait. PhD thesis, School of ElectricalEngineering and Computer Science, Faculty of Engineering, University of Ottawa(2015)

[61] Dotabadi, E., Taati, B., Parra-Dominguez, G.S., Mihailidis, A.: A markerlessmotion tracking approach to understand changes in gait and balance: A casestudy. In: Proceedings of the Rehabilitation Engineering and Assistive TechnologySociety of North America Annual Conference. (June 2013)

[62] Ning, X., Guo, G.: Assessing spinal loading using the kinect depth sensor: Afeasibility study. IEEE Sensors Journal 13(4) (April 2013) 1139–1140

[63] Tolgyessy, M., Hubinsky, P.: The kinect sensor in robotics education. In: Proceed-ings of 2nd International Conference on Robotics in Education, Vienna, Austria.(September 2011)

[64] Fankhauser, P., Bloesch, M., Rodriguez, D., Kaestner, R., Hutter, M., Siegwart,R.: Kinect v2 for mobile robot navigation: Evaluation and modeling. In: Pro-ceedings of the IEEE International Conference on Advanced Robotics (ICAR).(2015)

85

Page 100: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[65] Kotan, M., Oz, C.: Virtual mirror with virtual human using kinect sensor. In: 2ndInternational Symposium on Innovative Technologies in Engineerintg and Science(ISITES2014). (June 2014)

[66] Giovanni, S., Choi, Y.C., Huang, J., Khoo, E.T., Yin, K.: Virtual try-on usingkinect and hd camera. In: 11th International Fall Workshop on Vision, Modeling,and Visualiza-tion 2006 (VMV 2006), IOS, Aachen, Germany. (2006)

[67] Kiskio, J.: KinectEDucation. Available online:http://www.kinecteducation.com/. Accessed: 2016-06-15

[68] Faude, O., Junge, A., Kindermann, W., Dvorak, J.: Injuries in female soccerplayers: A prospective study in the german national league. The American Journalof Sports Medicine 33(11) (2005) 1694–1700

[69] Herrington, L., Munro, A.: Drop jump landing knee valgus angle; normative datain a physically active population. Physical Therapy in Sport 11 (2010) 56–59

[70] Kamath, A.F., Israelite, C., Horneff, J., Lotke, P.A.: Editorial: What is varus orvalgus knee alignment?: A call for a uniform radiographic classifcation. ClinicalOrthopaedic and Related Research 468(6) (2010) 1702–1704

[71] Zeller, B.L., McCrory, J.L., Kibler, B., Uhl, T.L.: Differences in kinematics andelectromyographic activity between men and women during the single-legged squat.The American Journal of Sports Medicine 31(3) (2003) 449–456

[72] Stensrud, S., Myklebust, G., Kristianslund, E., Bahr, R., Krosshaug, T.: Corre-lation between two-dimensional video analysis and subjective assessment in eval-uating knee control among elite female team handball players. British Journal ofSports Medicine 45 (2011) 589–595

[73] Microsoft: Kinect for Windows SDK 2. Available online:https://developer.microsoft.com/en-us/windows/kinect/tools. Accessed: 2016-03-17

[74] Microsoft: Kinect for Windows developer bundle. Available online:https://www.microsoftstore.com/store/msusa/en-US/pdp/Kinect-for-Windows-Developer-Bundle/productID.314513600. Accessed: 2016-02-10

[75] Wham, R.M.: Three-Dimensional Kinematic Analysis Using the Xbox Kinect.PhD thesis, University of Tennessee, Knoxville (May 2012)

[76] Andersen, M.R., Jensen, T., Lisouski, P., Mortensen, A.K., Hansen, M.K.,Gregersen, T., Ahrendt, P.: Kinect depth sensor evaluation for computer vi-sion applications. Technical report, Department of Engineerig, Aarhus University(2012)

[77] Li, L.: Time-of-ight camera - an introduction. Technical report, Texas Instruments(2014)

86

Page 101: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[78] Amon, C., Fuhrmann, F.: Evaluation of the spatial resolution accuracy of theface tracking system for Kinect for Windows V1 and V2. In: 6th Congress ofAplh-Adria Acoustics Association. (2014)

[79] Sarbolandi, H., Lefloch, D., Kolb, A.: Kinect range sensing: Structured-light ver-sus time-of-flight Kinect. Journal of Computer Vision and Image Underestanding139 (October 2015) 1–20

[80] Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kip-man, A., Blake, A.: Machine Learning for Computer Vision. In: Real-Time Hu-man Pose Recognition in Parts from Single Depth Images. Volume 411 of Studiesin Computational Intelligence. Springer Berlin Heidelberg (2013) 119–135

[81] Bonnechere, B., Scholukha, V., Moiseev, F., Rooze, M., Van Sint, J.S.: FromKinect to anatomically-correct motion modelling: Preliminary results for humanapplication. In: Games for Health: Proceedings of the 3rd european conferenceon gaming and playful interaction in health care, Springer Fachmedien Wiesbaden(2013) 15–26

[82] Wiedmann, L.G., Planinc, R., Nemec, I., Kampel, M.: Performance evaluationof joint angles obtained by the Kinect v2. In: IET International Conference onTechnlogies for Active and Assisted Living (TechAAL). (2015)

[83] Obdrzalek, S., Kurillo, G., Ofli, F., Bajcsy, R., adn Holly Jimison, E.S., Pavel, M.:Accuracy and robustness of Kinect pose estimation in the context of coaching ofelderly population. In: Annual International Conference of the IEEE Engineeringin Medicine and Biology Society, IEEE (August 2012) 1188–1193

[84] Giblin, S., Meldrum, D., McGroarty, M., O’Brien, S., Rand, S., Smith, S., Grogan,K., Wetterling, F.: Bone length calibration can significantly improve the measure-ment accuracy of knee flexion angle when using a marker-less system to capturethe motion of countermovement jump. In: IEEE-EMBS International Conferenceon Biomedical and Health Informatics (BHI). (February 2016) 382–397

[85] Optitrak: Motion Capture Markers. Available online:https://www.optitrack.com/products/motion-capture-markers/. Accessed:2016-04-02

[86] Giavarina, D.: Understanding Bland Altman analysis. Biochemia Medica 25(2)(2015) 141–151

[87] Kumar, S., Nilsen, W., Pavel, M., Srivastava, M.: Mobile health: Revolutionizinghealthcare through transdisciplinary research. Computer 1 (2013) 28–35

[88] Microsoft: Visual C#. Available online: https://msdn.microsoft.com/es-es/library/kx37x362.aspx. Accessed: 2016-02-10

[89] Microsoft: XAML. Available online: https://msdn.microsoft.com/es-es/library/ms752059(v=vs.100).aspx. Accessed: 2016-02-10

[90] SQLite: SQLite. Available online: http://www.sqlite.org/. Accessed: 2016-02-10

87

Page 102: Automatic Estimation System of Dynamic Knee …orestibanos.com/mentoring/2016_Thesis_CarlosBailon.pdf · amento cruzado anterior s ndrome patelofemoral sensor de profundidad Kinect

REFERENCES

[91] Microsoft: Visual Studio 2015. Available online:https://www.visualstudio.com/es-es/downloads/download-visual-studio-vs.aspx.Accessed: 2016-02-10

[92] Microsoft: Kinect API Overview. Available online:https://msdn.microsoft.com/en-us/library/dn782033.aspx. Accessed: 2016-02-10

[93] Microsoft: Dynamic Data Display. Available online:https://dynamicdatadisplay.codeplex.com. Accessed: 2016-05-28

[94] Bland, J.M., Altman, D.G.: A note on the use of the intraclass correlation co-effcient in the evaluation of agreement between two methods of measurement.Computers in Biology and Medicine 20(5) (1990) 337–340

[95] Cronbach, L.J.: Coeffcient alpha and the internal structure of tests. Psychometrika16(3) (1951) 297–334

[96] Bland, J.M., Altman, D.G.: Statistical methods for assessing agreement betweentwo methods of clinical measurement. The Lancet 327(8476) (1986) 307–310

[97] Prieto, L., Lamarca, R., Casado, A.: La evaluacion de la fiabilidad de las ob-servaciones clınicas: el coeficiente de correlacion intraclase. Medicina Clınica 110(1998) 142–145

[98] Gliem, J.A., Gliem, R.R.: Calculating, interpreting, and reporting Cronbach’salpha reliability coeffcient for Likert-type scales. In: Midwest Research-to-PracticeConference in Adult, Continuing, and Community Education. (2003)

[99] Lewis, J.R., Sauro, J.: The factor structure of the system usability scale. In:Human Centered Design, Springer (2009) 94–103

[100] Vagias, W.M.: Likert-type scale reponse anchors. Clemson International Institutefor Tourism & Research Development (2006)

88