DESIGN, DEVELOPMENT, AND IMPLEMENTATION OF THE DEBRISAT DEBRIS
CATEGORIZATION SYSTEM
By
JOE KLEESPIES
A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE
UNIVERSITY OF FLORIDA
2018
© 2018 Joe Kleespies
To my family, friends, and colleagues
4
ACKNOWLEDGMENTS
First and foremost, I would like to thank my family for their continued support. They
have been with me through every step of my collegiate career, providing constant support and
criticism when needed. I thank my family for always encouraging me to reach higher and fulfill
my greatest potential in both academics and in life.
Additionally, I would like to especially thank my advisor, mentor, and co-chair Dr.
Norman Fitz-Coy for his guidance and wisdom throughout both my undergraduate and graduate
studies at the University of Florida. Dr. Fitz-Coy’s consistently high expectations have served as
a gold standard adopted by his students and researchers. I thank him for never accepting a
problem or excuse without a solution in hand, for his constant push to document everything, and
for his invaluable insight academically, professionally, and personally. To my other committee
members Dr. Herman Lam and Dr. Janise McNair, thank you for your guidance and support
throughout this process.
To my Space Systems Group colleagues, especially Bungo Shiotani, thank you for your
friendship, advice, and the late nights in the lab and on the town. To my friends, thank you for
sticking with me through it all and for always being just a call or text away.
Finally, I would like to acknowledge the Federal Aviation Administration (FAA) Office
of Commercial Space Transportation (CST) for their support of the design, development, and
implementation of the DebriSat Debris Categorization System.
5
TABLE OF CONTENTS
Page
ACKNOWLEDGMENTS ...............................................................................................................4
LIST OF TABLES ...........................................................................................................................7
LIST OF FIGURES .........................................................................................................................9
LIST OF ABBREVIATIONS ........................................................................................................13
ABSTRACT ...................................................................................................................................15
CHAPTER
1 INTRODUCTION ..................................................................................................................17
Background on Orbital Debris ................................................................................................17
Breakup Models and HVI Tests .............................................................................................19 DebriSat Overview .................................................................................................................22 Motivation for the Debris Categorization System ..................................................................25
2 LITERATURE SURVEY .......................................................................................................27
Background on Databases .......................................................................................................27
Example Database Solutions ..................................................................................................30
Direct vs. Indirect BLOB Storage ..........................................................................................34
3 REQUIREMENTS AND SYSTEM OVERVIEW .................................................................38
DebriSat Post-Impact Phase ...................................................................................................38
Debris Categorization System Requirements .........................................................................54
Debris Categorization System Overview ................................................................................55
4 FRONT-END LAYER ...........................................................................................................58
Front-End Layer Overview .....................................................................................................58 Control Pages ..........................................................................................................................61 Data Pages ..............................................................................................................................72
5 BACK-END LAYER ...........................................................................................................106
Back-End Layer Overview ...................................................................................................106
Database Structure ................................................................................................................109 Performance Characterization ..............................................................................................110
6 CONCLUSIONS AND FUTURE WORK ...........................................................................116
6
APPENDIX DCS TABLE STRUCTURES.................................................................................117
LIST OF REFERENCES .............................................................................................................138
BIOGRAPHICAL SKETCH .......................................................................................................141
7
LIST OF TABLES
Table Page
1-1 A Comparison of SOCIT4 and DebriSat Target and Test Parameters. .............................24
2-1 Pros and Cons of Relational and Non-Relational Databases. ............................................29
2-2 Columns Used in the Chabot PHOTOCD_BIB Table. ......................................................31
2-3 Pros and Cons of Direct and Indirect BLOB Storage. .......................................................35
3-1 List of Materials Used to Categorize Post-Impact Debris .................................................45
3-2 List of Colors Used to Categorize Post-Impact Debris ......................................................46
3-3 List of Shapes Used to Categorize Post-Impact Debris .....................................................46
3-4 Mass Balances Used During the Mass Measurement Process ...........................................47
3-5 Equations Used to Calculate Physical Characteristics .......................................................53
3-6 High-Level System Requirements for the DCS. ................................................................55
4-1 Source Structure for the Front-End Layer. ........................................................................60
4-2 Description of Status Indicator Icons on the DCS Debris Page.........................................69
4-3 Description of Status Indicator Icons on the DCS Foam Page. .........................................72
4-4 Panel ID Encoding Scheme. ..............................................................................................81
4-5 Options for Foam Assessment Fields. ...............................................................................82
4-6 Summary of Options for Debris Material, Shape, and Color Fields. .................................99
5-1 Technical Specifications of the SSG Server ....................................................................107
5-2 Description of DCS Database Tables ..............................................................................109
5-3 Query Execution Times for BLOB Upload. ....................................................................111
5-4 Query Execution Times for 3D Size Measurement Upload. ...........................................112
5-5 Query Execution Times for Full Database Selection with BLOB Fields. .......................113
5-6 Query Execution Times for Specific Row Selection with BLOB Fields.........................113
5-7 Query Execution Times for BLOB Selection on Various Rows. ....................................114
8
A-1 Column Structure of the “dcs_activity” Database Table. ................................................117
A-2 Column Structure of the “dcs_announcements” Database Table. ...................................117
A-3 Column Structure of the “dcs_debris” Database Table. ..................................................117
A-4 Column Structure of the “dcs_foam” Database Table. ....................................................134
A-5 Column Structure of the “dcs_users” Database Table. ....................................................137
9
LIST OF FIGURES
Figure Page
1-1 Cataloged Man-Made Objects Orbiting Earth. A) 1963. B) 2013. ....................................17
1-2 Monthly Number of Objects in Earth Orbit by Object Type. ............................................18
1-3 The SOCIT4 U.S. Transit Satellite Target. A) Target with Phenolic Skin Removed.
B) Target with Phenolic Skin Installed. .............................................................................21
1-4 AMR Distributions of the Cosmos 2251 Iridium 33 Debris Fragments. A)
Distribution of Cosmos 2251 Fragments. B) Distribution of Iridium 33 Fragments.........22
1-5 A View of DebriSat Test Article Assembled Before the LHVI Test. ...............................23
2-1 Schema Organization Used for the Chabot Project. ..........................................................30
2-2 A Screenshot of One of the Chabot Front-End User Interface Web Forms. .....................32
3-1 Views of the DebriSat HVI Test Chamber. A) Downrange View of Test Chamber
Before Impact. B) Uprange View of Test Chamber After Impact. C) Downrange
View of Test Chamber After Impact. D) Example Fragment Collected in Test
Chamber After Impact. ......................................................................................................38
3-2 High-Level Overview of the DebriSat Post-Impact Phase Workflow. ..............................39
3-3 Overview of Post-Impact Phase Detection Procedures. ....................................................40
3-4 Grid Used in Foam Panel Preparation. A) Foam Preparation Grid Usage on Foam
Panel. B) Full Foam Preparation Grid Definition Regions with Axes. .............................41
3-5 Stitched Foam Panel X-Ray Images. A) Stitched Binary X-Ray Image. B) Processed
Binary X-Ray Image Highlighting Embedded Debris Fragments. ....................................42
3-6 Embedded Fragment Location Process During Extraction Stage. A) Stitched Binary
X-Ray Image Projected onto Panel to Highlight Embedded Fragment Locations. B)
Use of Pins to Physically Mark Locations of Embedded Fragments on the Panel. ...........43
3-7 Example of Extracted Debris Fragments. A) Foam Chunks. B) Foam Panels. .................43
3-8 Overview of the DebriSat Assessment Process. ................................................................44
3-9 Screenshot of the DebriSat Mass Measurement GUI. .......................................................47
3-10 DebriSat Mass Measurement Procedure. ...........................................................................48
3-11 DebriSat 2D Imaging System. A) 2D Imaging System Physical Apparatus. B) 2D
Imaging System Control GUI. ...........................................................................................49
10
3-12 DebriSat 2D Size Measurement Process............................................................................49
3-13 DebriSat 3D Imaging System Physical Apparatus. ...........................................................50
3-14 3D Imaging System. A) Camera and Axis Designations. B) Azimuth Angles. ................51
3-15 3D Imaging System Space Carving. A) Visualization of Complete Space of
Azimuth-Elevation Pairs. B) Visualization of the Space Carving Technique. ..................51
3-16 DebriSat 3D Imaging System GUI. ...................................................................................52
3-17 DebriSat 3D Size Measurement Process............................................................................52
3-18 Top-Level Structure of the DCS. .......................................................................................55
3-19 Access Control Scheme Between the DCS and the Off-Campus DebriSat
Characterization and Processing Facility. ..........................................................................56
4-1 Primary Page Structure of the Front-End User Interface. ..................................................58
4-2 Screenshot of the DCS Login Page. ...................................................................................61
4-3 Screenshot of the DCS Home Page. ..................................................................................63
4-4 Screenshot of the DCS Activity Page. ...............................................................................64
4-5 Screenshot of the Result of a Filter on the DCS Activity Page. ........................................65
4-6 Screenshot of the DCS Debris Page...................................................................................67
4-7 Screenshot of the Results of a Filter on the DCS Debris Page. .........................................68
4-8 Screenshot of the DCS Foam Page. ...................................................................................70
4-9 Screenshot of the Results of a Filter on the DCS Foam Page. ...........................................71
4-10 High-Level Overview of Foam Processing with DCS Roles.............................................73
4-11 Detailed Foam Processing Procedure on the DCS. ............................................................74
4-12 Screenshot of the DCS Add Foam Page. ...........................................................................76
4-13 Updated Foam Form on the DCS Add Debris Page. A) Add Debris Page for the
“Panel” Foam Type. B) Add Debris Page for the “Chunk (M)” Foam Type. ...................77
4-14 Screenshots of the Updated Foam Form on the DCS Add Foam Page. A) Foam Form
with the “Chunk (L)” Foam Type. B) Foam form with the “Pillar” Foam Type. .............79
11
4-15 Illustrations of Soft-Catch Foam Panel Layups. A) Foam Panels in the HVI Test
Chamber. B) Division of Foam “Areas” Relative to DebriSat in the HVI Test
Chamber. ............................................................................................................................80
4-16 Screenshot of the “LOCATION” Section on the Add Foam Page Populated Data and
Showing Form Validation on the “Foam Panel ID” Field. ................................................80
4-17 Illustration of Basic Code Structure of the Add Foam Data Page. ....................................83
4-18 Screenshot of the View Foam Page for a New Foam Record............................................85
4-19 Illustration of Basic Code Structure of the View Foam Data Page. ..................................86
4-20 Screenshot of the Edit Foam Page. ....................................................................................89
4-21 Screenshot of the View Foam Page with All Fields Populated. ........................................90
4-22 Screenshot of the “View Revision” Field on the View Foam Page. ..................................91
4-23 Screenshot of the Verify Foam Page for a Medium Foam Chunk. ....................................92
4-24 Screenshot of the Updated “USER DATA” Section After Verification. ...........................93
4-25 High-Level Overview of Debris Characterization Process with DCS Roles. ....................93
4-26 Detailed Debris Characterization Procedure on the DCS. .................................................94
4-27 Screenshot of the DCS Add Debris Page. ..........................................................................95
4-28 Screenshot of the “IDENTIFICATION” and “LOCATION” Fields on the Add
Debris Page for an “Embedded” Debris Fragment Associated with a Foam “Panel”
Record. ...............................................................................................................................97
4-29 Screenshot of the “IDENTIFICATION” and “LOCATION” Sections on the Add
Debris Page for an “Embedded” Debris Fragment Associated with a Medium Foam
Chunk. ................................................................................................................................98
4-30 Example of Material Repetition Prevention on the DCS Add Debris Page. .....................99
4-31 Imaging Sections on the DCS Add Debris Page Configured for 3D Fragments. ............100
4-32 Screenshot of the Main Debris Form for Broken Fragments. ..........................................102
4-33 Screenshot of the View Debris Page with Identification, Location, and Assessment. ....103
4-34 Screenshot of a Fully Populated View Debris Page. .......................................................105
5-1 Illustration of the DCS Back-End Layer Structure. .........................................................106
12
5-2 Plot of Resulting Data from Data Analysis Application Test Query. ..............................115
13
LIST OF ABBREVIATIONS
2D Two-dimensional
3D Three-dimensional
AAS American Astronautical Society
ACID Atomicity, Consistency, Isolation, and Durability
ACSA Average Cross-Sectional Area
AEDC Arnold Engineering Development Complex
AMR Area to Mass Ratio
B-Tree Balanced Search Tree
BASE Basically Available, Soft-State, and Eventually Consistent
BLOB Binary Large Object
CFRP Carbon Fiber Reinforced Polymer
CGI Common Gateway Interface
CSS Cascading Style Sheets
DAS Debris Assessment Software
DCS Debris Categorization System
DOD Department of Defense
DWR Department of Water Resources
EMR Energy to Mass Ratio
GEO Geostationary Earth Orbit
GRC General Research Corporation
GUI Graphical User Interface
HTML Hypertext Markup Language
HVI Hypervelocity Impact
IAC International Astronautical Congress
14
IIS Internet Information Services
J2EE Java Platform Enterprise Edition
JSC Johnson Space Center
LC Characteristic Length
LDAP Lightweight Directory Access Protocol
LEO Low Earth Orbit
LHVI Laboratory Hypervelocity Impact
MLI Multi-Layer Insulation
MSSQL Microsoft SQL Server
NAS Network Attached Storage
NASA National Aeronautics and Space Administration
ODBC Open Database Connectivity
ORDEM Orbital Debris Engineering Models
PDB Protein Data Bank
PHP Hypertext Preprocessor
RAID Redundant Array of Independent Disks
RCS Radar Cross Section
RCSB Research Collaboratory for Structural Bioinformatics
SOCIT Satellite Orbital Debris Characterization Impact Test
SSG Space Systems Group
SSN Space Surveillance Network
SQL Server Query Language
UF University of Florida
UFAD University of Florida Active Directory
VPN Virtual Private Network
15
Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science
DESIGN, DEVELOPMENT, AND IMPLEMENTATION OF THE DEBRISAT DEBRIS
CATEGORIZATION SYSTEM
By
Joe Kleespies
May 2018
Chair: Herman Lam
Co-Chair: Norman Fitz-Coy
Major: Electrical and Computer Engineering
DebriSat is an ongoing experiment to update existing orbital debris breakup models used
by the National Aeronautics and Space Administration (NASA) and the Department of Defense
(DOD). It based on a laboratory hypervelocity impact (LHVI) test conducted in 2014 to generate
new breakup data that reflect modern satellite manufacturing processes and materials. The
DebriSat project is currently in the post-impact phase where debris fragments generated from the
LHVI test are collected, characterized, and recorded. The initial estimate for the total number of
debris fragments with a minimum linear dimension of 2 mm produced from the LHVI test was
85,000 and as of February 2018, over 140,000 debris fragments have been collected, far
surpassing the initial estimate as this number continues to grow. After debris fragments are
collected, they are characterized and catalogued using up to 327 unique data and metadata fields
ranging from mass to full two-dimensional (2D) and three-dimensional (3D) point cloud
representations. These 327 fields multiplied by the 100,000+ debris fragments being collected
posed a classic Big Data management challenge for the project. In response, the Debris
Categorization System (DCS) was designed, developed, and implemented.
16
The DCS is a database-based solution designed to complement and streamline the
workflow used in the DebriSat post-impact phase. The workflow involves the characterization of
the debris fragments as well as the archival of the debris fragments, characterization data, and
processes used.
17
CHAPTER 1
INTRODUCTION
Background on Orbital Debris
On October 4, 1957, Sputnik 1 became the first man-made object launched into space.
Six years later in July 1963, the United States Space Surveillance Network (SSN) had recorded
616 man-made objects in space. As of January 1, 2013, the SSN cataloged over 23,000 man-
made objects orbiting the Earth. [1] Figure 1-1 provides a visual comparison of the cataloged
objects in low Earth orbit (LEO) in 1963 versus the cataloged objects in 2013.
A B
Figure 1-1. Cataloged Man-Made Objects Orbiting Earth. A) 1963. B) 2013. Courtesy of NASA.
Source: J. Liou and D. Shoots, “Fifty Years Ago,” Orbital Debris Quarterly News, vol. 17, no. 3,
pp. 2-3, Jul. 2013.
This exponential growth in cataloged objects is driven by an increasing number of
launches year over year, but mostly by on-orbit satellite fragmentations from collisions,
explosions, or intentional breakup events. The first orbital fragmentation event occurred in June
1961 and caused the number of cataloged objects to balloon by 400%. Since then, 10 of the
4,500+ space missions launched after 1957 have been responsible for at least 31% of all
cataloged objects. [2] Two notable examples of these significant fragmentation events are the
intentional breakup of the Fengyun-1C spacecraft in 2007 and the Iridium 33-Cosmos 2251
18
collision in 2009. On January 11, 2007, the Fengyun-1C spacecraft was intentionally destroyed
through a hypervelocity collision with a ballistic object launched from Earth. The resulting
breakup generated over 2,300 trackable objects thus creating one of the most severe debris
clouds in history. [3] On February 10, 2009, the first ever collision between two satellites in orbit
occurred when the Cosmos 2251 collided with and completely fragmented the Iridium 33. The
collision produced 1,366 trackable objects as of August 26, 2009 and continues to produce
hundreds of conjunctions with operational Iridium and Orbcomm satellites. [4]
Figure 1-2 shows the number of objects tracked by the SSN in Earth orbit over time. [5]
As shown, the population of tracked objects exploded between 2006 and 2010 due to large
breakup events including the Fengyun-1C and the Iridium 33-Cosmos 2251 collision. These
events have lead to a current population of around 19,000 objects. Also shown in Figure 1-2 is a
concerningly consistent growth of the orbital object population over time despite the natural
decay and deorbiting shown by the small dips in the plot (e.g., as shown between 1988 and 1990
in the figure).
Figure 1-2. Monthly Number of Objects in Earth Orbit by Object Type. Courtesy of NASA.
Source: P. Anz-Meador and D. Shoots, “Monthly Number of Objects in Earth Orbit by Object
Type,” Orbital Debris Quarterly News, vol. 22, no. 1, p. 10, Feb. 2018.
19
Years before the Fengyun-1C or Iridium 33-Cosmos 2251 breakup events, in 1978,
NASA’s Donald J. Kessler published his ideas on his now-famous Kessler Syndrome. [6] The
Kessler Syndrome postulates that when the density of objects in orbit becomes high enough,
collisions between these objects will begin to cascade such that debris generated from one
collision cause further collisions that ultimately result in a feedback runaway by which Earth’s
orbit becomes so polluted with debris that it is no longer safe to conduct space missions. The
Kessler Syndrome has started to manifest as debris generated from significant fragmentation
events such as the Fengyun-1C or the Iridium 33-Cosmos 2251 event begin to collide with one
another and create additional debris. [7] As a result, research into ways to mitigate the worsening
orbital debris environment has become prevalent in recent years. There is a need to provide
reliable description of the orbital debris environment as the importance of risk assessment and
conjunction analysis increases.
Breakup Models and HVI Tests
One of the primary tools used to define and assess the orbital debris environment are
breakup models. Breakup models are based on debris measurements and calculations such as
area-to-mass-ratio (AMR), average-cross-sectional-area (ACSA), characteristic length, mass,
shape, etc. In general, objects in Earth orbit greater than 10 cm are tracked via radar, telescopes,
and other ground-based techniques and cataloged by the SSN with information such as radar
cross section (RCS). Objects smaller than 10 cm are too small to track using current methods;
thus, these smaller objects are represented statistically. [8] [9] [10] These statistical
representations help estimate general sizes and number of debris in the environment; however,
these representations cannot be used to discern the AMR, ACSA, characteristic length, shape, or
material. [11] Measurements and calculations such as AMR, ACSA, and characteristic length are
used to predict radar measurements of these sub-10 mm debris, ultimately to be applied in the
20
same assessment and analysis methods used for cataloged debris greater than 10 mm. To
determine AMR, ACSA, characteristic length, shape, and material, ground-based hypervelocity
impact (HVI) testing is required.
The first orbital debris breakup models were based on data from the 1964 Atlas tank
explosion tests and the 1970 Bess collision tests. [12] [13] In the 1990s, the four-test Satellite
Orbital Debris Characterization Impact Test (SOCIT) series and the Battelle tests in Europe were
conducted. [11] [14] NASA used the data from the SOCIT series to develop the NASA Standard
Breakup Model, which would go on to be used in many of NASA’s tools for orbital debris
definition, assessment, and analysis such as Orbital Debris Engineering Models (ORDEM) and
Debris Assessment Software (DAS). [15]
The SOCIT series of tests were conducted at Arnold Engineering Development Complex
(AEDC) between 1991 and 1992 and consisted of a pre-test that targeted a satellite mock-up to
calibrate the test range followed by four main tests, SOCIT1 thru SOCIT4, which targeted a solar
panel, second satellite mock-up, fourth stage adapter, and a Transit satellite, respectively. The
SOCIT4 dataset was chosen for use in the 1993 NASA Standard Breakup Model because it was
derived from a flight-ready target. The SOCIT4 Transit satellite target, shown in Figure 1-3, was
suspended in the test chamber using cables and was surrounded by 10 soft-catch foam stacks to
catch debris fragments during the test (see the bottom and right side of Figure 1-3B). After the
test, a majority (27.81 kg of the total 34.5 kg target) of the debris consisted of 100 large
fragments recovered from the floor of the test chamber. Almost all the soft-catch foam stacks
were intact, less than 5% were destroyed. [11]
21
A B
Figure 1-3. The SOCIT4 U.S. Transit Satellite Target. A) Target with Phenolic Skin Removed.
B) Target with Phenolic Skin Installed. Courtesy of NASA. Source: P. Krisko, M. Horstman, and
M. L. Fudge, "SOCIT4 Collisional-Breakup Test Data Analysis: With Shape and Materials
Characterization," Advances in Space Research, vol. 41, no. 7, pp. 1138-1146, Oct. 2007.
The intact stacks were first sent to the General Research Corporation (GRC) for debris
extraction analysis. The GRC extracted some of the intact stacks using high-pressure water
reduction. Later, water-reduced fragments and unreduced stacks were sent to Kaman Sciences
for further extraction and analysis. Kaman Sciences used manual extraction techniques to reduce
intact stacks and ultimately created a spreadsheet-based database containing the 100 major
fragments initially found in the test chamber, water-reduced fragments processed by the GRC,
and manually reduced fragments processed by Kaman Science resulting in a total of 4,762
recorded debris fragments. [11] The Kaman database included 25 data fields including shape and
material, which were determined through visual inspection. The shapes from the Kaman database
were used in ACSA calculations for the 1993 NASA Standard Breakup Model and the scaled
cumulative number vs. mass derived by Kaman was used to create a collision breakup size
distribution in the 1993 NASA Standard Breakup Model. [11]
After the SOCIT series, several low-velocity impact tests were conducted at Kyushu
University in Japan to create a low-velocity collision model based on the 1998 NASA Standard
22
Breakup Model. [16] The Kyushu impact tests were conducted at velocities between 100 m/s and
200 m/s and utilized high-speed cameras to capture fragmentation events after collisions. These
low-velocity impact tests were conducted mainly to complement the NASA Standard Breakup
Model with data consistent with collisions in geostationary Earth orbit (GEO).
DebriSat Overview
Based on optical, radar, and in-situ measurements of the 18,000+ SSN-cataloged objects
orbiting Earth and LHVI data from ground-based tests such as the SOCIT series and Kyushu
University tests, the 1993 NASA Standard Breakup Model has been able to model breakups of
satellites built using older processes and materials (i.e., primarily metals) well; however, there
are some discrepancies in the NASA Standard Breakup Model when modeling breakups of
satellites built with more modern processes and materials such as carbon fiber reinforced
polymer (CFRP), Kevlar, and multi-layer insulation (MLI). For example, Figure 1-4 shows the
distributions of debris fragments from the Iridium 33-Cosmos 2251 collision, comparing
observed SSN catalog data to the NASA Standard Breakup Model Prediction. [17]
A B
Figure 1-4. AMR Distributions of the Cosmos 2251 Iridium 33 Debris Fragments. A)
Distribution of Cosmos 2251 Fragments. B) Distribution of Iridium 33 Fragments. Courtesy of
NASA. Source: J. Liou and P. Anz-Meador, "An Analysis of Recent Major Breakups in the Low
Earth Orbit Region," National Aeronautics and Space Administration, Houston, TX, USA, Rep.
NASA/JSC-CN-19713, 2010.
23
Figure 1-4A shows a good agreement between observed SSN data and the NASA
Standard Breakup Model prediction for Cosmos 2251 fragments. Figure 1-4B, however, shows a
discrepancy of about a factor of three between observed SSN data and the NASA Standard
Breakup Model prediction for Iridium 33 fragments. On the one hand, Cosmos 2251 was an
older Russian communications satellite fabricated using older processes and materials that had
been defunct for a few years prior to the collision with Iridium 33. On the other hand, the Iridium
33 was a functional modern satellite, part of the Iridium communications constellation, before
the Iridium 33-Cosmos 2251 collision; it was fabricated using modern materials and processes
such as Aluminum, CFRP, Kevlar, and MLI. This discrepancy is an example of the limitations of
the current NASA Standard Breakup Model where biases are introduced for breakups involving
modern materials such CFRP, Kevlar, and MLI in which many high-AMR debris fragments are
produced. [17]
Figure 1-5. A View of DebriSat Test Article Assembled Before the LHVI Test. Courtesy of
author.
The DebriSat project is the most recent major LHVI test since the SOCIT series and
Kyushu tests and is a collaboration between NASA, DOD, FAA, The Aerospace Corporation,
24
Jacobs Engineering, and the University of Florida (UF) to produce new ground-based breakup
data for fragmentation events of satellites built with modern materials and manufacturing
processes. The goal is to use this newly generated data to update the NASA and DOD breakup
models and address discrepancies and biases similar to those shown in Figure 1-4. The DebriSat
test article, shown in Figure 1-5, was designed and assembled at UF as a representative 50 kg-
class low Earth orbit (LEO) satellite using emulated flight hardware and other modern materials
and processes, such as CFRP, Kevlar, and MLI. [18] In April 2014, the DebriSat LHVI test was
conducted at AEDC with a 600 g projectile travelling at a speed of 6.8 km/s to emulate an on-
orbit collision between the satellite and a large piece of debris. The energy of the collision was
13.2 MJ, completely fragmenting both the projectile and the target. Table 1-1 compares the
target and test parameters of the SOCIT4 test and the DebriSat test.
Table 1-1. A Comparison of SOCIT4 and DebriSat Target and Test Parameters. [19]
Parameter SOCIT4 (Transit) DebriSat
Target Mass (kg) 34.5 56
Projectile Shape Sphere Hollow Cylinder
Projectile Material Aluminum Aluminum
Projectile Diameter (cm) 4.7 8.6
Projectile Mass (g) 150 570
Impact Speed (km/s) 6.1 6.8
Energy to Mass Ratio (J/g) 81 235
Propulsion System No Yes
Attitude Control Magnetic Hysteresis Rods Reaction Wheels and Magnetorquers
External Heat Protection Aluminized Mylar Multi-Layer Insulation (MLI)
Composite Materials No Yes
Emulated Components Solar Cell Batteries Majority of Components
As shown in Table 1-1, DebriSat was made using composites and other modern
materials, was a larger target than the Transit satellite used in SOCIT4, used a larger projectile,
and ultimately achieved a much higher energy-to-mass ratio (EMR) than SOCIT4 did.
25
Furthermore, the DebriSat LHVI test used many more soft-catch foam panels than were used in
SOCIT4, to provide complete coverage of the test chamber where DebriSat was mounted. [19]
The DebriSat project is currently in the post-impact phase of operation where debris
fragments down to 2 mm in size are collected, characterized, and recorded. The initial estimate
for the total number of target debris fragments was 85,000. As of February 2018, over 140,000
debris fragments have been collected, far surpassing the initial estimate. The number of debris
fragments continues to grow and already accounts for more than 28 times the number of debris
fragments recorded from the SOCIT4 test. Those who work on the DebriSat project collecting,
characterizing, and recording debris fragments follow a strict set of procedures developed in
conjunction with a database system to extract, assess, measure, record, and verify each debris
fragment.
Motivation for the Debris Categorization System
Each DebriSat debris fragment has 327 unique data and metadata fields ranging from
mass to full two-dimensional (2D) and three-dimensional (3D) point clouds. Furthermore, there
are an additional 65 data fields for each soft-catch foam panel from the LHVI test. The large
number of data fields and types multiplied by the 100,000+ debris fragments being collected and
500+ soft-catch foam panels being processed posed a classic Big Data management challenge for
the project. While previous LHVI tests such as the SOCIT series and Kyushu tests utilized
spreadsheet databases to store their data, the sheer scale of DebriSat’s dataset warranted a much
more detailed solution. Early estimates for DebriSat’s final dataset size were on the order of 14-
20 TB. Furthermore, DebriSat’s data management solution needed to standardize the data entry
process to eliminate and other inefficiencies that led to up to 50% error at times to in previous
LHVI tests series. Thus, the DebriSat Debris Categorization System (DCS) was designed,
developed, and implemented to provide a front-end user interface layer to guide users through
26
the DebriSat post-impact phase characterization procedures while also providing a robust back-
end service layer to securely store the tens of terabytes of data produced by the project with
permanence (10-20 years and longer)and serve that data to stakeholders quickly and efficiently.
This thesis presents and end-to-end description of the conception, design, development,
and implementation of the DebriSat DCS. The origins and justification for the DCS were presented
in Chapter 1. Next, a literature survey of similar systems is presented in Chapter 2. Then, in
Chapter 3, the requirements for the DCS are defined and the front-end back-end dichotomy of the
system is outlined. In Chapter 4 and Chapter 5, detailed descriptions of the design, development,
and implementation of the front-end interface layer and back-end service layer are presented.
Lastly, in Chapter 6, conclusions and lessons learned from the project are overviewed and
explained.
27
CHAPTER 2
LITERATURE SURVEY
Background on Databases
In general, there are two types of databases: relational databases and non-relational
databases. For the past 30 years, relational databases have dominated the market; however, with
the relatively recent explosion of Big Data and large, unstructured datasets, non-relational
databases have become more and more popular. [20] A relational database consists of a set of
tables that fit structured data into an array of pre-defined data categories represented as columns
in these tables. Each row in a relational database table corresponds to a unique structured data
point. The relational database model was first defined by Edgar Codd in 1970 [21] and have been
used in a wide array of industries and applications ranging from web hosting to protein
sequencing where transactions require high precision and ACID (atomicity, consistency,
isolation, and durability) compliance. In ACID compliance, atomicity means updates are
performed completely or not at all, consistency means no part of a transaction can break the rules
of the database, isolation means transactions are run independently of other concurrent
transactions, and durability means that completed transactions will persist. [22] Furthermore, due
to their structured nature, relational databases leverage Server Query Language (SQL) to enable
detailed querying and reporting on stored data. Ultimately, relational databases provide a large
feature set and excellent data integrity; however, there are some limitations. Enforcing ACID
compliance on every transaction makes relational databases slower than non-ACID compliant
systems. Additionally, non-regular data input into relational databases must be reformatted using
the pre-defined tabular structure, making the storage of unstructured data cumbersome. Finally,
relational databases scale very well vertically (i.e., running the database on more powerful and
expensive hardware to increase performance or storage); however, there is a point during scaling
28
where a database must be scaled horizontally (i.e., distributed across multiple machines) to scale
further. Due to the structured and self-contained nature of the relational database model,
relational databases do not perform well in distributed setups. [22]
The introduction of smart devices and the Internet of Things paradigm have caused the
number of data types and total amount of available data to grow exponentially over the past
decade. The rapid development of low-cost, low-power sensors, the explosion in utilization of
real-time chat and social media applications such as Facebook and Twitter, and the new focus on
web analytics, real-time data analysis, and business insight have flooded the market with
unstructured data that relational databases struggle to process. As a result, non-relational or
“NoSQL” databases have grown out of the need to store and process this unstructured data. Non-
relational database support simpler data models and scale very well horizontally, allowing the
implementation of large datacenters of machines to support the influx of petabytes of data
generated by billions of smart devices. Non-relational databases are schema-free (i.e., do not
have a pre-defined structure) and therefore support almost any type of data or document.
However, non-relational databases place much more responsibility on the application
programmer due to the simpler nature of the database and are not ACID-compliant. Rather, non-
relational databases are BASE (Basically Available, Soft-State, and Eventually Consistent)
compliant. That is, non-relational systems are available most of the time, but not always; the
state of a non-relational system may change without a transaction due to node updates; and a
non-relational system will be eventually consistent, but not immediately after a transaction. [23]
As a result of the less-stringent BASE restrictions on transactions, non-relational databases are
very quick; however, this speed comes at the expense of guaranteed data integrity. Table 2-1
outlines the individual pros and cons of relational and non-relational database systems.
29
Table 2-1. Pros and Cons of Relational and Non-Relational Databases. [23]
Database Type Pros Cons
Relational Works with structured data.
Supports strict ACID-transactional
consistency.
Built-in data integrity.
Large ecosystem.
Relationships via constraints.
Limitless indexing.
Strong SQL.
OTLP and OLAP.
Does not scale out horizontally
(concurrency and data size) – only
vertically, unless sharding is used.
Data is normalized, meaning lots of
joins affecting speed.
Difficulty in working with semi-
structured data.
Schema-on-write.
High cost.
Non-Relational Works with semi-structured data
(e.g., JSON, XML).
Scales out horizontally.
High concurrency, high volume
random reads and writes.
Massive data stores.
Schema-free, schema-on-read.
Supports documents with different
data fields.
High availability.
Low cost.
Simplicity of design: no “impedance
mismatch.”
Finer control over availability.
Speed, due to not having to join
tables.
Weaker or eventual consistency
(BASE) instead of ACID.
Limited support for joins.
Data is denormalized.
Does not have built-in data integrity
(i.e., must do in code).
No relationship enforcement.
Limited indexing.
Weak SQL.
Limited transaction support.
Slow mass updates.
Ultimately, for applications with structured data and requirements for high-precision and
strong data integrity, relational databases are the most useful solution. For applications with a
large amount of unstructured data that must be highly available or processed in real-time and
where a lower level of data integrity is acceptable, non-relational databases are the faster,
simpler, and more efficient option. That is, the decision to use either a relational database or non-
relational database depends entirely on the requirements of the intended application.
30
Example Database Solutions
Over the past decade, there have been a few implementations of database solutions for
various projects with similarly large and diverse datasets to DebriSat’s dataset. For example, the
Chabot project started at the University of California Berkeley in 1995 aimed to implement a
database solution to streamline the process of browsing and requesting images from the
California Department of Water Resources (DWR) collection of 500,000+ images of State Water
Project facilities. [24] The Chabot project began with a list of five requirements to replace the
existing, manual image retrieval system with a better system that includes: (i) an advanced
relational database for images and data, (ii) large-scale storage for images, (iii) on-line browsing
and retrieval of images, (iv) a flexible, easy-to-use retrieval system, and (v) retrieval of images
by content.
Figure 2-1. Schema Organization Used for the Chabot Project. Courtesy of Virginia Ogle and
Michael Stonebraker. Source: V. Ogle and M. Stonebraker, "Chabot: Retrieval from a Relational
Database of Images," Computer, vol. 28, no. 9, pp. 40-48, 1995.
The new system needed to store and track image data and metadata such as the date the
photo was taken, the subject of the photo, and the location where the photo was taken; therefore,
the system needed to handle a variety of data types such as text, numerical data, time, and
31
location. Furthermore, the Chabot team calculated that the 500,000+ images and textual data
would require approximately 2.5 terabytes of storage. Ultimately, the Chabot team implemented
a new system based on the POSTGRES relational database engine. Figure 2-1 outlines the
schema organization used for the Chabot system. Textual information was stored in the
TECH_RPT_BIB table, videos and associated metadata were stored in the VIDEO_BIB table,
and photos and associated metadata were stored in the PHOTOCD_BIB table.
Furthermore, each table in the Chabot schema consisted of a set of data fields and
attributes represented as columns in each table. Table 2-2 describes the columns used to classify
photos and metadata stored in the PHOTOCD_BIB table.
Table 2-2. Columns Used in the Chabot PHOTOCD_BIB Table. [24]
Column Name Data Type Description
Abstract text abstract (for documents)
Title text title (of document)
Comments text comments
Disknum text Photo-CD number
Imgnum integer image number on the CD
Id text DWR ID number
doc_type text nature, art, legal, etc.
Copyright text copyright information
Indexer text person creating db entry
Organization text who commissioned photo
Category text DWR category – “SWP”, etc.
Subject text DWR subject – “The Delta”
Location text one of 9 California regions
Description text a description of the image
job_req_num text DWR job request ID
Photographer text photographer
Filmformat text “35 mm slide”
Perspective char16 aerial – ground – close-up
Color char C (color) B (black & white)
Orientation char H (horizontal) V (vertical)
Histogram text color histogram
entry_date abstime date of db entry
shoot_date abstime date photo was taken
Oid oid POSTGRES object ID
32
In addition to the back-end POSTGRES relational database engine, the Chabot system
also included a front-end user interface to browse, query, and request data from the DWR
dataset. The front-end user interface consisted of several web forms with data fields for each of
the columns listed in Table 2-2. Figure 2-2 shows a screenshot of one of the front-end user
interface web forms.
Figure 2-2. A Screenshot of One of the Chabot Front-End User Interface Web Forms. Courtesy
of Virginia Ogle and Michael Stonebraker. Source: V. Ogle and M. Stonebraker, "Chabot:
Retrieval from a Relational Database of Images," Computer, vol. 28, no. 9, pp. 40-48, 1995.
To store the terabytes of textual and rich data, the Chabot team implemented a two-tier
storage solution. The first tier is a high-speed tier consisting of expensive, high-speed drives.
This tier is used to store textual data and small thumbnails of images and videos. The second tier
is a low-speed tier consisting of cheap, low-speed tapes that take approximately 2 minutes to
retrieve raw images and videos. This two-tiered storage approach allows users to browse for
33
images and videos quickly and defers the retrieval time for full-size images and videos to after a
user makes an official request for the data.
Ultimately, the Chabot system was implemented successfully and was able to streamline
the request process for the DWR. The Chabot project started with a list of requirements, around
which a relational database solution was designed, developed, and implemented. Furthermore,
the Chabot schema was also used on several geographical and environmental datasets from other
research projects at UC Berkeley.
Another example database system is the Research Collaboratory for Structural
Bioinformatics (RCSB) Protein Data Bank (PDB). The RCSB PDB is a worldwide repository for
3D structure data of macromolecules. [25] The RCSB PDB requires a high level of technical
quality and reliability of its data as it serves as an authority on protein sequence data used in a
variety of industries and research areas. As a result, the RCSB PDB was developed using a
relational database model to ensure ACID compliance for every transaction. Furthermore, the
RCSB PDB was developed to work with both the MySQL and IBM DB2 database engines to
enable the system to be compatible with many varying application environments. Similar to the
Chabot database system, the RCSB PDB consists of a back-end relational database layer and a
front-end user interface. However, the RCSB PDB also includes an object-relational Java
Platform Enterprise Edition (J2EE) connector middle layer to enable interfacing with external
systems and software. For example, to increase the robustness and breadth of the RCSB PDB
dataset, the system includes several external references to objects in other database systems such
as Swiss-Prot, GenBank, and PubMed. [25] [26] Using several data loaders written in Java, the
RCSB PDB leverages its J2EE connector interface to update its dataset and ensure uniformity by
loading data from external databases using the stored external references. The RCSB PDB
34
leverages the built-in data integrity and broad feature set of the relational database model to
provide a robust, precise library of protein sequence structures, models, and metadata used
worldwide.
Direct vs. Indirect BLOB Storage
Relational databases typically store table data in individual physical files for each table
on the host operating system’s filesystem and use algorithms such as Balanced Search Tree (B-
Tree) to organize the data within these files. This makes storing and retrieving structured textual
data very fast and very efficient. [27] However, storing a binary large object (BLOB) in a
relational database is more complex and performance-intensive. BLOBs are used to store files
and documents such as images or videos as raw bytes. These BLOBs are stored in the same B-
Tree organized files as the rest of the textual data but are typically much longer. For example, a
column storing a textual username is only a few bytes long while a standard BLOB is 64
kilobytes long. Storing BLOBs dramatically increases the size of the physical files used to store
table data, which impacts the storage and retrieval performance for the table. Storing BLOBs
directly with other textual table data is called direct BLOB storage. Alternatively, BLOBs can be
stored outside of a database either as a file on some filesystem or as a BLOB in another database
and linked to using a simple textual field in the source database. This method of BLOB storage is
called indirect BLOB storage.
Direct BLOB storage is attractive because it maintains absolute ACID compliance with
the rest of the table data. That is, transactions will apply to the textual data and BLOB data
contiguously all at once or not at all, the BLOB data will remain conform to the same rules and
will remain consistent with the rest of the textual data, and the BLOB data will persist with the
rest of the textual data. However, direct BLOB storage decreases the query performance of the
database and requires extra effort to mitigate these performance issues. Indirect BLOB storage is
35
attractive because it is much faster and flexible than direct BLOB storage and allows documents
and files to be accessed directly in their physical storage location. However, indirect BLOB
storage cannot guarantee full ACID compliance for the BLOB data. If a file or document is
moved, renamed, or deleted from its physical location, the relational link in the source database
is broken and the BLOB data becomes decoupled from the rest of the table data. Table 2-3
encapsulates some of the pros and cons of direct and indirect BLOB storage for relational
databases.
Table 2-3. Pros and Cons of Direct and Indirect BLOB Storage.
Storage Method Pros Cons
Direct Absolute ACID compliance with the
rest of the data.
BLOB data cannot be orphaned
from the dataset.
Backups automatically include the
BLOB data.
More powerful querying leveraging
textual fields to extract specific
BLOB data.
Increases the size of the database.
Portability becomes complex.
More complex code required to
retrieve BLOB data.
Decreased query performance on the
database table.
No external access to BLOB data.
Indirect Faster query performance on the
database’s textual data.
Faster BLOB access via filesystem.
Access to BLOB data outside of the
context of the database.
Smaller database size.
Simple portability.
Simple code to retrieve BLOB data.
Full ACID compliance not
guaranteed (i.e., BLOB data may
not persist with the dataset).
BLOB data not automatically
included in backups.
Less powerful querying (i.e.,
additional steps required to
utilize query result to relate to
stored BLOB data)
The debate around direct versus indirect BLOB storage has been around as long as
BLOBs and relational database have existed. In 2007, Microsoft published a study to address the
debate. [28] Microsoft researchers found that, in general, for BLOBs larger than 1 MB, indirect
storage resulted in the faster query execution times and minimal disk loads. For BLOBs less than
36
256 kB, the Microsoft researchers found that direct storage resulted in the faster query execution
times and minimal disk loads. In between 256 kB and 1 MB BLOB sizes, performance depended
on the specific application and database structure. In industry, the major technology companies
(e.g., Facebook, Twitter, Google, Microsoft, etc.) have all addressed this challenge in varying
ways. For example, Facebook operates on a custom implementation of MySQL cluster and
developed their own Haystack and f4 BLOB storage solutions. [29] [30] Haystack and f4 define
Facebook’s hybrid solution to BLOB storage that leverages both direct and indirect BLOB
storage. Haystack and f4 enable Facebook to implement direct BLOB storage on a wide array of
machines across several datacenters to serve “hot” data and indirect BLOB storage on cold
storage devices such as tape drives to store “warm” or “cold” data. For example, a recently
uploaded photo receiving many read requests in the few days after it is uploaded is considered
“hot” data and is stored on faster, high-performance machines that can server the data quickly.
The same uploaded file with very little read requests weeks or months later is considered “warm”
or “cold” data and is stored on slower, lower-performance machines. This custom storage
implementation allows Facebook to prioritize data that is needed immediately while still
maintaining the integrity of “warm” and “cold” data without the need to expensive, high-
performance storage machines.
Other companies such as Twitter have implemented their own custom BLOB storage
solutions. In Twitter’s case, their Blobstore architecture is primarily an implementation of
indirect BLOB storage. [31] Twitter stores BLOBs such as images and videos uploaded in tweets
in primary filesystems on an array of storage servers. These BLOBs are then externally linked to
in databases storing the rest of the textual data for users’ tweets. In Microsoft SharePoint, every
file that is uploaded to SharePoint is stored as an embedded BLOB in each table. [28] Ultimately,
37
many of the companies end up implementing a custom, in-house developed BLOB storage
solution specifically tailored to their application. Furthermore, many of these companies such as
Facebook and Twitter do not necessarily require absolute ACID compliance or data perpetuity.
In the end, the decision to implement direct BLOB storage, indirect BLOB storage, or some
hybrid of the two depends entirely on the requirements of the intended application.
38
CHAPTER 3
REQUIREMENTS AND SYSTEM OVERVIEW
DebriSat Post-Impact Phase
Before the DebriSat laboratory HVI test was conducted in 2014 at AEDC, a section of the
interior walls of the test chamber was covered with an array of foam panels as shown in Figure
3-1A to act as a soft-catch arena to capture debris fragmentation from the test. Following the
LHVI test, many of the soft-catch panels were destroyed and the test chamber was littered with
debris as shown in Figure 3-1B, Figure 3-1C, and Figure 3-1D. After the HVI test, the project
entered the post-impact phase of operation where soft-catch foam panels and debris were
collected from the test chamber, packaged and annotated based on the location where they were
found in the chamber, and shipped to UF for characterization. [32]
A B
C D
Figure 3-1. Views of the DebriSat HVI Test Chamber. A) Downrange View of Test Chamber
Before Impact. B) Uprange View of Test Chamber After Impact. C) Downrange View of Test
Chamber After Impact. D) Example Fragment Collected in Test Chamber After Impact. Courtesy
of NASA.
39
At UF, the post-impact phase foam and debris fragment processing workflow consists of
three stages: detection, extraction, and characterization. Figure 3-2 provides a high-level
description of these three stages. [33] [34]
D
etec
tio
n
Foam Preparation
X-Ray Image Acquisition
Foam Panel Entry
Image Stitching
Fragment Detection
Post X-Ray Processing
Ex
tra
ctio
n
Foam Verification
Extraction Fragment Entry
Ch
ar
act
eriz
ati
on
Assessment
Material 2D/3D Color Shape
Measurement
Mass Size
Calculation
Density & Volume
ACSA AMR
Fragment Modification
Fragment Verification
Characteristic Length
Figure 3-2. High-Level Overview of the DebriSat Post-Impact Phase Workflow. Courtesy of
author.
40
In the detection stage, soft-catch foam panels are prepared for X-ray imaging by
collecting loose debris fragments on top of the panels, prepared foam panels are X-ray imaged,
and X-ray images are processed to detected embedded fragments in foam panels. Figure 3-3
overviews the detailed procedure for the detection stage including foam preparation and X-ray.
Open New Box or
Retrieve Next Panel
Check Box Contents
Foam Bundle or Full Panel
Remove Bundle/Panel from Box
Close Box and Label as Dust
Bags of Dust
Foam Chunks
Move Box to Dust/Fragment Room
Label Panel and Take Pictures
Collect, Bag, and Label Loose
Debris on Panel
Sweep and Bag Remaining Dust
X-Ray Prepared Panel
Run Object Detection on X-Ray Image
Chunk < 10 cm
Decompose and Extract Fragments
Yes
Database Entry
No
Chunk< 25 cm
Decompose and Extract Fragments
Yes
Process Foam Chunk
Create Mixed Panel
No
X-Ray Mixed Panel
Figure 3-3. Overview of Post-Impact Phase Detection Procedures. Courtesy of author.
There are three classes of foam for the detection stage: full panels (panels with at least
2/3 of its original size intact), broken panels (also referred to as foam chunks), and foam dust.
41
Currently, foam dust characterization is outside the scope of the UF DebriSat characterization
effort; therefore, foam dust is packaged and stored for later processing. Full panels are first
labeled with a Foam ID and pictures are taken of the top, bottom, and side faces of each panel.
Next, a grid is placed on each foam panel (see Figure 3-4A) to identify the region of the foam
(see Figure 3-4A) where loose debris down to 2 mm in size are collected, bagged, and labeled.
A B
Figure 3-4. Grid Used in Foam Panel Preparation. A) Foam Preparation Grid Usage on Foam
Panel. B) Full Foam Preparation Grid Definition Regions with Axes. Courtesy of Moises Rivero.
After full foam panels have been prepared, they are X-ray imaged to determine the
location of fragments embedded within the panel. Due to the size of the X-ray detector, a total of
12 individual X-ray images are taken and later stitched together to form a full X-ray image of
each panel. Once a full X-ray image is stitched together, it is converted into a binary image using
dynamic thresholding and a custom object detection algorithm is executed on the stitched binary
image to highlight embedded fragments. The first 6 X-ray images are taken on one half of a
panel. Then, the panel is rotated 180⁰ about an axis normal to the largest panel face, and the
remaining 6 X-rays are taken on the other half of the panel. Figure 3-5 shows an example of a
stitched binary image and the resulting processed binary image highlighting embedded debris
fragments.
42
A B
Figure 3-5. Stitched Foam Panel X-Ray Images. A) Stitched Binary X-Ray Image. B) Processed
Binary X-Ray Image Highlighting Embedded Debris Fragments. Courtesy of Bungo Shiotani.
For foam chunks (i.e., foam panels that have broken into small chunks), a slightly
different preparation and X-ray process is followed. Foam chunks are divided into three classes
based on their size (i.e., their largest dimension): less than 10 cm, between 10 cm and 25 cm, and
greater than 25 cm. For foam chunks less than 10 cm in size, foam information such as color,
density, pattern, etc. on the chunks is not recorded, the chunks are not X-ray imaged, and the
chunks are immediately decomposed to release debris fragments. All the debris fragments
collected from foam chunks less than 10 cm in size are considered loose fragments similar to
those collected on the surface of full foam panels during foam panel preparation. Foam
information on foam chunks greater than 10 cm in size is recorded before the chunks are
decomposed to release debris fragments. Furthermore, pictures are taken of foam chunks larger
than 10 cm in size. Foam chunks larger than 25 cm in size are collected to form a “faux” panel
that is later X-ray imaged to highlight the location of embedded fragments through the same X-
ray process used to X-ray full foam panels.
In the extraction stage of the DebriSat post-impact phase characterization process, debris
fragments embedded in full foam panels and foam chunks are extracted. For full foam panels, the
associated processed binary X-ray image (e.g., see Figure 3-6A) is retrieved and projected onto
the panel to highlight the identified locations of embedded debris fragments. Pins are placed in
43
the panel to physically mark the locations of the embedded debris fragments as shown in Figure
3-6B. Then the areas around these markers are decomposed to expose each debris fragment.
A B
Figure 3-6. Embedded Fragment Location Process During Extraction Stage. A) Stitched Binary
X-Ray Image Projected onto Panel to Highlight Embedded Fragment Locations. B) Use of Pins
to Physically Mark Locations of Embedded Fragments on the Panel. Courtesy of Bungo
Shiotani.
For foam chunks less than 25 cm in size, the foam chunks are decomposed individually to
expose embedded debris fragments. For foam chunks larger than 25 cm in size, a faux panel is
created and X-ray imaged. The extraction process for foam chunks larger than 25 cm in size is
the same as the extraction process for full foam panels. Figure 3-7 shows examples of embedded
debris fragments being extracted from foam chunks and full foam panels.
A B
Figure 3-7. Example of Extracted Debris Fragments. A) Foam Chunks. B) Foam Panels.
Courtesy of Moises Rivero.
44
After extraction, debris fragments are stored while they await characterization.
Eventually, the stored debris fragments are retrieved and enter the characterization stage of the
DebriSat post-impact phase workflow. The characterization stage involves three processes:
assessment, measurement, and calculation. During assessment, identification and location data
for each fragment is recorded and debris fragments are visually inspected to determine material,
shape, and color. Figure 3-8 outlines the assessment process.
Figure 3-8. Overview of the DebriSat Assessment Process. Courtesy of Bungo Shiotani.
The debris fragment entry established during the assessment process is used throughout
the rest of the DebriSat post-impact phase workflow in the measurement and calculation
characterization processes.
Login to
computer
and VPN
Login to
DCS
≥ 2 mm?
Enter Box #
information
Enter Source
information
Related
foam?
Select as
"3D"
fragment
Print barcode
No
Yes
Yes
No
Yes
No
Yes
No
No
Fragment Entry
& Assessment
DebriSat Fragment Entry & Assessment Process
Click "Add
Debris" on
DCS
"Too Small"
bin
No
YesSelect one
fragment
Enter
Foam ID
Height
≥ 3 mm?
Select as
"2D"
fragment
Is fragment
METAL?
Second
material?
Select
primary
material
Select "METAL"
as primary
material
Third
material?
Select 2nd
materialSelect 3rd
material
Select
shape
Select
color
Foam
attached?
Intact
part?
Select
"Foam
Attached"
Select
"Intact
Part"
Click on
"Add Debris"
No
Yes
No
Yes Yes
Yes
No
Place barcode
on bag
Place fragment
in "To be
Massed" bin
Broken?
Place fragment
in "Broken" bin
Broken
Fragment
Process
Close browserLog-off VPN
and computer
End of Work Session
Click on
"Logout"
End of
shift?
No
Yes
Start
End
45
Prior to the DebriSat post-impact phase, a comprehensive list of materials used in
DebriSat’s construction and their associated densities was created to categorize post-impact
debris. This list of materials is outlined in Table 3-1.
Table 3-1. List of Materials Used to Categorize Post-Impact Debris
Material Density (g/cm3)
Aluminum 2.700
Carbon Fiber Reinforced Polymer 1.550
Copper 8.938
Epoxy 1.050
Glass 2.510
Kapton Tape 1.420
Kevlar 1.440
Multi-Layered Insulation 0.772
Printed Circuit Board 1.860
Plastic 1.250
Solar Cells 5.320
Silicone 1.080
Stainless Steel 7.900
Titanium 4.400
Additionally, a comprehensive list of colors used in DebriSat’s construction was also
created to categorize post-impact debris. This list of colors is outlined in Table 3-2. The colors in
Table 3-2 are RGB-exact. Included in this list are the anodization colors of each of DebriSat’s
main bays. Finally, a list of debris fragment shape categories was developed based on small-scale
LHVI testing performed by the HVI group at NASA Johnson Space Center (JSC). These shape
categories are intended to cover all possible shapes of debris recovered during the post-impact
phase. Additionally, each shape category is also defined analytically. For example, a debris
fragment is considered a needle if its length is at least 7 times its width. Table 3-3 outlines these
six shape categories and provides example images of each category used for quick categorization
during the assessment substage.
46
Table 3-2. List of Colors Used to Categorize Post-Impact Debris
Color Example (RGB-Exact)
Black
Purple
Clear (Glass)
Red
Green
Royal Blue
Gold
Silver
Light Blue
White
Magenta
Yellow
Orange
Table 3-3. List of Shapes Used to Categorize Post-Impact Debris
Shape Example
Straight Rod/Needle/Cylinder
Bent Rod/Needle/Cylinder
Flat Plate
Bent Plate
Nugget/Parallelepiped/Spheroid
Flexible
47
After debris fragments are assessed for material, color, and shape, they enter the
measurement substage of characterization. There are two measurements taken in the
measurement substage: mass and size. During mass measurement, debris fragments are measured
on one of four different mass balances depending on their size and the required precision. Table
3-4 describes the four mass balances used during the mass measurement process.
Table 3-4. Mass Balances Used During the Mass Measurement Process
Mass Balance Capacity (g) Precision (g)
Microbalance (BM-22) 5.0 0.000001
Milligram Balance (PGL-203) 200.00 0.001
Milligram Balance (CY-510) 510.0 0.001
Centigram Balance (CG-6102) 6100.0 0.01
To automate the operation of the mass balances during the mass measurement process,
the mass balances are controlled through a custom graphical user interface (GUI) that
automatically configures and queries the selected mass balance. [33] The use of a GUI
minimizes human contact with the balances to preserve the accuracy of the balances, helps to
reduce human errors and streamline the measurement process, and enables additional
measurements for temperature and humidity to be recorded. Figure 3-9 shows a screenshot of the
mass measurement GUI.
Figure 3-9. Screenshot of the DebriSat Mass Measurement GUI. Courtesy of Bungo Shiotani.
48
Figure 3-10 outlines the full mass measurement process including balance calibration,
operation, and debris fragment handling.
Figure 3-10. DebriSat Mass Measurement Procedure. Courtesy of Bungo Shiotani.
During the size characterization process, debris fragments are categorized either as 2D or
3D fragments. Fragments categorized as 2D are either needle-shaped fragments whose length is
at least 7 times the width or flat-plate-shaped fragments whose thickness is less than 25% of the
length; fragments not fitting this criteria are characterized as 3D fragments.
To determine the length characteristics of 2D debris fragments, a custom 2D imaging
system was developed; it consists of a single point-and-shoot camera positioned directly above a
lighting stage to capture both frontlit and backlit images of debris fragments. [33] [34] [35] [36]
A calibration ring whose dimensions are known is also placed on the stage to provide a
measurement reference (i.e., pixel-to-mm conversion). Finally, a precision 90-degree wedge
mirror was located at the edge of the lighting stage to enable measurement of the height of both
49
debris fragments and the calibration ring. Similar to the mass measurement system, the 2D
imaging system is controlled through a GUI. The 2D imaging system and its associated GUI are
pictured in Figure 3-11 and the process is shown in Figure 3-12.
A B
Figure 3-11. DebriSat 2D Imaging System. A) 2D Imaging System Physical Apparatus. B) 2D
Imaging System Control GUI. Courtesy of Bungo Shiotani.
Figure 3-12. DebriSat 2D Size Measurement Process. Courtesy of Bungo Shiotani.
Login to
computer
and VPN
Open
Matlab
Open 2D
Imager GUI
Input
Gatorlink
and login
Scan Debris ID
to be processed
Carefully place fragment
and calibration ring on
imaging boat
Click
"Capture Images"
Objects in
frame?
Click
"Upload"
Return
fragment to
bag
End of
shift?
Close
Matlab
Close 2D
Imager GUI
Log-off VPN
and computer
Yes
Invalid
No
No
Yes
Setup
Measurement
End of Work Session
DebriSat Size Measurement Process (2D)
Image
focused?
Click "Connect
Camera"
Turn on camera and place
focus pattern on backlight
Click "Disconnect
Camera" and
turn camera off
Click
"Measure"
Yes
No
Valid
Click
"Disconnect
Camera"
Turn off
camera
Scan Debris ID
to be processed
Carefully place fragment
and calibration ring on
imaging boat
Place focus
pattern on
backlight
Edge
detection?
Height
detection?
Invalid
Valid
Start
End
50
Fragments categorized as 3D (i.e., debris fragments that do not meet the 2D fragment
criteria) are size characterized using an in-house developed 3D imaging system. The 3D imaging
system consisting of six cameras (labelled cameras A thru F) equally spaced along a 90⁰ arc and
a turntable stage to image fragments. [33] [34] [36] Debris fragments are placed on the turntable
stage and imaged at 21 azimuthal positions from all six cameras. The “01” azimuth angle is
imaged twice, once at the beginning of the imaging process and once at the end. Figure 3-13
shows the 3D imaging system physical apparatus.
Figure 3-13. DebriSat 3D Imaging System Physical Apparatus. Courtesy of Bungo Shiotani.
Figure 3-14 shows a schematic of the 3D imaging system camera and axis designations as
well as the different azimuth angles.
51
A B
Figure 3-14. 3D Imaging System. A) Camera and Axis Designations. B) Azimuth Angles.
Courtesy of Bungo Shiotani.
The 3D imaging system acquires images of the debris fragment on the stage at each
azimuth and elevation pair; to verify no movement of the fragment relative to the turntable stage
during the process, a total of 126 images of the fragment are acquired where the last six images
are compared to the first six. Once it has been verified that no motion of the fragment relative to
the turntable has occurred, a space-carving algorithm is used to produce a 3D representation of
the debris fragment. [36] The space-carving algorithm starts with a digital block and carves out
the silhouette of the debris fragment at each azimuth-elevation pair as shown in Figure 3-15.
A B
Figure 3-15. 3D Imaging System Space Carving. A) Visualization of Complete Space of
Azimuth-Elevation Pairs. B) Visualization of the Space Carving Technique. Courtesy of Bungo
Shiotani.
1: Bound object 2: Discretize
3: Project and Carve 4: Carve and iterate through all cameras
52
The 3D imaging system is also controlled using a custom GUI to automate the stage
rotation, image capture, and analysis steps of the 3D size characterization process. The 3D
imaging system GUI is shown in Figure 3-16.
Figure 3-16. DebriSat 3D Imaging System GUI. Courtesy of Bungo Shiotani.
Figure 3-17 outlines the complete 3D size characterization procedure utilizing the 3D
imaging system and associated GUI.
Figure 3-17. DebriSat 3D Size Measurement Process. Courtesy of Bungo Shiotani.
Login to
computer
and VPN
Open
Matlab
Open 3D
Imager GUI
Input
Gatorlink
and login
Scan Debris ID
to be processed
Carefully place fragment
and calibration ring on
imaging boat
Click
"Capture Images"
Objects in
frame?Click
"Upload"
End of
shift?
Close
Matlab
Close 3D
Imager GUILog-off VPN
and computer
Yes
Invalid
No
No
Setup
Measurement
End of Work Session
DebriSat Size Measurement Process (3D)
Image
focused?
Click "Connect
Cameras"
Turn on cameras and
place focus pattern on
turntable
Click "Disconnect
Cameras" and
turn cameras off
Click
"Measure"
Space-
carving?
Yes
No Valid
Click
"Disconnect
Cameras"
Turn off
cameras
Scan Debris ID
to be processed
Carefully place fragment
on center of turntable
Turn on lights
and cameras
Image
check?
No
Yes
Batch
process?
Yes
No
Place focus
pattern on
turntable
Yes
Return
fragment to
bag
Measure UploadRecord in
log
Batch process
Start
End
3D imager
calibrated?
Calibrate
3D imager
53
Upon completion of the space carving process, several physical size characteristics are
determined using the measurement data. These physical characteristics are characteristic length
(LC), average cross-sectional area (ACSA), area-to-mass ratio (AMR), bulk density, and volume.
Table 3-5 lists the equations used to compute these physical characteristics for both 2D and 3D
fragments. Data and distributions from these physical characteristics are some of the primary
factors that will be used to update the NASA and DOD breakup models.
Table 3-5. Equations Used to Calculate Physical Characteristics
Physical Characteristic 2D Equation 3D Equation
Characteristic Length 𝑋𝐷𝐼𝑀 + 𝑌𝐷𝐼𝑀 + 𝑍𝐷𝐼𝑀
3
𝑋𝐷𝐼𝑀 + 𝑌𝐷𝐼𝑀 + 𝑍𝐷𝐼𝑀
3
Average Cross-
Sectional Area
𝑝𝑖𝑥𝑒𝑙 𝑎𝑟𝑒𝑎
2+
𝑝𝑒𝑟𝑖𝑚𝑒𝑡𝑒𝑟 ∗ 𝑍𝐷𝐼𝑀
4 Average of Projected Areas. [37]
Area-to-Mass Ratio 𝐴𝐶𝑆𝐴/𝑚𝑎𝑠𝑠 𝐴𝐶𝑆𝐴/𝑚𝑎𝑠𝑠
Bulk Density 𝑚𝑎𝑠𝑠/𝑣𝑜𝑙𝑢𝑚𝑒 𝑚𝑎𝑠𝑠/𝑣𝑜𝑙𝑢𝑚𝑒
Volume 𝑝𝑖𝑥𝑒𝑙 𝑎𝑟𝑒𝑎 ∗ 𝑍𝐷𝐼𝑀 Volume of 3D Point Cloud. [36]
Finally, after debris fragments have completed the entire characterization process, they
are subject to a final verification process. During verification, debris fragments and their
associated recorded data are reviewed for accuracy by an independent technician (i.e., a
technician that has not worked with the fragment under review). This verifier ensures the
recorded assessment data (i.e., material, shape, and color) matches their visual inspection of the
debris fragment. The verifier also ensures the numerical records for mass, temperature, humidity,
dimensions, and calculations have the correct number of significant digits and that these numbers
are realistic for the size of the fragment under review. Once a debris fragment is verified, its
records are locked from further modification and the debris fragment is physically placed in
long-term storage.
54
In addition to verification, repeatability and reproducibility tests are conducted after
every 1,000 verified fragments to ensure the characterization process and equipment used during
the characterization process are functional properly. The repeatability and reproducibility tests
consist of subjecting a randomly-selected debris fragment from the previous 1,000 verified
debris fragments to the full characterization process for a second time. The test is either passed
or failed based on this accuracy.
Debris Categorization System Requirements
The activities performed in the post-impact phase of the DebriSat project involve: (i)
detection, (ii) assessment, and (iii) characterization, of fragments resulting from the DebriSat
LHVI test. The objectives of the DebriSat project are to collect, characterize, catalog, and store
100% of fragments with the largest dimension greater than or equal to 2 mm. Given these
activities and objectives, the DebriSat post-impact phase workflow and its sub-processes were
designed, developed, implemented, and tested. To facilitate the recording and flow of data
through the post-impact phase workflow, the Debris Categorization System (DCS) was designed,
developed, implemented, and tested. The design of the DCS began with a set of requirements
derived from the DebriSat post-impact phase goals and objectives. Table 3-6 lists the high-level
system requirements for the DCS.
The procedures in the post-impact phase workflow were designed and developed in
parallel with the DCS. The DCS was designed to complement the post-impact phase workflow
and provide a framework to guide the technicians through each assessment and characterization
process. Furthermore, the DCS was designed to track and store a wide range of additional
metadata on each recorded debris fragment and foam panel to create a detailed history and log of
each fragment and foam panel as they progress through the post-impact phase workflow. Perhaps
most importantly, the DCS was designed to ensure a high level of data integrity and permanence
55
of the DebriSat dataset. Since the DebriSat dataset will be utilized many years into the future, the
dataset must have good integrity and perpetuity.
Table 3-6. High-Level System Requirements for the DCS.
ID Requirement Description
1 Facilitate entry and recording of identification, assessment, and characterization data for
soft-catch foam panels used in the DebriSat HVI test.
2 Facilitate entry and recording of identification, assessment, and characterization data for
debris fragments produced by the DebriSat HVI test.
3 Facilitate verification and validation of all data.
4 Facilitate regular backups of all recorded data.
5 Secure access to all recorded data and allow only authorized users to add, modify, and
verify recorded data.
6 Record all actions executed on recorded data, when these actions were executed, and
who executed them.
7 Facilitate the centralized storage and permanence of recorded data and the transfer of
this data between stakeholder organizations.
Debris Categorization System Overview
Given the system requirements listed in Table 3-6, a top-level structural overview of the
DCS was generated. The DCS consists of two main layers: a front-end user interface layer and a
back-end service layer. Figure 3-18 illustrates this top-level structure.
Local Server – University of Florida Campus
Front-End User Interface
PHP Scripts to
Query Data
HTML and
JavaScript to Print
Forms and Tables
Back-End Services
MySQL Database
Foundation to
Execute Queries
Task-Scheduled
Windows Backup
Service Daily
Network Attached Storage – University of Florida Campus
Remote Storage Shares to Store Period Backups
Performed by Task-Scheduled Backup Service
Figure 3-18. Top-Level Structure of the DCS. Courtesy of author.
56
The font-end layer and back-end layer of the DCS are both hosted on the Space Systems
Group (SSG) server on the UF campus. The front-end layer of the DCS consists of a web-based
user interface built with hypertext markup language (HTML), hypertext preprocessor (PHP), and
JavaScript. This user interface provides data entry forms for each stage of the post-impact phase
workflow and several reporting and querying functionalities for basic data analysis. The back-
end layer of the DCS consists of a database engine used to store data and process queries. The
back-end layer also includes a data backup service, which clones the DebriSat dataset daily and
keeps a 14-day backup of these clones. Additionally, a network attached storage (NAS) device
located on the UF campus in a different building than the SSG server is used to store the most
recent 14 days of backed up data in an off-site location.
The physical DebriSat debris characterization process is conducted at an external facility
off the main UF campus. Because the DCS is hosted locally on the main UF campus on the UF
network behind the university firewall, a virtual private network (VPN) is used to connect
computers located at the external DebriSat processing facility to the UF network. Authorized
users access this VPN using their personal UF credentials. Figure 3-19 illustrates this access
scheme.
DebriSat Processing Site – Off Campus
DS-PROCESSING-A DS-PROCESSING-B
UF Firewall
Virtual Private Network
DCS Server – University of Florida Campus
Back-End Database Front-End Web Service
Figure 3-19. Access Control Scheme Between the DCS and the Off-Campus DebriSat
Characterization and Processing Facility. Courtesy of author.
57
The DCS was implemented in two phases, a rapid development phase and an operational
phase. At the beginning of the DebriSat post-impact activities, the primary goal was to get the
physical debris characterization procedures and the DCS implemented and online as quickly as
possible, so debris fragment processing could begin. During this rapid-development phase,
several convenient design decisions were made in the name of expediency rather than long term
scalability. For example, the MySQL database engine was the only database engine implemented
for the DCS and image data was stored using the indirect BLOB storage method. MySQL is
great for development environments with rapidly changing structures and fields; however, there
are some concerns about long term scalability with MySQL and indirect BLOB storage sacrifices
data integrity for speed. Furthermore, during the rapid-development phase, data field and format
requirements from the DebriSat stakeholders changed frequently as processes and procedures
were developed. The DCS was developed in parallel with these physical procedures, which
allowed these changes in data fields and formats to be implemented quickly and efficiently.
After the workflow of DebriSat post-impact activities was solidified (i.e., a complete sets
of requirements, data fields, and data formats were established), the DCS entered into the
operational phase. The primary goal of the operational phase is to ensure the longevity,
perpetuity, and integrity of the DCS and the DebriSat dataset. In the operational phase, the
expedient design decisions made earlier in the rapid-development phase were addressed. For
example, support for additional database engines was added to the DCS and the direct BLOB
storage method for image data was adopted and implemented.
58
CHAPTER 4
FRONT-END LAYER
Front-End Layer Overview
The DCS front-end layer is a web-based user interface written in HTML, PHP, and
JavaScript. The front-end user interface consists of 14 main user-facing pages: 6 control pages
and 8 data pages. Figure 4-1 outlines the structure of these 14 main pages.
login.php
home.php
activity.php foam.phpdebris.php
add_debris.php view_debris.php
logout.php
add_foam.php view_foam.php
edit_debris.php verify_debris.php edit_foam.php verify_foam.php
Control Page
Data Page
Figure 4-1. Primary Page Structure of the Front-End User Interface. Courtesy of author.
Users of the DCS first gain access to the system through the login page where they use
their UF Active Directory (UFAD) Gatorlink credentials to login. Once logged in, users are
greeted with the home page, which contains summary information on the system such as recent
activity and system announcements. From the home page, users can browse to the third-tier
control pages (i.e., the Debris page, Foam page, and Activity page). On the Activity page, users
can view and filter a full list of all the actions executed on the DCS (i.e., additions,
modifications, verifications, and deletions of debris and foam). On the Debris page, users can
view and filter a full list of all the records for debris fragments. From the Debris page, users can
59
also access the Add Debris and View Debris data pages. The Add Debris page allows users to
create a new debris fragment record. The View Debris page allows users to view all the data
associated with an individual debris record. From the View Debris page, users browse to the Edit
Debris and Verify Debris pages. On the Edit Debris page, users modify the data associated with
an individual debris record. On the Verify Debris page, users review and verify (i.e., lock
records) for individual debris records or submit comments on why a particular debris record
cannot be verified. From the Foam page, users view and filter a full list of all the records for soft-
catch foam panels and chunks. Similar to the debris data pages, users browse to the Add Foam,
View Foam, Edit Foam, and Verify Foam pages from the Foam page to perform the same
functions as the debris data pages for foam.
In addition to the 14 main pages of the DCS front-end user interface, there are several
auxiliary files, folders, and pages that are referenced in the 14 main pages. Table 4-1 outlines the
full source structure of the DCS front-end layer. The “barcode” folder contains modified third-
party source files used to generate barcodes used to label debris fragments throughout the
characterization process. The “images” folder contains interface images such as headers, logos,
and icons. The “styles” folder contains the Cascading Style Sheets (CSS) definition for the front-
end layer pages, which defines the spacing and sizing of various page elements such as message
boxes, page divisions, and form fields. The “debris_images” and “foam_images” folders are used
to store debris and foam images uploaded through the user interface and external measurement
systems. In the main front-end layer source folder, the “globals.php” file is used to store the
global variables used by other pages such as database engine login information and version
information. The “debris_form.php” and “foam_form.php” files contain the definitions of the
main forms used on the debris and foam data pages. The debris and foam data pages reference
60
these files to display the structure of the main form, then additional code unique to each data
page is used to modify and fill the referenced form. In this way, if an aspect of the main form
needs to be change (e.g., a new data field needs to be added), the form structure is changed in
one place and propagated across all pages. The “view_debris_blob.php” and
“view_foam_blob.php” files are used to retrieve BLOBs stored directly in the database and
display them to the user directly in the browser.
Table 4-1. Source Structure for the Front-End Layer.
File/Folder Description
barcode Folder used to store source files for barcode generation.
barcode.php Source file used for barcode generation.
debris_images Folder used to store uploaded debris-related images.
foam_images Folder used to store uploaded foam-related images.
images Folder used to store interface images such as headers and icons.
styles Folder used to store Cascading Style Sheets (CSS) configurations.
style.css Primary CSS configuration for the front-end interface.
activity.php Source file for the Activity control page.
add_debris.php Source file for the Add Debris data page.
add_foam.php Source file for the Add Foam data page.
debris.php Source file for the Debris control page.
debris_form.php Source file for the main form used on all debris data pages.
edit_debris.php Source file for the Edit Debris data page.
edit_foam.php Source file for the Edit Foam data page.
foam.php Source file for the Foam control page.
foam_form.php Source file for the main form used on all foam data pages.
footer.php Source file for the main footer used on all pages.
globals.php Source file used to store global variables such as database login data.
header.php Source file for the main header used on all pages.
home.php Source file for the Home page.
links.php Source file used for the main page links used on all pages.
login.php Source file used for the Login page.
logout.php Source file used to securely log users out of the DCS.
verify_debris.php Source file for the Verify Debris page.
verify_foam.php Source file for the Verify Foam page.
version.php Source file for the Version page to display the full version history.
view_debris.php Source file for the View Debris page.
view_debris_blob.php Source file used to retrieve and display directly-stored debris BLOBs.
view_foam.php Source file used for the View Foam page.
view_foam_blob.php Source file used to retrieve and display directly-store foam BLOBs.
61
All pages on the DCS front-end layer are written using a combination of HTML,
JavaScript, and PHP. HTML is used to define the structure and content of each page. JavaScript
is used to dynamically hide, show, or change HTML elements based on user selections. PHP is
used to connect to the database engine on the DCS back-end layer, retrieve data, and insert
retrieved data into HTML elements to be displayed to the user. PHP is also used to process form
submissions that insert, update, or delete data stored in the database engine.
Control Pages
The first control page DCS users encounter is the Login page. Figure 4-2 shows a
screenshot of the Login page.
Figure 4-2. Screenshot of the DCS Login Page. Courtesy of author.
62
On the Login page, users are prompted to login to the DCS using their UF Gatorlink
username and password. The login page uses PHP’s built-in support for the Lightweight
Directory Access Protocol (LDAP) in combination with the open-source OpenLDAP framework
to connect to the UFAD system using the user’s credentials. To establish communication with
UFAD’s self-signed system, the “TLS_REQCERT” parameter in OpenLDAP’s “ldap.conf”
configuration file is set to “never” to keep the LDAP client from checking the UFAD self-signed
certificate. The DCS does not store user credentials; rather, user credentials are passed through
directly to the UFAD system for authentication. If the user’s credentials are valid, the UFAD
system responds with some basic information about the user such as name and UFID number. If
the user’s credentials are invalid, the UFAD system responds with an error. Once the name and
UFID of the user is retrieved, this information is compared to the information stored in the
“dcs_users” table of the DCS database engine. If the “dcs_users” table contains a record with the
user’s matching name and UFID, the user is authorized to access the DCS. Furthermore, the
“dcs_users” table is queried to determine whether the user is an administrator or not. Once the
user is verified as an authentic user, the Login page creates a unique PHP session for the user and
stores the user’s information and default preferences such as Gatorlink username and
administrator status in cookie-based PHP session variables, which can be accessed globally
throughout the DCS. Finally, the Login page redirects the authenticated user to the Home page of
the DCS.
Once logged in, DCS users are greeted with the Home page. A screenshot of the DCS
Home page is shown in Figure 4-3. The Home page is the first page to present users with links to
the rest of the control pages (i.e., Activity, Debris, Foam, and Logout). The Home page also lists
the 5 most recent activity events executed on the DCS to provide a quick glance at the state of
63
the system. Clicking on the “View All Activity” button below the 5 most recent activity events
will redirect users to the main Activity control page. Additionally, the Home page lists system-
wide announcements stored in the “dcs_announcements” table of the DCS database engine that
users can post to communicate with all users using the DCS. Administrators have the option to
show or hide announcements posted by users by using the trash can icon in the lower right corner
of announcements or the “Unhide Hidden Announcements” button, which modifies the
“isHidden” parameter of announcements stored in the “dcs_announcements” table.
Figure 4-3. Screenshot of the DCS Home Page. Courtesy of author.
64
From the Home page, users can browse to either the Activity, Debris, or Foam control
pages. The Activity page provides a paginated list of every activity event executed on the DCS
filtered by index, the ID of the subject of the activity event (i.e., Debris ID or Foam ID), the type
of the subject of the activity event (i.e., Debris or Foam), the type of activity event (i.e., addition,
modification, verification, or deletion), the user who executed the activity event, and the
timestamp of when the activity event was executed. Figure 4-4 shows a screenshot of the activity
page.
Figure 4-4. Screenshot of the DCS Activity Page. Courtesy of author.
All activity events executed on the DCS are logged in the “dcs_activity” table of the DCS
database engine. The Activity page queries the data stored in the “dcs_activity” table to display
65
results similar to those shown in Figure 4-4. Because all the data stored in the “dcs_activity”
table is textual, it can be accessed and queried very quickly. Therefore, the Activity page queries
all rows and all columns of the “dcs_activity” table, calculates the total number of pages based
on the user’s selection for “Rows per Page,” and paginates the resulting dataset accordingly. As
shown in Figure 4-4, there were 203,724 activity events executed at the time of the screenshot.
Furthermore, the queried data can be filtered using the form fields located above the column
headers of the main table. For example, if a user wanted to filter the data for only the activity
associated with debris fragment DS134725, the user would enter “DS134725” in the “SUBJ. ID”
filter field and click the add filter icon on the right side of the filter fields. Figure 4-5 shows the
result of this example filter.
Figure 4-5. Screenshot of the Result of a Filter on the DCS Activity Page. Courtesy of author.
66
As shown in Figure 4-5, the debris fragment DS134725 had 5 activity events associated
with it. The activity data is filtered by adding “WHERE” clauses to the base query used to
retrieve all the activity data. For example, the base query to select all the activity data is
“SELECT * FROM `dcs_activity`” and the query to retrieve the filtered data shown in
Figure 4-5 is “SELECT * FROM `dcs_activity` WHERE `subjectId` LIKE
‘%DS134725%’.” These “WHERE” clauses are added to the main select query on the Activity
page when filter form fields are populated, and a filter is added. Filters can be removed by
clicking the remove filter icon directly underneath the refresh filter icon on the right side of the
filter form fields. Finally, each Subject ID is hyperlinked in the main table on the Activity page.
Clicking on a Subject ID hyperlink will redirect the user to the associated View Debris or View
Foam page, depending on the Subject Type.
The Debris page provides a paginated list of the records for each debris fragment in the
characterization process filtered by Debris ID, the box number where the debris fragment was
found, the user who last modified the record, the timestamp of when the record was modified,
and by the status of each record data category (i.e., identification, location, assessment, mass,
and measurement). A screenshot of the Debris page is shown in Figure 4-6. The Debris page
queries data from the “dcs_debris” table of the DCS database engine. On the Debris page, only
the current revision of each debris fragment record is queried; therefore, there is one row for
each unique debris fragment in the main table on the Debris page. Unlike the “dcs_activity” table
used on the Activity page, the “dcs_debris” table contains both textual and BLOB data;
therefore, querying the “dcs_debris” dataset must be done carefully to ensure results are retrieved
and displayed quickly. However, data related to each field of the “dcs_debris” table must be
queried to accurately compute the characterization status of each record to populate the status
67
indicator fields on the right side of the main table on the Debris page. To achieve quick response
time, only the relevant textual columns of the “dcs_debris” table are selected for each debris
record. Specifically, BLOB fields are not selected; rather, the associated textual backup path
fields for each BLOB field are queried instead. This results in a fully-textual dataset; however, if
a BLOB data field is missing it will not show up in the status calculation on the Debris page.
This tradeoff was made because querying BLOB data with the rest of the textual data for over
100,000 debris records would cause the response time of the Debris page to be untenably high.
Figure 4-6. Screenshot of the DCS Debris Page. Courtesy of author.
68
Once the fully-textual “dcs_debris” dataset is retrieved from the DCS database engine,
the total number of pages is calculated based on the user’s selection of “Rows per Page” and the
results are paginated accordingly in the same way the results on the Activity page are. As shown
in Figure 4-6, the total number of debris fragments recorded in the DCS at the time of the
screenshot was 144,177 debris fragments.
Figure 4-7. Screenshot of the Results of a Filter on the DCS Debris Page. Courtesy of author.
Similar to the Activity page, the Debris page has several filtering fields at the top of the
main table. In addition to textual filters for Debris ID, Box Number, Creator, and Timestamp, the
Debris page contains additional filter fields for the status indicators of each stage of the
69
characterization process. For example, the user could filter the data on the Debris page to only
show debris fragments that have completed up to the mass measurement stage of the
characterization process by checking the first three status indicator boxes for location,
assessment, and mass measurement. This filtering configuration and the resulting data is shown
in Figure 4-7. Green checks in the status fields indicate the debris fragment has completed that
stage of characterization while red X’s in the status fields indicate the debris fragment has not
completed that stage of characterization. Table 4-2 outlines the meanings of each of the status
indicator icons at the top of each of the status indicator columns in the main table on the Debris
page.
Table 4-2. Description of Status Indicator Icons on the DCS Debris Page.
Icon Description
Identification and location fields populated. Extraction stage completed.
Assessment fields populated. Assessment stage completed.
Mass measurement fields populated. Mass measurement completed.
Dimension fields populated. Dimension measurement completed.
Picture fields populated. Imaging completed.
Image processing fields populated. Image processing completed.
Verification status. Lock indicates record has been verified and locked.
Similar to the Activity page, all the Debris IDs are hyperlinked. Clicking on a Debris ID
hyperlink redirects the user to the associated View Debris page. In addition to the filtering form
fields, the Debris page contains a Debris ID field at the top of the page that is automatically
selected when a user browses to the page. This Debris ID field allows users to type in a Debris
ID or scan a Debris ID from a barcode and instantly be redirected to the View Debris page for
70
the entered Debris ID. Finally, from the Debris ID page, users can browse to the Add Debris data
page to insert new records for new debris fragments.
Finally, the Foam page provides a paginated list of the records for each soft-catch foam
panel filtered by Foam ID, the box the foam was found in, the last user to modify the foam
record, the timestamp of when the foam record was last modified, and by the status of each foam
record data category (i.e., identification, location, assessment, pictures, and X-ray). A screenshot
of the Foam page is shown in Figure 4-8.
Figure 4-8. Screenshot of the DCS Foam Page. Courtesy of author.
71
The Foam page queries data from the “dcs_foam” table of the DCS database engine.
Similar to the “dcs_debris” table, the “dcs_foam” table contains both textual and BLOB data. To
ensure quick data retrieval and response, the same method and tradeoff used on the Debris page
is used on the Foam page. All the textual data fields in the “dcs_foam” table are selected, and the
BLOB data fields are ignored. Instead, the textual backup path fields associated with each BLOB
field are used to determine the completion status of the characterization stages that utilize the
BLOB fields.
Figure 4-9. Screenshot of the Results of a Filter on the DCS Foam Page. Courtesy of author.
72
Similar to the Activity and Debris pages, the Foam page contains several filter fields at
the top of the main table. Also, as with the Debris page, the Foam page contains filter fields for
the various data categories associated with each foam record. For example, a user can filter the
foam results to show all the foam records that have been through all the characterization
procedures for foam but have not yet been verified by selecting all the status indicator filter
fields except the verification indicator. Figure 4-9 shows this filter configuration and the filtered
results. Table 4-3 outlines the meanings of each of the status indicator icons on the Foam page.
Table 4-3. Description of Status Indicator Icons on the DCS Foam Page.
Icon Description
Identification and location fields populated. Extraction stage completed.
Assessment fields populated. Assessment stage completed.
Picture fields populated. Imaging completed.
X-ray fields populated. X-ray completed.
Verification status. Lock indicates record has been verified and locked.
Similar to the Activity and Debris pages, the Foam page hyperlinks all the Foam IDs.
Clicking on a Foam ID hyperlink redirects the user to the associated View Foam page. Also, as
with the Debris page, the Foam page includes a Foam ID field at the top of the Foam page,
which allows users to quickly and directly browse to the View Foam page for a specified Foam
ID.
Data Pages
The Data pages of the DCS include the Add Debris, Add Foam, View Debris, View
Foam, Edit Debris, Edit Foam, Verify Debris, and Verify Foam pages. These DCS data pages are
structured and organized based on the DebriSat post-impact phase workflow and are designed to
73
facilitate the data entry and verification of the different tasks of the debris characterization
process. In general, the debris characterization process begins with the preparation and
processing of soft-catch foam panels. Figure 4-10 provides a high-level illustration of the foam
preparation process based on the DebriSat post-impact phase workflow emphasizing the role of
the DCS in the process.
Begin Foam Processing
Retrieve Foam Panel or
Chunk > 10 cm
Create New Foam Entry in
DCS with Location and
Assessment Information
Take Pictures of Foam
Panel or Chunk
Edit Foam Entry in DCS
with New Pictures of Foam
Panel or Chunk
Prepare Foam Panel or
Chunk > 25 cm for X-Ray
X-Ray Foam Panel or
Chunk > 25 cm
Edit Foam Entry in DCS
with New X-Ray ImagingVerify Foam Entry in DCS
Figure 4-10. High-Level Overview of Foam Processing with DCS Roles. Courtesy of author.
In Figure 4-10, the octagonal blocks are the subprocesses performed on the DCS. That is,
the DCS is used to (i) create new foam records with identification, location, and assessment
information, (ii) update existing foam records with pictures of foam panels or chunks, (iii) update
existing foam records with X-ray imaging data, and (iv) verify foam records. The complete
detailed procedure for foam processing on the DCS is shown in Figure 4-11. This detailed
process includes descriptions of the octagonal subprocesses from Figure 4-10.
74
Start
Enter Box Number
Select Foam Type
Select Section
Section 5?Select RowSelect Area
Foam Chunk?
Check Panel ID Available
Enter Panel ID Select Color Select Pattern
Select Foam Density
Upload Pictures
Panel ID? Select Color Select Pattern
Select Foam Density
Upload Pictures
Upload X-Ray Images
Chunk (L)?Upload X-
Ray ImagesVerify Foam
Verify Foam
End
Yes
No
YesNo
YesNo
No
Yes
Figure 4-11. Detailed Foam Processing Procedure on the DCS. Courtesy of author.
After a foam panel, pillar, or foam chunk larger than 10 cm in size is retrieved, a new
foam record in the DCS is created using the Add Foam page. Figure 4-12 shows a screenshot of
the Add Foam page. All the DCS data pages are based on either the “foam_form.php” or
“debris_form.php” files. Each data page references either “foam_form.php” or
“debris_form.php” to define the default structure of the main form used on all foam or debris
data pages. The form used on all foam data pages defined in “foam_form.php” is broken down
into 9 sections: “IDENTIFICATION,” “LOCATION,” “ASSESSMENT,” “RAW X-RAY
IMAGES,” “X-RAY PROCESSING SYSTEM,” “PROCESSED X-RAY IMAGES,”
“PROCESSED X-RAY DATA,” and “COMMENTS.” These 9 sections are implemented using
HTML fieldsets, which containerize tables and data defined in each section. Data pages such as
75
the Add Foam page import the default foam form and use JavaScript to modify the form for the
specific page’s application after the foam form has been imported. Furthermore, the default foam
form includes several built-in JavaScript functions used to dynamically show, hide, or update
form elements based on user selections. These JavaScript functions are typically executed when
a form element changed. For example, selecting an option in the Foam Type drop-down menu
will cause the “foam_type_check()” function to be executed and selecting an option in the
Section drop-down menu will cause the “section_check()” function to be executed. These
functions, in turn, modify HTML parameters such as “style” in various form elements to show,
hide, resize, or relocate the form elements. Furthermore, these functions can be called directly in
the page source to change form elements once imported.
When users create a new foam record for a retrieved foam panel or chunk, they are
required to populate all the fields in the “IDENTIFICATION,” “LOCATION,” and
“ASSESSMENT” sections of the Add Foam page. In the “IDENTIFICATION” section, the user
enters the box number from which the foam panel or chunk was retrieved. In the “LOCATION”
section, the user must first select the type of foam (i.e., panel, chunk between 10 cm and 25 cm,
chunk greater than 25 cm, or pillar). When users select the panel type, the Add Foam page
executes the “foam_type_check()” JavaScript function built into the main foam form. This
function checks the foam type the user selects and updates the form accordingly. When users
select the “Panel” foam type, the “Foam Panel ID Available” field is shown in the
“LOCATION” section, the “RAW X-RAY IMAGES” section is updated to remove the option to
select the number of raw X-ray images and all 12 raw X-ray image fields are shown because full
foam panels always produce 12 raw X-ray images, and the “PROCESSED X-RAY IMAGES”
76
section is updated to include a field for the stitched raw X-ray image, which is required when
there are more than 1 raw X-ray image.
Figure 4-12. Screenshot of the DCS Add Foam Page. Courtesy of author.
77
Figure 4-13A shows a screenshot of the updated foam form for the “Panel” foam type.
When users select the “Chunk (M)” foam type (i.e., for medium foam chunks between 10 cm and
25 cm in size), the “Foam Panel ID Available” field is hidden from the “LOCATION” section
because foam chunks do not show the Panel IDs and the “RAW X-RAY IMAGES,” “X-RAY
PROCESSING SYSTEM,” “PROCESSED X-RAY IMAGES,” and “PROCESSED X-RAY
DATA” sections are hidden because foam chunks between 10 cm and 25 cm in size are not X-
rayed. Figure 4-13B shows a screenshot of the updated foam form for the “Chunk (M)” foam
type.
A B
Figure 4-13. Updated Foam Form on the DCS Add Debris Page. A) Add Debris Page for the
“Panel” Foam Type. B) Add Debris Page for the “Chunk (M)” Foam Type. Courtesy of author.
78
When users select the “Chunk (L)” foam type (i.e., for large foam chunks larger than 25
cm in size), the “Foam Panel ID Available” field is hidden from the “LOCATION” section
because foam chunks do not show the Panel IDs and the X-ray sections are left to their default
state. The field to select the number of raw X-ray images is shown in the “RAW X-RAY
IMAGES” section because large foam chunks greater than 25 cm in size are organized into
“mixed” panels and X-rayed together; therefore, the number of X-ray images for the associated
foam chunk can vary depending on the size of the chunk and the “mixed” panel. Figure 4-14A
shows a screenshot of the updated foam form for the “Chunk (L)” foam type. Finally, when users
select the “Pillar” foam type, the “Foam Panel ID Available” field is shown in the “LOCATION”
section because foam “pillars” have Panel IDs written on them and the X-ray sections are left to
their default state. Similar to the large foam chunks and “mixed” panels, the number of raw X-
ray images of a foam “pillar” can vary depending on the size of the “pillar;” therefore, the
“Number of Raw X-Ray Images” field is shown in the “RAW X-RAY IMAGES” section. Figure
4-14B shows a screenshot of the updated foam form for the “Pillar” foam type.
For the “Chunk (L)” (i.e. foam chunks greater than 25 cm in size) and “Pillar” foam
types, the user can select the number of raw X-ray images to be between 1 and 12 in the “RAW
X-RAY IMAGES” section. If the “Number of Raw X-Ray Images” field is 1, only one raw X-
ray image upload field will be displayed and the “Stitched Raw X-Ray Image” field will be
hidden in the “PROCESSED X-RAY IMAGES” section. If the “Number of Raw X-Ray Images”
field is greater than 1, the selected number of raw X-ray image fields will be shown in the “RAW
X-RAY IMAGES” section and the “Stitched Raw X-Ray Image” field will be shown in the
“PROCESSED X-RAY IMAGES” section.
79
A B
Figure 4-14. Screenshots of the Updated Foam Form on the DCS Add Foam Page. A) Foam
Form with the “Chunk (L)” Foam Type. B) Foam form with the “Pillar” Foam Type. Courtesy of
author.
After users select the foam type, the section of the HVI test chamber where the foam
panel or chunk was found must be selected. This location is captured from information written
on the foam packaging during cleanup of the HVI test chamber. There are several available
options for the “Section” field such as “Front Door,” “Section 4,” and “Section 5.” DebriSat and
all the soft-catch foam panels were mounted in “Section 5” of the HVI test chamber before
impact. Furthermore, “Section 5” was divided into 5 rows beginning with “Row 5” at the
uprange end of “Section 5” and ending with “Row 1” at the downrange end of “Section 5.”
Additionally, the 5 rows in “Section 5” were broken down into 3 “areas:” “left,” “middle,” and
“right.” Figure 4-15 illustrates the organization of DebriSat and the soft-catch foam panels in the
HVI test chamber.
80
A B
Figure 4-15. Illustrations of Soft-Catch Foam Panel Layups. A) Foam Panels in the HVI Test
Chamber. B) Division of Foam “Areas” Relative to DebriSat in the HVI Test Chamber. Courtesy
of author.
In Figure 4-15A, the arrow represents the direction of the projectile during the HVI test.
In Figure 4-15B, the foam panel categories labeled “L” correspond to the “Left” area, “M”
corresponds to the “Middle” area, and “R” corresponds to the “Right” area. All foam collected
from “Section 5” must also include information on the “Row” and “Area.” When users select the
“Section 5” option in the “Section” data field, the Add Foam page executed the
“section_check()” JavaScript function imported with the main foam form to dynamically show
the required “Row” and “Area” fields based on the value in the “Section” field as shown in
Figure 4-16.
Figure 4-16. Screenshot of the “LOCATION” Section on the Add Foam Page Populated Data
and Showing Form Validation on the “Foam Panel ID” Field. Courtesy of author.
L
L
L
M
R
R
R
M
81
For “Panel” and “Pillar” foam types, users can select whether a Panel ID is written and
visible on the foam panel or pillar. If the user checks the “Foam Panel ID Available” checkbox,
the “foam_panel_id_check()” JavaScript function is executed to dynamically show the “Foam
Panel ID” text field where the user can enter a Panel ID. Panel IDs were written on each
individual foam “Panel” or “Pillar” before they were organized into bundles and mounted in the
HVI test chamber. Each Panel ID is encoded with location information on each panel. Panel IDs
are encoded using the format “SG-RrC-L.” Table 4-4 describes the encoding scheme.
Table 4-4. Panel ID Encoding Scheme.
Indicator Description
S Shot Number [1 / 2]
G Panel Group [L = Left / R = Right / F = Floor / C = Ceiling]
R Map Row (major) [1 - 5]
r Map Row (minor) [1 / 2]
C Map Column [1 – 8]
L Panel Layer [numbered from top of foam bundle to bottom (plywood)]
After the HVI test, Panel IDs written on foam panels may have been destroyed or obscured.
If the Panel ID is available and visible, it is recorded in the foam record; if not, the Panel ID is
omitted. In general, there are very few textual data fields on the DCS data pages. Whenever
possible, data fields are configured as drop-down menus to force users to select from a preset list
of options. However, for fields such as Panel ID, a complete list of available Panel IDs is not
available; therefore, users must enter the data manually. To ensure consistent entries with minimal
error, several form validation techniques are implemented on the DCS data pages to force users to
enter data in specific formats with specific lengths. For the Panel ID field on the foam data pages,
the HTML “pattern” attribute is used to define a set of acceptable textual patterns for the “Foam
Panel ID” text field. As shown in Figure 4-16, if a user enters a value that differs from the
prescribed length or pattern set, an error is displayed when the user attempts to submit the form.
82
After all of the fields in the “LOCATION” section of the Add Foam page are populated,
the “Color,” “Pattern,” and “Foam Density” fields in the “ASSESSMENT” section must be
filled. The “Color” field describes the overall color of the foam, the “Pattern” field describes any
patterns printed on the foam, and the “Foam Density” field describes the density of the foam.
The foam density increases with foam bundle layer. Table 4-5 lists the options for each of these
fields.
Table 4-5. Options for Foam Assessment Fields.
Color Pattern Foam Density
Green None Low
Tan Black Checkerboard Medium
Terra Cotta Black Diamond High
Gray Blue Diamond
Neutral Thin Black Clover
Thin Blue Clover
Blue Teardrop
Red Teardrop
Blue “H”
Black “H”
Red “H”
Once all of the data fields in the “IDENTIFICATION,” “LOCATION,” and
“ASSESSMENT” fields have been populated on the Add Foam page, users can submit the form
using the “Add Foam” button located at the bottom of the form to create the new foam record.
Once the “Add Foam” button is pressed, the form data populated in the main foam form is sent
the HTTP POST method to the web server, the action of this POST request is the
“add_foam.php” file. The source code of each DCS data page, including “add_foam.php,” is
structured very similarly for each page. Figure 4-17 outlines this code structure for the
“add_foam.php” file. The language used for each element of the code structure in Figure 4-17 is
color coded.
83
PHP HTML JavaScript
Global PHP Variables and Functions (imported from globals.php)
PHP Script to Check for Valid PHP Session
Header (imported from header.php)
Links (imported from links.php)
PHP Script to Process Add Foam POST Request
JavaScript Functions for Dynamic Form Modification (imported from foam_form.php)
Main Foam Form HTML Structure (imported from foam_form.php)
JavaScript Scripts to Modify the Structure of the Main Foam Form for the Add Foam Page
Footer (imported from footer.php)
Figure 4-17. Illustration of Basic Code Structure of the Add Foam Data Page. Courtesy of
author.
At the beginning of each file, there are some PHP scripts to import all the global
variables and ensure the user is logged in and has a valid PHP session active. Next, the header
and main links HTML definitions are imported from the “header.php” and “links.php” auxiliary
files. After the header and links, there is a PHP script to process POST requests from the Add
Foam form. After the POST request processing script, the main foam form is imported from the
“foam_form.php” file. This import includes all the JavaScript functions used to dynamically
change and validate form elements based on user selections and the HTML definition of the
main foam form structure. After the main foam form is defined, page-specific JavaScript is used
to modify the form structure specific to the Add Foam page. For example, the default foam form
structure defined in “foam_form.php” does not include an “Add Foam” button, so JavaScript is
used to add this button to the form. Finally, the HTML definition for the footer is imported from
the “footer.php” auxiliary file.
After the Add Foam form is submitted, the Add Foam page is refreshed with the new
POST request. Before the main foam form is loaded in the “add_foam.php” file, a PHP script to
84
process the data from the POST request is executed. This PHP script parses all of the foam form
data from the new POST request and creates a new record in the “dcs_foam” table of the DCS
database engine. When a new foam record is created, a new Foam ID is created to label the foam
panel, pillar, or chunk with. To avoid Foam ID conflicts if multiple foam records are created
concurrently, the newly created foam record only contains a randomly-generated unique ID, not
a Foam ID. This unique ID is generated by the database engine and is temporarily stored by the
PHP script in the “add_foam.php” file as the local variable named “uniqueId.” After the initial
foam record is created, the “dcs_foam” table is queried to find the next available Foam ID, and
the initial form record is updated with the new Foam ID and all the foam form data from the
POST request using the temporarily-stored unique ID. Because the unique IDs are generated by
the database engine, which can query the set of all existing unique IDs, it is guaranteed that these
unique IDs will never be repeated within the same table.
Once a foam record has been created for a foam panel, pillar, or chunk, the record will
appear in the results on the Foam page and all the data from the foam record will be available to
view on the View Foam page. The View Foam page for a foam record can be access either
through a hyperlink on the Activity page, a hyperlink on the Foam page, or by entering the
associated Foam ID in the “Foam ID” field on the Foam page. The View Foam page utilizes the
same main foam form used on the Add Foam page as well as some metadata related to the foam
record such as the record’s author, timestamp, and revision. This metadata is automatically
generated using data from the user’s session and from the database engine after the Add Foam
form is submitted. A screenshot of the View Foam page is shown in Figure 4-18. The screenshot
in Figure 4-18 shows a foam record immediately after its initial creation with the minimum
“IDENTIFICATION,” “LOCATION,” and “ASSESSMENT” fields populated.
85
Figure 4-18. Screenshot of the View Foam Page for a New Foam Record. Courtesy of author.
86
As shown in Figure 4-18, the View Foam page has an additional section called “USER
DATA,” which contains metadata about the specific revision of the foam record being shown.
That is, the “USER DATA” section lists the user who created the revision and the timestamp at
which the revision was created. Furthermore, when a foam record is verified, the “Verifier” and
“Verification Timestamp” fields are populated on all revisions of the foam record with the user
who verified the record and the timestamp at which the verification occurred. In the
“IDENTIFICATION” section, the View Foam page displays the “Foam ID” and “Revision #” of
the foam record in addition to the standard “Project” and “Box #” fields. All the fields on the
View Foam page are greyed out and locked for editing using the HTML “disabled” attribute.
Any unpopulated fields (e.g., the pictures fields and X-ray fields in Figure 4-18) are highlighted.
Finally, the “Edit Foam” and “Verify Foam” buttons are added at the end of the form to redirect
users to the Edit Foam and Verify Foam pages respectively. All these modifications to the
standard main foam form are implemented by first importing the form from “foam_form.php”
and using JavaScript after the import to add or modify form elements.
PHP HTML JavaScript
Global PHP Variables and Functions (imported from globals.php)
PHP Script to Check for Valid PHP Session
Header (imported from header.php)
Links (imported from links.php)
PHP Script to Process Verify Foam POST Request
PHP Script to Process Edit Foam POST Request
PHP Script to Query Database for Foam Record Data
JavaScript Functions for Dynamic Form Modification (imported from foam_form.php)
Main Foam Form HTML Structure (imported from foam_form.php)
JavaScript Scripts to Modify the Structure of the Main Foam Form for the View Foam Page
Footer (imported from footer.php)
Figure 4-19. Illustration of Basic Code Structure of the View Foam Data Page. Courtesy of
author.
87
The code structure of the “view_foam.php” file, shown in Figure 4-19, is similar to the
“add_foam.php” file; however, instead of a PHP script to process foam addition requests, there
are two PHP scripts to process foam modification and foam verification requests from the Edit
Foam and Verify Foam pages respectively and an additional PHP script to query the “dcs_foam”
table of the DCS database engine for the foam record’s data. The View Foam page processes
foam modification and verification requests so that users are automatically redirected to the
View Foam page to view the requested changes and/or status messages about the request. The
data from the foam record query PHP script is injected into the main foam form using a
combination of PHP and JavaScript after the foam form is fully imported and drawn.
After an initial foam record is created with “IDENTIFICATION,” “LOCATION,” and
“ASSESSMENT” data, the foam record must be edited to include pictures of the foam panel,
pillar, or chunk and X-ray images and data of the foam panel, pillar, or large chunk. Typically,
users upload foam pictures through the DCS while X-ray images and data are uploaded to the
DCS through an external X-ray management system; however, both pictures and X-ray images
can be uploaded through the DCS front-end layer. To edit an existing foam record, users click
the “Edit Foam” button at the bottom of the View Foam page for the foam record they are
attempting to edit. The user is then redirected to the Edit Foam page. A screenshot of the Edit
Foam page is shown in Figure 4-20. The Edit Foam page does not include the “USER DATA”
section as does the View Foam page; however, the Edit Foam page does show the “Foam ID”
and “Revision #” fields in the “IDENTIFICATION SECTION.” The “Foam ID” and “Revision
#” fields are disabled to prevent the user for modifying the values. Furthermore, the “Revision #”
field is automatically incremented to show the revision that will be added if the Edit Foam form
is submitted. Similar to the Add Foam and View Foam pages, the Edit Foam page imports the
88
main foam form from “foam_form.php.” Also, as with the “view_foam.php” file, the
“edit_foam.php” file includes a PHP function before the import of the main form to query the
“dcs_foam” table of the DCS database engine for the foam record’s data and a combination of
PHP and JavaScript after the foam form to populate each data field. Finally, the rest of the foam
form fields are enabled to allow the user to make and submit modifications to the foam record
data.
Once a user has made the necessary modifications to the foam record data (e.g., uploaded
pictures and X-ray images), the user clicks the “Submit New Revision” button to execute the
modification. The Edit Foam form then submits a POST request containing all the new foam
form data to the View Foam page where one of the PHP scripts in the “view_foam.php” file
processes the POST request. When the foam modification POST request is submitted to the
View Foam page, the foam modification PHP script parses the data from the POST request and
creates a new record in the “dcs_foam” table with the associated Foam ID and foam form data.
When images are uploaded through the foam modification POST request (i.e., pictures or X-ray
images), they are stored in a temporary location on the server while they wait to be processed.
During POST request processing, the uploaded files are first read into variables and escaped
using the PHP “fread()” and “addslashes()” functions. The resulting variables are binary
representations of the uploaded files, which can be uploaded to the associated BLOB fields on
the “dcs_foam” table of the DCS database engine. Additionally, the temporary files are also
renamed based on the Foam ID and file upload field and are copied to the “foam_images” folder
on the SSG server as a backup to the BLOB data stored in the DCS database engine. The textual
paths to these backups are also stored on the “dcs_foam” table of the DCS database engine. The
revision number of this new record is incremented by 1 from the previous foam record.
89
Furthermore, the “currentRevision” data field is set to “true” for the new revision and “false” for
all previous revisions to denote the new revision as the current revision of the foam panel, pillar,
or chunk.
Figure 4-20. Screenshot of the Edit Foam Page. Courtesy of author.
90
Figure 4-21. Screenshot of the View Foam Page with All Fields Populated. Courtesy of author.
91
Figure 4-21 shows a screenshot of the View Foam page after all the data fields have been
populated for a foam panel. For each file upload field, a button is shown that, what pressed,
redirects the user to the “view_foam_blob.php” page to display the uploaded BLOB data stored
on the “dcs_foam” table. Additionally, each file upload field includes a hyperlink to the backup
file stored in the “foam_images” folder next to the buttons.
At the top of the View Foam page, there is also a “View Revision” field, which allows
users to see all the previous revisions for a foam record. A screenshot of the “View Revision”
field is shown in Figure 4-22. Selecting a previous revision from this drop-down field will
refresh the View Foam page and show the foam record data for the selected revision.
Figure 4-22. Screenshot of the “View Revision” Field on the View Foam Page. Courtesy of
author.
Finally, once all the foam form fields have been populated, the foam record is ready for
verification. Verification cannot be performed by the user who created the current revision.
Therefore, if the user accessing the View Foam page is the same user who created the current
revision, the “Verify Foam” button at the bottom of the View Foam page remains disabled.
However, if the user accessing the View Foam page is not the same user who created the current
revision, the form is checked for completeness and the “Verify Foam” button is enabled. Once a
user clicks the “Verify Foam” button, they are redirected to the Verify Foam page. A screenshot
of the Verify Foam page is shown in Figure 4-23. The Verify Foam page includes a section with
92
instructions on how to review and verify a foam record. Similar to the Edit Foam page, the
Verify Foam page does not include the “USER DATA” section from the View Foam page;
however, the Verify Foam page does disable all foam form elements same as the View Foam
page to prevent modification. Finally, the Verify Foam page includes an additional “VERIFIER
COMMENTS” section where verifiers can add comments about errors in a foam record
preventing verification. Users can either click the “Verify Foam” button to verify the foam
record if all of the displayed data is correct or the “Submit Verifier Comments” to submit
comments if the foam record cannot be verified.
Figure 4-23. Screenshot of the Verify Foam Page for a Medium Foam Chunk. Courtesy of
author.
93
Once a foam record is verified, it is locked and can no longer be edited. After
verification, debris collected and extracted from the foam panel, pillar, or chunk can be
characterized. Also, the “USER DATA” section of the View Foam page is updated to show the
verifier and verification timestamp. A screenshot of the updated “USER DATA” section is
shown in Figure 4-24.
Figure 4-24. Screenshot of the Updated “USER DATA” Section After Verification. Courtesy of
author.
After foam processing, the debris characterization process begins on debris fragments
collected and extracted from processed foam panels, pillars, and chunks. Figure 4-25 provides a
high-level illustration of the debris characterization process based on the DebriSat post-impact
phase workflow emphasizing the role of the DCS in the process.
Bag Individual Debris,
Label with Location
Asses Debris
Material, Shape,
Color, Size
Edit Debris Entry in
DCS with new
Assesments
Measure Debris
Mass and Size
Edit Debris Entry in
DCS with new
Measurements
Retrieve a Bagged
Debris FragmentBegin Characterization
Extract or Collect
Debris from Foam
Create New Debris
Entry in DCS with
Location Information
Verify Debris
Entry
Figure 4-25. High-Level Overview of Debris Characterization Process with DCS Roles.
Courtesy of author.
94
In Figure 4-25, the rectangles with angled corners are the subprocesses performed on the
DCS. That is, the DCS is used to (i) create new debris records with identification and location
information, (ii) modify debris records with assessment information, (iii) modify debris records
with mass and size measurement information, and (iv) verify debris records. The detailed
procedure for foam processing on the DCS is shown in Figure 4-26.
StartEnter Box Number
Select Source
Related Foam?
Check Related Foam
Select Foam ID Select Section Section 5? Select Row
Select Area
SourceFoam Type
Select Primary Material
Select Grid Location
Second Material?
Select Second Material
Third Material?
Select Third Material
Select Shape
Select Color
Select Debris Type
Foam Attached?
Check Foam Attached
Intact Part?Check Intact
PartBroken
Fragment?Check Broken
FragmentUpload Mass
Measurements
Upload Size Measurements
Verify Debris Record
Upload Mass Measurements
Upload Broken Capture
End
Yes No Yes
No
Embedded
Loose or DustChunk (M)
Panel, Chunk (L), or Pillar
YesNo
YesNo
Yes
No
No Yes Yes
No
Figure 4-26. Detailed Debris Characterization Procedure on the DCS. Courtesy of author.
95
After debris fragments are collected, extracted, and bagged, users create an initial debris
record for each fragment using the Add Debris page. A screenshot of the Add Debris page is
shown in Figure 4-27.
Figure 4-27. Screenshot of the DCS Add Debris Page. Courtesy of author.
96
Similar to the DCS foam data pages, the debris data pages are use a main debris form
defined in the “debris_form.php” file. The main debris form consists of 9 sections implemented
as HTML fieldsets: “IDENTIFICATION,” “LOCATION,” “ASSESSMENT,”
“MEASUREMENT SYSTEMS,” “MEASUREMENTS,” “IMAGING CAPTURES,”
“IMAGING ANALYSES,” “IMAGING DATA,” and “COMMENTS.” Similar to the main foam
form, the main debris form includes built-in JavaScript functions to dynamically control various
form elements. When users create an initial debris record, they are required to populate all the
fields in the “IDENTIFICATION” and “LOCATION” sections. In the “IDENTIFICATION”
section, users first enter the box number where the debris fragment was found. In the
“LOCATION” section, users first select the “Source” type of the debris. There are three options
for the “Source” field: “Embedded,” “Loose,” and “Dust.” An “Embedded” fragment is one that
was embedded in and extracted from a foam panel, pillar, or chunk. A “Loose” fragment is one
that was collected from the surface of a foam panel or found on the ground in the HVI test
chamber. Finally, a “Dust” fragment is one found in the dust swept from the HVI test chamber or
from the dust swept from the foam preparation tables after foam panels are prepared for X-ray
imaging. After selecting an option in the “Source” field, users can select whether the debris
fragment is associated with a foam panel, pillar, or chunk with a foam record in the DCS by
checking the “Related Foam” checkbox. Most debris fragments will be associated with a foam
record; however, there are some cases where debris fragments are not associated with a foam
record. For example, an embedded debris fragment extracted from a foam chunk less than 10 cm
is size will be considered “Embedded,” but will not have a related foam record. Checking the
“Related Foam” checkbox calls the “related_foam_check()” JavaScript function imported with
the main debris form, which reveals the “Foam ID” and “Foam Type” fields as well as the “View
97
Foam” and “View X-Ray” buttons. Users then select the related Foam ID from the “Foam ID”
drop-down field. When a user selects a Foam ID from the “Foam ID” field, the “Box #,”
“Section,” and, if the section is “Section 5,” the “Row,” and “Area” fields are automatically
populated with data from the foam record associated with the selected Foam ID as shown in
Figure 4-28. If the debris fragment does not have a related foam record, the user must populate
the “Section” and, if the “Section” is “Section 5,” the “Row” and “Area” fields. Furthermore, if
the debris fragment “Source” is “Embedded” with a related foam record of type “Panel,”
“Pillar,” or “Chunk (L),” the “Foam Grid Location” field is shown, and the user is required to
select the grid location where the debris fragment was collected or extracted. The “Foam Grid
Location” field is also shown in Figure 4-28. If the debris fragment “Source” is not “Embedded”
or if the related “Foam Type” is “Chunk (M),” the “Foam Grid Location” field is not shown.
This case is shown in Figure 4-29.
Figure 4-28. Screenshot of the “IDENTIFICATION” and “LOCATION” Fields on the Add
Debris Page for an “Embedded” Debris Fragment Associated with a Foam “Panel” Record.
Courtesy of author.
98
Figure 4-29. Screenshot of the “IDENTIFICATION” and “LOCATION” Sections on the Add
Debris Page for an “Embedded” Debris Fragment Associated with a Medium Foam Chunk.
Courtesy of author.
The “View Foam” button redirects the user to the View Foam page for the selected
related Foam ID. The “View X-Ray” button redirects the user to the “view_foam_blob.php”
page to view the “Processed Binary X-Ray Image” BLOB.
After the “IDENTIFICATION” and “LOCATION” sections have been populated, a new
debris record is created by clicking the “Add Debris” button at the bottom of the Add Debris
page. Typically, assessment is also completed on debris fragments before the initial debris record
is submitted. In the “ASSESSMENT” section, users must first select a material from the
“Primary Material” field. If a debris fragment consists of multiple materials, up to three materials
may be selected by using the “Second Material” and “Third Material” fields. To ensure a
material cannot be selected multiple times in the material fields, each time either the “Primary
Material,” “Second Material,” or “Third Material” fields are modified, a check is performed by
another JavaScript function imported with the main debris form, which disables previously-
selected materials in the rest of the material fields. An example of this check to prevent material
repetition is shown in Figure 4-30. After material(s) are selected, the shape of the debris
according to the shape categories listed in Table 3-3 must be selected. Additionally, the color of
99
the debris according to the colors listed in Table 3-2 must be selected. Table 4-6 outlines the
options for material, shape, and color.
Figure 4-30. Example of Material Repetition Prevention on the DCS Add Debris Page. Courtesy
of author.
Table 4-6. Summary of Options for Debris Material, Shape, and Color Fields.
Material Shape Color
Aluminum (AL) Strt. Rod/Ndl/Cyl Black
Carbon Fiber Reinforced Polymer (CFRP) Bent Rod/Ndl/Cyl Clear
Copper (CU) Flat Plate Gold
EPOXY Bent Plate Green
GLASS Nug/Para/Sphe Light Blue
Kapton (KAP) Flexible Royal Blue
Kevlar (KEV) Magenta
Multi-Layer Insulation (MLI) Orange
Printed Circuit Board (PCB) Purple
PLASTIC Red
Solar Cell (SCEL) Silver
SIL (Silicone) White
Stainless Steel (SS) Yellow
Titanium (TI)
METAL
The “METAL” option for the material fields is used as a placeholder during assessment.
After mass and size measurements are performed on “METAL” debris fragments, the mass and
100
density are used to select a more specific metal. Furthermore, once a material is selected, a
placeholder density is inserted into the “Density” field. After material, shape, and color are
select, the “Debris Type” must be selected. Debris fragments are characterized as “2D” or “3D”
fragments depending on their size and shape. A 2D debris fragment is a “needle-like” fragments
whose length is at least 7 times its width or a “flat-plate” fragment whose thickness is less than
25% of its length. A 3D fragment is any other fragment that does not meet these criteria. 2D and
3D fragments are measured using two different external measurement systems, which generate
different images and files. Therefore, when the “Debris Type” field is modified, the “IMAGING
CAPTURES,” “IMAGING ANALYSES,” and “IMAGING DATA” sections are modified to
reflect the correct file upload fields for the associated external measurement system. The imaging
sections shown in Figure 4-27 are configured for 2D debris fragments. The imaging sections
shown in Figure 4-31 are configured for 3D debris fragments.
Figure 4-31. Imaging Sections on the DCS Add Debris Page Configured for 3D Fragments.
Courtesy of author.
101
After a “Debris Type” is selected during assessment, the optional “Foam Attached” field
is checked if there is foam fused or attached to the debris fragment. The “Intact Part” field is
checked if the debris fragment is a complete part from DebriSat. Finally, if the debris fragment
has broken into multiple pieces during storage or handling, the “Broken Fragment” field is
checked. For broken debris fragments, only mass measurements are taken. No size
measurements or imaging are taken. Therefore, the “MEASUREMENTS” field is modified to
show only mass measurement fields and the size measurement imaging fields are placed with the
“Broken Capture” field once the “Broken Fragment” field is checked. Figure 4-32 shows a
screenshot of the main debris form for a broken fragment.
All the fields in the “MEASUREMENT SYSTEMS” and “MEASUREMENTS” sections
of the main debris form are always disabled to prevent user modification. These fields are
automatically populated by the external mass and size measurement systems. Modification of
these fields requires use of the external measurement systems. Furthermore, while the file upload
fields can be used to manually upload size measurement images and files, these fields are also
automatically populated by the external size measurement system. The “MEASUERMENT
SYSTEMS” section contains metadata fields such as “Balance Type” and “Balance Software
Version” to track which external measurement systems were used to extract mass and size from
the debris fragment and their hardware and software states during the measurement process.
The “add_debris.php” code structure is the same as the “add_foam.php” code structure.
There is a PHP script used to handle an Add Debris POST request when the “Add Debris” button
is clicked. Additionally, the same method of using a unique ID to create an initial debris record,
then later updating the record with the debris form data is used on the debris data pages to avoid
Debris ID conflicts during concurrent debris record additions.
102
Figure 4-32. Screenshot of the Main Debris Form for Broken Fragments. Courtesy of author.
After a new debris record is created using the Add Debris page, a popup is generated that
redirects the user to the “barcode.php” page, which generates a barcode with the new Debris ID
for the user to print out and label the bag containing the debris fragment with.
Similar to the foam records, debris records can be viewed through the View Debris page.
The View Debris page can be accessed via hyperlinks on the Activity or Debris pages or via the
“Debris ID” field at the top of the Debris page. A screenshot of the View Debris page for a
debris fragment with completed identification, location, and assessment information is shown in
Figure 4-33.
103
Figure 4-33. Screenshot of the View Debris Page with Identification, Location, and Assessment.
Courtesy of author.
104
Similar to the View Foam page, the View Debris page contains a “View Revision” field
and the “USER DATA” section to display metadata about the debris record. At the top of the
View Debris page, the barcode generated after the initial debris record was created is shown and
can be reprinted by clicking on the barcode. All of the fields in the main debris form are disabled
to prevent user modification and the “Edit Debris” and “Verify Debris” links are available at the
bottom of the page to redirect the user to the Edit Debris and Verify Debris pages. The Edit
Debris and Verify Debris pages operate identically to the Edit Foam and Verify Foam pages,
using the main debris form instead of the main foam form. Figure 4-34 shows a screenshot of a
fully populated View Debris page.
Both the foam and debris data pages were designed to be modularized. For example, each
data page uses auxiliary files such as “header.php,” “links.php,” “foam_form.php,”
“debris_form.php,” and “footer.php” to generate the forms and structures on each page.
Additionally, all of the DCS data pages were designed to work with multiple database engines.
During the rapid-development phase of the DCS, only the MySQL database engine was
implemented. However, in the production phase, both MySQL and Microsoft SQL Server
(MSSQL) were implemented. Because the query languages for MySQL and MSSQL are
different, separate queries to retrieve foam or debris record data and to handle data additions and
modifications had to be written. To minimize code size and limit complexity, the same PHP
variable names and functions were used for both MySQL and MSSQL. For example, on the
View Debris page, the variable “$queryResult” is used to store the result of the “SELECT” query
to retrieve a debris record’s data. This value of this variable differs for MySQL and MSSQL;
however, the “call_user_func_array()” PHP function is used to dynamically select the correct
parsing function to use for the query result depending on whether MySQL or MSSQL is selected.
105
Figure 4-34. Screenshot of a Fully Populated View Debris Page. Courtesy of author.
106
CHAPTER 5
BACK-END LAYER
Back-End Layer Overview
The DCS back-end layer consists of a web server, file server, and a database engine.
These three services are hosted on the SSG server located on the UF campus. In addition to the
SSG server, there is a NAS device located on the UF campus in a different building to act as an
“off-site” backup of the data stored on the SSG server.
Web Server
File Server
Database Server
SSG Server
SSG Backup NAS
SMB
UF Network – UF Campus
Data Entry Workstations
Mass Measurement
System
Size Measurement
System
ODBC
UF VPN – Off-Campus Characterization Facility
HTTP
Figure 5-1. Illustration of the DCS Back-End Layer Structure. Courtesy of author.
The web server, file server, database engine, and backup server all communicate with
each other internally on the SSG sever and the UF network. At the DebriSat off-campus
characterization facility, there are several data entry workstations and several external mass and
107
size measurement systems, which communicate with the back-end layer hosted on the UF
campus through the UF VPN. The data entry workstations communicate directly with the web
server via HTTP. The web server hosts the DCS front-end layer user interface. The external mass
and size measurement systems communicate directly with the database engine through the Open
Database Connectivity (ODBC) standard. An ODBC connector installed on the measurement
system control workstations allows the systems to execute queries directly, without having to use
the front-end layer user interface. Figure 5-1 provides an illustration of this back-end layer
structuring.
The SSG server is a Dell PowerEdge T620 server with specifications listed in Table 5-1.
Table 5-1. Technical Specifications of the SSG Server
Specification Value
Model Dell PowerEdge T620
Operating System Microsoft Windows Server 2016
Processor (CPU) 12-Core Intel Xeon E5-2620 @ 2.10 GHz
Memory (RAM) 32 GB DDR3L @ 1600 MHz
Storage (HDD) 36 TB @ RAID-6 Double Parity
The SSG server holds a total of 12 drives. One of the drives is a 1 TB drive used to store
the operating system and application data. The remaining 11 drives are 4 TB drives configured in
a Redundant Array of Independent Disks (RAID)-6 array used to store DebriSat and DCS data.
The RAID-6 configuration is a double parity configuration, which allows 2 of the 11 drives to
fail without losing data. In addition to this RAID configuration, Windows Task Scheduler and
Windows Backup are used to create daily backups of all data stored on the SSG server to an “off-
site” backup NAS device. The backup NAS is a Synology DS1813+ NAS server. There are six 4
TB disks installed on the Synology DS1813+ in a RAID-5 configuration, which allows a single
disk failure before data is lost. The total available storage on the backup NAS is 18 TB. Daily
108
backups are stored for up to 14 days on the backup NAS, which allows up to 2-weeks for
recovery.
The web server hosted on the SSG server uses Windows Server 2016’s built-in Internet
Information Services (IIS) 10 web hosting service. The DCS front-end layer site is configured to
sever pages on the “dcs.ssg.mae.ufl.edu” host and to use anonymous authentication to allow IIS
users to view served pages. Additionally, PHP 7.1 is installed and configured for use with IIS
using the Common Gateway Interface (CGI) protocol. PHP is used to support database engine
connectivity on the DCS front-end layer interface. To enable uploads of large imaging files from
the front-end layer, several default settings in the “php.ini” configuration file are modified. The
“post_max_size” and “upload_max_filesize” parameters are changed to “0” from their default
values to allow unlimited-size file uploads. Furthermore, the “memory_limit” and
“max_allowed_packet” parameters are changed to “999M” from their default values to allow
large enough packets to be sent between the database engine and front-end layer for BLOB data
to be uploaded to and read from the database engine. Finally, the MySQL 5.7.18 and Microsoft
SQL Server 2016 database engines are installed to support storage and querying of DebriSat
data. MySQL is currently the primary database engine used for the DCS. The MySQL InnoDB
schema is used for all the DCS tables. The InnoDB schema was chosen because it balanced high
reliability and high performance. InnoDB is ACID-compliant, allows row-level locking, supports
“FOREIGN KEY” checks to maintain data integrity, and optimizes table storage on disk based
on primary keys and indices.
The default MySQL configuration settings are not sufficient to store the full table sizes
and row lengths; therefore, several settings are modified from their default values to support
larger table and row sizes. The “innodb_page_size” parameter is changed from the default “16K”
109
to “64K” to increase the size of InnoDB data pages. This allows larger row lengths, which
enables all BLOB data to be stored inline with textual data for each DCS record. Additionally,
the increased “innodb_page_size” requires an increased “buffer_pool_size” and
“innodb_log_file_size.” The “buffer_pool_size” is increased to “27G,” which allows up to 80%
of the RAM on the SSG server to be used for the InnoDB buffer pool. The
“innodb_log_file_size” is increased to “7G,” which is approximately 25% of the InnoDB buffer
pool size. A larger allocation for the InnoDB buffer pool increases performance and caching on
the MySQL database engine, which is needed to handle the direct storage of BLOBs in the DCS
tables.
Database Structure
The DCS database is divided into 5 tables used to store activity, announcements, debris
data, foam data, and user data. Table 5-2 describes each of the main DCS tables.
Table 5-2. Description of DCS Database Tables
Table Description
dcs_activity Table to store all activity events executed on the DCS.
dcs_announcements Table to store all announcements posted for view on the Home page.
dcs_debris Table to store all debris records.
dcs_foam Table to store all foam records.
dcs_users Table to store names, UFIDs, and statuses of authorized DCS users.
Each DCS database table is constructed from a series of data columns, which each new
row or record fill with unique data. All the DCS database tables are self-contained and do not
require references to external tables or databases. The Appendix contains tables that list the full
column structures for each of the DCS database tables. In the tables in the Appendix, the “Type”
column lists the data types for each DCS table column. The “int” type refers to the integer data
type, the “varchar” type refers to the variable character array type, the “datetime” type is a
110
special type for formatting timestamps, the “decimal” type refers to a the decimal-formatted data
type, and the “longblob” type refers to the longblob data type, which can hold BLOBS up to 4
GB in size. In the tables in the Appendix, the “Size” column lists the lengths of each DCS
database table column in characters. For example, a “varchar” with a length of “8” can hold a
string with up to 8 characters. The “Null” column lists whether the database field allows a value
of “NULL.” The “Default Value” column defines the default value used to fill each database
field if another value is not specified. A “Default Value” of “None” means the database field is
filled with an empty string of length 0 by default. Finally, the “Indexed” column lists whether
each database field is indexed by the database engine or not.
Most of the columns in the DCS database tables are “varchar” types and simply store
text. Numeric data fields such as “revisionNumber” or “xDimension_mm” use “int” or
“decimal” data types. However, in some cases, the varchar data type is used for numeric fields
such as “mass_g.” This is because the number of decimal places in the “mass_g” field changes
based on the mass balance used during mass measurement; therefore, the column cannot be
defined using a static “decimal” data type with a fixed number of decimal places. Additionally,
each measurement or calculation column includes the units in the name of the column. For
example, the AMR data field’s name is “areaToMassRatio_mm^2/g.”
Performance Characterization
During the rapid-development implementation phase of the DCS, one of the “convenient”
design decision made was to store imaging data from X-ray imaging and size measurement
indirectly by first storing the images on disk then inserting relative textual links to these files in
the DCS database. This decision enabled quick development and flexibility; however, it
decreased the integrity and perpetuity of the dataset. In the operational implementation phase of
the DCS, this design decision as revisited and the DCS was modified to store imaging data
111
directly in the DCS database with the rest of the textual data to ensure data integrity and
permanence. One of the tradeoffs of this decision was decreased query performance, especially
when selecting full table datasets or directly-stored imaging data.
To characterize the decrease in performance due to direct BLOB storage, a study was
conducted using a 2 MB test image to simulate a typically image produced by one of the size
measurement systems. Table 5-3 lists the query performance for a single upload of the test image
BLOB and 126 uploads of the test image BLOB to simulate the upload of a full 3D measurement
of a debris fragment.
Table 5-3. Query Execution Times for BLOB Upload.
Test Query Execution Time
Single Image BLOB Upload 2.09 sec
126 Image BLOB Upload 261.67 sec (4.5 min)
As shown in Table 5-3, the upload time for a single BLOB is relatively small and the
amount of time to upload n image blobs can be linearly extrapolated. Image BLOB uploading is
primarily performed by the 2D and 3D size measurement systems. To modularize the upload
process and to minimize overall upload time, the upload process implemented on the 2D and 3D
size measurement systems is divided into multiple queries. First, the textual data is inserted into
the DCS database. Next, the captured images are used to update the newly created revision.
Finally, the imaging analyses and data are used to complete the revision. For 2D size
measurements, there are only 2 image captures, 4 imaging analyses, and 2 imaging data uploads.
In general, the 2D size measurement upload process takes approximately 30 seconds to
complete. For 3D size measurement, however, there are 126 imaging captures, 3 imaging
analyses, and 1 very large imaging data upload. The 3D size measurement upload process takes
approximately 30 minutes to complete, which is 60 times longer than 2D size measurement. To
112
offset this increased upload time, 3D size measurements are uploaded in a batch process
executed overnight. Table 5-4 outlines the upload performance for a few batch processes.
Table 5-4. Query Execution Times for 3D Size Measurement Upload.
Date # of Uploads Debris ID(s) Upload Time
01/30/2018 1 DS142255 1050.02 sec (17.50 min)
01/30/2018 7 DS142251
DS142256
DS142260
DS142266
DS142274
DS142275
DS142276
1289.94 sec (21.49 min)
2253.56 sec (37.56 min)
2587.40 sec (43.12 min)
2701.36 sec (45.02 min)
2895.79 sec (48.26 min)
2963.62 sec (49.39 min)
4581.59 sec (76.36 min)
02/01/2018 3 DS142243
DS142260
DS142275
1052.77 sec (17.55 min)
964.98 sec (16.08 min)
986.37 sec (16.43 min)
As shown in Table 5-4, the average upload time for 3D size measurement imagery is
around 30 minutes. The variation in upload time can be attributed to the variation in size of the
debris fragments being measured. For example, a small debris fragment with a smaller point
cloud may upload quicker than a larger debris fragment with a large point cloud. These upload
times are acceptable for overnight batch processing and the use of batch uploading is one method
used to mitigate the performance impact of direct BLOB storage.
The original study to characterize query performance also considered the selection and
download of DCS database data. Using the same 2 MB test image, a full database selection query
on 100,000 rows was tested with various configurations of populated BLOB fields. First, the
query was tested with no BLOB fields populated. Next, the query was tested with 1 row
populated with 126 BLOB fields populated, simulating a fully characterized 3D debris fragment.
Then, the query was tested with 5 rows with 126 BLOB fields populated simulating 5 fully-
characterized 3D debris fragments. Finally, the query was modified to only select all columns
113
except the BLOB columns in a table with five rows with 126 BLOB fields populated. Table 5-5
lists the results of these four tests on the full database selection query.
Table 5-5. Query Execution Times for Full Database Selection with BLOB Fields.
Test Execution Time
Full Database Selection with No BLOB Fields Populated 0.01 sec
Full Database Selection with One Row with 126 BLOB Fields Populated. 14.69 sec
Full Database Selection with Five Rows with 126 BLOB Fields Populated. Timeout
Selection of All Columns Except BLOB Fields with Five Rows with 126
BLOB Fields Populated.
0.09 sec
As shown in Table 5-5, the query execution time increases significantly when rows with
populated BLOB fields were added. After just 5 rows with 126 populated BLOB fields are
added, the query times out and is not able to return a result. However, with targeted querying, the
non-BLOB textual fields can still be selected very quickly.
To test this phenomenon further, another query to select 5 specific rows with various
configurations of populated BLOB fields was tested. First, five specific rows were selected with
no BLOB fields populated. Next, 5 specific rows were select with 126 BLOB fields populated in
each row. Finally, 5 specific rows were selected with 126 BLOB fields populated in each row;
however, only the non-BLOB textual columns were selected. The results of this test are shown in
Table 5-6.
Table 5-6. Query Execution Times for Specific Row Selection with BLOB Fields.
Test Execution Time
5 Rows Selected with No BLOB Fields Populated 0.01 sec
5 Rows Selected with 126 BLOB Fields Populated in All Rows 56.63 sec
5 Rows Selected with 126 BLOB Fields Populated in All Rows, Only Non-
BLOB Textual Columns Selected.
0.01 sec
As shown in Table 5-6, 5 BLOB-populated rows can be selected in about a minute and
textual data can still be selected individual very quickly.
114
Finally, a query to select only the BLOB fields for various row configurations was tested.
First, only the BLOB columns were selected from 1 row with 126 BLOB fields populated. Next,
Only the BLOB columns were selected from 5 rows with 126 BLOB fields populated. Finally, a
single image was selected from 1 row with 126 BLOB fields populated. The results of this test
are shown in Table 5-7.
Table 5-7. Query Execution Times for BLOB Selection on Various Rows.
Test Execution Time
BLOB Column Selection from 1 Row with 126 BLOB Fields Populated. 16.41 sec
BLOB Column Selection from 5 Rows with 126 BLOB Fields Populated. 52.64 sec
Single BLOB Column Selection from 1 Row with 126 BLOB Fields
Populated.
0.17 sec
As shown in Table 5-7, all the BLOB columns can be selected from a single row in about
17 seconds, all the BLOB columns from 5 rows can be selected in about a minute, and a single
BLOB field can be selected in less than a second. Ultimately, the decreased performance from
direct BLOB storage can be mitigated by writing targeted queries that avoid selecting BLOB
columns if not absolutely necessary.
To test DCS database performance in a realistic data analysis application, a query to
select the mass and AMR for metal debris fragments was executed. The query picks 1,826 debris
records out of over 144,000 in less than a second with a total query execution time of 0.7066
seconds. The data from this query can be used to produce an insightful distribution of debris
fragments by mass and AMR as shown in Figure 5-2.
The testing of these various queries produced insights into how a decrease in
performance from direct BLOB storage could be mitigated. The tests targeted at full database
selection and the data analysis test query with populated BLOB fields revealed that the direct
selection of populated BLOB fields is the primary factor increasing query execution time.
115
Therefore, on the DCS front-end layer, all the large selection queries similar to those on the
Debris page were modified to select only textual fields, resulting in very quick performance for
the selection of the entire DebriSat debris dataset. Ultimately, a combination of clever query
structuring and table indexing allows the DCS to operate using the direct BLOB storage method
while maintaining a high level of querying performance.
Figure 5-2. Plot of Resulting Data from Data Analysis Application Test Query. Courtesy of
author.
1
10
100
1000
10000
100000
0.00001 0.0001 0.001 0.01 0.1 1 10 100 1000
Are
a-to
-Mas
s R
atio
(m
m^2
/g)
Mass (g)
Mass vs. Area-to-Mass Ratio by Metal Material - 2/22/18
AL
CU
METAL
SS
TI
116
CHAPTER 6
CONCLUSIONS AND FUTURE WORK
After the DebriSat LHVI test in 2014, there was an immediate need to establish a debris
characterization process and a data management solution. The DCS was designed, developed,
and implemented in parallel with the physical DebriSat post-impact phase workflow in response
to this need. This parallel design, development, and implementation created a synergistic effect
between the DCS and the post-impact phase characterization process. The structure of the DCS
was able to mirror the flow of the characterization process, which enables a high level of user
efficiency and a minimal human error component. The front-end, back-end dichotomy of the
DCS allows the segmentation and modularization of the system, which allows individual
modules and subsystems to be modified and upgraded independently. Additionally, the DCS was
able to successfully implement a storage and backup solution that ensures a high level of data
integrity and permanence that will allow the DebriSat dataset years into the future. The final
DebriSat dataset stored within the DCS will be the most comprehensive modern spacecraft
debris dataset ever recorded.
Moving forward, the design of the DCS can be further refined to increase the
performance of external measurement systems to help reduce the total time of the debris
characterization process. Furthermore, the current DCS codebase includes support for the
MSSQL database engine; however, this database engine has not been tested as extensively as the
MySQL database engine. In the future, sufficient testing can be performed with the MSSQL
database engine to enable usage of the DebriSat dataset in a wider variety of application
environments.
117
APPENDIX
DCS TABLE STRUCTURES
Table A-1. Column Structure of the “dcs_activity” Database Table.
Name Description Type Size Null Default Value Indexed
index Automatically-incrementing index for each row. int 11 No None Yes
project The project the activity occurred on. Options are DebriSat or DebrisLV. varchar 8 No None No
subjectId The ID of the subject of the activity event. Options are Foam ID or Debris
ID.
varchar 8 No None Yes
subjectType The type of the subject of the activity event. Options are Foam or Debris. varchar 6 No None Yes
action The type of activity executed. Options are ADD, EDIT, VOID, or VERIFY varchar 6 No None Yes
user The Gatorlink username of the user who executed the activity event. varchar 30 No None Yes
timestamp The timestamp at which the activity occurred. datetime N/A No CURRENT_TIMESTAM
P
No
Table A-2. Column Structure of the “dcs_announcements” Database Table. Name Description Type Size Null Default Value Indexed
index Automatically-incrementing index for each row. int 11 No None Yes
user The Gatorlink username of the user who posted the announcement. varchar 100 No None No
timestamp The timestamp at which the announcement was posted. datetime N/A No CURRENT_TIMESTAMP No
announcement The content of the posted announcement. varchar 1000 No None No
isHidden Value determines whether an announcement is hidden on the Home page. varchar 5 No false No
Table A-3. Column Structure of the “dcs_debris” Database Table.
Name Description Type Size Null Default Value Indexed
index Automatically-incrementing index for each row. int 11 No None Yes
project The associated project for the debris record. Options
are DebriSat or DebrisLV.
varchar 8 No None No
debrisId The Debris ID of the debris record. varchar 8 Yes NULL Yes
revisionNumber The revision number of the debris record. int 11 No 0 Yes
currentRevision Defines whether the row is the current revision of the
debris record.
varchar 5 No true Yes
boxNumber The box number where the debris fragment was
found.
varchar 5 No None Yes
118
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
creator The Gatorlink username of the user who created the
debris record or revision.
varchar 30 No None Yes
creationTimestamp The timestamp at which the debris record or revision
was created.
datetime N/A No CURRENT_TIMESTAMP No
verified The verification status of the debris record. varchar 5 No false Yes
verifier The Gatorlink username of the user who verified the
debris record.
varchar 30 Yes NULL Yes
verificationTimestamp The timestamp at which the debris record was
verified.
datetime N/A Yes NULL No
source The source type of the debris fragment. varchar 8 No None Yes
relatedFoam Whether or not a debris record has a related foam
record.
varchar 5 No false Yes
foamId The Foam ID of the related foam record. varchar 6 Yes NULL Yes
foamType The type of related foam record. varchar 6 Yes NULL Yes
section The section of the HVI test chamber where the debris
fragment was found.
varchar 11 No None Yes
row The row of “Section 5” in the chamber where the
debris fragment was found.
varchar 5 Yes NULL Yes
area The area of “Section 5” in the chamber where the
debris fragment was found.
varchar 6 Yes NULL Yes
foamGridLocation The foam grid location where the debris fragment was
collected or extracted.
varchar 2 Yes NULL Yes
primaryMaterial The primary material of the debris fragment. varchar 7 No None Yes
secondMaterial The secondary material of the debris fragment. varchar 7 No None Yes
thirdMaterial The tertiary material of the debris fragment. varchar 7 No None Yes
shape The shape of the debris fragment. varchar 17 No None Yes
identifyingColor The color of the debris fragment. varchar 10 No None Yes
debrisType Whether the debris fragment is a 2D or 3D fragment. varchar 2 No None Yes
foamAttached Whether there is foam attached to the debris fragment
or not.
varchar 5 No false Yes
intactPart Whether the debris fragment was found to be an intact
part from DebriSat.
varchar 5 No false Yes
brokenFragment Whether the debris fragment is a broken fragment or
not.
varchar 5 No false Yes
balanceType The type of mass balance used during mass
measurement.
varchar 7 Yes NULL Yes
balanceSoftwareVersion The software version of the mass measurement system
used during mass measurement.
varchar 100 Yes NULL Yes
119
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
xyMillimeterToPixelRatio The millimeter-to-pixel ratio used to calculate the x
and y dimensions during 2D size measurement.
varchar 5 Yes NULL No
zMillimeterToPixelRatio The millimeter-to-pixel-ratio used to calculate the z
dimension during 2D size measurement.
varchar 5 Yes NULL No
voxelResolution The voxel resolution used to calculate the x, y, and z
dimensions during 3D size measurement.
decimal 4,3 Yes NULL No
mass_g The mass of the debris fragment in grams. varchar 13 Yes NULL No
temperature_C The room temperature during mass measurement in
Celsius.
decimal 5,2 Yes NULL No
humidity_% The room humidity during mass measurement in
percent.
decimal 5,2 Yes NULL No
xDimension_mm The x dimension of the debris fragment in
millimeters.
decimal 6,3 Yes NULL No
yDimension_mm The y dimension of the debris fragment in
millimeters.
decimal 6,3 Yes NULL No
zDimension_mm The z dimension of the debris fragment in millimeters. decimal 6,3 Yes NULL No
charcteristicLength_mm The calculated characteristic length of the debris
fragment in millimeters.
decimal 6,3 Yes NULL No
volume_mm^3 The calculated volume of the debris fragment in mm3. decimal 9,3 Yes NULL No
density_g/mm^2 The calculated density of the debris fragment in
g/mm2.
decimal 12,6 Yes NULL No
averageCrossSectionalArea_mm^2 The calculated average cross-sectional area of the
debris fragment in mm2.
decimal 10,3 Yes NULL No
areaToMassRatio_mm^2/g The calculated area-to-mass ratio of the debris
fragment in mm2/g.
decimal 10,3 Yes NULL No
frontlitCapture The BLOB data for the frontlit image taken during 2D
size measurement.
longblob N/A Yes NULL No
frontlitCaptureBackupPath The relative path to the disk backup of the frontlit
image taken during 2D size measurement.
varchar 100 Yes NULL No
backlitCapture The BLOB data for the backlit image taken during 2D
size measurement.
longblob N/A Yes NULL No
backlitCaptureBackupPath The relative path to the disk backup of the backlit
image taken during 2D size measurement.
varchar 100 Yes NULL No
heightDetection The BLOB data for the height detection image created
during 2D size measurement.
longblob N/A Yes NULL No
heightDetectionBackupPath The relative path to the disk backup of the height
detection image created during 2D size measurement.
varchar 100 Yes NULL No
120
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
edgeDetection The BLOB data for the edge detection image created
during 2D size measurement.
longblob N/A Yes NULL No
edgeDetectionBackupPath The relative path to the disk backup of the edge
detection image created during 2D size measurement.
varchar 100 Yes NULL No
ringCalibration The BLOB data for the ring calibration image created
during 2D size measurement.
longblob N/A Yes NULL No
ringCalibrationBackupPath The relative path to the disk backup of the ring calib.
image created during 2D size measurement.
varchar 100 Yes NULL No
debrisAnalysis The BLOB data for the debris analysis image created
during 2D size measurement.
longblob N/A Yes NULL No
debrisAnalysisBackupPath The relative path to the disk backup of the debris
analysis image created during 2D size measurement.
varchar 100 Yes NULL No
pointCloud2D The BLOB data for the 2D point cloud created during
2D size measurement.
longblob N/A Yes NULL No
pointCloud2DBackupPath The relative path to the disk backup of the 2D point
cloud created during 2D size measurement.
varchar 100 Yes NULL No
regionProperties The BLOB data for the regionprops file created during
2D size measurement.
longblob N/A Yes NULL No
regionPropertiesBackupPath The relative path to the disk backup of the
regionprops file created during 2D size measurement.
varchar 100 Yes NULL No
captureA01 The BLOB data for the image at A01 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA01BackupPath The relative path to the disk backup of the image at
A01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB01 The BLOB data for the image at B01 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB01BackupPath The relative path to the disk backup of the image at
B01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC01 The BLOB data for the image at C01 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC01BackupPath The relative path to the disk backup of the image at
C01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD01 The BLOB data for the image at D01 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD01BackupPath The relative path to the disk backup of the image at
D01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE01 The BLOB data for the image at E01 taken during 3D
size measurement.
longblob N/A Yes NULL No
121
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureE01BackupPath The relative path to the disk backup of the image at
E01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF01 The BLOB data for the image at F01 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF01BackupPath The relative path to the disk backup of the image at
F01 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA02 The BLOB data for the image at A02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA02BackupPath The relative path to the disk backup of the image at
A02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB02 The BLOB data for the image at B02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB02BackupPath The relative path to the disk backup of the image at
B02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC02 The BLOB data for the image at C02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC02BackupPath The relative path to the disk backup of the image at
C02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD02 The BLOB data for the image at D02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD02BackupPath The relative path to the disk backup of the image at
D02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE02 The BLOB data for the image at E02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE02BackupPath The relative path to the disk backup of the image at
E02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF02 The BLOB data for the image at F02 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF02BackupPath The relative path to the disk backup of the image at
F02 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA03 The BLOB data for the image at A03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA03BackupPath The relative path to the disk backup of the image at
A03 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB03 The BLOB data for the image at B03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB03BackupPath The relative path to the disk backup of the image at
B03 taken during 3D size measurement.
varchar 100 Yes NULL No
122
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureC03 The BLOB data for the image at C03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC03BackupPath The relative path to the disk backup of the image at
C03 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD03 The BLOB data for the image at D03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD03BackupPath The relative path to the disk backup of the image at
D03 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE03 The BLOB data for the image at E03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE03BackupPath The relative path to the disk backup of the image at
E03 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF03 The BLOB data for the image at F03 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF03BackupPath The relative path to the disk backup of the image at
F03 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA04 The BLOB data for the image at A04 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA04BackupPath The relative path to the disk backup of the image at
A04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB04 The BLOB data for the image at B04 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB04BackupPath The relative path to the disk backup of the image at
B04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC04 The BLOB data for the image at C04 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC04BackupPath The relative path to the disk backup of the image at
C04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD04 The BLOB data for the image at D04 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD04BackupPath The relative path to the disk backup of the image at
D04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE04 The BLOB data for the image at E04 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE04BackupPath The relative path to the disk backup of the image at
E04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF04 The BLOB data for the image at F04 taken during 3D
size measurement.
longblob N/A Yes NULL No
123
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureF04BackupPath The relative path to the disk backup of the image at
F04 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA05 The BLOB data for the image at A05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA05BackupPath The relative path to the disk backup of the image at
A05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB05 The BLOB data for the image at B05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB05BackupPath The relative path to the disk backup of the image at
B05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC05 The BLOB data for the image at C05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC05BackupPath The relative path to the disk backup of the image at
C05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD05 The BLOB data for the image at D05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD05BackupPath The relative path to the disk backup of the image at
D05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE05 The BLOB data for the image at E05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE05BackupPath The relative path to the disk backup of the image at
E05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF05 The BLOB data for the image at F05 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF05BackupPath The relative path to the disk backup of the image at
F05 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA06 The BLOB data for the image at A06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA06BackupPath The relative path to the disk backup of the image at
A06 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB06 The BLOB data for the image at B06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB06BackupPath The relative path to the disk backup of the image at
B06 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC06 The BLOB data for the image at C06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC06BackupPath The relative path to the disk backup of the image at
C06 taken during 3D size measurement.
varchar 100 Yes NULL No
124
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureD06 The BLOB data for the image at D06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD06BackupPath The relative path to the disk backup of the image at
D06 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE06 The BLOB data for the image at E06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE06BackupPath The relative path to the disk backup of the image at
E06 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF06 The BLOB data for the image at F06 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF06BackupPath The relative path to the disk backup of the image at
F06 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA07 The BLOB data for the image at A07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA07BackupPath The relative path to the disk backup of the image at
A07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB07 The BLOB data for the image at B07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB07BackupPath The relative path to the disk backup of the image at
B07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC07 The BLOB data for the image at C07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC07BackupPath The relative path to the disk backup of the image at
C07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD07 The BLOB data for the image at D07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD07BackupPath The relative path to the disk backup of the image at
D07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE07 The BLOB data for the image at E07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE07BackupPath The relative path to the disk backup of the image at
E07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF07 The BLOB data for the image at F07 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF07BackupPath The relative path to the disk backup of the image at
F07 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA08 The BLOB data for the image at A08 taken during 3D
size measurement.
longblob N/A Yes NULL No
125
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureA08BackupPath The relative path to the disk backup of the image at
A08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB08 The BLOB data for the image at B08 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB08BackupPath The relative path to the disk backup of the image at
B08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC08 The BLOB data for the image at C08 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC08BackupPath The relative path to the disk backup of the image at
C08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD08 The BLOB data for the image at D08 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD08BackupPath The relative path to the disk backup of the image at
D08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE08 The BLOB data for the image at E08 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE08BackupPath The relative path to the disk backup of the image at
E08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF08 The BLOB data for the image at F08 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF08BackupPath The relative path to the disk backup of the image at
F08 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA09 The BLOB data for the image at A09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA09BackupPath The relative path to the disk backup of the image at
A09 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB09 The BLOB data for the image at B09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB09BackupPath The relative path to the disk backup of the image at
B09 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC09 The BLOB data for the image at C09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC09BackupPath The relative path to the disk backup of the image at
C09 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD09 The BLOB data for the image at D09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD09BackupPath The relative path to the disk backup of the image at
D09 taken during 3D size measurement.
varchar 100 Yes NULL No
126
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureE09 The BLOB data for the image at E09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE09BackupPath The relative path to the disk backup of the image at
E09 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF09 The BLOB data for the image at F09 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF09BackupPath The relative path to the disk backup of the image at
F09 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA10 The BLOB data for the image at A10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA10BackupPath The relative path to the disk backup of the image at
A10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB10 The BLOB data for the image at B10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB10BackupPath The relative path to the disk backup of the image at
B10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC10 The BLOB data for the image at C10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC10BackupPath The relative path to the disk backup of the image at
C10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD10 The BLOB data for the image at D10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD10BackupPath The relative path to the disk backup of the image at
D10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE10 The BLOB data for the image at E10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE10BackupPath The relative path to the disk backup of the image at
E10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF10 The BLOB data for the image at F10 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF10BackupPath The relative path to the disk backup of the image at
F10 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA11 The BLOB data for the image at A11 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA11BackupPath The relative path to the disk backup of the image at
A11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB11 The BLOB data for the image at B11 taken during 3D
size measurement.
longblob N/A Yes NULL No
127
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureB11BackupPath The relative path to the disk backup of the image at
B11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC11 The BLOB data for the image at C11 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC11BackupPath The relative path to the disk backup of the image at
C11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD11 The BLOB data for the image at D11 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD11BackupPath The relative path to the disk backup of the image at
D11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE11 The BLOB data for the image at E11 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE11BackupPath The relative path to the disk backup of the image at
E11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF11 The BLOB data for the image at F11 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF11BackupPath The relative path to the disk backup of the image at
F11 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA12 The BLOB data for the image at A12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA12BackupPath The relative path to the disk backup of the image at
A12 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB12 The BLOB data for the image at B12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB12BackupPath The relative path to the disk backup of the image at
B12 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC12 The BLOB data for the image at C12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC12BackupPath The relative path to the disk backup of the image at
C12 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD12 The BLOB data for the image at D12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD12BackupPath The relative path to the disk backup of the image at
D12 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE12 The BLOB data for the image at E12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE12BackupPath The relative path to the disk backup of the image at
E12 taken during 3D size measurement.
varchar 100 Yes NULL No
128
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureF12 The BLOB data for the image at F12 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF12BackupPath The relative path to the disk backup of the image at
F12 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA13 The BLOB data for the image at A13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA13BackupPath The relative path to the disk backup of the image at
A13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB13 The BLOB data for the image at B13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB13BackupPath The relative path to the disk backup of the image at
B13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC13 The BLOB data for the image at C13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC13BackupPath The relative path to the disk backup of the image at
C13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD13 The BLOB data for the image at D13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD13BackupPath The relative path to the disk backup of the image at
D13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE13 The BLOB data for the image at E13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE13BackupPath The relative path to the disk backup of the image at
E13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF13 The BLOB data for the image at F13 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF13BackupPath The relative path to the disk backup of the image at
F13 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA14 The BLOB data for the image at A14 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA14BackupPath The relative path to the disk backup of the image at
A14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB14 The BLOB data for the image at B14 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB14BackupPath The relative path to the disk backup of the image at
B14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC14 The BLOB data for the image at C14 taken during 3D
size measurement.
longblob N/A Yes NULL No
129
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureC14BackupPath The relative path to the disk backup of the image at
C14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD14 The BLOB data for the image at D14 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD14BackupPath The relative path to the disk backup of the image at
D14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE14 The BLOB data for the image at E14 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE14BackupPath The relative path to the disk backup of the image at
E14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF14 The BLOB data for the image at F14 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF14BackupPath The relative path to the disk backup of the image at
F14 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA15 The BLOB data for the image at A15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA15BackupPath The relative path to the disk backup of the image at
A15 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB15 The BLOB data for the image at B15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB15BackupPath The relative path to the disk backup of the image at
B15 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC15 The BLOB data for the image at C15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC15BackupPath The relative path to the disk backup of the image at
C15 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD15 The BLOB data for the image at D15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD15BackupPath The relative path to the disk backup of the image at
D15 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE15 The BLOB data for the image at E15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE15BackupPath The relative path to the disk backup of the image at
E15 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF15 The BLOB data for the image at F15 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF15BackupPath The relative path to the disk backup of the image at
F15 taken during 3D size measurement.
varchar 100 Yes NULL No
130
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureA16 The BLOB data for the image at A16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA16BackupPath The relative path to the disk backup of the image at
A16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB16 The BLOB data for the image at B16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB16BackupPath The relative path to the disk backup of the image at
B16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC16 The BLOB data for the image at C16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC16BackupPath The relative path to the disk backup of the image at
C16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD16 The BLOB data for the image at D16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD16BackupPath The relative path to the disk backup of the image at
D16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE16 The BLOB data for the image at E16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE16BackupPath The relative path to the disk backup of the image at
E16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF16 The BLOB data for the image at F16 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF16BackupPath The relative path to the disk backup of the image at
F16 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA17 The BLOB data for the image at A17 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA17BackupPath The relative path to the disk backup of the image at
A17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB17 The BLOB data for the image at B17 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB17BackupPath The relative path to the disk backup of the image at
B17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC17 The BLOB data for the image at C17 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC17BackupPath The relative path to the disk backup of the image at
C17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD17 The BLOB data for the image at D17 taken during 3D
size measurement.
longblob N/A Yes NULL No
131
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureD17BackupPath The relative path to the disk backup of the image at
D17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE17 The BLOB data for the image at E17 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE17BackupPath The relative path to the disk backup of the image at
E17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF17 The BLOB data for the image at F17 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF17BackupPath The relative path to the disk backup of the image at
F17 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA18 The BLOB data for the image at A18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA18BackupPath The relative path to the disk backup of the image at
A18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB18 The BLOB data for the image at B18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB18BackupPath The relative path to the disk backup of the image at
B18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC18 The BLOB data for the image at C18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC18BackupPath The relative path to the disk backup of the image at
C18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD18 The BLOB data for the image at D18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD18BackupPath The relative path to the disk backup of the image at
D18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE18 The BLOB data for the image at E18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE18BackupPath The relative path to the disk backup of the image at
E18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF18 The BLOB data for the image at F18 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF18BackupPath The relative path to the disk backup of the image at
F18 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA19 The BLOB data for the image at A19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA19BackupPath The relative path to the disk backup of the image at
A19 taken during 3D size measurement.
varchar 100 Yes NULL No
132
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureB19 The BLOB data for the image at B19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB19BackupPath The relative path to the disk backup of the image at
B19 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC19 The BLOB data for the image at C19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC19BackupPath The relative path to the disk backup of the image at
C19 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD19 The BLOB data for the image at D19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD19BackupPath The relative path to the disk backup of the image at
D19 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE19 The BLOB data for the image at E19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE19BackupPath The relative path to the disk backup of the image at
E19 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF19 The BLOB data for the image at F19 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF19BackupPath The relative path to the disk backup of the image at
F19 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA20 The BLOB data for the image at A20 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA20BackupPath The relative path to the disk backup of the image at
A20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB20 The BLOB data for the image at B20 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB20BackupPath The relative path to the disk backup of the image at
B20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC20 The BLOB data for the image at C20 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC20BackupPath The relative path to the disk backup of the image at
C20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD20 The BLOB data for the image at D20 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD20BackupPath The relative path to the disk backup of the image at
D20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE20 The BLOB data for the image at E20 taken during 3D
size measurement.
longblob N/A Yes NULL No
133
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
captureE20BackupPath The relative path to the disk backup of the image at
E20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF20 The BLOB data for the image at F20 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF20BackupPath The relative path to the disk backup of the image at
F20 taken during 3D size measurement.
varchar 100 Yes NULL No
captureA21 The BLOB data for the image at A21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureA21BackupPath The relative path to the disk backup of the image at
A21 taken during 3D size measurement.
varchar 100 Yes NULL No
captureB21 The BLOB data for the image at B21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureB21BackupPath The relative path to the disk backup of the image at
B21 taken during 3D size measurement.
varchar 100 Yes NULL No
captureC21 The BLOB data for the image at C21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureC21BackupPath The relative path to the disk backup of the image at
C21 taken during 3D size measurement.
varchar 100 Yes NULL No
captureD21 The BLOB data for the image at D21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureD21BackupPath The relative path to the disk backup of the image at
D21 taken during 3D size measurement.
varchar 100 Yes NULL No
captureE21 The BLOB data for the image at E21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureE21BackupPath The relative path to the disk backup of the image at
E21 taken during 3D size measurement.
varchar 100 Yes NULL No
captureF21 The BLOB data for the image at F21 taken during 3D
size measurement.
longblob N/A Yes NULL No
captureF21BackupPath The relative path to the disk backup of the image at
F21 taken during 3D size measurement.
varchar 100 Yes NULL No
xyPlane The BLOB data for the x-y plane image created
during 3D size measurement.
longblob N/A Yes NULL No
xyPlaneBackupPath The relative path to the disk backup of the x-y plane
image created during 3D size measurement.
varchar 100 Yes NULL No
xzPlane The BLOB data for the x-z plane image created
during 3D size measurement.
longblob N/A Yes NULL No
xzPlaneBackupPath The relative path to the disk backup of the x-z plane
image created during 3D size measurement.
varchar 100 Yes NULL No
134
Table A-3. Continued. Name Description Type Size Null Default Value Indexed
yzPlane The BLOB data for the y-z plane image created
during 3D size measurement.
longblob N/A Yes NULL No
yzPlaneBackupPath The relative path to the disk backup of the y-z plane
image created during 3D size measurement.
varchar 100 Yes NULL No
pointCloud3D The BLOB data for the 3D point cloud created during
3D size measurement.
longblob N/A Yes NULL No
pointCloud3DBackupPath The relative path to the disk backup of the 3D point
cloud created during 3D size measurement.
varchar 100 Yes NULL No
brokenCapture The BLOB data for the broken capture taken of
broken fragments.
longblob N/A Yes NULL No
brokenCaptureBackupPath The relative path to the disk backup of the broken
capture taken of broken fragments.
varchar 100 Yes NULL No
comments User comments on a debris record. varchar 500 Yes NULL No
verifierComments Comments from verifiers during verification. varchar 500 Yes NULL No
uniqueId A unique ID used during row insertion. varchar 36 No None Yes
Table A-4. Column Structure of the “dcs_foam” Database Table. Name Description Type Size Null Default Value Indexed
index Automatically-incrementing index for each row. int 11 No None Yes
project The project related to the foam record. Options
are DebriSat or DebrisLV.
varchar 8 No None Yes
foamId The Foam ID of the foam record. varchar 6 Yes NULL Yes
revisionNumber The revision number of the foam record. int 11 No 0 Yes
currentRevision Whether the row is the current revision of the
foam record.
varchar 5 No true Yes
boxNumber The box number where the foam was found. varchar 5 No None Yes
creator The Gatorlink username of the user who created
the foam record or revision.
varchar 30 No None Yes
creationTimestamp The timestamp at which the foam record or
revision was created.
datetime N/A No CURRENT_TIMESTAMP No
verified The verification status of the foam record. varchar 5 No false Yes
verifier The Gatorlink username of the user who verified
the foam record.
varchar 30 Yes NULL Yes
verificationTimestamp The timestamp at which the foam record was
verified.
datetime N/A Yes NULL No
135
Table A-4. Continued. Name Description Type Size Null Default Value Indexed
foamType The type of the foam. Options are panel, medium
chunk, large chunk, or pillar.
varchar 12 No None Yes
section The section of the test chamber where the foam was
found.
varchar 11 No None Yes
row The row of “Section 5” where the foam was found. varchar 5 Yes NULL Yes
area The area of “Section 5” where the foam was found. varchar 6 Yes NULL Yes
foamPanelIdAvailable Whether the foam has a visible Panel ID. varchar 5 No false Yes
foamPanelId The foam Panel ID. varchar 8 Yes NULL Yes
color The color of the foam. varchar 11 No None Yes
pattern The pattern printed on the foam. varchar 18 No None Yes
foamDensity The density of the foam. varchar 6 No None Yes
topPicture The BLOB data for the top picture taken of the foam. longblob N/A Yes NULL No
topPictureBackupPath The relative path to the disk backup of the top picture
of the foam.
varchar 100 Yes NULL No
sidePicture The BLOB data for the side picture taken of the foam. longblob N/A Yes NULL No
sidePictureBackupPath The relative path to the disk backup of the side picture
of the foam.
varchar 100 Yes NULL No
bottomPicture The BLOB data for the bottom picture taken of the
foam.
longblob N/A Yes NULL No
bottomPictureBackupPath The relative path to the disk backup of the bottom
picture of the foam.
varchar 100 Yes NULL No
numberOfRawXrayImages The number of uploaded raw X-ray images. int 11 Yes NULL No
rawXrayImage1 The BLOB data for the raw X-ray image 1 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage1BackupPath The relative path to the disk backup of the raw X-ray
image 1 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage2 The BLOB data for the raw X-ray image 2 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage2BackupPath The relative path to the disk backup of the raw X-ray
image 2 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage3 The BLOB data for the raw X-ray image 3 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage3BackupPath The relative path to the disk backup of the raw X-ray
image 3 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage4 The BLOB data for the raw X-ray image 4 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage4BackupPath The relative path to the disk backup of the raw X-ray
image 4 taken during X-ray imaging.
varchar 100 Yes NULL No
136
Table A-4. Continued. Name Description Type Size Null Default Value Indexed
rawXrayImage5 The BLOB data for the raw X-ray image 5 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage5BackupPath The relative path to the disk backup of the raw X-ray
image 5 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage6 The BLOB data for the raw X-ray image 6 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage6BackupPath The relative path to the disk backup of the raw X-ray
image 6 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage7 The BLOB data for the raw X-ray image 7 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage7BackupPath The relative path to the disk backup of the raw X-ray
image 7 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage8 The BLOB data for the raw X-ray image 8 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage8BackupPath The relative path to the disk backup of the raw X-ray
image 8 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage9 The BLOB data for the raw X-ray image 9 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage9BackupPath The relative path to the disk backup of the raw X-ray
image 9 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage10 The BLOB data for the raw X-ray image 10 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage10BackupPath The relative path to the disk backup of the raw X-ray
image 10 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage11 The BLOB data for the raw X-ray image 11 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage11BackupPath The relative path to the disk backup of the raw X-ray
image 11 taken during X-ray imaging.
varchar 100 Yes NULL No
rawXrayImage12 The BLOB data for the raw X-ray image 12 taken
during X-ray imaging.
longblob N/A Yes NULL No
rawXrayImage12BackupPath The relative path to the disk backup of the raw X-ray
image 12 taken during X-ray imaging.
varchar 100 Yes NULL No
xrayProcessingSoftwareVersion The software version of the X-ray imaging system used
during X-ray imaging.
varchar 100 Yes NULL No
stitchedRawXrayImage The BLOB data for the stitched raw X-ray image
created during X-ray processing.
longblob N/A Yes NULL No
stitchedRawXrayImageBackupPath The relative path to the disk backup of the stitched raw
X-ray image created during X-ray processing.
varchar 100 Yes NULL No
137
Table A-4. Continued. Name Description Type Size Null Default Value Indexed
binaryXrayImage The BLOB data of the binary X-ray image created
during X-ray processing.
longblob N/A Yes NULL No
binaryXrayImageBackupPath The relative path to the disk backup of the binary X-ray
image created during X-ray processing.
varchar 100 Yes NULL No
processedRawXrayImage The BLOB data of the processed raw X-ray image
created during X-ray processing.
longblob N/A Yes NULL No
processedRawXrayImageBackupPath The relative path of the disk backup of the processed
raw X-ray image created during X-ray.
varchar 100 Yes NULL No
processedBinaryXrayImage The BLOB data of the processed binary X-ray image
created during X-ray processing.
longblob N/A Yes NULL No
processedBinaryXrayImageBackupPath The relative path to the disk backup of the processed
binary X-ray image created during X-ray processing.
varchar 100 Yes NULL No
processedXrayData The BLOB data of the processed X-ray data produced
by the X-ray processing system.
longblob N/A Yes NULL No
processedXrayDataBackupPath The relative path to the disk backup of the processed
X-ray data produced by the X-ray imaging system.
varchar 100 Yes NULL No
comments User comments on the foam record. varchar 500 Yes NULL No
verifierComments Verifier comments created during verification. varchar 500 Yes NULL No
uniqueId A unique ID using foam record insertion. varchar 36 No None Yes
Table A-5. Column Structure of the “dcs_users” Database Table.
Name Description Type Size Null Default Value Indexed
index Automatically-incrementing index for each row. int 11 No None Yes
ufid The UFID of the user. varchar 8 No None No
name The full name of the user. varchar 100 No None No
gatorlinkUsername The Gatorlink username of the user. varchar 30 No None No
isAdministrator Whether the user is an administrator or not. varchar 5 No false No
138
LIST OF REFERENCES
[1] J. Liou and D. Shoots, “Fifty Years Ago,” Orbital Debris Quarterly News, vol. 17, no. 3,
pp. 2-3, Jul. 2013.
[2] D. Whitlock, E. Stansbery, and N. Johnson, “History of On-Orbit Satellite Fragmentations,
14th Edition,” National Aeronautics and Space Administration, Houston, TX, USA, Rep.
NASA/TM2008-214779, 2008.
[3] N. Johnson, E. Stansbery. J. Liou, M. Horstman, C. Stokely, and D. Whitlock, “The
Characteristics and Consequences of the Break-Up of the Fengyun-1C Spacecraft,”
presented at the 58th International Astronautical Congress (IAC), Hyderabad, India, Sep.
24-28, 2007.
[4] T. Kelso, "Analysis of the Iridium 33-Cosmos 2251 Collision," presented at the 10th
Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, HI, USA,
Sep. 2, 2009.
[5] P. Anz-Meador and D. Shoots, “Monthly Number of Objects in Earth Orbit by Object
Type,” Orbital Debris Quarterly News, vol. 22, no. 1, p. 10, Feb. 2018.
[6] D. Kessler and Burton G. Cour‐Palais, "Collision Frequency of Artificial Satellites: The
Creation of a Debris Belt," Journal of Geophysical Research: Space Physics, vol. 83, no.
A6, pp. 2637-2646, Jun. 1978.
[7] D. Kessler, N. Johnson, J. Liou, and M. Matney, “The Kessler Syndrome: Implications to
Future Space Operations,” presented at the 33rd Annual American Astronautical Society
(AAS) Guidance and Control Conference, Breckenridge, CO, USA, Feb. 6-10, 2010.
[8] C. Stokely, J. Foster, E. Stansbery, J. Benbrook, and Q. Juarez, "Haystack and HAX Radar
Measurements of the Orbital Debris Environment; 2003," National Aeronautics and Space
Administration, Houston, TX, USA, Rep. NASA/JSC-62815, 2006.
[9] J. Hyde, “As Flown Shuttle Orbiter Meteoroid/Orbital Debris Assessment, Phase I–
Shuttle/Mir Missions: STS 71, 76, 79, 81, 84, 86, 89 & 91,” National Aeronautics and
Space Administration, Houston, TX, USA, Rep. NASA/JSC-28768, 2000.
[10] J. Hyde, "As-Flown Shuttle Orbiter Meteoroid/Orbital Debris Assessment, Phase II,"
National Aeronautics and Space Administration, Houston, TX, USA, Rep. NASA/JSC-
29070, 2000.
[11] P. Krisko, M. Horstman, and M. L. Fudge, "SOCIT4 Collisional-Breakup Test Data
Analysis: With Shape and Materials Characterization," Advances in Space Research, vol.
41, no. 7, pp. 1138-1146, Oct. 2007.
[12] J. Edwards, "Range Safety Considerations Related to Atlas Tank Fragmentation for the
Penetration Aids Program for PMR," General Dynamics/Astronautics, West Falls Church,
VA, USA, Rep. GD/ASJ-0048 18, 1963.
139
[13] T. Bess, "Mass Distribution of Orbiting Man-Made Space Debris," National Aeronautics
and Space Administration, Washington, DC, USA, Rep. NASA/TN D-8108, 1975.
[14] W. Fucke and H. Sdunnus. "Population Model of Small Size Space Debris." Battelle-
Institut, Frankfurt, Germany, Rep. 9266/90/D/MD, pp. 67-698, 1993.
[15] R. Reynolds, A. Bade, P. Eichler, A. Jackson, P. Krisko, M. Matney, D. Kessler, and P.
Anz-Meador, "NASA Standard Breakup Model 1998 Revision." Lockheed Martin Space
Operations, Houston, TX, USA, Rep. LMSMSS-32532, 1998.
[16] T. Hanada, "Developing a Low-Velocity Collision Model Based on the NASA Standard
Breakup Model," Space Debris, vol. 2, no. 4, pp. 233-247, Jun. 2003.
[17] J. Liou and P. Anz-Meador, "An Analysis of Recent Major Breakups in the Low Earth
Orbit Region," National Aeronautics and Space Administration, Houston, TX, USA, Rep.
NASA/JSC-CN-19713, 2010.
[18] J. Liou, S. Clark, N. Fitz-Coy, T. Huynh, J. Opiela, M. Polk, B. Roebuck, R. Rushing, M.
Sorge, and M. Werremeyer, "DebriSat-A Planned Laboratory-Based Satellite Impact
Experiment for Breakup Fragment Characterizations," presented at the 6th European
Conference on Space Debris, Darmstadt, Germany, Apr. 22-25, 2013.
[19] E. Ausay, A. Cornejo, A. Horn, K. Palma, T. Sato, B. Blake, Frank Pistella, "A Comparison
of the SOCIT and DebriSat Experiments," presented at the 7th European Conference on
Space Debris, Darmstadt, Germany, Apr. 18-21, 2017.
[20] N. Jatana, S. Puri, M. Ahuja, I. Kathuria, and D. Gosain, "A Survey and Comparison of
Relational and Non-Relational Databases," International Journal of Engineering Research
& Technology, vol. 1, no. 6, 2012.
[21] E. Codd, "A Relational Model of Data for Large Shared Data Banks," Communications of
the ACM, vol. 13, no. 6, pp. 377-387, 1970.
[22] N. Leavitt, "Will NoSQL Databases Live Up to Their Promise?," Computer, vol. 43, no. 2,
pp. 12-14, 2010.
[23] J. Serra, “Relational Databases vs. Non-Relational Databases,” presented at the 20th Annual
Enterprise Data World (EDW) Conference, San Diego, CA, USA, Apr. 17-22, 2016.
[24] V. Ogle and M. Stonebraker, "Chabot: Retrieval from a Relational Database of Images,"
Computer, vol. 28, no. 9, pp. 40-48, 1995.
[25] N. Deshpande, K. Addess, W. Bluhm, J. Merino-Ott, W. Townsend-Merino, Q. Zhang, and
C. Knezevich, "The RCSB Protein Data Bank: A Redesigned Query System and Relational
Database Based on the mmCIF Schema," Nucleic Acids Research, vol. 33, no. 1, pp. D232-
D237, 2005.
140
[26] A. Bairoch and R. Apweiler, “The SWISS-PROT Protein Sequence Database and its
Supplement TrEMBL in 2000,” Nucleic Acids Res., vol. 28, no. 1, pp. 45–48, 2000.
[27] D. Comer, "Ubiquitous B-Tree," ACM Computing Surveys (CSUR), vol. 11, no. 2, pp. 121-
137, 1979.
[28] R. Sears, C. Van Ingen, and J. Gray, "To BLOB or Not to BLOB: Large Object Storage in
a Database or a Filesystem?," Microsoft Corporation, Redmond, WA, USA, Rep. MSR-
TR-2006-45, 2007.
[29] D. Beaver, S. Kumar, H. Li, J. Sobel, and P. Vajge, "Finding a Needle in Haystack:
Facebook's Photo Storage," OSDI, vol. 10, no. 2010, pp. 1-8. 2010.
[30] S. Muralidhar, W. Lloyd, S. Roy, C. Hill, E. Lin, W. Liu, and S. Pan, "f4: Facebook’s
Warm Blob Storage System," presented at the 11th USENIX Conference on Operating
Systems Design and Implementation, Broomfield, CO, USA, Oct. 6-8, 2014.
[31] A. Bigian, "Blobstore: Twitter’s In-House Photo Storage System," Twitter, Inc., San
Francisco, CA, USA, 2012.
[32] M. Rivero, J. Kleespies, K. Patankar, N. Fitz-Coy, J-C. Liou, M. Sorge, T. Huynh, J.
Opiela, P. Krisko, and H. Cowardin, "Characterization of Debris from the DebriSat
Hypervelocity Test," presented at the 66th International Astronautical Congress (IAC),
Jerusalem, Israel, Oct. 12-16, 2015.
[33] M. Rivero, B. Shiotani, M. Carrasquilla, N. Fitz-Coy, J. Liou, M. Sorge, T. Huynh, J.
Opiela, P. Krisko, and H. Cowardin, "DebriSat Fragment Characterization System and
Processing Status," presented at the 67th International Astronautical Congress (IAC),
Guadalajara, Mexico, Sep. 26-30, 2016.
[34] B. Shiotani, M. Rivero, M. Carrasquilla, S. Allen, N. Fitz-Coy, J. Liou, and T. Huynh,
"Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little
Time," presented at the 68th International Astronautical Congress (IAC), Adelaide,
Australia, Sep. 25-29, 2017.
[35] M. Moraguez, K. Patankar, N. Fitz-Coy, J. Liou, M. Sorge, H. Cowardin, J. Opiela, and P.
Krisko, "An Imaging System for Automated Characteristic Length Measurement of
DebriSat Fragments," presented at the 66th International Astronautical Congress (IAC),
Jerusalem, Israel, Oct. 12-16, 2015.
[36] B. Shiotani, T. Scruggs, R. Toledo, N. Fitz-Coy, J. Liou, M. Sorge, T. Huynh, J. Opiela, P.
Krisko, and H. Cowardin, "Imaging Systems for Size Measurements of DebriSat
Fragments," presented at the 68th International Astronautical Congress (IAC), Adelaide,
Australia, Sep. 25-29, 2017.
[37] T. Scruggs, “Average Cross-Sectional Area Calculation of DebriSat Fragments,” M.S.
thesis, Department of Electrical and Computer Engineering, University of Florida,
Gainesville, FL, 2017.
141
BIOGRAPHICAL SKETCH
Joe Kleespies grew up in Florida, where he was able to experience the wonder of the space
coast and shuttle launches from his backyard. In 2012, Joe began his undergraduate studies at the
University of Florida. Because of his interest in space, technology, and electronics, Joe chose to
pursue electrical engineering as his major. During his undergraduate studies, Joe made the Dean’s
List every semester, was a forward for UF club hockey team, and was a member of the Small
Satellite Design Club. In April 2014, Joe was invited to participate in the DebriSat HVI test through
the Small Satellite Design Club. After attending the HVI test and helping clean up the test chamber,
Joe became involved in the Space Systems Group where he would go on to continue his work with
DebriSat and lead and work on several other design projects such as SwampSat II and SABRE.
In May 2016, Joe graduated with his Bachelor of Science in electrical and computer
engineering from the University of Florida. In the Fall of 2016, Joe returned for his Master of
Science in electrical and computer engineering from the University of Florida. Joe is the project
lead for the SwampSat II mission, a CubeSat project that aims to characterize very low frequency
waves in the upper ionosphere, scheduled to launch in mid-2019. Joe has interned for Siemens
Industry, Inc. and Analytical Graphics, Inc. and he plans to take a position as a systems engineer
this summer.