situational awareness and visual analytics for emergency...

5
Situational Awareness and Visual Analytics for Emergency Response and Training Ross Maciejewski, SungYe Kim, Deen King-Smith, Karl Ostmo, Nicholas Klosterman Aravind K. Mikkilineni, David S. Ebert, Edward J. Delp, Timothy F. Collins Purdue University Regional Visualization and Analytics Center Purdue University West Lafayette, IN 47906 Email: rmacieje|inside|dkingsmi|kostmo|nkloster|amikkili|ebertd|ace|[email protected] Abstract—Many emergency response units are currently faced with restrictive budgets which prohibit their use of technology both in training and in real-world situations. Our work focuses on creating an affordable, mobile, state-of-the-art emergency response test-bed through the integration of low-cost, com- mercially available products. We have developed a command, control, communications, surveillance and reconnaissance system that will allow small-unit exercises to be tracked and recorded for evaluation purposes. Our system can be used for military and first responder training providing the nexus for decision making through the use of computational models, advanced technology, situational awareness and command and control. During a training session, data is streamed back to a central repository allowing commanders to evaluate their squads in a live action setting and assess their effectiveness in an after-action review. In order to effectively analyze this data, an interactive visualization system has been designed in which commanders can track personnel movement, view surveillance feeds, listen to radio traffic, and fast-forward/rewind event sequences. This system provides both 2-D and 3-D views of the environment while showing previously traveled paths, responder orientation and activity level. Both stationary and personnel-worn mobile camera video feeds may be displayed, as well as the associated radio traffic. I. I NTRODUCTION Most large-scale emergency operation center systems and tools are designed only to coordinate support and provide basic information for first responders and decision makers during a disaster. The next generation of systems needs to be a true visual analytics environment that supports planning, predic- tion, detection, mitigation, monitoring, and crisis management. This comprehensive environment requires integrated visual and traditional systematic analysis of massive data, including improved strategies for exploratory visual analysis, hypothesis testing and user-specific presentation of relevant information as a basis for actionable decision making. To this end, we have developed a new system incorporating the principles of visual analytics into a command and control environment for enhanced situational awareness. Visual analytics is defined as the science of analytical reasoning facilitated by interactive visual interfaces [1]. Our work facilitates the visual analytics framework by providing a command and control environment from which first responders may analyze real-time training scenarios and provide interac- tive feedback to participants. Our work integrates data from multiple sources into a seamless environment for tracking in- field personnel in an urban environment. Furthermore, this work needs to be available on displays with varying modalities to allow operatives both in the field and at the command center access to relevant data. As such, this work employs the use of both desktop PCs and mobile devices such as PDAs and cell-phones. Images of our application running on mobile devices can be seen in Figure 1. Currently, our work has focused on emergency response training in our local testbed facility. Our testbed consists of integrated video (stationary and active) surveillance and GPS to GPS-denied tracking. Our most recent training session consisted of an active shooter in a school scenario. In this scenario, our deployed visual analytics tool was used to direct participants in the scenario, as well as provide feedback during after-action review. Fig. 1. Our system running on mobile devices: Smartphone (left) OQO 02 device (right). II. RELATED WORK The primary thrust of this work is to create visual analytics tools for the enhancement of situational awareness. As such, relevant related work can be classified into two areas: visual- ization of sensor data and visual analytics on mobile devices. A. Visualization of sensor data In monitoring first responders for in-field training scenarios and real-world incidents, timely and accurate visualization of sensor data is a crucial component of situational awareness. Fan and Biagioni [2] described approaches to process and

Upload: others

Post on 10-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Situational Awareness and Visual Analytics for Emergency ...rmaciejewski.faculty.asu.edu/papers/2008/HST_MUTC.pdfmanagement technology, geographic information systems, web development

Situational Awareness and Visual Analytics forEmergency Response and Training

Ross Maciejewski, SungYe Kim, Deen King-Smith, Karl Ostmo, Nicholas KlostermanAravind K. Mikkilineni, David S. Ebert, Edward J. Delp, Timothy F. Collins

Purdue University Regional Visualization and Analytics CenterPurdue University

West Lafayette, IN 47906Email: rmacieje|inside|dkingsmi|kostmo|nkloster|amikkili|ebertd|ace|[email protected]

Abstract—Many emergency response units are currently facedwith restrictive budgets which prohibit their use of technologyboth in training and in real-world situations. Our work focuseson creating an affordable, mobile, state-of-the-art emergencyresponse test-bed through the integration of low-cost, com-mercially available products. We have developed a command,control, communications, surveillance and reconnaissance systemthat will allow small-unit exercises to be tracked and recordedfor evaluation purposes. Our system can be used for militaryand first responder training providing the nexus for decisionmaking through the use of computational models, advancedtechnology, situational awareness and command and control.During a training session, data is streamed back to a centralrepository allowing commanders to evaluate their squads in alive action setting and assess their effectiveness in an after-actionreview. In order to effectively analyze this data, an interactivevisualization system has been designed in which commanderscan track personnel movement, view surveillance feeds, listento radio traffic, and fast-forward/rewind event sequences. Thissystem provides both 2-D and 3-D views of the environmentwhile showing previously traveled paths, responder orientationand activity level. Both stationary and personnel-worn mobilecamera video feeds may be displayed, as well as the associatedradio traffic.

I. INTRODUCTION

Most large-scale emergency operation center systems andtools are designed only to coordinate support and provide basicinformation for first responders and decision makers during adisaster. The next generation of systems needs to be a truevisual analytics environment that supports planning, predic-tion, detection, mitigation, monitoring, and crisis management.This comprehensive environment requires integrated visualand traditional systematic analysis of massive data, includingimproved strategies for exploratory visual analysis, hypothesistesting and user-specific presentation of relevant informationas a basis for actionable decision making. To this end, wehave developed a new system incorporating the principles ofvisual analytics into a command and control environment forenhanced situational awareness.

Visual analytics is defined as the science of analyticalreasoning facilitated by interactive visual interfaces [1]. Ourwork facilitates the visual analytics framework by providing acommand and control environment from which first respondersmay analyze real-time training scenarios and provide interac-tive feedback to participants. Our work integrates data from

multiple sources into a seamless environment for tracking in-field personnel in an urban environment. Furthermore, thiswork needs to be available on displays with varying modalitiesto allow operatives both in the field and at the command centeraccess to relevant data.

As such, this work employs the use of both desktop PCsand mobile devices such as PDAs and cell-phones. Imagesof our application running on mobile devices can be seenin Figure 1. Currently, our work has focused on emergencyresponse training in our local testbed facility. Our testbedconsists of integrated video (stationary and active) surveillanceand GPS to GPS-denied tracking. Our most recent trainingsession consisted of an active shooter in a school scenario. Inthis scenario, our deployed visual analytics tool was used todirect participants in the scenario, as well as provide feedbackduring after-action review.

Fig. 1. Our system running on mobile devices: Smartphone (left) OQO 02device (right).

II. RELATED WORK

The primary thrust of this work is to create visual analyticstools for the enhancement of situational awareness. As such,relevant related work can be classified into two areas: visual-ization of sensor data and visual analytics on mobile devices.

A. Visualization of sensor data

In monitoring first responders for in-field training scenariosand real-world incidents, timely and accurate visualization ofsensor data is a crucial component of situational awareness.Fan and Biagioni [2] described approaches to process and

Page 2: Situational Awareness and Visual Analytics for Emergency ...rmaciejewski.faculty.asu.edu/papers/2008/HST_MUTC.pdfmanagement technology, geographic information systems, web development

interpret data gathered by sensor networks for geographicinformation systems. These approaches combine databasemanagement technology, geographic information systems, webdevelopment technology, and human computer interactive de-sign to visualize the data gathered by wireless sensor networks.Koo et al. [3] designed software to analyze multi-sensor datafor pipeline inspection of gas transmission. The informationgathered by sensors is parsed and converted before it is savedin a database. Gunnarsson et al. [4] introduced a mobileaugmented reality prototype for visual inspection of hiddenstructures on mobile devices using data collected from a mo-bile wireless sensor network. Pattath et al. [5] implemented aninteractive visual analytic system to visualize sensor networkdata on PDAs.

B. Visual analytics on mobile devices

While the visualization of sensor data is a crucial compo-nent, our work specifically focuses on the presentation andanalysis of data on handheld devices. Such an applicationis different from visual analytics on common desktop sys-tems because of the restricted display space and computingresources of mobile devices. Furthermore, we also focus onsimplified views for enhanced situational awareness. Whilemuch work on visualization on mobile devices has focusedon 3D rendering, 2D graphics and visualization can be justas efficient in the case of information visualization. In fact,there are many applications that utilize 2D capabilities ofmobile devices in fields such as geographic information sys-tems [6], [7], entertainment, education, business, and industry.Moreover, OpenVG [8] and Mobile 2D Graphics (M2G) areboosting the development of 2D applications that are scalableacross any screen size.

Further work on visualization on mobile devices extendstheir capabilities from a visualization tool to a visual analyticstool. Sanfilipa et al. [9] introduced InfoStar [10], an adaptivevisual analytics platform for mobile devices. Their work wasdemonstrated at Super Computing 2004 (SC2004) providinginformation such as maps, schedules, exhibitor lists, and visualexploration tools to conference attendees. Similarly, the workby Pattath et al. [5] provided a visual analytic tool for thevisualization of network and sensor data gathered from PurdueRoss-Ade Stadium during football games. Finally, Herrero etal. [11] presented an intrusion detection system for visualanalysis of high-volume network traffic data streams on mobiledevices.

III. SYSTEM DESIGN

Figure 2 shows the abstraction of our system structure. Oursystem integrates commodity hardware products into a real-time tracking and surveillance module for use in emergencyresponse training. We primarily focus on the utilization ofvarious types of datasets such as images/videos, 3D models,sensor data, and radio traffic. Wearable components in thesystem were chosen based on factors of weight, size and costin order to accommodate first responders in the field.

An initial visual analytics system was designed for acommand center position in which the training evaluator, ordispatcher, could track personnel in the field. This initial toolwas then ported to a mobile application and tailored for theuse of in-field personnel. This conversion involved determiningthe appropriate representation of the data for rapid, in-fieldcognition on a small-screen mobile device. The data createdfor desktop systems could not be used for mobile deviceswithout further preprocessing because of the limits of mobiledevices in terms of memory, bandwidth, and screen resolution.

Our mobile visual analytics tool consists of a 2D/3Dvisualization component which shows personnel-related in-formation, situational and static scene-related information,integrated multi-media playback functionality for personneloutfitted with cameras, and fast-forward/rewind capability forreviewing events.

Fig. 2. System diagram.

A. Tracking

Tracking of agents in the field was accomplished throughuse of standard GPS devices attached to the USB hub ofthe personnel’s wearable in-field computer. Indoor trackingutilizes a wireless 802.11 network array which estimates theposition of a user’s wearable tag. Transitions from indoor tooutdoor tracking hardware are handled through a data smooth-ing algorithm which assess the accuracy of data being returnedby both the indoor tracking tag and the GPS device. Locationdata is updated approximately once per second. Such a speedis acceptable, as many emergency response operations, suchas clearing a building, require slow, methodical exploration ofa structure.

B. Video

Along with the tracking system, each agent is also equippedwith a headmounted camera to record the scene from theirpoint of view. The camera is attached to the portable computerthe agent is carrying, and data is streamed across the networkon a channel separate from the tracking. Stationary camerasare also positioned around the facility. Data from all camerasmay be viewed in real-time and after-action settings. Duringlive sessions, network bandwidth limits the replay of streamingvideos to approximately five wireless cameras.

Page 3: Situational Awareness and Visual Analytics for Emergency ...rmaciejewski.faculty.asu.edu/papers/2008/HST_MUTC.pdfmanagement technology, geographic information systems, web development

C. Database

Both the tracking and video data are stored in a databaseto provide for after-action review of a training session. Chal-lenges addressed include retrieving and storing session dataclient-side while maintaining interactivity, effective utilizationof network bandwidth, and the real-time reconstruction ofcontinuous-time activity from discrete stored events. All videoand position information is temporarily stored locally onthe participant’s wearable computer. In the event of networkloss, information is pushed to the database once a networkconnection is re-established. All information is stored on thewearable computer, which can hold over two hours worth ofsession data.

Prior to each session, personnel are assigned equipment, andthe database must capture this relational information. Figure 3shows a screenshot of the entity relationship interface. Here weshow the initialization of a session. The upper left box showsall previously created sessions. An event coordinator can goback and modify entity relationships in a previously recordedsession if mistakes occur. Participants’ names are entered intothe database and found in the middle box on the screen. Knownequipment and past participants are already populated into thelist of choices so that the event coordinator can quickly assignagents to their given devices, or add new agents and equipmentas necessary. Stationary camera positions can be assigned ateach session, allowing users of our facility to modify the setupfor their specific needs.

Fig. 3. Entity relationship interface.

D. Visual Analytics

Once a session begins, the database pulls and stores all datafrom both the sensor networks and video networks. Our visualanalytics tool pulls data directly from the tracking and videocomponents to prevent data delays, and displays this data onthe corresponding 2D or 3D model of the building. In after-action review, all data is pulled from the database.

The display capability has been tailored to the respondersand their roles, and provides a succinct, quickly understoodsummary of relevant information extracted from all informa-tion acquired. System capabilities include the rewinding andfast-forwarding of a session, model displays in both a 2D and3D environment, path tracking of multiple agents, and cameraview selection.

Figure 4 (Left) shows the initial screen for the visual ana-lytics software. Users may pick from any previously recordedsession for playback. If there is an active session currentlyrunning, that sessions name will appear at the top of the listas assigned from the interface shown in Figure 3. Note thecontrol bar on the left. Users have the option of playing theradio traffic associated with the session. Agent movement canbe fast forward or rewound. The ‘+’ and ‘-’ buttons can beused to speed up or slow down the rate of play; however, theassociated video playback is currently limited to 1x speed.

In Figure 4 (Middle) we show the normal viewing modeof our interface. Users can see a 2D overview of a buildingfloorplan. Agents are shown as either circles or directionalarrowheads (when directional information is available). Thethird shape represents the position of all stationary cameraswithin the building. Also visible is a colored line trailingbehind each agent’s glyph. This line represents the pathtraveled by the agent, helping commanders better assess themovements of their agents.

In Figure 4 (Right) we show the 3D viewing mode of ourinterface. Depending on the level of detail of the buildingmodel, texture information could be displayed as well asrelevant height information, providing commanders with moredetails on their surroundings. Again in the 3D mode, theagent’s previously traveled path can also be seen.

Video traffic associated with the stationary cameras andheadmounted displays on the agents can be retrieved by left-clicking on an agent’s glyph. The video may be turned off byagain clicking on the glyph. If the user chooses to fast forwardor rewind in the program, the open videos will also rewind tothe associated points. All video, radio and tracking data aretime synched for proper playback.

IV. FIRST RESPONDER CASE STUDY

Currently, our system is deployed at a decommissioned localelementary school near Purdue University. The floorplan canbe seen in Figure 5. To assess the capabilities of our system,we have conducted a live shooter in the school scenario inwhich a group of personnel were instructed to search thebuilding.

Each member of the response team was outfitted as seenin Figure 6. Each member is given a head-mounted camera,two tracking tags, one positioned on the head, and GPS tagpositioned on the fanny pack. The fanny pack at the waistconsists of the portable OQO 02 device that is used for video-processing and recording as well as for mobile visualization,see Figure 1 (Right). Figure 7 shows a screen capture of alive session. Here we show multiple video displays from bothmobile and stationary cameras.

Page 4: Situational Awareness and Visual Analytics for Emergency ...rmaciejewski.faculty.asu.edu/papers/2008/HST_MUTC.pdfmanagement technology, geographic information systems, web development

Fig. 4. From left to right: session selection (Left), agents path being tracked in a 2D model (Middle), 3D view (Right).

Fig. 5. Floorplan of our training area.

V. CONCLUSION

This work demonstrates the applicability of a mobile visualanalytics system in a command and control environment. Wehave successfully utilized low-cost components in the creationof a real-time tracking and video surveillance network. Ourvisual analytics tools could be ported to other areas whereasset monitoring through location and video is necessary. Inthe future, we plan to extend this work, modifying our trackingsystem for quick deployment in areas where no pre-existingwireless network structures exist. We also plan to implementvideo tracking algorithms that will send alerts to our visualanalytics system when an entity (known or unknown) entersthe cameras viewing area, as well as add wearable systemcomponents that will monitor the health status of first respon-ders.

ACKNOWLEDGMENTS

The authors would like to thank the Department of Home-land Security and Army CERDEC as well as the IndianaNational Guard.

Fig. 6. Participant gear for first responder training.

REFERENCES

[1] J. J. Thomas and K. A. Cook, Eds., Illuminating the Path: The R&DAgenda for Visual Analytics. IEEE Press, 2005.

[2] F. Fan and E. S. Biagioni, “An approach to data visualization andinterpretation for sensor networks,” in HICSS ’04: Proceedings ofthe 37th Annual Hawaii International Conference on System Sciences(HICSS’04) - Track 3. Washington, DC, USA: IEEE Computer Society,2004, p. 30063.1.

[3] S. O. Koo, H. D. Kwon, C. G. Yoon, W. S. Seo, and S. K. Jung, “Visual-ization for a multi-sensor data analysis,” in Proceedings of InternationalConference on Computer Graphics, Imaging and Visualization, 2006, pp.57–63.

[4] A.-S. Gunnarsson, M. Rauhala, A. Henrysson, and A. Ynnerman,“Visualization of sensor data using mobile phone augmented reality,”in ISMAR ’06: IEEE/ACM International Symposium on Mixed andAugmented Reality, 2006, pp. 233–234.

[5] A. Pattath, B. Bue, Y. Jang, D. S. Ebert, X. Zhong, A. Ault, and E. Coyle,“Interactive visualization and analysis of network and sensor data onmobile devices,” in VAST ’06: Proceedings of IEEE Symposium on

Page 5: Situational Awareness and Visual Analytics for Emergency ...rmaciejewski.faculty.asu.edu/papers/2008/HST_MUTC.pdfmanagement technology, geographic information systems, web development

Fig. 7. Shooter in a school training scenario.

Visual Analytics Science and Technology. IEEE Computer Society,2006.

[6] M. Masoodian and D. Budd, “Visualization of travel itinerary infor-mation on pdas,” in AUIC ’04: Proceedings of the fifth conferenceon Australasian user interface. Darlinghurst, Australia, Australia:Australian Computer Society, Inc., 2004, pp. 65–71.

[7] L. Chittaro, “Visualizing information on mobile devices,” Computer,vol. 39, no. 3, pp. 40–45, 2006.

[8] [Khronos Group] http://www.khronos.org/.[9] A. Sanfilippo, R. May, G. Danielson, B. Baddeley, R. Riensche, S. Kirby,

S. Collins, S. Thornton, K. Washington, M. Schrager, J. V. Randwyk,B. Borchers, and D. Gatchell, “An adaptive visual analytics platformfor mobile devices,” in SC ’05: Proceedings of the 2005 ACM/IEEEconference on Supercomputing. Washington, DC, USA: IEEE ComputerSociety, 2005, p. 74.

[10] [Infostar] http://www.sc-conference.org/sc2004/infostar.html.[11] A. Herrero, E. Corchado, and J. M. Saiz, “Movicab-ids: Visual analysis

of network traffic data streams for intrusion detection,” in IDEAL, 2006,pp. 1424–1433.