image fusion -multi sensor intel brochure

4

Click here to load reader

Upload: monicamckenzie

Post on 06-Jul-2015

1.132 views

Category:

Technology


3 download

TRANSCRIPT

Page 1: Image Fusion -Multi Sensor Intel Brochure

Look inside for thecomplete speaker roster!

November 16 – 18, 2009 • Washington, D.C. Metro Area

MULTI-SENSOR AND INTELLIGENCE FUSION

• William R. Smith, SES, DeputyProgram Executive Officer, PEOSoldier

• Col Phillip Chudoba, USMC, PMIntelligence Systems, Marine CorpsSystems Command

• Dr. Amy Vanderbilt, ProgramManager, Information ProcessingTechniques Office, DARPA

• CAPT David R. Luber, USN, DeputyProgram Manager for ISR,Expeditionary Maneuver Warfare &Combating Terrorism, S&TDirectorate, ONR

• Darlene Minick, Director of ImageryIntelligence, National ReconnaissanceOffice

• Dr. David Boyd, Division Director,Command, Control & InteroperabilityS&T Directorate, DHS

All new speakers for 2009 include:

• Performance metrics for digitalimaging sensors, EO fused systems,target recognition, and scene andsituation characterization

• Multi-sensor integration to enablepersistent ISR: New architectures fortactical persistent surveillance fromsensor to knowledge dissemination,including Green Devil and MICSR-E

• Quantification of algorithmperformance based on the desiredvisual information transfer soughtthrough fusion

Media Partners:

Don’t miss youropportunity to discussthese crucial issues:

Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com

presents the 8th Annualtraining conference

New for 2009! Tactical persistent surveillancefrom sensor to knowledge dissemination.

Page 2: Image Fusion -Multi Sensor Intel Brochure

Who You Will Meet:At IDGA’s 8th Annual Image FusionSummit, you will have the uniqueopportunity to interact and networkwith representatives from the relevantmilitary, government, academic, andservice providers with the followingresponsibilities and job functions:

• Program Manager

• S&T Director

• Fusion Technologist

• Test & Evaluation Director

• Imagery Specialist

• Electro-Optical Device R&DDirector

• IR & I2 Technologist

Here’s what pastattendees have saidabout IDGA’s ImageFusion Summits:

“To the point, informative,and challenging toindustry”- Technical Writer, US Army

Special Operations

“Well presented and fullof information, easilyrelated to real worldapplications”- Engineer, Hoffman

Engineering

“Extremely interesting, lotsof great information oncurrent developments”- Project Manager, Department

of National Defence

“Thought provoking”- Engineer, Raytheon

Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 2

Dear Colleague,

As we enter into the 8th year of continuous operations in OIF/OEF, the need for our militaryto exploit all possible technological advancements rapidly and effectively at home andabroad remains at the forefront. Asymmetric warfare requires that our warfighters can havereal time situational awareness of the “battlefield.” Evolving image fusion systems enablemajor advances in ISR. While the field of multi-sensor fusion has made many advances inrecent years, there remain many challenges for the immediate future.

IDGA’s Image Fusion Summit will examine next generation technologies, systems, andplatforms in a forum that brings together both solution providers and end-users.

By participating in this summit, you will have the unique opportunity to interact with amultitude of senior level professionals to discuss, brainstorm, and network in order to definemethodologies and initiatives, while forging potential solutions and futurepartnerships. The end goal is to advance possible solutions that enable our warfighters tohave advanced situational awareness in the most efficient manner.

IDGA’s 8th Annual Image Fusion Summit is your best opportunity of the year to:• Hear about new architectures for tactical persistent surveillance from sensor to knowledgedissemination • Learn about the current needs and challenges of our warfighters from PEO Soldier• Understand concepts, current techniques, algorithms, and performance metrics forcharacterizing objects in imagery

• Discuss applications of multi-variate visualization techniques to multi-spectral imageryDon’t delay! Take the time now to block off November 16 – 18, 2009 in your calendar, andreserve your place among your peers and key leaders! Register today by logging on towww.imagefusionsummit.com or calling 1-800-882-8684.

I look forward to seeing you in November!

Very Respectfully,

Monica MckenzieProgram Director, [email protected]

PS: Don’t miss the MasterClass on Image Fusion. See pg. 3 for details!

About IDGAThe Institute for Defense & Government Advancement (IDGA) is a non-partisan information basedorganization dedicated to the promotion of innovative ideas in public service and defense. We bring

together speaker panels comprised of military and government professionals while attracting delegates with decision-making power from military, government and defense industries. For more information, please visit us at www.idga.org.

Log On & Stay Connected!Be sure to add www.imagefusionsummit.com to your “Favorites” on your internet browser and visit us regularlyfor the latest updates:• Event agenda• Speaker faculty• Social and networking activities

• Download Center featuring speaker presentationsand white papers

• Sponsors and Exhibitors

Page 3: Image Fusion -Multi Sensor Intel Brochure

Image Fusion Master ClassMonday, November 16, 2009

Scene Understanding and Situation Awareness

This Master Class is designed to explore the latest advancement made towards performance metrics and algorithms for image/ multi-sensor fusion. Youwill also have the chance to explore how rapid changes in non-traditional warfare have changed the focus of fusion systems to try to assess the humanlandscape as well as the physical landscape. Ascertain how providing inputs augment traditional sensor systems

8:15 am – 10:15 am

This Master Class is designed to help participants understand the latestdevelopments in automated and semi-automated methods for SceneUnderstanding and Situation Awareness for military, intelligence, and civilapplications. The intended audience includes users that are tasked withdeveloping, using or evaluating techniques and systems for scene andsituation understanding and for detecting, assessing, tracking, andprosecuting threat activity. This is also for military or industryrepresentatives involved in policy-making who have a need to learn thebasic concepts, issues, and realistic capabilities of tools and methods forimage analysis, situation and threat assessment.

What will be covered :• Template- and model-based target recognition concepts and techniques

• Methods for adaptive evidence accrual, context display, and exploration• Algorithms for scene/situation hypothesis generation/evaluation/selection • Performance metrics for target recognition, scene and situation

characterization

How you will benefit:• Gain an understanding of concepts, current techniques, algorithms, and

performance metrics • Learn the application of performance metrics for characterizing objects’

imagery by combining object literal and induced features together withcontextual information, both within the image and in collateralinformation (to include other sensed or descriptive data)

Session leader: Dr. Alan Steinberg, Georgia Tech Research Institute

Automated methods for Scene Understanding/Situation Awareness

Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 3

Registration and Coffee7:30 am – 8:15 am

Lunch12:30 pm – 1:30 pm

In-Depth Objective Evaluation of Image Fusion10:30 am – 12:30 pm

Image fusion has already gained acceptance as a useful tool in night andimproved vision applications where multiple sensors are available with awealth of different algorithms proposed for the task. But how can wereally know which algorithms to choose for each application and what toexpect from them when they encounter real data. This workshop willfocus on in-depth fusion evaluation that would allow us to makeinformed decisions based on objective qualification as well asquantification of algorithm performance based on the desired visualinformation transfer sought through fusion.

What will be covered:• Theory of visual information transfer during the image fusion process

and its breakdown into tractable categories.

• Measures of various information transfer processes taking place duringfusion in order to qualify its performance

• Creation of customized metrics to evaluate specifically desired outcomeof image fusion

How you will benefit:• Gain an understanding of the underlying information transfer processes

taking place during image fusion• Understand how more informed qualifications of fusion performance can

be achieved• Learn how to construct objective metrics for specific image fusion goals

Session leader: Dr. Vladmir Petrovic, Research Associate, ImagingScience, University of Manchester (UK)

Understanding underlying information transfer processes during image fusion

Improving Established Image Fusion Algorithms: Image Fusion System-on-a-Chip1:30 pm – 3:30 pm

In 1991 the Departments of Electrical Engineering and Neuroscience atUC Berkeley took on the challenge to implement notions of biologicalimage processing in silicon devices. Those studies, published in a variety ofengineering journals, outlined a number of conceptual advances. Readingthrough the published work, it became clear that it could be translatedand implemented into highly efficient algorithms that could lead toreduced size, weight, and power consumption in silicon-based devices.

This work, originally supported by grants from the Office of NavalResearch, has evolved over the last decade, becoming ever more efficient.Initially implemented in VME boards, the technology has been reduced insize, weight, and power consumption to our current system that canoperate on two 1280 x 1024 video streams at 60 fps, consuming lessthan 0.5W. We currently supply fusion technology to many of the leadingnight vision corporations. The next step will be the implementation of this

technology, integrated along with the back ends of two cameras and a displaydriver in a single ASIC......truly an image fusion system-on-a-chip.

What will be covered:• Hear about how algorithms can lead to reduced size, weight, and power

consumption in silicon-based devices• Implementation integrated along with the back ends of two cameras and a

display driver in a single ASIC......truly an image fusion system-on-a-chip

How you will benefit:• Learn about an image fusion system-on-a-chip• Gain an understanding of best practices towards applying algorithms for

efficient outcomes

Session Leader: Dr. Frank Warblin, University of California, Berkley

*see website for update on session topic

Image fusion system-on-a-chip

Human-Centered Multi-INT Fusion3:45 pm – 5:45 pm

The traditional role of data fusion has involved the use of physical sensorsto observe physical targets, in effect trying to characterize the physicallandscape for situational awareness. In this workshop we will explore howrapid changes in non-traditional warfare have changed the focus of fusionsystems to try to assess the human landscape as well as the physicallandscape.

What will be covered?• Humans acting as new sources of information• Human analysts supporting the analysis process (advanced visualization

and sonification interfaces)• Humans acting as an ad hoc community of analysts (“crowd-sourcing” of

the analysis process)

How you will benefit?• Understand the dynamics of ad hoc community of observers• Ascertain how providing inputs augment traditional sensor systems

Session Leader: Dr. David Hall, Professor, The Center for Network-CentricCognition and Information Fusion, Pennsylvania State University

Augmenting traditional sensor systems

IMA

GE

FUSI

ON

MA

STER

CLA

SS

Page 4: Image Fusion -Multi Sensor Intel Brochure

Tuesday, November 17, 2009MAIN SUMMIT, DAY 1

7:15 Registration and Coffee

8:15 Chairperson’s Welcome & Opening Remarks

8:30 PEO Solider Perspective• Designing, developing, procuring, fielding, and sustaining virtually

everything the Soldier wears or carries• Operating to increase combat effectiveness, to save lives, and to improve

quality of lifeWilliam R. Smith, SES, Deputy Program Executive Officer, PEOSoldier

9:10 Marine Corps Intelligence, Surveillance, and ReconnaissanceEnterprise (MCISR-E) • Conceptual overview• Multi-sensor integration to enable Persistent ISR• Future intelligence capability roadmapColonel Phillip C. Chudoba, USMC, Program Manager, IntelligenceSystems, Marine Corps Systems Command

9:50 Networking Break

10:35 Urban Leader Tactical Response, Awareness & Visualization (ULTRA-Vis)• Techniques to create/disseminate/display geo-registered icons and

actionable combat information for Fire Team Leaders/DismountWarfighters in real time over an existing soldier radio network

• Integration with a low-profile, see-through display to prototype anddemonstrate multi-modal icon-based command and control in a non-line-of-sight, urban environment

Dr. Amy Vanderbilt, Program Manager, IPTO, DARPA

11:15 How Imagery Products Affects Ground Soldier Systems• Current usage challenges• Lesson learnedMaster Sergeant (p) Marcus Griffith, USA, PM Ground Soldier

11:55 Lunch

1:00 A New Architecture for Tactical Persistent Surveillance: From Sensorto Knowledge Dissemination • Green Devil, an experiment conducted at Empire Challenge,

demonstrates the utility of image and image/SIGINT fusion to buildingan actionable tactical picture

• Wide area surveillance sensors, high resolution airborne spot sensors,tower based sensors, unattended ground sensors and RF

• Measurement of the ability of the PISR network to detect insertedbehaviors of interest

Captain David R. Luber, USN, Deputy Program Manager for ISR,Expeditionary Maneuver Warfare & Combating Terrorism, S&TDirectorate, ONR

1:40 Fusion to Counter the Improvised Explosive Device: JIEDDOPerspective• Updates from the S&T department• Applications on the battlefield• Future challengesJulia Erdley, Science and Technology Advisor, Joint IED DefeatOrganization

2:20 Networking Break

3:05 Ultra Vision – The View to the Future• Image fusion today• What Marines need to see• "Vision" for the future George Gibbs, Technologist, IWS Strategic Business Team, MarineCorps Systems Command

3:45 Application of Multi-Variate Visualization Techniques to Multi-spectral Imagery• Summary of existing multi-variate display techniques• Demonstration of their use on multi-spectral imagery• Evaluating applicability and user needs for practical use of these

techniquesMark Livingston, PhD, Research Scientist, Information TechnologyDivision, Naval Research Laboratory

4:25 End of Day One

7:30 Registration and Coffee

8:15 Opening Remarks

8:30 NRO Enterprise Integration: Imagery Intelligence FusionEfforts• Strategic objectives for enhanced imagery intelligence sharing• Technology upgrades and tools for NRO collaborationDarlene Minick, Director of Imagery Intelligence, NationalReconnaissance Office

9:10 PEO Solider Equipment: Sensors and Lasers• PM Soldier Sensors and Lasers’ development, production, and

fielding of advanced sensor and laser devices that provideSoldiers with improved lethality, mobility, and survivability

• Updates on future challenges and needsLieutenant Colonel Jospeh A. Capobianco, USA, PM Sensors& Lasers, PEO Solider

9:50 Networking Break

10:35 Update on Applications of Image/ Multi-Sensor Fusion Datafrom the Office of Science and Technology Directorate, DHS• Use of imagery data within the DHS• Challenges faced when creating optimal situational awarenessDr. David Boyd, Division Director, Command, Control, andInteroperability, Department of Homeland Security

11:15 AquaQuIPS and Sea Mist Sensor and Imagery Data FusionResults from Trident Warrior 08 and 09• Automated, real time sensor and image data fusion• Tracking uncooperative and EMCON silent ("dark") surface ships• Why the AquaQuIPS data fusion algorithm has been so successful• Future objective: Automated Abnormal BehaviorDr. James H. Wilson, United States Naval Academy Foundation,MULTI-INT Sensor and Image Data Fusion Thesis Advisor

11:55 LUNCH

1:00 Low Light Level Limiting Resolution and MTF of VariousDigital Imaging, Image Intensified, and EO Fused Systems• Performance metrics for digital imaging sensors and EO fused

systems• Utility/validity of measuring spatial limiting resolution and MTF

of various electro-optical systems from high light to low lightconditions

• Detailed line shape analysis of spatial resolution profiles for lowlight level images using manual and COTS machine vision basedprocessing algorithms

Dr. Joseph Estrera, Senior VP and Chief Technology Officer,L-3 Electro-Optical Systems

1:40 Air Multispectral Imaging (AMI)• All-source image fusion• Component enablers and integration• Application

Dr. Darrel G. Hopper, Principal Electronics Engineer, AirForce Research Laboratory

2:20 Networking Break

2:50 Man-Portable Power Sources• Challenges faced with creating power sources for our

warfighters• Updates on current projects Mike Brundage, Chief of Power Sources Branch, US Army,CERDEC, Army Power Division

3:30 Tracking from Commercial Satellite Imagery: SPAWAR• RAPIER program update• MDA imagery applicationsHeidi Buck, Intelligence, Surveillance, and ReconnaissanceOffice, Space and Naval Warfare Systems Command

4:10 End of Main Summit

Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 4

Op

enin

g

Key

no

teTh

eate

rU

pd

ate

Gre

en

Dev

il

Mo

rnin

gK

eyn

ote

– N

RO

DH

S Pe

rsp

ecti

veA

fter

no

on

K

eyn

ote

Pow

er S

ou

rces

fr

om

CER

DEC

Wednesday, November 18, 2009MAIN SUMMIT, DAY 2