SlideShare une entreprise Scribd logo
1  sur  4
Télécharger pour lire hors ligne
Look inside for the
complete speaker roster!
November 16 – 18, 2009 • Washington, D.C. Metro Area
MULTI-SENSOR AND
INTELLIGENCE FUSION
• William R. Smith, SES, Deputy
Program Executive Officer, PEO
Soldier
• Col Phillip Chudoba, USMC, PM
Intelligence Systems, Marine Corps
Systems Command
• Dr. Amy Vanderbilt, Program
Manager, Information Processing
Techniques Office, DARPA
• CAPT David R. Luber, USN, Deputy
Program Manager for ISR,
Expeditionary Maneuver Warfare &
Combating Terrorism, S&T
Directorate, ONR
• Darlene Minick, Director of Imagery
Intelligence, National Reconnaissance
Office
• Dr. David Boyd, Division Director,
Command, Control & Interoperability
S&T Directorate, DHS
All new speakers for
2009 include:
• Performance metrics for digital
imaging sensors, EO fused systems,
target recognition, and scene and
situation characterization
• Multi-sensor integration to enable
persistent ISR: New architectures for
tactical persistent surveillance from
sensor to knowledge dissemination,
including Green Devil and MICSR-E
• Quantification of algorithm
performance based on the desired
visual information transfer sought
through fusion
Media Partners:
Don’t miss your
opportunity to discuss
these crucial issues:
Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com
presents the 8th Annual
training conference
New for 2009! Tactical persistent surveillance
from sensor to knowledge dissemination.
Who You Will Meet:
At IDGA’s 8th Annual Image Fusion
Summit, you will have the unique
opportunity to interact and network
with representatives from the relevant
military, government, academic, and
service providers with the following
responsibilities and job functions:
•
Program Manager
•
S&T Director
•
Fusion Technologist
•
Test & Evaluation Director
•
Imagery Specialist
•
Electro-Optical Device R&D
Director
•
IR & I2 Technologist
Here’s what past
attendees have said
about IDGA’s Image
Fusion Summits:
“To the point, informative,
and challenging to
industry”
- Technical Writer, US Army
Special Operations
“Well presented and full
of information, easily
related to real world
applications”
- Engineer, Hoffman
Engineering
“Extremely interesting, lots
of great information on
current developments”
- Project Manager, Department
of National Defence
“Thought provoking”
- Engineer, Raytheon
Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 2
Dear Colleague,
As we enter into the 8th year of continuous operations in OIF/OEF, the need for our militaryto exploit all possible technological advancements rapidly and effectively at home andabroad remains at the forefront. Asymmetric warfare requires that our warfighters can havereal time situational awareness of the “battlefield.” Evolving image fusion systems enablemajor advances in ISR. While the field of multi-sensor fusion has made many advances inrecent years, there remain many challenges for the immediate future.
IDGA’s Image Fusion Summit will examine next generation technologies, systems, andplatforms in a forum that brings together both solution providers and end-users.
By participating in this summit, you will have the unique opportunity to interact with amultitude of senior level professionals to discuss, brainstorm, and network in order to definemethodologies and initiatives, while forging potential solutions and futurepartnerships. The end goal is to advance possible solutions that enable our warfighters tohave advanced situational awareness in the most efficient manner.
IDGA’s 8th Annual Image Fusion Summit is your best opportunity of the year to:•
Hear about new architectures for tactical persistent surveillance from sensor to knowledgedissemination
•
Learn about the current needs and challenges of our warfighters from PEO Soldier•
Understand concepts, current techniques, algorithms, and performance metrics forcharacterizing objects in imagery
•
Discuss applications of multi-variate visualization techniques to multi-spectral imagery
Don’t delay! Take the time now to block off November 16 – 18, 2009 in your calendar, andreserve your place among your peers and key leaders! Register today by logging on towww.imagefusionsummit.com or calling 1-800-882-8684.
I look forward to seeing you in November!
Very Respectfully,
Monica Mckenzie
Program Director, IDGA
Monica.mckenzie@idga.org
PS: Don’t miss the Master
Class on Image Fusion.
See pg. 3 for details!
About IDGA
The Institute for Defense & Government Advancement (IDGA) is a non-partisan information based
organization dedicated to the promotion of innovative ideas in public service and defense. We bring
together speaker panels comprised of military and government professionals while attracting delegates with decision-
making power from military, government and defense industries. For more information, please visit us at www.idga.org.
Log On & Stay Connected!
Be sure to add www.imagefusionsummit.com to your “Favorites” on your internet browser and visit us regularly
for the latest updates:
•
Event agenda
•
Speaker faculty
•
Social and networking activities
•
Download Center featuring speaker presentations
and white papers
•
Sponsors and Exhibitors
Image Fusion Master Class
Monday, November 16, 2009
Scene Understanding and Situation Awareness
This Master Class is designed to explore the latest advancement made towards performance metrics and algorithms for image/ multi-sensor fusion. You
will also have the chance to explore how rapid changes in non-traditional warfare have changed the focus of fusion systems to try to assess the human
landscape as well as the physical landscape. Ascertain how providing inputs augment traditional sensor systems
8:15 am – 10:15 am
This Master Class is designed to help participants understand the latest
developments in automated and semi-automated methods for Scene
Understanding and Situation Awareness for military, intelligence, and civil
applications. The intended audience includes users that are tasked with
developing, using or evaluating techniques and systems for scene and
situation understanding and for detecting, assessing, tracking, and
prosecuting threat activity. This is also for military or industry
representatives involved in policy-making who have a need to learn the
basic concepts, issues, and realistic capabilities of tools and methods for
image analysis, situation and threat assessment.
What will be covered :
•
Template- and model-based target recognition concepts and techniques
•
Methods for adaptive evidence accrual, context display, and exploration
•
Algorithms for scene/situation hypothesis generation/evaluation/selection
•
Performance metrics for target recognition, scene and situation
characterization
How you will benefit:
•
Gain an understanding of concepts, current techniques, algorithms, and
performance metrics
•
Learn the application of performance metrics for characterizing objects’
imagery by combining object literal and induced features together with
contextual information, both within the image and in collateral
information (to include other sensed or descriptive data)
Session leader: Dr. Alan Steinberg, Georgia Tech Research Institute
Automated methods for Scene Understanding/Situation Awareness
Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 3
Registration and Coffee7:30 am – 8:15 am
Lunch12:30 pm – 1:30 pm
In-Depth Objective Evaluation of Image Fusion
10:30 am – 12:30 pm
Image fusion has already gained acceptance as a useful tool in night and
improved vision applications where multiple sensors are available with a
wealth of different algorithms proposed for the task. But how can we
really know which algorithms to choose for each application and what to
expect from them when they encounter real data. This workshop will
focus on in-depth fusion evaluation that would allow us to make
informed decisions based on objective qualification as well as
quantification of algorithm performance based on the desired visual
information transfer sought through fusion.
What will be covered:
•
Theory of visual information transfer during the image fusion process
and its breakdown into tractable categories.
•
Measures of various information transfer processes taking place during
fusion in order to qualify its performance
•
Creation of customized metrics to evaluate specifically desired outcome
of image fusion
How you will benefit:
•
Gain an understanding of the underlying information transfer processes
taking place during image fusion
•
Understand how more informed qualifications of fusion performance can
be achieved
•
Learn how to construct objective metrics for specific image fusion goals
Session leader: Dr. Vladmir Petrovic, Research Associate, Imaging
Science, University of Manchester (UK)
Understanding underlying information transfer processes during image fusion
Improving Established Image Fusion Algorithms: Image Fusion System-on-a-Chip
1:30 pm – 3:30 pm
In 1991 the Departments of Electrical Engineering and Neuroscience at
UC Berkeley took on the challenge to implement notions of biological
image processing in silicon devices. Those studies, published in a variety of
engineering journals, outlined a number of conceptual advances. Reading
through the published work, it became clear that it could be translated
and implemented into highly efficient algorithms that could lead to
reduced size, weight, and power consumption in silicon-based devices.
This work, originally supported by grants from the Office of Naval
Research, has evolved over the last decade, becoming ever more efficient.
Initially implemented in VME boards, the technology has been reduced in
size, weight, and power consumption to our current system that can
operate on two 1280 x 1024 video streams at 60 fps, consuming less
than 0.5W. We currently supply fusion technology to many of the leading
night vision corporations. The next step will be the implementation of this
technology, integrated along with the back ends of two cameras and a display
driver in a single ASIC......truly an image fusion system-on-a-chip.
What will be covered:
•
Hear about how algorithms can lead to reduced size, weight, and power
consumption in silicon-based devices
•
Implementation integrated along with the back ends of two cameras and a
display driver in a single ASIC......truly an image fusion system-on-a-chip
How you will benefit:
•
Learn about an image fusion system-on-a-chip
•
Gain an understanding of best practices towards applying algorithms for
efficient outcomes
Session Leader: Dr. Frank Warblin, University of California, Berkley
*see website for update on session topic
Image fusion system-on-a-chip
Human-Centered Multi-INT Fusion
3:45 pm – 5:45 pm
The traditional role of data fusion has involved the use of physical sensors
to observe physical targets, in effect trying to characterize the physical
landscape for situational awareness. In this workshop we will explore how
rapid changes in non-traditional warfare have changed the focus of fusion
systems to try to assess the human landscape as well as the physical
landscape.
What will be covered?
•
Humans acting as new sources of information
•
Human analysts supporting the analysis process (advanced visualization
and sonification interfaces)
•
Humans acting as an ad hoc community of analysts (“crowd-sourcing” of
the analysis process)
How you will benefit?
•
Understand the dynamics of ad hoc community of observers
•
Ascertain how providing inputs augment traditional sensor systems
Session Leader: Dr. David Hall, Professor, The Center for Network-Centric
Cognition and Information Fusion, Pennsylvania State University
Augmenting traditional sensor systems
IMAGEFUSIONMASTERCLASS
Tuesday, November 17, 2009
MAIN SUMMIT, DAY 1
7:15 Registration and Coffee
8:15 Chairperson’s Welcome & Opening Remarks
8:30 PEO Solider Perspective
•
Designing, developing, procuring, fielding, and sustaining virtually
everything the Soldier wears or carries
•
Operating to increase combat effectiveness, to save lives, and to improve
quality of life
William R. Smith, SES, Deputy Program Executive Officer, PEO
Soldier
9:10 Marine Corps Intelligence, Surveillance, and Reconnaissance
Enterprise (MCISR-E)
•
Conceptual overview
•
Multi-sensor integration to enable Persistent ISR
•
Future intelligence capability roadmap
Colonel Phillip C. Chudoba, USMC, Program Manager, Intelligence
Systems, Marine Corps Systems Command
9:50 Networking Break
10:35 Urban Leader Tactical Response, Awareness & Visualization (ULTRA-
Vis)
•
Techniques to create/disseminate/display geo-registered icons and
actionable combat information for Fire Team Leaders/Dismount
Warfighters in real time over an existing soldier radio network
•
Integration with a low-profile, see-through display to prototype and
demonstrate multi-modal icon-based command and control in a non-
line-of-sight, urban environment
Dr. Amy Vanderbilt, Program Manager, IPTO, DARPA
11:15 How Imagery Products Affects Ground Soldier Systems
•
Current usage challenges
•
Lesson learned
Master Sergeant (p) Marcus Griffith, USA, PM Ground Soldier
11:55 Lunch
1:00 A New Architecture for Tactical Persistent Surveillance: From Sensor
to Knowledge Dissemination
•
Green Devil, an experiment conducted at Empire Challenge,
demonstrates the utility of image and image/SIGINT fusion to building
an actionable tactical picture
•
Wide area surveillance sensors, high resolution airborne spot sensors,
tower based sensors, unattended ground sensors and RF
•
Measurement of the ability of the PISR network to detect inserted
behaviors of interest
Captain David R. Luber, USN, Deputy Program Manager for ISR,
Expeditionary Maneuver Warfare & Combating Terrorism, S&T
Directorate, ONR
1:40 Fusion to Counter the Improvised Explosive Device: JIEDDO
Perspective
• Updates from the S&T department
•
Applications on the battlefield
•
Future challenges
Julia Erdley, Science and Technology Advisor, Joint IED Defeat
Organization
2:20 Networking Break
3:05 Ultra Vision – The View to the Future
•
Image fusion today
•
What Marines need to see
• "Vision" for the future
George Gibbs, Technologist, IWS Strategic Business Team, Marine
Corps Systems Command
3:45 Application of Multi-Variate Visualization Techniques to Multi-
spectral Imagery
•
Summary of existing multi-variate display techniques
•
Demonstration of their use on multi-spectral imagery
•
Evaluating applicability and user needs for practical use of these
techniques
Mark Livingston, PhD, Research Scientist, Information Technology
Division, Naval Research Laboratory
4:25 End of Day One
7:30 Registration and Coffee
8:15 Opening Remarks
8:30 NRO Enterprise Integration: Imagery Intelligence Fusion
Efforts
•
Strategic objectives for enhanced imagery intelligence sharing
•
Technology upgrades and tools for NRO collaboration
Darlene Minick, Director of Imagery Intelligence, National
Reconnaissance Office
9:10 PEO Solider Equipment: Sensors and Lasers
•
PM Soldier Sensors and Lasers’ development, production, and
fielding of advanced sensor and laser devices that provide
Soldiers with improved lethality, mobility, and survivability
•
Updates on future challenges and needs
Lieutenant Colonel Jospeh A. Capobianco, USA, PM Sensors
& Lasers, PEO Solider
9:50 Networking Break
10:35 Update on Applications of Image/ Multi-Sensor Fusion Data
from the Office of Science and Technology Directorate, DHS
•
Use of imagery data within the DHS
•
Challenges faced when creating optimal situational awareness
Dr. David Boyd, Division Director, Command, Control, and
Interoperability, Department of Homeland Security
11:15 AquaQuIPS and Sea Mist Sensor and Imagery Data Fusion
Results from Trident Warrior 08 and 09
•
Automated, real time sensor and image data fusion
•
Tracking uncooperative and EMCON silent ("dark") surface ships
•
Why the AquaQuIPS data fusion algorithm has been so successful
•
Future objective: Automated Abnormal Behavior
Dr. James H. Wilson, United States Naval Academy Foundation,
MULTI-INT Sensor and Image Data Fusion Thesis Advisor
11:55 LUNCH
1:00 Low Light Level Limiting Resolution and MTF of Various
Digital Imaging, Image Intensified, and EO Fused Systems
•
Performance metrics for digital imaging sensors and EO fused
systems
•
Utility/validity of measuring spatial limiting resolution and MTF
of various electro-optical systems from high light to low light
conditions
•
Detailed line shape analysis of spatial resolution profiles for low
light level images using manual and COTS machine vision based
processing algorithms
Dr. Joseph Estrera, Senior VP and Chief Technology Officer,
L-3 Electro-Optical Systems
1:40 Air Multispectral Imaging (AMI)
•
All-source image fusion
•
Component enablers and integration
•
Application
Dr. Darrel G. Hopper, Principal Electronics Engineer, Air
Force Research Laboratory
2:20 Networking Break
2:50 Man-Portable Power Sources
•
Challenges faced with creating power sources for our
warfighters
•
Updates on current projects
Mike Brundage, Chief of Power Sources Branch, US Army,
CERDEC, Army Power Division
3:30 Tracking from Commercial Satellite Imagery: SPAWAR
•
RAPIER program update
•
MDA imagery applications
Heidi Buck, Intelligence, Surveillance, and Reconnaissance
Office, Space and Naval Warfare Systems Command
4:10 End of Main Summit
Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 4
Opening
Keynote
Theater
Update
Green
Devil
Morning
Keynote–NRO
DHS
Perspective
Afternoon
Keynote
PowerSources
fromCERDEC
Wednesday, November 18, 2009
MAIN SUMMIT, DAY 2

Contenu connexe

Similaire à Image Fusion -Multi Sensor Intel Brochure

Broadcasting Forensics Using Machine Learning Approaches
Broadcasting Forensics Using Machine Learning ApproachesBroadcasting Forensics Using Machine Learning Approaches
Broadcasting Forensics Using Machine Learning Approaches
ijtsrd
 
resume for work in predictive analytics
resume for work in predictive analyticsresume for work in predictive analytics
resume for work in predictive analytics
butest
 
Camara for uav jan2012 eas 021
Camara for uav jan2012 eas 021Camara for uav jan2012 eas 021
Camara for uav jan2012 eas 021
M.L. Kamalasana
 

Similaire à Image Fusion -Multi Sensor Intel Brochure (20)

Broadcasting Forensics Using Machine Learning Approaches
Broadcasting Forensics Using Machine Learning ApproachesBroadcasting Forensics Using Machine Learning Approaches
Broadcasting Forensics Using Machine Learning Approaches
 
Image and video processing 1.pptx
Image and video processing 1.pptxImage and video processing 1.pptx
Image and video processing 1.pptx
 
Presentation of the InVID verification technologies at IPTC 2018
Presentation of the InVID verification technologies at IPTC 2018Presentation of the InVID verification technologies at IPTC 2018
Presentation of the InVID verification technologies at IPTC 2018
 
Video Analytics on Hadoop webinar victor fang-201309
Video Analytics on Hadoop webinar victor fang-201309Video Analytics on Hadoop webinar victor fang-201309
Video Analytics on Hadoop webinar victor fang-201309
 
Next Century Project Overview
Next Century Project OverviewNext Century Project Overview
Next Century Project Overview
 
Borys Pratsiuk "How to be NVidia partner"
Borys Pratsiuk "How to be NVidia partner"Borys Pratsiuk "How to be NVidia partner"
Borys Pratsiuk "How to be NVidia partner"
 
resume for work in predictive analytics
resume for work in predictive analyticsresume for work in predictive analytics
resume for work in predictive analytics
 
Guides
GuidesGuides
Guides
 
Internet of Things: Government Keynote, Randy Garrett
Internet of Things: Government Keynote, Randy GarrettInternet of Things: Government Keynote, Randy Garrett
Internet of Things: Government Keynote, Randy Garrett
 
Prompting an EOSC in Practice, Isabel Campos, CSIC & Member of the High Level...
Prompting an EOSC in Practice, Isabel Campos, CSIC & Member of the High Level...Prompting an EOSC in Practice, Isabel Campos, CSIC & Member of the High Level...
Prompting an EOSC in Practice, Isabel Campos, CSIC & Member of the High Level...
 
Emerging engineering issues for building large scale AI systems By Srinivas P...
Emerging engineering issues for building large scale AI systems By Srinivas P...Emerging engineering issues for building large scale AI systems By Srinivas P...
Emerging engineering issues for building large scale AI systems By Srinivas P...
 
The Roadmap to a Lifesaving Digital Ecosystem
The Roadmap to a Lifesaving Digital EcosystemThe Roadmap to a Lifesaving Digital Ecosystem
The Roadmap to a Lifesaving Digital Ecosystem
 
Office of Naval Research White Paper
Office of Naval Research White PaperOffice of Naval Research White Paper
Office of Naval Research White Paper
 
BCC 2006 - NSTC
BCC 2006 - NSTCBCC 2006 - NSTC
BCC 2006 - NSTC
 
Digital forensics research: The next 10 years
Digital forensics research: The next 10 yearsDigital forensics research: The next 10 years
Digital forensics research: The next 10 years
 
BiometricsMAIN13
BiometricsMAIN13BiometricsMAIN13
BiometricsMAIN13
 
sullivan_resume
sullivan_resumesullivan_resume
sullivan_resume
 
Camara for uav jan2012 eas 021
Camara for uav jan2012 eas 021Camara for uav jan2012 eas 021
Camara for uav jan2012 eas 021
 
QAI brochure
QAI brochureQAI brochure
QAI brochure
 
Aggregating and Analyzing the Context of Social Media Content
Aggregating and Analyzing the Context of Social Media ContentAggregating and Analyzing the Context of Social Media Content
Aggregating and Analyzing the Context of Social Media Content
 

Dernier

Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 

Dernier (20)

A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 

Image Fusion -Multi Sensor Intel Brochure

  • 1. Look inside for the complete speaker roster! November 16 – 18, 2009 • Washington, D.C. Metro Area MULTI-SENSOR AND INTELLIGENCE FUSION • William R. Smith, SES, Deputy Program Executive Officer, PEO Soldier • Col Phillip Chudoba, USMC, PM Intelligence Systems, Marine Corps Systems Command • Dr. Amy Vanderbilt, Program Manager, Information Processing Techniques Office, DARPA • CAPT David R. Luber, USN, Deputy Program Manager for ISR, Expeditionary Maneuver Warfare & Combating Terrorism, S&T Directorate, ONR • Darlene Minick, Director of Imagery Intelligence, National Reconnaissance Office • Dr. David Boyd, Division Director, Command, Control & Interoperability S&T Directorate, DHS All new speakers for 2009 include: • Performance metrics for digital imaging sensors, EO fused systems, target recognition, and scene and situation characterization • Multi-sensor integration to enable persistent ISR: New architectures for tactical persistent surveillance from sensor to knowledge dissemination, including Green Devil and MICSR-E • Quantification of algorithm performance based on the desired visual information transfer sought through fusion Media Partners: Don’t miss your opportunity to discuss these crucial issues: Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com presents the 8th Annual training conference New for 2009! Tactical persistent surveillance from sensor to knowledge dissemination.
  • 2. Who You Will Meet: At IDGA’s 8th Annual Image Fusion Summit, you will have the unique opportunity to interact and network with representatives from the relevant military, government, academic, and service providers with the following responsibilities and job functions: • Program Manager • S&T Director • Fusion Technologist • Test & Evaluation Director • Imagery Specialist • Electro-Optical Device R&D Director • IR & I2 Technologist Here’s what past attendees have said about IDGA’s Image Fusion Summits: “To the point, informative, and challenging to industry” - Technical Writer, US Army Special Operations “Well presented and full of information, easily related to real world applications” - Engineer, Hoffman Engineering “Extremely interesting, lots of great information on current developments” - Project Manager, Department of National Defence “Thought provoking” - Engineer, Raytheon Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 2 Dear Colleague, As we enter into the 8th year of continuous operations in OIF/OEF, the need for our militaryto exploit all possible technological advancements rapidly and effectively at home andabroad remains at the forefront. Asymmetric warfare requires that our warfighters can havereal time situational awareness of the “battlefield.” Evolving image fusion systems enablemajor advances in ISR. While the field of multi-sensor fusion has made many advances inrecent years, there remain many challenges for the immediate future. IDGA’s Image Fusion Summit will examine next generation technologies, systems, andplatforms in a forum that brings together both solution providers and end-users. By participating in this summit, you will have the unique opportunity to interact with amultitude of senior level professionals to discuss, brainstorm, and network in order to definemethodologies and initiatives, while forging potential solutions and futurepartnerships. The end goal is to advance possible solutions that enable our warfighters tohave advanced situational awareness in the most efficient manner. IDGA’s 8th Annual Image Fusion Summit is your best opportunity of the year to:• Hear about new architectures for tactical persistent surveillance from sensor to knowledgedissemination • Learn about the current needs and challenges of our warfighters from PEO Soldier• Understand concepts, current techniques, algorithms, and performance metrics forcharacterizing objects in imagery • Discuss applications of multi-variate visualization techniques to multi-spectral imagery Don’t delay! Take the time now to block off November 16 – 18, 2009 in your calendar, andreserve your place among your peers and key leaders! Register today by logging on towww.imagefusionsummit.com or calling 1-800-882-8684. I look forward to seeing you in November! Very Respectfully, Monica Mckenzie Program Director, IDGA Monica.mckenzie@idga.org PS: Don’t miss the Master Class on Image Fusion. See pg. 3 for details! About IDGA The Institute for Defense & Government Advancement (IDGA) is a non-partisan information based organization dedicated to the promotion of innovative ideas in public service and defense. We bring together speaker panels comprised of military and government professionals while attracting delegates with decision- making power from military, government and defense industries. For more information, please visit us at www.idga.org. Log On & Stay Connected! Be sure to add www.imagefusionsummit.com to your “Favorites” on your internet browser and visit us regularly for the latest updates: • Event agenda • Speaker faculty • Social and networking activities • Download Center featuring speaker presentations and white papers • Sponsors and Exhibitors
  • 3. Image Fusion Master Class Monday, November 16, 2009 Scene Understanding and Situation Awareness This Master Class is designed to explore the latest advancement made towards performance metrics and algorithms for image/ multi-sensor fusion. You will also have the chance to explore how rapid changes in non-traditional warfare have changed the focus of fusion systems to try to assess the human landscape as well as the physical landscape. Ascertain how providing inputs augment traditional sensor systems 8:15 am – 10:15 am This Master Class is designed to help participants understand the latest developments in automated and semi-automated methods for Scene Understanding and Situation Awareness for military, intelligence, and civil applications. The intended audience includes users that are tasked with developing, using or evaluating techniques and systems for scene and situation understanding and for detecting, assessing, tracking, and prosecuting threat activity. This is also for military or industry representatives involved in policy-making who have a need to learn the basic concepts, issues, and realistic capabilities of tools and methods for image analysis, situation and threat assessment. What will be covered : • Template- and model-based target recognition concepts and techniques • Methods for adaptive evidence accrual, context display, and exploration • Algorithms for scene/situation hypothesis generation/evaluation/selection • Performance metrics for target recognition, scene and situation characterization How you will benefit: • Gain an understanding of concepts, current techniques, algorithms, and performance metrics • Learn the application of performance metrics for characterizing objects’ imagery by combining object literal and induced features together with contextual information, both within the image and in collateral information (to include other sensed or descriptive data) Session leader: Dr. Alan Steinberg, Georgia Tech Research Institute Automated methods for Scene Understanding/Situation Awareness Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 3 Registration and Coffee7:30 am – 8:15 am Lunch12:30 pm – 1:30 pm In-Depth Objective Evaluation of Image Fusion 10:30 am – 12:30 pm Image fusion has already gained acceptance as a useful tool in night and improved vision applications where multiple sensors are available with a wealth of different algorithms proposed for the task. But how can we really know which algorithms to choose for each application and what to expect from them when they encounter real data. This workshop will focus on in-depth fusion evaluation that would allow us to make informed decisions based on objective qualification as well as quantification of algorithm performance based on the desired visual information transfer sought through fusion. What will be covered: • Theory of visual information transfer during the image fusion process and its breakdown into tractable categories. • Measures of various information transfer processes taking place during fusion in order to qualify its performance • Creation of customized metrics to evaluate specifically desired outcome of image fusion How you will benefit: • Gain an understanding of the underlying information transfer processes taking place during image fusion • Understand how more informed qualifications of fusion performance can be achieved • Learn how to construct objective metrics for specific image fusion goals Session leader: Dr. Vladmir Petrovic, Research Associate, Imaging Science, University of Manchester (UK) Understanding underlying information transfer processes during image fusion Improving Established Image Fusion Algorithms: Image Fusion System-on-a-Chip 1:30 pm – 3:30 pm In 1991 the Departments of Electrical Engineering and Neuroscience at UC Berkeley took on the challenge to implement notions of biological image processing in silicon devices. Those studies, published in a variety of engineering journals, outlined a number of conceptual advances. Reading through the published work, it became clear that it could be translated and implemented into highly efficient algorithms that could lead to reduced size, weight, and power consumption in silicon-based devices. This work, originally supported by grants from the Office of Naval Research, has evolved over the last decade, becoming ever more efficient. Initially implemented in VME boards, the technology has been reduced in size, weight, and power consumption to our current system that can operate on two 1280 x 1024 video streams at 60 fps, consuming less than 0.5W. We currently supply fusion technology to many of the leading night vision corporations. The next step will be the implementation of this technology, integrated along with the back ends of two cameras and a display driver in a single ASIC......truly an image fusion system-on-a-chip. What will be covered: • Hear about how algorithms can lead to reduced size, weight, and power consumption in silicon-based devices • Implementation integrated along with the back ends of two cameras and a display driver in a single ASIC......truly an image fusion system-on-a-chip How you will benefit: • Learn about an image fusion system-on-a-chip • Gain an understanding of best practices towards applying algorithms for efficient outcomes Session Leader: Dr. Frank Warblin, University of California, Berkley *see website for update on session topic Image fusion system-on-a-chip Human-Centered Multi-INT Fusion 3:45 pm – 5:45 pm The traditional role of data fusion has involved the use of physical sensors to observe physical targets, in effect trying to characterize the physical landscape for situational awareness. In this workshop we will explore how rapid changes in non-traditional warfare have changed the focus of fusion systems to try to assess the human landscape as well as the physical landscape. What will be covered? • Humans acting as new sources of information • Human analysts supporting the analysis process (advanced visualization and sonification interfaces) • Humans acting as an ad hoc community of analysts (“crowd-sourcing” of the analysis process) How you will benefit? • Understand the dynamics of ad hoc community of observers • Ascertain how providing inputs augment traditional sensor systems Session Leader: Dr. David Hall, Professor, The Center for Network-Centric Cognition and Information Fusion, Pennsylvania State University Augmenting traditional sensor systems IMAGEFUSIONMASTERCLASS
  • 4. Tuesday, November 17, 2009 MAIN SUMMIT, DAY 1 7:15 Registration and Coffee 8:15 Chairperson’s Welcome & Opening Remarks 8:30 PEO Solider Perspective • Designing, developing, procuring, fielding, and sustaining virtually everything the Soldier wears or carries • Operating to increase combat effectiveness, to save lives, and to improve quality of life William R. Smith, SES, Deputy Program Executive Officer, PEO Soldier 9:10 Marine Corps Intelligence, Surveillance, and Reconnaissance Enterprise (MCISR-E) • Conceptual overview • Multi-sensor integration to enable Persistent ISR • Future intelligence capability roadmap Colonel Phillip C. Chudoba, USMC, Program Manager, Intelligence Systems, Marine Corps Systems Command 9:50 Networking Break 10:35 Urban Leader Tactical Response, Awareness & Visualization (ULTRA- Vis) • Techniques to create/disseminate/display geo-registered icons and actionable combat information for Fire Team Leaders/Dismount Warfighters in real time over an existing soldier radio network • Integration with a low-profile, see-through display to prototype and demonstrate multi-modal icon-based command and control in a non- line-of-sight, urban environment Dr. Amy Vanderbilt, Program Manager, IPTO, DARPA 11:15 How Imagery Products Affects Ground Soldier Systems • Current usage challenges • Lesson learned Master Sergeant (p) Marcus Griffith, USA, PM Ground Soldier 11:55 Lunch 1:00 A New Architecture for Tactical Persistent Surveillance: From Sensor to Knowledge Dissemination • Green Devil, an experiment conducted at Empire Challenge, demonstrates the utility of image and image/SIGINT fusion to building an actionable tactical picture • Wide area surveillance sensors, high resolution airborne spot sensors, tower based sensors, unattended ground sensors and RF • Measurement of the ability of the PISR network to detect inserted behaviors of interest Captain David R. Luber, USN, Deputy Program Manager for ISR, Expeditionary Maneuver Warfare & Combating Terrorism, S&T Directorate, ONR 1:40 Fusion to Counter the Improvised Explosive Device: JIEDDO Perspective • Updates from the S&T department • Applications on the battlefield • Future challenges Julia Erdley, Science and Technology Advisor, Joint IED Defeat Organization 2:20 Networking Break 3:05 Ultra Vision – The View to the Future • Image fusion today • What Marines need to see • "Vision" for the future George Gibbs, Technologist, IWS Strategic Business Team, Marine Corps Systems Command 3:45 Application of Multi-Variate Visualization Techniques to Multi- spectral Imagery • Summary of existing multi-variate display techniques • Demonstration of their use on multi-spectral imagery • Evaluating applicability and user needs for practical use of these techniques Mark Livingston, PhD, Research Scientist, Information Technology Division, Naval Research Laboratory 4:25 End of Day One 7:30 Registration and Coffee 8:15 Opening Remarks 8:30 NRO Enterprise Integration: Imagery Intelligence Fusion Efforts • Strategic objectives for enhanced imagery intelligence sharing • Technology upgrades and tools for NRO collaboration Darlene Minick, Director of Imagery Intelligence, National Reconnaissance Office 9:10 PEO Solider Equipment: Sensors and Lasers • PM Soldier Sensors and Lasers’ development, production, and fielding of advanced sensor and laser devices that provide Soldiers with improved lethality, mobility, and survivability • Updates on future challenges and needs Lieutenant Colonel Jospeh A. Capobianco, USA, PM Sensors & Lasers, PEO Solider 9:50 Networking Break 10:35 Update on Applications of Image/ Multi-Sensor Fusion Data from the Office of Science and Technology Directorate, DHS • Use of imagery data within the DHS • Challenges faced when creating optimal situational awareness Dr. David Boyd, Division Director, Command, Control, and Interoperability, Department of Homeland Security 11:15 AquaQuIPS and Sea Mist Sensor and Imagery Data Fusion Results from Trident Warrior 08 and 09 • Automated, real time sensor and image data fusion • Tracking uncooperative and EMCON silent ("dark") surface ships • Why the AquaQuIPS data fusion algorithm has been so successful • Future objective: Automated Abnormal Behavior Dr. James H. Wilson, United States Naval Academy Foundation, MULTI-INT Sensor and Image Data Fusion Thesis Advisor 11:55 LUNCH 1:00 Low Light Level Limiting Resolution and MTF of Various Digital Imaging, Image Intensified, and EO Fused Systems • Performance metrics for digital imaging sensors and EO fused systems • Utility/validity of measuring spatial limiting resolution and MTF of various electro-optical systems from high light to low light conditions • Detailed line shape analysis of spatial resolution profiles for low light level images using manual and COTS machine vision based processing algorithms Dr. Joseph Estrera, Senior VP and Chief Technology Officer, L-3 Electro-Optical Systems 1:40 Air Multispectral Imaging (AMI) • All-source image fusion • Component enablers and integration • Application Dr. Darrel G. Hopper, Principal Electronics Engineer, Air Force Research Laboratory 2:20 Networking Break 2:50 Man-Portable Power Sources • Challenges faced with creating power sources for our warfighters • Updates on current projects Mike Brundage, Chief of Power Sources Branch, US Army, CERDEC, Army Power Division 3:30 Tracking from Commercial Satellite Imagery: SPAWAR • RAPIER program update • MDA imagery applications Heidi Buck, Intelligence, Surveillance, and Reconnaissance Office, Space and Naval Warfare Systems Command 4:10 End of Main Summit Register Now! Call 1.800.882.8684 or visit www.imagefusionsummit.com 4 Opening Keynote Theater Update Green Devil Morning Keynote–NRO DHS Perspective Afternoon Keynote PowerSources fromCERDEC Wednesday, November 18, 2009 MAIN SUMMIT, DAY 2