Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Multisensor data fusion for defense application

4 764 vues

Publié le

Publié dans : Formation, Technologie, Business
  • Identifiez-vous pour voir les commentaires

Multisensor data fusion for defense application

  1. 1. MULTISENSOR DATA FUSION FOR DEFENSE APPLICATION Othman Sidek and S.A.Quadri Collaborative µ-electronic Design Excellence Centre Universiti Sains Malaysia
  2. 2. <ul><li>DATA FUSION </li></ul><ul><li>Data-fusion is a problem-solving technique based on the idea of integrating many answers to a question into a single; best answer. </li></ul><ul><li>Process of combining inputs from various sensors to provide a robust and complete description of an environment or process of interest. </li></ul><ul><li>Multilevel, multifaceted process dealing with the automatic detection, association, correlation , estimation, and combination of data and information from single and multiple sources. </li></ul><ul><li>“ Properly said, fusion is neither a theory nor a technology in its own. It is a concept which uses various techniques pertaining to information theory, artificial intelligence and statistics [1] ” </li></ul><ul><li>[1] Dave L. Hall and James Llinas, “Introduction to Multisensor Data Fusion”, IEEE , Vol. 85, No. 1, pp. 6 – 23, Jan 1997. </li></ul>
  3. 3. <ul><li>Multisensor data fusion Applications : </li></ul><ul><li>Military application : </li></ul><ul><li>Location and characterization of enemy units and weapons </li></ul><ul><li>High level inferences about enemy situation </li></ul><ul><li>Air to air or surface to air defense </li></ul><ul><li>Ocean monitoring </li></ul><ul><li>Battlefield intelligence </li></ul><ul><li>Strategic warning </li></ul><ul><li>Non military applications: </li></ul><ul><li>Condition based maintenance </li></ul><ul><li>Detection of system faults </li></ul><ul><li>Robotics </li></ul><ul><li>Medical </li></ul><ul><li>Environmental monitoring </li></ul><ul><li>Location and identification </li></ul><ul><li>of natural phenomena </li></ul>
  4. 4. DATA FUSION APPLICATION IN ESTIMATION PROBLEMS Application Dynamic system Sensor Types Process control Chemical plant Pressure Temperature Flow rate Gas analyzer Flood predication River system Water level Rain gauge Weather radar Tracking Spacecraft Radar Imaging system Navigation Ship Sextant Log Gyroscope Accelerometer Global Navigation satellite system
  5. 5. <ul><li>Data fusion plays a critical and fundamental role in defense and national security </li></ul><ul><li>Areas of surveillance </li></ul><ul><li>Intelligence analysis for timely situational awareness. </li></ul><ul><li>Intelligence analysis is inherently a multi-modal multi-source data fusion operation. </li></ul><ul><li>Raw intelligence can come from multitude of sources, such as </li></ul><ul><li>SIGNT => Signals intelligence is intelligence-gathering by interception of signals. </li></ul><ul><li>COMINT => Communications intelligence between people. </li></ul><ul><li>ELINT => Electronic intelligence between electronic signals. </li></ul>
  6. 6. <ul><li>On the battlefield, data fusion may involve integrating data from several sensors </li></ul><ul><li>Radars </li></ul><ul><li>Infra Red sensors </li></ul><ul><li>Acoustic sensors </li></ul><ul><li>For arms control and verification, data fusion involve integrating data from </li></ul><ul><li>Multispectral satellite images </li></ul><ul><li>Ground-based sensors </li></ul><ul><li>On-site inspections for better analysis. </li></ul>
  7. 7. <ul><li>RSTAKA Function </li></ul><ul><li>Reconnaissance, Surveillance, Target Acquisition, and Kill Assessment functions where data from a variety of sensors must be integrated to deal effectively with many targets of different types. </li></ul><ul><li>In the defense domain, a target such as a tank or a missile carrier may be located in desert, swampy, or rolling terrain. </li></ul><ul><li>Targets can exhibit behaviors such as moving, rotating a turret, firing, erecting a missile, or launching a missile. </li></ul><ul><li>Again certain behaviors are only appropriate with certain types of targets. A target exhibiting a behavior in an environment generates a signature. </li></ul><ul><li>A launcher firing a missile generates a number of signatures — visual, radar, thermal, acoustic, and seismic. </li></ul><ul><li>However, the appearance of these signatures may be affected by the environment in which they occur. </li></ul>
  8. 8. A surveillance spacecraft may have a set of sensors to track the status of different critical subsystems. It is of great importance to be able to fuse information from these sensors to create a global picture of the health of the spacecraft Which allow to predict an impending failure and correct it before it reaches criticality.
  9. 9. <ul><li>Data Fusion Methodology </li></ul><ul><li>Data fusion requires combining expertise in two areas </li></ul><ul><li>Sensors </li></ul><ul><li>Information integration. </li></ul><ul><li>Data fusion is essentially an information integration problem. </li></ul><ul><li>Data fusion can be categorized into three main classes based on the level of information abstraction used for fusion: </li></ul><ul><li>Raw data fusion : Primarily limited to fusion of homogeneous modalities. </li></ul><ul><li> It requires the different data sets to be properly aligned for correlation </li></ul><ul><li>at the lowest level of information granularity. </li></ul><ul><li>(ii ) Feature vector fusion : Feature extractor is co-located with the sensor and the feature vectors </li></ul><ul><li>are sent to the fusion node. </li></ul><ul><li>(iii ) Decision fusion : Combines information at the highest level of abstraction but requires significant </li></ul><ul><li>computational capabilities in the sensor nodes, </li></ul>
  10. 10. <ul><li>Multisensor data fusion provides significant advantages over single source data </li></ul><ul><li>Improves accuracy </li></ul><ul><li>Improves precision </li></ul><ul><li>Supports effective decision making </li></ul>Schematic diagram of data fusion
  11. 11. <ul><li>MISSILE GUIDANCE SYSTEM </li></ul><ul><li>A system which evaluates </li></ul><ul><li>Flight information, </li></ul><ul><li>Correlates it with target data, </li></ul><ul><li>Determines the desired flight path of a missile, </li></ul><ul><li>Communicates the necessary commands to the missile flight control system. </li></ul><ul><li>The missile's target accuracy is a critical factor for its effectiveness. </li></ul><ul><li>Guidance systems improve missile accuracy by improving its &quot;Single Shot Kill Probability&quot; (SSKP). </li></ul><ul><li>Guidance systems are divided into different categories </li></ul><ul><li>Go-Onto-Target (GOT): A GOT missile can target either a moving or fixed target. </li></ul><ul><li>Go-Onto-Location-in-Space (GOLIS) guidance systems: GOLIS weapon is limited to a stationary target. </li></ul><ul><li>The trajectory that a missile takes while attacking a moving target is dependent upon the movement of the target. </li></ul>
  12. 12. <ul><li>In every GOT system, there are three subsystems: </li></ul><ul><li>Target tracker </li></ul><ul><li>Missile tracker </li></ul><ul><li>Guidance computer </li></ul><ul><li>The way these three subsystems are distributed between the missile and the launcher result in two different categories: </li></ul><ul><li>Remote Control Guidance: The guidance computer and target tracker are on the launching platform. </li></ul><ul><li>Homing Guidance: The guidance computers are in the missile and in the target tracker. </li></ul>
  13. 13. SIMULATION OF TARGET TRACKING AND ESTIMATION USING DATA FUSION Objective: Target tracking and estimation of a moving object Sensors required: Multiple sensors => Position estimation sensors => Velocity estimation sensors Need for heterogeneous multi sensors ? =>It is not possible to deduce a comprehensive picture about the entire target scenario from each of the pieces of evidence alone. =>Due to the inherent limitations of technical features characterizing each sensor. Coordinate system Selected : Cartesian coordinate system Technique applied : Multisensor data fusion using Kalman filter
  14. 14. <ul><li>Kalman filter </li></ul><ul><li>The Kalman filter produces estimates of the true values of measurements and their associated calculated values by </li></ul><ul><li>Predicting a value, </li></ul><ul><li>Estimating the uncertainty of the predicted value, </li></ul><ul><li>and computing a weighted average of the predicted value and the measured value. </li></ul><ul><li>The most weight is given to the value with the least uncertainty. </li></ul><ul><li>The estimates produced by the method tend to be closer to the true values than the original measurements . </li></ul><ul><li>Weighted average has a better-estimated uncertainty than either of the values that went into the weighted average. </li></ul><ul><li>Kalman filter uses </li></ul><ul><li>System's dynamics model (i.e., physical laws of motion), </li></ul><ul><li>Known control inputs to that system, </li></ul><ul><li>Measurements from sensors to form an estimate of the system's varying quantities (its state) </li></ul><ul><li>that is better than the estimate obtained by using any one measurement alone. </li></ul>Squares represent matrices. Ellipses represent multivariate normal distributions (with the mean and covariance matrix enclosed). Unenclosed values are vectors. F k is the state transition model, which is applied to the previous state x k−1 ; B k is the control-input model, which is applied to the control vector U k ; W k is the process noise, which is assumed to be drawn from a zero mean multivariate normal distribution with covariance Q k .(also denoted by v i
  15. 15. Simulation has been carried out by with two-dimensional state model of the moving object along x; y and z directions. The program is executed in Mat lab environment .
  16. 16. As shown in figure estimation using state vector fusion method using Kalman filter is more close and accurate to actual track.
  17. 17. Sample code % Missile_Launcher tracking Moving_Object using kalman filter clear all %% define our meta-variables (i.e. how long and often we will sample) duration = 10 %how long the Moving_Object flies dt = .1; %The Missile_Launcher continuously looks for the Object-in-motion , %but we'll assume he's just repeatedly sampling over time at a fixed interval %% Define update equations (Coefficent matrices): A physics based model for A = [1 dt; 0 1] ; % state transition matrix: expected flight of the Moving_Object (state prediction) B = [dt^2/2; dt]; %input control matrix: expected effect of the input accceleration on the state. C = [1 0]; % measurement matrix: the expected measurement given %% define main variables u = 1.5; % define acceleration magnitude Q= [0; 0]; %initized state--it has two components: [position; velocity] of the Moving_Object Q_estimate = Q; %x_estimate of initial location estimation of where the Moving_Object Moving_ObjectAccel_noise_mag = 0.05; %process noise: the variability in Q_loc = []; % ACTUAL Moving_Object flight path vel = []; % ACTUAL Moving_Object velocity Q_loc_meas = []; % Moving_Object path that the Missile_Launcher sees %% simulate what the Missile_Launcher sees over time figure(2);clf figure(1);clf % Generate the Moving_Object flight Moving_ObjectAccel_noise = Moving_ObjectAccel_noise_mag * [(dt^2/2)*randn; dt*randn]; Q= A * Q+ B * u + Moving_ObjectAccel_noise; ......................... pause end %plot theoretical path of Missile_Launcher that doesn't use kalman plot(0:dt:t, smooth(Q_loc_meas), '-g.') %plot(0:dt:t, vel, '-b.') %% Do kalman filtering %initize estimation variables ......................... % Plot the results figure(2); plot(tt,Q_loc,'-r.',tt,Q_loc_meas,'-k.', tt,Q_loc_estimate,'-g.'); %data measured by the Missile_Launcher ……………………… .. %combined position estimate mu = Q_loc_estimate(T); % mean sigma = P_mag_estimate(T); % standard deviation y = normpdf(x,mu,sigma); % pdf y = y/(max(y)); hl = line(x,y, 'Color','g'); % or use hold on and normal plot axis([Q_loc_estimate(T)-5 Q_loc_estimate(T)+5 0 1]); %actual position of the Moving_Object plot(Q_loc(T)); ylim=get(gca,'ylim'); line([Q_loc(T);Q_loc(T)],ylim.','linewidth',2,'color','b'); legend('state predicted','measurement','state estimate','actual Moving_Object position') pause end