doanh của phòng thí nghiệm (JDL) mô hình dữ liệu quá trình nhiệt toán. Level 1 kết quả quá trình trong cơ sở dữ liệu phát triển có chứa các ước tính về vị trí, vận tốc, thuộc tính, và danh tính của các thực thể vật lý hạn chế | The output y j is a row vector of length D where each element indicates the confidence that the input data from the multiple sensor set has membership in a particular class. At time k the output decision d k is the class that satisfies the maximum confidence criteria of Equation . N ịwi 1 This implementation of weighted decision fusion permits future extension in two ways. First it provides a path to the use of confidence as an input from each sensor. This would allow the fusion process to utilize fuzzy logic within the structure. Second it enables an adaptive mechanism to be incorporated that can modify the sensor weights as data are processed through the system. Bayesian Inference Bayes theorem16-18 serves as the basis for the Bayesian inference technique for identity fusion. This technique provides a method for computing the a posteriori probability of a particular outcome based on previous estimates of the likelihood and additional evidence. Bayesian inference assumes that a set of D mutually exclusive and exhaustive hypotheses or outcomes exists to explain a given situation. In the decision-level multisensor fusion problem Bayesian inference is implemented as follows. A system exists with N sensors that provide decisions on membership to one of D possible classes. The Bayesian fusion structure uses a priori information on the probability that a particular hypothesis exists and the likelihood that a particular sensor is able to classify the data to the correct hypothesis. The inputs to the structure are 1 P Oj the a priori probabilities that object j exists or equivalently that a fault condition exists 2 P Dki Oj the likelihood that each sensor k will classify the data as belonging to any one of the D hypotheses and 3 Dk the input decisions from the K sensors. Equation describes the Bayesian combination rule. K .j. w - NK ị P oj n p i 1 k 1 The output is a vector with element j representing the a posteriori probability that the data .