Abstract
In large scale information fusion networks the statistical dependencies between nodes are rarely known. Failing to account for this detail in the design of distributed estimation schemes may lead to statistical inconsistencies and may even cause some estimators to diverge at various nodes within the network. Among the approaches that have been proposed to address this problem, covariance intersection and its generalization are perhaps the most well-known. Owing to its relationship with the statistical divergence known as Chernoff information the latter approach is referred to here as Chernoff fusion. Using Chernoff fusion requires tuneup for the fusion parameters, where in some applications, the Chernoff fusion may cause overhead. This work is concerned with the practical application of Chernoff fusion for distributed estimation. We provide a technique for fast particle filters fusion based on a linear variant of Chernoff fusion as well as a recursive tuning technique of the underlying mixing weights. The viability of the proposed schemes are demonstrated in such applications as distributed object tracking, cooperative robots localization, and image classification with ensembles of convolutional neural networks.
Original language | English |
---|---|
Article number | 102877 |
Journal | Digital Signal Processing: A Review Journal |
Volume | 107 |
DOIs | |
State | Published - 1 Dec 2020 |
Keywords
- Chernoff fusion
- Distributed information fusion
- Distributed particle filtering
- Ensembles of neural networks
ASJC Scopus subject areas
- Signal Processing
- Computer Vision and Pattern Recognition
- Statistics, Probability and Uncertainty
- Computational Theory and Mathematics
- Electrical and Electronic Engineering
- Artificial Intelligence
- Applied Mathematics