Vol. 46
Latest Volume
All Volumes
PIERM 130 [2024] PIERM 129 [2024] PIERM 128 [2024] PIERM 127 [2024] PIERM 126 [2024] PIERM 125 [2024] PIERM 124 [2024] PIERM 123 [2024] PIERM 122 [2023] PIERM 121 [2023] PIERM 120 [2023] PIERM 119 [2023] PIERM 118 [2023] PIERM 117 [2023] PIERM 116 [2023] PIERM 115 [2023] PIERM 114 [2022] PIERM 113 [2022] PIERM 112 [2022] PIERM 111 [2022] PIERM 110 [2022] PIERM 109 [2022] PIERM 108 [2022] PIERM 107 [2022] PIERM 106 [2021] PIERM 105 [2021] PIERM 104 [2021] PIERM 103 [2021] PIERM 102 [2021] PIERM 101 [2021] PIERM 100 [2021] PIERM 99 [2021] PIERM 98 [2020] PIERM 97 [2020] PIERM 96 [2020] PIERM 95 [2020] PIERM 94 [2020] PIERM 93 [2020] PIERM 92 [2020] PIERM 91 [2020] PIERM 90 [2020] PIERM 89 [2020] PIERM 88 [2020] PIERM 87 [2019] PIERM 86 [2019] PIERM 85 [2019] PIERM 84 [2019] PIERM 83 [2019] PIERM 82 [2019] PIERM 81 [2019] PIERM 80 [2019] PIERM 79 [2019] PIERM 78 [2019] PIERM 77 [2019] PIERM 76 [2018] PIERM 75 [2018] PIERM 74 [2018] PIERM 73 [2018] PIERM 72 [2018] PIERM 71 [2018] PIERM 70 [2018] PIERM 69 [2018] PIERM 68 [2018] PIERM 67 [2018] PIERM 66 [2018] PIERM 65 [2018] PIERM 64 [2018] PIERM 63 [2018] PIERM 62 [2017] PIERM 61 [2017] PIERM 60 [2017] PIERM 59 [2017] PIERM 58 [2017] PIERM 57 [2017] PIERM 56 [2017] PIERM 55 [2017] PIERM 54 [2017] PIERM 53 [2017] PIERM 52 [2016] PIERM 51 [2016] PIERM 50 [2016] PIERM 49 [2016] PIERM 48 [2016] PIERM 47 [2016] PIERM 46 [2016] PIERM 45 [2016] PIERM 44 [2015] PIERM 43 [2015] PIERM 42 [2015] PIERM 41 [2015] PIERM 40 [2014] PIERM 39 [2014] PIERM 38 [2014] PIERM 37 [2014] PIERM 36 [2014] PIERM 35 [2014] PIERM 34 [2014] PIERM 33 [2013] PIERM 32 [2013] PIERM 31 [2013] PIERM 30 [2013] PIERM 29 [2013] PIERM 28 [2013] PIERM 27 [2012] PIERM 26 [2012] PIERM 25 [2012] PIERM 24 [2012] PIERM 23 [2012] PIERM 22 [2012] PIERM 21 [2011] PIERM 20 [2011] PIERM 19 [2011] PIERM 18 [2011] PIERM 17 [2011] PIERM 16 [2011] PIERM 14 [2010] PIERM 13 [2010] PIERM 12 [2010] PIERM 11 [2010] PIERM 10 [2009] PIERM 9 [2009] PIERM 8 [2009] PIERM 7 [2009] PIERM 6 [2009] PIERM 5 [2008] PIERM 4 [2008] PIERM 3 [2008] PIERM 2 [2008] PIERM 1 [2008]
2016-01-27
Multi-Look SAR ATR Using Two-Level Decision Fusion of Neural Network and Sparse Representation
By
Progress In Electromagnetics Research M, Vol. 46, 89-100, 2016
Abstract
As for the lack of the contribution by decision fusion in pose estimation and the demand for the combination of the feature fusion and the decision fusion in SAR ATR, in this paper, with the help of pose estimation, a new multi-look SAR ATR method is proposed in order to improve the performance, which is based on two-level decision fusion of neural network and sparse representation. The first-level decision fusion is acted for the combination of the pose estimation result by neural network and sparse representation. Based on the constraint of pose, these two models are exerted for the multi-look SAR ATR, and the second-level decision fusion is used to achieve the final recognition result. Several experiments based on MSTAR are conducted, and experimental results show that our method can achieve an acceptable result.
Citation
Xuan Li, Chun-Sheng Li, and Pengbo Wang, "Multi-Look SAR ATR Using Two-Level Decision Fusion of Neural Network and Sparse Representation," Progress In Electromagnetics Research M, Vol. 46, 89-100, 2016.
doi:10.2528/PIERM15092304
References

1. O’Sullivan, J. A., et al. "SAR ATR performance using a conditionally Gaussian model," IEEE Transactions on Aerospace and Electronic Systems, Vol. 37, No. 1, 91-108, 2001.

2. Brown, M. Z., "Analysis of multiple-view Bayesian classification for SAR ATR," AeroSense 2003. International Society for Optics and Photonics, 265-274, 2003.

3. Ruta, D. and B. Gabrys, "An overview of classifier fusion methods," Computing and Information Systems, Vol. 7, No. 1, 1-10, 2000.

4. Liu, H. and S. Li, "Decision fusion of sparse representation and support vector machine for SAR image target recognition," Neurocomputing, Vol. 113, 97-104, 2013.

5. Yu, X., Y. Li, and L. C. Jiao, "SAR automatic target recognition based on classifiers fusion," 2011 International Workshop on Multi-Platform/Multi-Sensor Remote Sensing and Mapping (M2RSM), IEEE, 1-5, 2011.

6. Ranney, K. I., H. Khatri, and L. H. Nguyen, "SAR prescreener enhancement through multi-look processing," Defense and Security. International Society for Optics and Photonics, 1065-1072, 2004.

7. Albrecht, T. W. and S. C. Gustafson, "Hidden Markov models for classifying SAR target images," Defense and Security. International Society for Optics and Photonics, 302-308, 2004.

8. Albrecht, T. W., K. W. Bauer, and Jr., "Classification of sequenced SAR target images via hidden Markov models with decision fusion," Defense and Security. International Society for Optics and Photonics, 306-313, 2005.

9. Zhang, H., et al. "Joint sparse representation based automatic target recognition in SAR images," SPIE Defense, Security, and Sensing. International Society for Optics and Photonics, 805112-805112, 2011.

10. Morgan, D. R. and T. D. Ross, "A Bayesian framework for ATR decision-level fusion experiments," Defense and Security Symposium. International Society for Optics and Photonics, 65710C-65710C, 2007.

11. Ettinger, G. J. and W. C. Snyder, "Model-based fusion of multi-look SAR for ATR," AeroSense 2002. International Society for Optics and Photonics, 277-289, 2002.

12. Huan, R. and Y. Pan, "Decision fusion strategies for SAR image target recognition," IET Radar, Sonar & Navigation, Vol. 5, No. 7, 747-755, 2011.

13. Huan, R.-H. and Y. Pan, "Target recognition for multi-aspect SAR images with fusion strategies," Progress In Electromagnetics Research, Vol. 134, 267-288, 2013.

14. Zhao, Q., et al. "Synthetic aperture radar automatic target recognition with three strategies of learning and representation," Optical Engineering, Vol. 39, No. 5, 1230-1244, 2000.

15. Kaplan, L. M. and R. Murenzi, "Pose estimation of SAR imagery using the two dimensional continuous wavelet transform," Pattern Recognition Letters, Vol. 24, No. 14, 2269-2280, 2003.

16. Voicu, L. I., R. Patton, and H. R. Myler, "Multicriterion vehicle pose estimation for SAR ATR," AeroSense’99. International Society for Optics and Photonics, 497-506, 1999.

17. Sun, Y., et al. "Adaptive boosting for SAR automatic target recognition," IEEE Transactions on Aerospace and Electronic Systems, Vol. 43, No. 1, 112-125, 2007.

18. Principe, J. C., D. Xu, J. W. Fisher, and III, "Pose estimation in SAR using an information theoretic criterion," Aerospace/Defense Sensing and Controls. International Society for Optics and Photonics, 218-229, 1998.

19. Roberts, D. J., "Applications of artificial neural networks to synthetic aperture radar for feature extraction in noisy environments,", Diss. California Polytechnic State University San Luis Obispo, California, 2013.

20. Xing, X., et al. "SAR vehicle classification based on sparse representation with aspect angle constraint," Eighth International Symposium on Multispectral Image Processing and Pattern Recognition. International Society for Optics and Photonics, 89180K-89180K, 2013.

21. Ng, A., "Sparse autoencoder," CS294A Lecture notes, Vol. 72, 2011.

22. Rennie, J. D. M., "Pose estimation in SAR using an information theoretic criterion,", Diss. Massachusetts Institute of Technology, 2001.

23. Azouzi, R. and M. Guillot, "On-line prediction of surface finish and dimensional deviation in turning using neural network based sensor fusion," International Journal of Machine Tools and Manufacture, Vol. 37, No. 9, 1201-1217, 1997.

24. Principe, J., Q. Zhao, and D. Xu, "A novel ATR classifier exploiting pose information," Proceedings of Image Understanding Workshop, 833-836, 1998.

25. Zhao, Q. and J. C. Principe, "Support vector machines for SAR automatic target recognition," IEEE Transactions on Aerospace and Electronic Systems, Vol. 37, No. 2, 643-654, 2001.

26. Huan, R. and R. Yang, "SAR target recognition using multiple views decision fusion," Journal of Remote Sensing, Vol. 14, No. 2, 252-256, 2010.