Patch-based sparse reconstruction for electrical impedance tomography
Abstract
Purpose
Electrical impedance tomography (EIT) is a technique for reconstructing the conductivity distribution by injecting currents at the boundary of a subject and measuring the resulting changes in voltage. Image reconstruction for EIT is a nonlinear problem. A generalized inverse operator is usually ill-posed and ill-conditioned. Therefore, the solutions for EIT are not unique and highly sensitive to the measurement noise.
Design/methodology/approach
This paper develops a novel image reconstruction algorithm for EIT based on patch-based sparse representation. The sparsifying dictionary optimization and image reconstruction are performed alternately. Two patch-based sparsity, namely, square-patch sparsity and column-patch sparsity, are discussed and compared with the global sparsity.
Findings
Both simulation and experimental results indicate that the patch based sparsity method can improve the quality of image reconstruction and tolerate a relatively high level of noise in the measured voltages.
Originality/value
EIT image is reconstructed based on patch-based sparse representation. Square-patch sparsity and column-patch sparsity are proposed and compared. Sparse dictionary optimization and image reconstruction are performed alternately. The new method tolerates a relatively high level of noise in measured voltages.
Keywords
Acknowledgements
The authors gratefully acknowledge the financial support from the National Natural Science Foundation of China (61402330, 61301246, 61301244 and 61601324), the PhD Programs Foundation of Ministry of Education of China (20131201120002) and the Natural Science Foundation of Tianjin Municipal Science and Technology Commission (15JCQNJC01500).
Citation
Wang, Q., Zhang, P., Wang, J., Chen, Q., Lian, Z., Li, X., Sun, Y., Duan, X., Cui, Z., Sun, B. and Wang, H. (2017), "Patch-based sparse reconstruction for electrical impedance tomography", Sensor Review, Vol. 37 No. 3, pp. 257-269. https://doi.org/10.1108/SR-07-2016-0126
Publisher
:Emerald Publishing Limited
Copyright © 2017, Emerald Publishing Limited