A measure of the information content of EIT data

Physiol Meas. 2008 Jun;29(6):S101-9. doi: 10.1088/0967-3334/29/6/S09. Epub 2008 Jun 10.

Abstract

We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainty is quantified by the change from the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, (Sigma(n)), and a prior model of the image element covariances (Sigma(x)), we calculate IM = 1/2 summation operator log(2)([SNR](i) + 1), where [SNR](i) is the signal-to-noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM =179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Electric Impedance
  • Humans
  • Information Theory*
  • Time Factors
  • Tomography / methods*