Detecting Breast Cancer with a New Algorithm

What does remote sensing for camouflaged enemy ground vehicles have to do with breast cancer diagnosis? By next year, perhaps plenty. A smart sensor fusion algorithm modeled on the human visual/brain "unsupervised" learning system and a 200 channel hyperspectral remote sensing capability have been developed by the Office of Naval Research for use as a passive electro-optical, infra-red ground surveillance system. The same method has now shown success in detecting the heat radiated by abnormally reproducing breast cancer cells.

Hyperspectral sensors sweep up enormous quantities of data, but their usefulness has been limited by our ability to pull the important information out of that clutter. The algorithm that processes the data is the important factor. Last year the Under Secretary of Defense for Science and Technology asked ONR to look at the potential usage of remote sensing to improve breast cancer diagnosis. Dr. Harold Szu and Mr. James Buss' single-pixel unsupervised classification algorithm, based on the Lagrange Constraint Neural Network (LCNN) and multiple spectral data per pixel initially designed to increase the effectiveness of surveillance systems, now promises to enhance the sensitivity and accuracy of breast cancer testing.

Abnormally reproducing cells demand greater nutrition through increased blood supply, thus generating higher concentrations of heat in specific areas. Applying their algorithm, Dr. Harold Szu and Mr. James Buss are able to classify the infrared heat distribution given off by these cells.

A truly unsupervised algorithm per pixel must be based on the information derived directly form spectral data alone. In order to reveal the hidden spectral features contained in a single pixel image data vector X=[A]S, one has to invert the matrix without knowing both the breast-medium heat-transfer matrix (MTF) [A] and the heat source S which both vary from pixel to pixel. While ONR's space-variant imaging algorithm following the spectral data vector analysis and the physics constraints of thermodynamics free energy minimization has achieved sub-pixel accuracy, other statistical Independent Component Analyses (ICA) methodologies suffer pixel-averaging blurring effect. This is because the average over neighborhood pixels must implicitly assume an identical MTF [A] for the space-invariant imaging. This would be true only in cases of a large tumor requiring no more automatic target detection.

Similar to a pair of human eyes, a pair of cameras - at different infrared wavelengths* — transcribes this thermal diffusion process into two images, which are then filtered for shared signals while disagreement noise is minimized. Through this process, last February Szu and Buss and their team detected early stage ductal carcinoma in situ (DCIS) in a test patient using a double-blind procedure. See the images in the Image Gallery section of this media site.

"This multispectral, sub-pixel super-resolution is potentially more accurate by an order of magnitude," states Dr. Szu, "It offers a passive, inexpensive, non-intrusive, convenient means of screening pre-cancer patients without radiation hazard, and may potentially detect in situ carcinomas long before a mammogram might detect them."

Thermal breast scanning has been employed for a number of years, especially in Europe and Asia, but its use has been limited to a single infra-red band, using a single camera. The application of the "unsupervised" classification algorithm may offer an unbiased, more sensitive, accurate, and generally more effective way to track the development of breast cancer, without demanding the variables of a long wait in a cold room, increasing the variability inaccuracy in thermal detection and causing patient discomfort.

The success of the initial double-blind experiment substantiates the promising application for the use of multispectral imaging in improving the early detection process for breast cancer and possibly other dermal carcinomas. A provisional patent application has been filed. Follow-on research and clinical studies are being planned through the use of Cooperative Research and Development Agreements (CRADA). A web-based database of medical images (MedATR) is being developed by Advanced Concepts Analysis Inc., of Falls Church, VA, hosted on the Air Force Virtual Distributed Laboratory secure web site (VDL).

*Medium wavelength IR (3-5 m) camera and Long wavelength IR (8-12 m) camera.. Both have about 10 milli Kelvin degrees in the minimum resolvable temperature difference (MRTD).

* Some pages on this website provide links which require a plug in to view.