Study of microcirculation has shown potential diagnostic value in diseases such as sepsis, chronic ulcers, diabetes mellitus, and hypertension. A technology that can quantitatively detect and monitor changes in microcirculation can lead to early detection of these pathological conditions, and therefore to a better chance of treatment. Additionally, in trauma it is highly desirable to automatically monitor microcirculation during resuscitation to decide when to start or stop resuscitation according to real-time quantitative analysis of microcirculation. Recently developed hardware systems have provided the means to capture video recordings of capillaries in lingual surface. However, despite these advances, there is still a lack of effective computational methods to analyze and interpret these images.
This invention improves existing technologies in that it is fully automated and it has the ability to extract more diagnostically accurate information such as functional capillary density and red cell velocity.
The designed system consists of two main steps: stabilization and segmentation. The method first stabilizes the videos in order to eliminate the effect of motion artifacts. Preprocessing techniques are applied to the stabilized video to improve the contrast and reduce noise.
The second step, segmentation, is performed in two resolutions using Gaussian pyramid. The algorithm detects active capillaries and determines functional capillary density. The algorithm is capable of extracting approximately 95% of the capillaries and blood vessels in each frame and successfully identifying healthy and hemorrhagic subjects based on the calculated index.
Initial findings published, A Multi-Resolution Entropic-based Image Processing Technique for Diagnostic Analysis of Microcirculation videos, BIOTECHNO 2010, Section International Conference on Advances in Biotechnologies. Algorithm tested on microcirculation video recordings. This technology is available for licensing to industry for further development and commercialization.