Return to search

Automated counting of cell bodies using Nissl stained cross-sectional images

Cell count is an important metric in neurological research. The loss in numbers
of certain cells like neurons has been found to accompany not only the deterioration of
important brain functions but disorders like clinical depression as well. Since the manual
counting of cell numbers is a near impossible task considering the sizes and numbers
involved, an automated approach is the obvious alternative to arrive at the cell count. In
this thesis, a software application is described that automatically segments, counts, and
helps visualize the various cell bodies present in a sample mouse brain, by analyzing the
images produced by the Knife-Edge Scanning Microscope (KESM) at the Brain
Networks Laboratory.
The process is described essentially in five stages: Image acquisition, Pre-
Processing, Processing, Analysis and Refinement, and finally Visualization. Nissl
staining is a staining mechanism that is used on the mouse brain sample to highlight the
cell bodies of our interest present in the brain, namely neurons, granule cells and
interneurons. This stained brain sample is embedded in solid plastic and imaged by the
KESM, one section at a time. The volume that is digitized by this process is the data that
is used for the purpose of segmentation.
While most sections of the mouse brain tend to be comprised of sparsely
populated neurons and red blood cells, certain sections near the cerebellum exhibit a
very high density and population of smaller granule cells, which are hard to segment
using simpler image segmentation techniques. The problem of the sparsely populated
regions is tackled using a combination of connected component labeling and template matching, while the watershed algorithm is applied to the regions of very high density.
Finally, the marching cubes algorithm is used to convert the volumetric data to a 3D
polygonal representation.
Barring a few initializations, the process goes ahead with minimal manual
intervention. A graphical user interface is provided to the user to view the processed data
in 2D or 3D. The interface offers the freedom of rotating and zooming in/out of the 3D
model, as well as viewing only cells the user is interested in analyzing. The
segmentation results achieved by our automated process are compared with those
obtained by manual segmentation by an independent expert.

Identiferoai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/ETD-TAMU-2035
Date15 May 2009
CreatorsD'Souza, Aswin Cletus
ContributorsKeyser, John
Source SetsTexas A and M University
Languageen_US
Detected LanguageEnglish
Typethesis, text
Formatelectronic, application/pdf, born digital

Page generated in 0.002 seconds