Return to search

A neural model of head direction calibration during spatial navigation: learned integration of visual, vestibular, and motor cues

Thesis (Ph.D.)--Boston University, 2012 / PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you. / Effective navigation depends upon reliable estimates of head direction (HD). Visual, vestibular, and outflow motor signals combine for this purpose in a brain system that includes dorsal tegmental nucleus, lateral mammillary nuclei (LMN), anterior dorsal thalamic nucleus (ADN), and the postsubiculum (PoS). Learning is needed to combine such different cues and to provide reliable estimates of HD. A neural model is developed to explain how these three types of signals combine adaptively within the above brain regions to generate a consistent and reliable HD estimate, in both light and darkness. The model starts with establishing HD cells so that each cell is tuned to a preferred head direction, wherein the firing rate is maximal at the preferred direction and decreases as the head turns from the preferred direction. In the brain, HD cells fire in anticipation of a head rotation. This anticipation is measured by the anticipated time interval (ATI), which is greater in early processing stages of the HD system than at later stages. The ATI is greatest in the LMN at -70 ms, it is reduced in the ADN to -25 ms, and non-existing in the last HD stage, the PoS. In the model, these HD estimates are controlled at the corresponding processing stages by combinations of vestibular and motor signals as they become adaptively calibrated to produce a correct HD estimate. The model also simulates how visual cues anchor HD estimates through adaptive learning when the cue is in the animal's field of view. Such learning gains control over cell firing within minutes. As in the data, distal visual cues are more effective than proximal cues for anchoring the preferred direction. The introduction of novel cues in either a novel or familiar environment is learned and gains control over a cell's preferred direction within minutes. Turning out the lights or removing all familiar cues does not change the cells firing activity, but it may accumulate a drift in the cell's preferred direction. / 2031-01-01

Identiferoai:union.ndltd.org:bu.edu/oai:open.bu.edu:2144/32882
Date January 2012
CreatorsFortenberry, Bret
PublisherBoston University
Source SetsBoston University
Languageen_US
Detected LanguageEnglish
TypeThesis/Dissertation

Page generated in 0.0053 seconds