• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Neural circuits for solving the cocktail party problem in mouse auditory cortex

Nocon, Jian Carlo P. 17 January 2023 (has links)
Neural circuits that mediate complex behaviors contain several cell types, yet little is known about the role of each cell type within these circuits. An example problem in the auditory domain is how cortical circuits process complex natural sounds amidst competing stimuli from different spatial sources, also known as the "cocktail party effect". A pre-study recorded cortical responses in songbirds and found that neurons are broadly tuned to sound location when only one sound is present; when a competing stimulus is introduced, neurons sharpen their spatial tuning. These results were visualized by "spatial grids" that show preferred sound source locations in the presence of competing stimuli. These experiments motivated a computational model which proposed that lateral inhibition between spatially tuned channels within cortex is a key mechanism for spatial sound segregation. Cortical circuits are known to contain both excitatory cells and subpopulations of inhibitory interneurons, the roles of which can be probed in vivo with optogenetic techniques. Motivated by these past results and the optogenetic tools readily available in the mouse model, I present experimental and computational approaches in uncovering the cortical circuits that aid in solving the cocktail party problem in mouse auditory cortex (ACx). First, I probe the role of parvalbumin-expressing (PV) interneurons in solving the cocktail party problem using optogenetic and electrophysiological techniques. I found that mice exhibit similar cortical spatial grids as in songbirds, and optogenetic suppression of PV neurons reduces discriminability between dynamic sounds in both clean and masked presentations of spatially distributed stimuli. To mechanistically explain these results, I create a two-layer computational model of ACx with PV subpopulations that respond to distinct temporal stimulus features. I found that differentially weighing inhibition from these interneurons captures the range of neural discriminability performances found in cortex and the effects of optogenetically suppressing PV cells. Next, I analyze the population coding of neurons during the cocktail party problem. Here, I found that a relatively compact and diverse population of neurons within cortex is sufficient for encoding sounds from competing spatial locations. Finally, I determine how changes in behavioral states via tone extinction tasks affect activity in ACx and medial prefrontal cortex (mPFC). Results show that alpha and beta oscillations (8-18 Hz) in response to unrewarded tones exhibited immediate and robust increases in both regions prior to behavioral changes. When subjects learned to suppress behavioral responses, coherence at 8-18 Hz between ACx and mPFC was enhanced and spiking at ACx in response to the unrewarded tone was decreased. Taken together, this work advances the knowledge of both bottom-up and top-down circuit mechanisms underlying the cocktail party problem. / 2024-01-16T00:00:00Z
2

Decoding spatial location of attended audio-visual stimulus with EEG and fNIRS

Ning, Matthew H. 17 January 2023 (has links)
When analyzing complex scenes, humans often focus their attention on an object at a particular spatial location in the presence of background noises and irrelevant visual objects. The ability to decode the attended spatial location would facilitate brain computer interfaces (BCI) for complex scene analysis. Here, we tested two different neuroimaging technologies and investigated their capability to decode audio-visual spatial attention in the presence of competing stimuli from multiple locations. For functional near-infrared spectroscopy (fNIRS), we targeted dorsal frontoparietal network including frontal eye field (FEF) and intra-parietal sulcus (IPS) as well as superior temporal gyrus/planum temporal (STG/PT). They all were shown in previous functional magnetic resonance imaging (fMRI) studies to be activated by auditory, visual, or audio-visual spatial tasks. We found that fNIRS provides robust decoding of attended spatial locations for most participants and correlates with behavioral performance. Moreover, we found that FEF makes a large contribution to decoding performance. Surprisingly, the performance was significantly above chance level 1s after cue onset, which is well before the peak of the fNIRS response. For electroencephalography (EEG), while there are several successful EEG-based algorithms, to date, all of them focused exclusively on auditory modality where eye-related artifacts are minimized or controlled. Successful integration into a more ecological typical usage requires careful consideration for eye-related artifacts which are inevitable. We showed that fast and reliable decoding can be done with or without ocular-removal algorithm. Our results show that EEG and fNIRS are promising platforms for compact, wearable technologies that could be applied to decode attended spatial location and reveal contributions of specific brain regions during complex scene analysis.

Page generated in 0.0622 seconds