• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Statistical Machine Learning Methods for the Large Scale Analysis of Neural Data

Mena, Gonzalo Esteban January 2018 (has links)
Modern neurotechnologies enable the recording of neural activity at the scale of entire brains and with single-cell resolution. However, the lack of principled approaches to extract structure from these massive data streams prevent us from fully exploiting the potential of these technologies. This thesis, divided in three parts, introduces new statistical machine learning methods to enable the large-scale analysis of some of these complex neural datasets. In the first part, I present a method that leverages Gaussian quadrature to accelerate inference of neural encoding models from a certain type of observed neural point processes --- spike trains --- resulting in substantial improvements over existing methods. The second part focuses on the simultaneous electrical stimulation and recording of neurons using large electrode arrays. There, identification of neural activity is hindered by stimulation artifacts that are much larger than spikes, and overlap temporally with spikes. To surmount this challenge, I develop an algorithm to infer and cancel this artifact, enabling inference of the neural signal of interest. This algorithm is based on a a bayesian generative model for recordings, where a structured gaussian process is used to represent prior knowledge of the artifact. The algorithm achieves near perfect accuracy and enables the analysis of data hundreds of time faster than previous approaches. The third part is motivated by the problem of inference of neural dynamics in the worm C.elegans: when taking a data-driven approach to this question, e.g., when using whole-brain calcium imaging data, one is faced with the need to match neural recordings to canonical neural identities, in practice resolved by tedious human labor. Alternatively, on a bayesian setup this problem may be cast as posterior inference of a latent permutation. I introduce methods that enable gradient-based approximate posterior inference of permutations, overcoming the difficulties imposed by the combinatorial and discrete nature of this object. Results suggest the feasibility of automating neural identification, and demonstrate variational inference in permutations is a sensible alternative to MCMC.

Page generated in 0.1021 seconds