• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 293
  • 293
  • 102
  • 101
  • 82
  • 71
  • 70
  • 46
  • 39
  • 39
  • 38
  • 37
  • 35
  • 32
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Addressing Challenges in Graphical Models: MAP estimation, Evidence, Non-Normality, and Subject-Specific Inference

Sagar K N Ksheera (15295831) 17 April 2023 (has links)
<p>Graphs are a natural choice for understanding the associations between variables, and assuming a probabilistic embedding for the graph structure leads to a variety of graphical models that enable us to understand these associations even further. In the realm of high-dimensional data, where the number of associations between interacting variables is far greater than the available number of data points, the goal is to infer a sparse graph. In this thesis, we make contributions in the domain of Bayesian graphical models, where our prior belief on the graph structure, encoded via uncertainty on the model parameters, enables the estimation of sparse graphs.</p> <p><br></p> <p>We begin with the Gaussian Graphical Model (GGM) in Chapter 2, one of the simplest and most famous graphical models, where the joint distribution of interacting variables is assumed to be Gaussian. In GGMs, the conditional independence among variables is encoded in the inverse of the covariance matrix, also known as the precision matrix. Under a Bayesian framework, we propose a novel prior--penalty dual called the `graphical horseshoe-like' prior and penalty, to estimate precision matrix. We also establish the posterior convergence of the precision matrix estimate and the frequentist consistency of the maximum a posteriori (MAP) estimator.</p> <p><br></p> <p>In Chapter 3, we develop a general framework based on local linear approximation for MAP estimation of the precision matrix in GGMs. This general framework holds true for any graphical prior, where the element-wise priors can be written as a Laplace scale mixture. As an application of the framework, we perform MAP estimation of the precision matrix under the graphical horseshoe penalty.</p> <p><br></p> <p>In Chapter 4, we focus on graphical models where the joint distribution of interacting variables cannot be assumed Gaussian. Motivated by the quantile graphical models, where the Gaussian likelihood assumption is relaxed, we draw inspiration from the domain of precision medicine, where personalized inference is crucial to tailor individual-specific treatment plans. With an aim to infer Directed Acyclic Graphs (DAGs), we propose a novel quantile DAG learning framework, where the DAGs depend on individual-specific covariates, making personalized inference possible. We demonstrate the potential of this framework in the regime of precision medicine by applying it to infer protein-protein interaction networks in Lung adenocarcinoma and Lung squamous cell carcinoma.</p> <p><br></p> <p>Finally, we conclude this thesis in Chapter 5, by developing a novel framework to compute the marginal likelihood in a GGM, addressing a longstanding open problem. Under this framework, we can compute the marginal likelihood for a broad class of priors on the precision matrix, where the element-wise priors on the diagonal entries can be written as gamma or scale mixtures of gamma random variables and those on the off-diagonal terms can be represented as normal or scale mixtures of normal. This result paves new roads for model selection using Bayes factors and tuning of prior hyper-parameters.</p>
292

Towards structured planning and learning at the state fisheries agency scale

Aldridge, Caleb A 09 December 2022 (has links)
Inland recreational fisheries has grown philosophically and scientifically to consider economic and sociopolitical aspects (non-biological) in addition to the biological. However, integrating biological and non-biological aspects of inland fisheries has been challenging. Thus, an opportunity exists to develop approaches and tools which operationalize planning and decision-making processes which include biological and non-biological aspects of a fishery. This dissertation expands the idea that a core set of goals and objectives is shared among and within inland fisheries agencies; that many routine operations of inland fisheries managers can be regimented or standardized; and the novel concept that current information and operations can be used to improve decision making through structured decision making and adaptive management approaches at the agency scale. In CHAPTER II, my results show that the goals of inland fisheries agencies tend to be more similar than different but have expanded and diversified since the 1970s. I suggest that changes in perspectives and communication technology, as well as provisions within nationwide funding mechanisms, have led to goals becoming more homogenous across the USA and more diverse within each bureau. In CHAPTER III, I found that standardized collection and careful curation of data has allowed one inland fisheries bureau to acquire a large fish and fisheries database and that managers use this database to summarize common fish population parameters and indices, craft objectives, and set targets. The regimentation of data management and analysis has helped managers within the inland fisheries bureau to assess fish populations and fisheries efficiently and effectively across waterbodies within their districts and state. In CHAPTER IV, I extend CHAPTERS II and III to show that biological and non-biological management objectives and their associated measurable attributes and management actions can be synthesized into a common set of decision elements. I demonstrate how common decision elements enable managers to easily structure decisions and help to address common problems at the agency scale. Using a subset of common decision elements, I demonstrate how existing agency operations (e.g., monitoring) can be used to expedite learning and improve decision making for a common problem faced by managers in multiple, similar systems.
293

Software Internationalization: A Framework Validated Against Industry Requirements for Computer Science and Software Engineering Programs

Vũ, John Huân 01 March 2010 (has links)
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c. In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested five new courses to add to the computer science curriculum due to the growing "gap between what college graduates in any field are taught and what they need to know to work in industry." He concludes that "globalization and accessibility should be part of any course of introductory programming," stating: A course on globalization and accessibility is long overdue on college campuses. It is embarrassing to take graduates from a college with a diverse student population and have to teach them how to write software for a diverse set of customers. This should be part of introductory software development. Anything less is insulting to students, their family, and the peoples of the world. There is very little research into how the subject of software internationalization should be taught to meet the major requirements of the industry. The research question of the thesis is thus, "Is there a framework for software internationalization that has been validated against industry requirements?" The answer is no. The framework "would promote communication between academia and industry ... that could serve as a common reference point in discussions." Since no such framework for software internationalization currently exists, one will be developed here. The contribution of this thesis includes a provisional framework to prepare graduates to internationalize software and a validation of the framework against industry requirements. The requirement of this framework is to provide a portable and standardized set of requirements for computer science and software engineering programs to teach future graduates.

Page generated in 0.0572 seconds