• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 356
  • 96
  • 73
  • 47
  • 26
  • 20
  • 18
  • 12
  • 10
  • 8
  • 6
  • 5
  • 3
  • 2
  • 2
  • Tagged with
  • 814
  • 279
  • 221
  • 200
  • 173
  • 131
  • 121
  • 96
  • 91
  • 88
  • 85
  • 72
  • 67
  • 67
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Interactive Computer Graphical Approaches to some Maximin and Minimax Location Problems

Buchanan, David John 03 1900 (has links)
This study describes algorithms for the solution of several single facility location problems with maximin or minimax objective functions. Interactive computer graphical algorithms are presented for maximizing the minimum rectilinear travel distance and for minimizing the maximum rectilinear travel distance to a number of point demands when there exist several right-angled polygonal barriers to travel. For the special case of unweighted rectilinear distances with barriers, a purely numerical algorithm for the maximin location problem is described. An interactive computer graphical algorithm for maximizing the minimum Euclidean, rectilinear, or general 1p distance to a number of polygonal areas is described. A modified version of this algorithm for location problems with the objective of minimizing the maximum cost when the costs are non-linear monotonically decreasing functions of distance is presented. Extension of this algorithm to problems involving the minimization of the maximum cost when the costs are functions of both distance and direction is discussed using asymmetric distances. / Thesis / Doctor of Philosophy (PhD)
152

Definition, Analysis, And An Approach For Discrete-Event Simulation Model Interoperability

Wu, Tai-Chi 10 December 2005 (has links)
Even though simulation technology provides great benefits to industry, it is largely underutilized. One of the biggest barriers to utilizing simulation is the lack of interoperability between simulation models. This is especially true when simulation models that need to interact with each other span an enterprise or supply chain. These models are likely to be distributed and developed in disparate simulation application software. In order to analyze the dynamic behavior of the systems they represent, the models must interoperate. However, currently this interoperability is nearly impossible. The interaction of models also refers to the understanding of them among stakeholders in the different stages of models¡Š lifecycles. The lack of interoperability also makes it difficult to share the knowledge within disparate models. This research first investigates this problem by identifying, defining, and analyzing the types of simulation model interactions. It then identifies and defines possible approaches to allow models to interact. Finally, a framework that adopts the strength of Structured Modeling (SM) and the Object-Oriented (OO) concept is proposed for representing discrete event simulation models. The framework captures the most common simulation elements and will serve as an intermediate language between disparate simulation models. Because of the structured nature of the framework, the resulting model representation is concise and easily understandable. Tools are developed to implement the framework. A Common User Interface (CUI) with software specified controllers is developed for using the proposed framework with various commercial simulation software packages. The CUI is also used to edit simulation models in a neutral environment. A graphical modeling tool is also developed to facilitate conceptual modeling. The resulting graphic can be translated into the common model representation automatically. This not only increases the understanding of models for all stakeholders, but also shifts model interactions to the ¡§formulating¡š stage, which can prevent problems later in the model¡Šs lifecycle. Illustration of the proposed framework and the tools will be given, as well as future work needs.
153

PEDIGREE QUERY, VISUALIZATION, AND GENETIC CALCULATIONS TOOL

Kurtcephe, Murat 27 August 2012 (has links)
No description available.
154

Learning for Spoken Dialog Systems with Discriminative Graphical Models

Ma, Yi January 2015 (has links)
No description available.
155

VISUALIZATION OF 3D OPTICAL LATTICES AND GRAPHICAL USER INTERFACE SOFTWARE DEVELOPMENT

Lee, Hoseong Asher 25 October 2016 (has links)
No description available.
156

The Implementation of a Standard Computer Graphics Package - Graphical Kernel System

Chen, Deh-Chang 03 1900 (has links)
Computer graphics is a field whose time has come. In the past, it was an esoteric specialty involving expensive display hardware and idiosyncratic software. Recently, hardware has become more readily available, and efforts have been made to develop graphics software standards, which help make graphics programming rational and straightforward. The Graphical Kernel System (GKS) is rapidly gaining acceptance as a worldwide standard for computer graphics. The International Standards Organization (ISO) is in the final stages of converting GKS from its current status as a Draft International Standard (DIS) to an International Standard. This report presents an overview of GKS and also discusses a subroutine library, that has been developed for use at McMaster University and is equivalent to ”0a" GKS (the lowest level of GKS). This library, called GKSLIB, is written in FORTRAN 77, and could be used by a programmer to support a wide range of two-dimensional, passive graphics applications. / Thesis / Master of Science (MS)
157

SCALABLE BAYESIAN METHODS FOR PROBABILISTIC GRAPHICAL MODELS

Chuan Zuo (18429759) 25 April 2024 (has links)
<p dir="ltr">In recent years, probabilistic graphical models have emerged as a powerful framework for understanding complex dependencies in multivariate data, offering a structured approach to tackle uncertainty and model complexity. These models have revolutionized the way we interpret the interplay between variables in various domains, from genetics to social network analysis. Inspired by the potential of probabilistic graphical models to provide insightful data analysis while addressing the challenges of high-dimensionality and computational efficiency, this dissertation introduces two novel methodologies that leverage the strengths of graphical models in high-dimensional settings. By integrating advanced inference techniques and exploiting the structural advantages of graphical models, we demonstrate how these approaches can efficiently decode complex data patterns, offering significant improvements over traditional methods. This work not only contributes to the theoretical advancements in the field of statistical data analysis but also provides practical solutions to real-world problems characterized by large-scale, complex datasets.</p><p dir="ltr">Firstly, we introduce a novel Bayesian hybrid method for learning the structure of Gaus- sian Bayesian Networks (GBNs), addressing the critical challenge of order determination in constraint-based and score-based methodologies. By integrating a permutation matrix within the likelihood function, we propose a technique that remains invariant to data shuffling, thereby overcoming the limitations of traditional approaches. Utilizing Cholesky decompo- sition, we reparameterize the log-likelihood function to facilitate the identification of the parent-child relationship among nodes without relying on the faithfulness assumption. This method efficiently manages the permutation matrix to optimize for the sparsest Cholesky factor, leveraging the Bayesian Information Criterion (BIC) for model selection. Theoretical analysis and extensive simulations demonstrate the superiority of our method in terms of precision, recall, and F1-score across various network complexities and sample sizes. Specifically, our approach shows significant advantages in small-n-large-p scenarios, outperforming existing methods in detecting complex network structures with limited data. Real-world applications on datasets such as ECOLI70, ARTH150, MAGIC-IRRI, and MAGIC-NIAB further validate the effectiveness and robustness of our proposed method. Our findings contribute to the field of Bayesian network structure learning by providing a scalable, efficient, and reliable tool for modeling high-dimensional data structures.</p><p dir="ltr">Secondly, we introduce a Bayesian methodology tailored for Gaussian Graphical Models (GGMs) that bridges the gap between GBNs and GGMs. Utilizing the Cholesky decomposition, we establish a novel connection that leverages estimated GBN structures to accurately recover and estimate GGMs. This innovative approach benefits from a theoretical foundation provided by a theorem that connects sparse priors on Cholesky factors with the sparsity of the precision matrix, facilitating effective structure recovery in GGMs. To assess the efficacy of our proposed method, we conduct comprehensive simulations on AR2 and circle graph models, comparing its performance with renowned algorithms such as GLASSO, CLIME, and SPACE across various dimensions. Our evaluation, based on metrics like estimation ac- curacy and selection correctness, unequivocally demonstrates the superiority of our approach in accurately identifying the intrinsic graph structure. The empirical results underscore the robustness and scalability of our method, underscoring its potential as an indispensable tool for statistical data analysis, especially in the context of complex datasets.</p>
158

A Graphical Representation Framework for Enhanced Visualization of Construction Control Processes

Hays, Benjamin James 21 October 2002 (has links)
Graphical representation for construction control information--processes such as scheduling, budgeting and RFIs--follows no formalized method. Many graphics neglect relevant information necessary to highlight trends in or relationships between processes. The principles of data graphics offer visual capabilities beyond those currently employed by the construction industry to display appropriate information in a manner that enhances comprehension of control processes. This paper describes a method that incorporates four tasks; those of structuring and filtering data, editing for density and communicating efficiently; as necessary to creating effective data graphics. In addition to an evaluation technique, these tasks are outlined in a coherent framework. Several construction control processes are then described with respect to these four tasks. Focused application of the framework to the budgeting process produces four graphics that are subsequently evaluated by industry professionals. Conclusions detailed at the end of this document draw together lessons learned from the process of creating data graphics as well as from quantitative and qualitative evaluations of the visual cost report. / Master of Science
159

Automated Analysis of Astrocyte Activities from Large-scale Time-lapse Microscopic Imaging Data

Wang, Yizhi 13 December 2019 (has links)
The advent of multi-photon microscopes and highly sensitive protein sensors enables the recording of astrocyte activities on a large population of cells over a long-time period in vivo. Existing tools cannot fully characterize these activities, both within single cells and at the population-level, because of the insufficiency of current region-of-interest-based approaches to describe the activity that is often spatially unfixed, size-varying, and propagative. Here, we present Astrocyte Quantitative Analysis (AQuA), an analytical framework that releases astrocyte biologists from the ROI-based paradigm. The framework takes an event-based perspective to model and accurately quantify the complex activity in astrocyte imaging datasets, with an event defined jointly by its spatial occupancy and temporal dynamics. To model the signal propagation in astrocyte, we developed graphical time warping (GTW) to align curves with graph-structured constraints and integrated it into AQuA. To make AQuA easy to use, we designed a comprehensive software package. The software implements the detection pipeline in an intuitive step by step GUI with visual feedback. The software also supports proof-reading and the incorporation of morphology information. With synthetic data, we showed AQuA performed much better in accuracy compared with existing methods developed for astrocytic data and neuronal data. We applied AQuA to a range of ex vivo and in vivo imaging datasets. Since AQuA is data-driven and based on machine learning principles, it can be applied across model organisms, fluorescent indicators, experimental modes, and imaging resolutions and speeds, enabling researchers to elucidate fundamental astrocyte physiology. / Doctor of Philosophy / Astrocyte is an important type of glial cell in the brain. Unlike neurons, astrocyte cannot be electrically excited. However, the concentrations of many different molecules inside and near astrocytes change over space and time and show complex patterns. Recording, analyzing, and deciphering these activity patterns enables the understanding of various roles astrocyte may play in the nervous system. Many of these important roles, such as sensory-motor integration and brain state modulation, were traditionally considered the territory of neurons, but recently found to be related to astrocytes. These activities can be monitored in the intracellular and extracellular spaces in either brain slices and living animals, thanks to the advancement of microscopes and genetically encoded fluorescent sensors. However, sophisticated analytical tools lag far behind the impressive capability of generating the data. The major reason is that existing tools are all based on the region-of-interest-based (ROI) approach. This approach assumes the field of view can be segmented to many regions, and all pixels in the region should be active together. In neuronal activity analysis, all pixels in an ROI (region of interest) correspond to a neuron and are assumed to share a common activity pattern (curve). This is not true for astrocyte activity data because astrocyte activities are spatially unfixed, size-varying, and propagative. In this dissertation, we developed a framework called AQuA to detect the activities directly. We designed an accurate and flexible detection pipeline that works with different types of astrocyte activity data sets. We designed a machine learning model to characterize the signal propagation for the pipeline. We also implemented a compressive and user-friendly software package. The advantage of AQuA is confirmed in both simulation studies and three different types of real data sets.
160

Comparison of Various Display Representation Formats for Older Adults Using Inlab and Remote Usability Testing

Narayan, Sajitha 19 July 2005 (has links)
The population of seniors is growing and will continue to increase in the next decade. Computer technology holds the promise of enhancing the quality of life and independence of older people as it may increase their ability to perform a variety of tasks. This is true for elderly. By the year 2030, people age 65 or older will comprise 22% of the population in the United States. As the population shifts so that a greater percentage are middle-aged and older adults, and as dependence on computer technology increases, it becomes more crucial to understand how to design computer displays for these older age groups. The research has compared various display representation formats to try to find out which is the best way to represent information to seniors in any form of display and the reason for the preferences. The formats compared include high and low density screens for abstract icon representation, concrete icon representation, tabular representation and graphical representation.This research also endeavored to study the effectiveness of remote usability testing as compared to inlab testing for seniors. Results indicated that density of screen is a very important factor affecting the performance of older adults. Density effect showed statistical significance F (1,112)=8.934, p< .05 from further post-hoc analysis that was conducted. Although significant results were not obtained, different formats of display representations may still be an area worth pursuing. Also it was noted that remote usability testing is not as effective as inlab testing for seniors in terms of time taken to conduct the study and the number of user comments collected. Implications, as well as recommendations and conclusions, of the study are presented. / Master of Science

Page generated in 0.048 seconds