• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 515
  • 76
  • 17
  • 9
  • 8
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 3
  • 3
  • 2
  • Tagged with
  • 753
  • 753
  • 208
  • 193
  • 142
  • 119
  • 101
  • 101
  • 83
  • 79
  • 71
  • 70
  • 69
  • 62
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Visual Sensitivity of Dynamic Graphical Displays

Jessa, Munira January 2005 (has links)
Advanced display design, such as Ecological Interface Design (EID), makes extensive use of complex graphical objects. Research has shown that by following EID methodologies, supervisory operators have better performance with the EID displays (Pawlak and Vicente, 1996). However, past research does not consider the visual aspects of the graphical objects used in EID. Of particular interest is how different design decisions of graphical objects affect the performance of the objects used within that design. This thesis examines the visual sensitivity of dynamic graphical objects by examining features that make certain graphical objects visually superior for certain monitoring tasks. Previous research into the visual aspects of supervisory control with respect to emergent features, psychophysics and attention were considered in the investigation of the visual sensitivities of the dynamic graphical objects used. Research into static graphical objects, combined with prior work on emergent features has been merged to find emergent features that best show changes in dynamic graphical objects for the monitoring tasks investigated. It was found that for simple dynamic objects such as bars and polygon objects, a line changing in angle was the most noticeable emergent feature to show a departure from ?normal? state. For complex graphical objects, those target-indicator displays that mimic a ?bull?s eye? when at the target value should be used for displays that show observers when a target value has been reached. Abrupt changes in shape should be used in trend meters to show when variables or processes have changed direction. Finally, ?solid objects? that make use of vertical lines and shading should be used for comparison meters that compare two values and keep them in a particular ratio. These findings provide guidance for designers of dynamic advanced graphical displays by encouraging the consideration of visual aspects of graphical objects, as well as prescribing graphical objects that should be used in the types of tasks investigated.
2

E-Intelligence Form Design and Data Preprocessing in Health Care

Pedarla, Padmaja January 2004 (has links)
Clinical data systems continue to grow as a result of the proliferation of features that are collected and stored. Demands for accurate and well-organized clinical data have intensified due to the increased focus on cost-effectiveness, and continuous quality improvement for better clinical diagnosis and prognosis. Clinical organizations have opportunities to use the information they collect and their oversight role to enhance health safety. Due to the continuous growth in the number of parameters that are accumulated in large databases, the capability of interactively mining patient clinical information is an increasingly urgent need to the clinical domain for providing accurate and efficient health care. Simple database queries fail to address this concern for several problems like the lack of the use of knowledge contained in these extremely complex databases. Data mining addresses this problem by analyzing the databases and making decisions based on the hidden patterns. The collection of data from multiple locations in clinical organizations leads to the loss of data in data warehouses. Data preprocessing is the part of knowledge discovery where the data is cleaned and transformed to perform accurate and efficient data mining results. Missing values in the databases result in the loss of useful data. Handling missing values and reducing noise in the data is necessary to acquire better quality mining results. This thesis explores the idea of either rejecting inappropriate values during the data entry level or suggesting various methods of handling missing values in the databases. E-Intelligence form is designed to perform the data preprocessing tasks at different levels of the knowledge discovery process. Here the minimum data set of mental health and the breast cancer data set are used as case studies. Once the missing values are handled, decision trees are used as the data mining tool to perform the classification of the diagnosis of the databases for analyzing the results. Due to the ever increasing mobile devices and internet in health care, the analysis here also addresses issues relevant hand-held computers and communicational devices or web based applications for quick and better access.
3

Image Models for Wavelet Domain Statistics

Azimifar, Seyedeh-Zohreh January 2005 (has links)
Statistical models for the joint statistics of image pixels are of central importance in many image processing applications. However the high dimensionality stemming from large problem size and the long-range spatial interactions make statistical image modeling particularly challenging. Commonly this modeling is simplified by a change of basis, mostly using a wavelet transform. Indeed, the wavelet transform has widely been used as an approximate whitener of statistical time series. It has, however, long been recognized that the wavelet coefficients are neither Gaussian, in terms of the marginal statistics, nor white, in terms of the joint statistics. The question of wavelet joint models is complicated and admits for possibilities, with statistical structures within subbands, across orientations, and scales. Although a variety of joint models have been proposed and tested, few models appear to be directly based on empirical studies of wavelet coefficient cross-statistics. Rather, they are based on intuitive or heuristic notions of wavelet neighborhood structures. Without an examination of the underlying statistics, such heuristic approaches necessarily leave unanswered questions of neighborhood sufficiency and necessity. This thesis presents an empirical study of joint wavelet statistics for textures and other imagery including dependencies across scale, space, and orientation. There is a growing realization that modeling wavelet coefficients as independent, or at best correlated only across scales, may be a poor assumption. While recent developments in wavelet-domain Hidden Markov Models (notably HMT-3S) account for within-scale dependencies, we find that wavelet spatial statistics are strongly orientation dependent, structures which are surprisingly not considered by state-of-the-art wavelet modeling techniques. To demonstrate the effectiveness of the studied wavelet correlation models a novel non-linear correlated empirical Bayesian shrinkage algorithm based on the wavelet joint statistics is proposed. In comparison with popular nonlinear shrinkage algorithms, it improves the denoising results.
4

Image Models for Wavelet Domain Statistics

Azimifar, Seyedeh-Zohreh January 2005 (has links)
Statistical models for the joint statistics of image pixels are of central importance in many image processing applications. However the high dimensionality stemming from large problem size and the long-range spatial interactions make statistical image modeling particularly challenging. Commonly this modeling is simplified by a change of basis, mostly using a wavelet transform. Indeed, the wavelet transform has widely been used as an approximate whitener of statistical time series. It has, however, long been recognized that the wavelet coefficients are neither Gaussian, in terms of the marginal statistics, nor white, in terms of the joint statistics. The question of wavelet joint models is complicated and admits for possibilities, with statistical structures within subbands, across orientations, and scales. Although a variety of joint models have been proposed and tested, few models appear to be directly based on empirical studies of wavelet coefficient cross-statistics. Rather, they are based on intuitive or heuristic notions of wavelet neighborhood structures. Without an examination of the underlying statistics, such heuristic approaches necessarily leave unanswered questions of neighborhood sufficiency and necessity. This thesis presents an empirical study of joint wavelet statistics for textures and other imagery including dependencies across scale, space, and orientation. There is a growing realization that modeling wavelet coefficients as independent, or at best correlated only across scales, may be a poor assumption. While recent developments in wavelet-domain Hidden Markov Models (notably HMT-3S) account for within-scale dependencies, we find that wavelet spatial statistics are strongly orientation dependent, structures which are surprisingly not considered by state-of-the-art wavelet modeling techniques. To demonstrate the effectiveness of the studied wavelet correlation models a novel non-linear correlated empirical Bayesian shrinkage algorithm based on the wavelet joint statistics is proposed. In comparison with popular nonlinear shrinkage algorithms, it improves the denoising results.
5

Visual Sensitivity of Dynamic Graphical Displays

Jessa, Munira January 2005 (has links)
Advanced display design, such as Ecological Interface Design (EID), makes extensive use of complex graphical objects. Research has shown that by following EID methodologies, supervisory operators have better performance with the EID displays (Pawlak and Vicente, 1996). However, past research does not consider the visual aspects of the graphical objects used in EID. Of particular interest is how different design decisions of graphical objects affect the performance of the objects used within that design. This thesis examines the visual sensitivity of dynamic graphical objects by examining features that make certain graphical objects visually superior for certain monitoring tasks. Previous research into the visual aspects of supervisory control with respect to emergent features, psychophysics and attention were considered in the investigation of the visual sensitivities of the dynamic graphical objects used. Research into static graphical objects, combined with prior work on emergent features has been merged to find emergent features that best show changes in dynamic graphical objects for the monitoring tasks investigated. It was found that for simple dynamic objects such as bars and polygon objects, a line changing in angle was the most noticeable emergent feature to show a departure from ?normal? state. For complex graphical objects, those target-indicator displays that mimic a ?bull?s eye? when at the target value should be used for displays that show observers when a target value has been reached. Abrupt changes in shape should be used in trend meters to show when variables or processes have changed direction. Finally, ?solid objects? that make use of vertical lines and shading should be used for comparison meters that compare two values and keep them in a particular ratio. These findings provide guidance for designers of dynamic advanced graphical displays by encouraging the consideration of visual aspects of graphical objects, as well as prescribing graphical objects that should be used in the types of tasks investigated.
6

E-Intelligence Form Design and Data Preprocessing in Health Care

Pedarla, Padmaja January 2004 (has links)
Clinical data systems continue to grow as a result of the proliferation of features that are collected and stored. Demands for accurate and well-organized clinical data have intensified due to the increased focus on cost-effectiveness, and continuous quality improvement for better clinical diagnosis and prognosis. Clinical organizations have opportunities to use the information they collect and their oversight role to enhance health safety. Due to the continuous growth in the number of parameters that are accumulated in large databases, the capability of interactively mining patient clinical information is an increasingly urgent need to the clinical domain for providing accurate and efficient health care. Simple database queries fail to address this concern for several problems like the lack of the use of knowledge contained in these extremely complex databases. Data mining addresses this problem by analyzing the databases and making decisions based on the hidden patterns. The collection of data from multiple locations in clinical organizations leads to the loss of data in data warehouses. Data preprocessing is the part of knowledge discovery where the data is cleaned and transformed to perform accurate and efficient data mining results. Missing values in the databases result in the loss of useful data. Handling missing values and reducing noise in the data is necessary to acquire better quality mining results. This thesis explores the idea of either rejecting inappropriate values during the data entry level or suggesting various methods of handling missing values in the databases. E-Intelligence form is designed to perform the data preprocessing tasks at different levels of the knowledge discovery process. Here the minimum data set of mental health and the breast cancer data set are used as case studies. Once the missing values are handled, decision trees are used as the data mining tool to perform the classification of the diagnosis of the databases for analyzing the results. Due to the ever increasing mobile devices and internet in health care, the analysis here also addresses issues relevant hand-held computers and communicational devices or web based applications for quick and better access.
7

Characterization of the surface electrokinetic properties of solid-liquid colloidal dispersions by electrophoretic topography

Shiau, Shaw-Ji 01 January 1989 (has links)
Electrophoretic Topography (ET) technique is developed to study the surface behaviors (single acid site, amphoteric and zwitterionic surface) and electrophoretic properties (mobility-pH profile and mobility-conductivity profile) of colloidal dispersions without the influence of specific ion adsorption and double layer compression. A graphical correlation of the electrophoretic mobility data is displayed by a three dimensional Electrophoretic Template and a two dimensional isomobility contour plot, Electrophoretic Fingerprint. A systematic analysis of the Electrophoretic Fingerprint provides information such as isoelectric line, maximum mobility, isomobility line density, mobility-pH profile and mobility-conductivity profiles. Three colloidal systems were examined in this work: (1) polystyrene latex with different surface functional groups which include amidine, carboxyl, sulfate, carboxy-sulfate mixed and carboxyl-amidine zwitterionic latexes; (2) titanium dioxide-ionic surfactant (SDS and CTAB) dispersion; and (3) Carbonaceous particulates which include GM coal powders, coal-derived liquid fines, oxidized carbon black and nonoxidized carbon black. For each colloidal system characteristic surface behaviour was shown with its characteristic Electrophoretic Template and Fingerprint. Mobility-pH profiles of the various colloidal systems were used extensively to explore the electrokinetic properties under different experimental conditions. The Electrophoretic Topography technique can be used successfully to interpret qualitatively the surface behavior of colloidal dispersions.
8

Theory and practical considerations in reset control design

Zheng, Yuan 01 January 1998 (has links)
In the past three decades, linear time-invariant (LTI) control design techniques have been developed which can achieve stringent performance specifications in the presence of process uncertainties. However, all such LTI techniques are limited by the Bode gain-phase relation. Specifically, an LTI controller's high-frequency magnitude can not be designed arbitrarily but depends on the low-frequency specifications and stability constraints. Qualitatively speaking, "large" low-frequency magnitudes have to go with "large" high-frequency magnitudes, which make the closed-loop system more sensitive to sensor noise and high-frequency modeling errors. This phenomenon is called "cost of feedback", for which LTI control theory can not offer a remedy. In this dissertation, a reset control design method is studied, which is based on ideas originated in the 1950's. The nonlinear controller is composed of a reset network, whose states are reset to zero when its input crosses zero, cascaded with a linear network. Using describing function analysis, the low-frequency performance of this nonlinear system is similar to that of a linear system where the resetting mechanism is not used. However, the control bandwidth is reduced, thereby reducing the "cost of feedback" beyond the limitation imposed by the LTI system's gain-phase relation. In this research, a theoretical framework for the reset control systems is developed, which has connection with the recently developed framework for systems with impulse effects. In this framework, the uniform exponential stability and uniform bounded-input-bounded-output stability for reset control systems are studied and a small-gain stability condition is derived. An optimal matrix norm search algorithm is developed to sharpen this small-gain condition. Based on these theoretical study, a set of engineering design guidelines for reset control are developed and are applied to a tape-drive servo control system. The simulations and experiments of the reset control for this tape-drive servo control system show the potential of the reset control design.
9

Continuum design sensitivity analysis based force calculation in EM devices

Li, Min, 1977 Apr. 2- January 2007 (has links)
The continuum design sensitivity analysis (CDSA) has been applied to the magnetostatic and electrostatic force calculation. This method allows the computation of the net loading force on a body as well as the force distribution on the surface of the body. An algorithm for force calculation combined with a standard field analysis software package is presented. The efficiency and accuracy of the method is proved through the numerical implementation applied to a set of test examples. In addition, the new approach has several advantages over the traditional methods based on the Maxwell Stress Tensor, such as no air gap or artificial interference with the original model is required. Particularly, the performance analysis of a MEMS micro-mirror using CDSA torque calculation is conducted for the first time.
10

The design of accounting systems : a general theory with an empirical study of the Church of England

Laughlin, Richard C. January 1984 (has links)
The primary focus of this study is with the design of accounting systems in specific enterprise contexts: more specifically with the sequential processes for describing the nature of such systems, prescribing how they should look in the future and bringing such changes into being. Such concerns are explored at both a general theoretical level and in terms of the detailed design problems of the accounting systems in the Church of England. The contents of this study can be seen to be divided into three major parts. The first takes a critical look at the nature of accounting knowledge, particularly financial and management accounting, paying particular attention to it's methodological underpinnings. The conclusion, from this part, is that this knowledge stock does not adequately deal with the sequential processes of interest to this study primarily because of the dominant scientific and functionalist assumptions upon which such knowledge is based which are argued to be an inappropriate foundation upon which to build to satisfy this problem focus. The second part presents a case for, and describes the nature of, a methodological approach based on Critical Theory as the basis for satisfying the sequential concerns of this study. The third part applies this methodological approach in the process of trying to both understand and change the accounting systems in the Church of England. The conclusion forthcoming from this study is that the approach based on Critical Theory is a general 'theory' for the sequential concerns of this study but not the only approach which could fulfil such a claim. However, what does become apparent is that if the problem focus of this study is seen as important for the accounting mission then major shifts in the dominant methodology of accounting thought is necessary.

Page generated in 0.1137 seconds