• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7709
  • 3557
  • 3293
  • 1169
  • 361
  • 177
  • 166
  • 164
  • 150
  • 88
  • 76
  • 76
  • 56
  • 55
  • 47
  • Tagged with
  • 20663
  • 3848
  • 3297
  • 3210
  • 2746
  • 2697
  • 2689
  • 1937
  • 1802
  • 1511
  • 1370
  • 1242
  • 1186
  • 1122
  • 980
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Stochastické modelování datových souborů / Stochastic Modeling of Data Sets

Orgoník, Svetoslav January 2011 (has links)
Master's thesis is focused on implementing modern statistical methods for fitting propability distribution using kernel estimates with regard to the possibilities of their implementation on the PC and the application of specic data sets. Master's thesis is a part of project from MSMT of the Czech Republic no. 1M06047 Center for Quality and Reliability of Production.
202

Cometabolic Modeling of Chlorinated Aliphatic Hydrocarbons using SEAM3D Cometabolism Package

Brewster, Ryan Jude Stephen 21 May 2003 (has links)
Bioremediation of chlorinated aliphatic hydrocarbon (CAH) compounds commonly found at contaminated sites has been an area of focus in recent years. The cometabolic transformation of CAH compounds is important at sites where the redox condition does not favor natural attenuation or populations of indigenous microorganisms are relatively low. At sites where the ground-water system is aerobic, monitored natural attenuation strategies will not meet remediation objectives, or both, enhanced bioremediation via cometabolism is an option. Models are needed to simulate cometabolism in an effort to improve performance and design. The SEAM3D Cometabolism Package was designed to address this need. The objective of this report is to model field data to determine the ability of SEAM3D to simulate the performance of cometabolism. A ground-water flow and transport model was designed based on reported parameters used in the field experiments at Moffett Field. Electron donor and acceptor breakthrough curves were also simulated in an effort to calibrate the model. Several data sets describing the cometabolism of CAHs were used in the cometabolism modeling for calibration to field data. The cometabolism modeling showed areas of best fit calibration with modification to the model parameters reported for the pilot tests at Moffett Field. The overall performance of the SEAM3D Cometabolism Package described in this report establishes validation of the model using field experiment results from the literature. Additional model validation is recommended for other contaminants. / Master of Science
203

Spring Hollow Reservoir: Application of a two-dimensional water quality model

Dorsel, Daniel S. 09 July 1998 (has links)
The BETTER water quality model, created by TVA, was used to model the temperature and dissolved oxygen (DO) in Spring Hollow Reservoir. The water balance consisted of pump discharge from the Roanoke River, runoff, releases at the dam, leakage, and storage. The geometry of the reservoir was represented by four columns and a variable number of five-foot layers. Through a sensitivity analysis, the parameters that influenced temperature and DO the most were determined. Temperature was then calibrated to a subset of the 19-month simulation period by systematically varying the most sensitive parameters. DO was calibrated to the entire simulation period due to the young age of the reservoir and the inconsistent inflow rates and timing. The verification process showed that the model reasonably reproduced the seasonal temperature patterns. By varying the sediment oxygen demand temporally and spatially, the model depicted the gradual hypolimnetic oxygen depletion in the reservoir. The model results suggest that the inflow organics and subsequent settling and accumulation are key factors in the DO depletion rate. Therefore, to enhance water quality conditions in the reservoir, a monitoring system in the Roanoke River should be installed with filling carried out when water quality in the river is optimal. For future modeling purposes, this research indicated that the model was very sensitivity to meteorological data, especially in determining temperature. Thus, a weather station located at the reservoir would permit collection of more accurate meteorological data, leading to greater confidence in the interpretation of the model predictions. / Master of Science
204

Contributions to the semi-classical signal analysis method: The arterial stiffness assessment case study

Piliouras, Evangelos 04 1900 (has links)
Semi-classical signal analysis (SCSA) is a signal representation framework based on quantum mechanics principles and the inverse scattering transform. The signal of interest is decom- posed in a linear combination of the Schrodinger operator squared eigenfunctions, influenced by the semi-classical parameter. The framework has been utilized in several applications, in virtue of the adaptivity and localization of its components. In this thesis, we expand two direc- tions. From the theoretical perspective, up to date, the semi-classical parameter was selected in an error minimization context or a representation sparsity requirement. The framework is reinforced by providing the interval of this parameter, where a proper representation can be obtained. The lower bound is inspired by the semi-classical approximation and the sampling theorem, while the upper bound is based on the quantum perturbation theory. Such an interval defines the sampling theorem of the framework. Based on existing properties, we propose a non-uniform sampling of the semi-classical parameter, which can significantly increase the speed of convergence with minimal accuracy error. An immediate representation is also in- vestigated by providing an alternative convergence criterion drawn from signal features. Such criterion paves the way to a calculus-based parameter definition and extension to a filtering scenario. The semi-classical parameter exerts a strong influence on the SCSA components. Each component can be viewed as a soliton, a wave whose amplitude determines its width and velocity. In parallel, there exist arterial dynamics models where the solitons are solu- tions of the describing equations. We therefore propose that the soliton propagation velocity extracted from the algorithm is correlated with the pulse wave velocity, which is the blood pressure propagation velocity in the systolic phase. The velocity in the carotid-femoral seg- ment is considered the golden-standard to indicate cardiovascular risk. We therefore turn our attention to validate such a model and utilize it for arterial stiffness assessment. The model was validated based on an in-silico database fostering more than 3000 subjects. This SCSA-based model is proposed to be integrated into existing methods, where its calibration can yield single-point continuous velocity measurements.
205

Electrostatic Modeling of Protein Aggregation

Vanam, Ram 12 1900 (has links)
Submitted to the faculty of Indiana University in partial fulfillment of the requirements for the degree Master of Science in the Department of Bioinformatics in the School of Informatics of, Indiana University December, 2004 / Electrostatic modeling was done with Delphi of insight II to explain and predict protein aggregation, measured here for β-lactoglobulin and insulin using turbidimetry and stopped flow spectrophotometry. The initial rate of aggregation of β-Lactoglobulin was studied between pH 3.8 and 5.2 in 4.5mM NaCl; and for ionic strengths from 4.5 to 500mM NaCl at pH 5.0. The initial slope of the turbidity vs. time curve was used to define the initial rate of aggregation. The highest initial rate was observed near pH < pI i.e., 4.6 (< 5.2). The decrease in aggregation rate when the pH was increased from 4.8 to 5.0 was large compared to its decrease when the pH was reduced from 4.4 to 4.2; i.e., the dependence of initial rate on pH was highly asymmetric. The initial rate of aggregation at pH 5.0 increased linearly with the reciprocal of ionic strength in the range I = 0.5 to 0.0045M. Protein electrostatic potential distributions are used to understand the pH and ionic strength dependence of the initial rate of aggregation. Similar studies were done with insulin. In contrast to BLG, the highest initial aggregation rate for insulin was observed at pH = pI. Electrostatic computer modeling shows that these differences arise from the distinctly different surface charge distributions of insulin and BLG.
206

Linear Dynamic Model for Continuous Speech Recognition

Ma, Tao 30 April 2011 (has links)
In the past decades, statistics-based hidden Markov models (HMMs) have become the predominant approach to speech recognition. Under this framework, the speech signal is modeled as a piecewise stationary signal (typically over an interval of 10 milliseconds). Speech features are assumed to be temporally uncorrelated. While these simplifications have enabled tremendous advances in speech processing systems, for the past several years progress on the core statistical models has stagnated. Since machine performance still significantly lags human performance, especially in noisy environments, researchers have been looking beyond the traditional HMM approach. Recent theoretical and experimental studies suggest that exploiting frame-torame correlations in a speech signal further improves the performance of ASR systems. This is typically accomplished by developing an acoustic model which includes higher order statistics or trajectories. Linear Dynamic Models (LDMs) have generated significant interest in recent years due to their ability to model higher order statistics. LDMs use a state space-like formulation that explicitly models the evolution of hidden states using an autoregressive process. This smoothed trajectory model allows the system to better track the speech dynamics in noisy environments. In this dissertation, we develop a hybrid HMM/LDM speech recognizer that effectively integrates these two powerful technologies. This hybrid system is capable of handling large recognition tasks, is robust to noise-corrupted speech data and mitigates the ill-effects of mismatched training and evaluation conditions. This two-pass system leverages the temporal modeling and N-best list generation capabilities of the traditional HMM architecture in a first pass analysis. In the second pass, candidate sentence hypotheses are re-ranked using a phone-based LDM model. The Wall Street Journal (WSJ0) derived Aurora-4 large vocabulary corpus was chosen as the training and evaluation dataset. This corpus is a well-established LVCSR benchmark with six different noisy conditions. The implementation and evaluation of the proposed hybrid HMM/LDM speech recognizer is the major contribution of this dissertation.
207

Creating a More Natural Multidimensional Category

Zivot, Matthew 01 January 2009 (has links) (PDF)
Three experiments examined category creation with no feedback and minimal feedback by using modeling to determine number of dimensions subjects attended to. In the first experiment, subjects were shown a series of two-dimensional objects with no training and no feedback and asked to categorize the stimuli. Subjects in experiment 1 mostly attended to one dimension. In the second experiment, subjects shown similar two-dimensional stimuli but were given minimal feedback. Significantly more subjects in experiment 2 attended to both dimensions. In the third experiment, subjects were trained on three related two-dimensional categories and then asked to categorize four. Performance in experiment 3 was similar to that of experiment 1, where subjects mainly attended to 1 dimension. These findings indicate that a more natural feedback structure would help subjects create categories that resemble those used in everyday life.
208

DEVELOPMENT OF DATA-DRIVEN APPROACHES FOR WASTEWATER MODELING

Zhou, Pengxiao January 2023 (has links)
To effectively operate and manage the complex wastewater treatment system, simplified representations, known as wastewater modeling, are critical. Wastewater modeling allows for the understanding, monitoring, and prediction of wastewater treatment processes by capturing intricate relationships within the system. Process-driven models (PDMs), which rely on a set of interconnected hypotheses and assumptions, are commonly used to capture the physical, chemical, and biological mechanisms of wastewater treatment. More recently, with the development of advanced algorithms and sensor techniques, data-driven models (DDMs) that are based on analyzing the data about a system, specifically finding relationships between the system state variables without relying on explicit knowledge of the system, have emerged as a complementary alternative. However, both PDMs and DDMs suffer from their limitations. For example, uncertainties of PDMs can arise from imprecise calibration of empirical parameters and natural process variability. Applications of DDMs are limited to certain objectives because of a lack of high-quality dataset and struggling to capture changing relationship. Therefore, this dissertation aims to enhance the stable operation and effective management of WWTPs by addressing these limitations through the pursuit of three objectives: (1) investigating an efficient data-driven approach for uncertainty analysis of process-driven secondary settling tank models; (2) developing data-driven models that can leverage sparse and imbalanced data for the prediction of emerging contaminant removal; (3) exploring an advanced data-driven model for influent flow rate predictions during the COVID-19 emergency. / Thesis / Doctor of Philosophy (PhD) / Ensuring appropriate treatment and recycling of wastewater is vital to sustain life. Wastewater treatment plants (WWTPs), which have complicated processes that include several intricate physical, chemical, and biological procedures, play a significant role in the water recycling. Due to stricter regulations and complex wastewater composition, the wastewater treatment system has become increasingly complex. Therefore, it is crucial to use simplified versions of the system, known as wastewater modeling, to effectively operate and manage the complex system. The aim of this thesis is to develop data-driven approaches for wastewater modeling.
209

Videotaped Modeling with and without Verbal Cues

Rowland, Amy Lee 19 August 2004 (has links)
The purpose of this study was to investigate the use of videotaped modeling of a tennis skill with and without verbal cues. Eighteen female players from two NCAA Division III colleges served as the subjects for the study. The players were randomly assigned to one of two groups. Both of the groups viewed a modeling videotape which contained a 56-second clip of a female professional hitting forehand groundstrokes looped seven times. Group One'­s tape included verbal cues on balance, posture, and contact point. Group Two'­s tape did not contain verbal cues. Both of the groups were pre-tested on power, performance, trait confidence, and state confidence before viewing the modeling tape six times. Then they were post-tested on the same measures and given a qualitative questionnaire. They were also asked a follow-up question in interview format. The qualitative analyses revealed that Group 2 subjects were unable to articulate the concepts of balance, posture and contact point as well as Group 1. Group 1 was better able to articulate these concepts with a higher percentage of participants answering the qualitative questionnaire consistent with the relevant verbal cues for balance, posture, and contact point. The results of this study indicate that tennis coaches should consider adding verbal cues when using videotaped modeling to enhance its effectiveness. / Ph. D.
210

Development of a Framework for Enterprise Modeling

Venugopalan, Thiyagarajan 13 December 2003 (has links)
Enterprises are growing in complexity due to numerous interactions within and outside the enterprise. Enterprise modeling addresses this issue of complexity by helping to structure it. A review of the literature indicates several issues in the field of enterprise modeling need to be addressed. First, the terms related to enterprise modeling have numerous definitions, each one focusing on different aspects. These definitions are analyzed and a comprehensive definition is provided. Next, enterprise modeling methodologies and enterprise modeling frameworks in the literature focus on different views when modeling an enterprise, thus making it difficult for an enterprise to choose the framework that best fits their needs. In order to resolve this, an enterprise modeling framework is designed that attempts to incorporate all of the views of an enterprise. This framework is then extended, by taking into account various models and functionalities provided in enterprise modeling software packages.

Page generated in 0.1005 seconds