• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 754
  • 179
  • 109
  • 91
  • 28
  • 26
  • 24
  • 23
  • 18
  • 18
  • 12
  • 7
  • 5
  • 5
  • 5
  • Tagged with
  • 1599
  • 289
  • 241
  • 199
  • 199
  • 191
  • 168
  • 164
  • 150
  • 144
  • 139
  • 137
  • 118
  • 111
  • 108
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Semiactive Cab Suspension Control for Semitruck Applications

Marcu, Florin M. 29 April 2009 (has links)
Truck drivers are exposed to vibrations all day as a part of their work. In addition to repetitive motion injuries the constant vibrations add to the fatigue of the driver which in turn can have safety implications. The goal of this research is to lower the vibrations an occupant of a class 8 semitruck cab sleeper is exposed to by improving the ride quality. Unlike prior research in the area of ride comfort that target the chassis or seat suspension, this work focuses on the cab suspension. The current standard in cab suspensions is comprised of some type of spring and passive damper mechanism. Ride improvements can most easily be accomplished by replacing the stock passive dampers with some type of controllable damper; in this case Magneto-Rheological (MR) dampers. MR dampers can change damping characteristics in real time, while behaving like a passive damper in their OFF state. This means that in case of a failure to the power supply, the dampers still retain their functionality and can provide some level of damping. Additionally, MR dampers can be packaged such that they do not require any redesign of mounting bracketry on the cab or the frame, their use as a retrofitable device. The damper controller is based on the skyhook control policy pioneered by Karnopp et al. in the 1970s. A variation on skyhook control is chosen called no-jerk skyhook control. A controller called Hierarchical SemiActive Control (HSAC) is designed and implemented to allow the no-jerk skyhook controller to adapt to the road conditions. It also incorporates an endstop controller to better handle the limited rattle space of the cab suspension. The development and initial testing of the controller prototype is done in simulation using a model of the cab and its suspension. The model is derived from first principles using bond graph modeling. The controller is implemented in Simulink to ease the transition to hardware testing. The realtime prototype controller is tested on a class 8 semitruck in a lab environment using dSPACE and road input at the rear axles. The laboratory results are veried on the road in a series of road tests on a test truck. The road tests showed a need for HSAC controller. The HSAC is implemented on the test truck in a final prototype system. The test results with this system show signfiicant improvements over the stock passive suspension, especially when dealing with transient excitations. The overall research results presented show that significant ride improvements can be achieved from a semiactive cab suspension. / Ph. D.
212

Constructing an Estimate of Academic Capitalism and Explaining Faculty Differences through Multilevel Analysis

Kniola, David J. 24 November 2009 (has links)
Two broad influences have converged to shape a new environment in which universities must now compete and operate. Shrinking financial resources and a global economy have arguably compelled universities to adapt. The concept of academic capitalism helps explain the new realities and places universities in the context of a global, knowledge-based economy (Slaughter & Leslie, 1997). Prior to this theory, the role of universities in the knowledge economy was largely undocumented. Academic capitalism is a measurable concept defined by the mechanisms and behaviors of universities that seek to generate new sources of revenue and are best revealed through faculty work. This study was designed to create empirical evidence of academic capitalism through the behaviors of faculty members at research universities. Using a large-scale, national database, the researcher created a new measure—an estimate of academic capitalism—at the individual faculty member level and then used multi-level analysis to explain variation among these individual faculty members. This study will increase our understanding of the changing nature of faculty work, will lead to future studies on academic capitalism that involve longitudinal analysis and important sub-populations, and will likely influence institutional and public policy. / Ph. D.
213

Bayesian hierarchical modelling of dual response surfaces

Chen, Younan 08 December 2005 (has links)
Dual response surface methodology (Vining and Myers (1990)) has been successfully used as a cost-effective approach to improve the quality of products and processes since Taguchi (Tauchi (1985)) introduced the idea of robust parameter design on the quality improvement in the United States in mid-1980s. The original procedure is to use the mean and the standard deviation of the characteristic to form a dual response system in linear model structure, and to estimate the model coefficients using least squares methods. In this dissertation, a Bayesian hierarchical approach is proposed to model the dual response system so that the inherent hierarchical variance structure of the response can be modeled naturally. The Bayesian model is developed for both univariate and multivariate dual response surfaces, and for both fully replicated and partially replicated dual response surface designs. To evaluate its performance, the Bayesian method has been compared with the original method under a wide range of scenarios, and it shows higher efficiency and more robustness. In applications, the Bayesian approach retains all the advantages provided by the original dual response surface modelling method. Moreover, the Bayesian analysis allows inference on the uncertainty of the model parameters, and thus can give practitioners complete information on the distribution of the characteristic of interest. / Ph. D.
214

Bayesian Methodology for Missing Data, Model Selection and Hierarchical Spatial Models with Application to Ecological Data

Boone, Edward L. 14 February 2003 (has links)
Ecological data is often fraught with many problems such as Missing Data and Spatial Correlation. In this dissertation we use a data set collected by the Ohio EPA as motivation for studying techniques to address these problems. The data set is concerned with the benthic health of Ohio's waterways. A new method for incorporating covariate structure and missing data mechanisms into missing data analysis is considered. This method allows us to detect relationships other popular methods do not allow. We then further extend this method into model selection. In the special case where the unobserved covariates are assumed normally distributed we use the Bayesian Model Averaging method to average the models, select the highest probability model and do variable assessment. Accuracy in calculating the posterior model probabilities using the Laplace approximation and an approximation based on the Bayesian Information Criterion (BIC) are explored. It is shown that the Laplace approximation is superior to the BIC based approximation using simulation. Finally, Hierarchical Spatial Linear Models are considered for the data and we show how to combine analysis which have spatial correlation within and between clusters. / Ph. D.
215

Phylogenetic Niche Modeling

McHugh, Sean W. 01 September 2021 (has links)
Projecting environmental niche models through time is a common goal when studying species response to climatic change. Species distribution models (SDMs) are commonly used to estimate a species' niche from observed patterns of occurrence and environmental predictors. However, a species niche is also shaped by non-environmental factors--including biotic interactions and dispersal barrier—truncating SDM estimates. Though truncated SDMs may accurately predict present-day species niche, projections through time are often biased by environmental condition change. Modeling niche in a phylogenetic framework leverages a clade's shared evolutionary history to pull species estimates closer towards phylogenetic conserved values and farther away from species specific biases. We propose a new Bayesian model of phylogenetic niche implemented in R. Under our model, species SDM parameters are transformed into biologically interpretable continuous parameters of environmental niche optimum, breadth, and tolerance evolving under multivariate Brownian motion random walk. Through simulation analyses, we demonstrated model accuracy and precision that improved as phylogeny size increased. We also demonstrated our model on a clade of eastern United States Plethodontid salamanders by accurately estimating species niche, even when no occurrence data is present. Our model demonstrates a novel framework where niche changes can be studied forwards and backwards through time to understand ancestral ranges, patterns of environmental specialization, and niche in data deficient species. / Master of Science / As many species face increasing pressure in a changing climate, it is crucial to understand the set of environmental conditions that shape species' ranges--known as the environmental niche--to guide conservation and land management practices. Species distribution models (SDMs) are common tools that are used to model species' environmental niche. These models treat a species' probability of occurrence as a function of environmental conditions. SDM niche estimates can predict a species' range given climate data, paleoclimate, or projections of future climate change to estimate species range shifts from the past to the future. However, SDM estimates are often biased by non-environmental factors shaping a species' range including competitive divergence or dispersal barriers. Biased SDM estimates can result in range predictions that get worse as we extrapolate beyond the observed climatic conditions. One way to overcome these biases is by leveraging the shared evolutionary history amongst related species to "fill in the gaps". Species that are more closely phylogenetically related often have more similar or "conserved" environmental niches. By estimating environmental niche over all species in a clade jointly, we can leverage niche conservatism to produce more biologically realistic estimates of niche. However, currently a methodological gap exists between SDMs estimates and macroevolutionary models, prohibiting them from being estimated jointly. We propose a novel model of evolutionary niche called PhyNE (Phylogenetic Niche Evolution), where biologically realistic environmental niches are fit across a set of species with occurrence data, while simultaneously fitting and leveraging a model of evolution across a portion of the tree of life. We evaluated model accuracy, bias, and precision through simulation analyses. Accuracy and precision increased with larger phylogeny size and effectively estimated model parameters. We then applied PhyNE to Plethodontid salamanders from Eastern North America. This ecologically-important and diverse group of lungless salamanders require cold and wet conditions and have distributions that are strongly affected by climatic conditions. Species within the family vary greatly in distribution, with some species being wide ranging generalists, while others are hyper-endemics that inhabit specific mountains in the Southern Appalachians with restricted thermal and hydric conditions. We fit PhyNE to occurrence data for these species and their associated average annual precipitation and temperature data. We identified no correlations between species environmental preference and specialization. Pattern of preference and specialization varied among Plethodontid species groups, with more aquatic species possessing a broader environmental niche, likely due to the aquatic microclimate facilitating occurrence in a wider range of conditions. We demonstrated the effectiveness of PhyNE's evolutionarily-informed estimates of environmental niche, even when species' occurrence data is limited or even absent. PhyNE establishes a proof-of-concept framework for a new class of approaches for studying niche evolution, including improved methods for estimating niche for data-deficient species, historical reconstructions, future predictions under climate change, and evaluation of niche evolutionary processes across the tree of life. Our approach establishes a framework for leveraging the rapidly growing availability of biodiversity data and molecular phylogenies to make robust eco-evolutionary predictions and assessments of species' niche and distributions in a rapidly changing world.
216

An investigation of assignment rules for fitting new subjects into clusters established by hierarchical pattern analysis

Frary, Jewel McDow 02 March 2010 (has links)
Cluster analysis has been used fairly extensively as a means of grouping objects or subjects on the basis of their similarity over a number of variables. Almost all of the work to this point has been for the purpose of classifying an extant collection of similar objects into clusters or types. However, there often arises a need for methods of identifying additional objects as members of clusters that have already been established. Discriminant function analysis has been used for this purpose even though its underlying assumptions often cannot be met. This study explored a different approach to the problem, namely, the use of distance functions as a means of identifying subjects as members of types which had been established by hierarchical pattern analysis. A sample of subjects was drawn randomly from a population; these subjects were assigned to the types that appeared in other samples that were drawn from the same population. Each type was defined by the vector of mean scores on selected variables for the subjects in that cluster. A new subject was identified as a member of a type if the distance function described by the assignment rule was a minimum for that type. Various criteria were established for judging the adequacy of the assignments. Five distance functions were identified as being potential ways of assigning new subjects to types. Recommendations were not made for immediate practical application. However, the results were generally positive, and successful applications should be possible with the suggested methodological refinement. / Ph. D.
217

Sequential learning, large-scale calibration, and uncertainty quantification

Huang, Jiangeng 23 July 2019 (has links)
With remarkable advances in computing power, computer experiments continue to expand the boundaries and drive down the cost of various scientific discoveries. New challenges keep arising from designing, analyzing, modeling, calibrating, optimizing, and predicting in computer experiments. This dissertation consists of six chapters, exploring statistical methodologies in sequential learning, model calibration, and uncertainty quantification for heteroskedastic computer experiments and large-scale computer experiments. For heteroskedastic computer experiments, an optimal lookahead based sequential learning strategy is presented, balancing replication and exploration to facilitate separating signal from input-dependent noise. Motivated by challenges in both large data size and model fidelity arising from ever larger modern computer experiments, highly accurate and computationally efficient divide-and-conquer calibration methods based on on-site experimental design and surrogate modeling for large-scale computer models are developed in this dissertation. The proposed methodology is applied to calibrate a real computer experiment from the gas and oil industry. This on-site surrogate calibration method is further extended to multiple output calibration problems. / Doctor of Philosophy / With remarkable advances in computing power, complex physical systems today can be simulated comparatively cheaply and to high accuracy through computer experiments. Computer experiments continue to expand the boundaries and drive down the cost of various scientific investigations, including biological, business, engineering, industrial, management, health-related, physical, and social sciences. This dissertation consists of six chapters, exploring statistical methodologies in sequential learning, model calibration, and uncertainty quantification for heteroskedastic computer experiments and large-scale computer experiments. For computer experiments with changing signal-to-noise ratio, an optimal lookahead based sequential learning strategy is presented, balancing replication and exploration to facilitate separating signal from complex noise structure. In order to effectively extract key information from massive amount of simulation and make better prediction for the real world, highly accurate and computationally efficient divide-and-conquer calibration methods for large-scale computer models are developed in this dissertation, addressing challenges in both large data size and model fidelity arising from ever larger modern computer experiments. The proposed methodology is applied to calibrate a real computer experiment from the gas and oil industry. This large-scale calibration method is further extended to solve multiple output calibration problems.
218

Deep Learning for Taxonomy Prediction

Ramesh, Shreyas 04 June 2019 (has links)
The last decade has seen great advances in Next-Generation Sequencing technologies, and, as a result, there has been a rise in the number of genomes sequenced each year. In 2017, there were as many as 10,000 new organisms sequenced and added into the RefSeq Database. Taxonomy prediction is a science involving the hierarchical classification of DNA fragments up to the rank species. In this research, we introduce Predicting Linked Organisms, Plinko, for short. Plinko is a fully-functioning, state-of-the-art predictive system that accurately captures DNA - Taxonomy relationships where other state-of-the-art algorithms falter. Plinko leverages multi-view convolutional neural networks and the pre-defined taxonomy tree structure to improve multi-level taxonomy prediction. In the Plinko strategy, each network takes advantage of different word usage patterns corresponding to different levels of evolutionary divergence. Plinko has the advantages of relatively low storage, GPGPU parallel training and inference, making the solution portable, and scalable with anticipated genome database growth. To the best of our knowledge, Plinko is the first to use multi-view convolutional neural networks as the core algorithm in a compositional,alignment-free approach to taxonomy prediction. / Master of Science / Taxonomy prediction is a science involving the hierarchical classification of DNA fragments up to the rank species. Given species diversity on Earth, taxonomy prediction gets challenging with (i) increasing number of species (labels) to classify and (ii) decreasing input (DNA) size. In this research, we introduce Predicting Linked Organisms, Plinko, for short. Plinko is a fully-functioning, state-of-the-art predictive system that accurately captures DNA - Taxonomy relationships where other state-of-the-art algorithms falter. Three major challenges in taxonomy prediction are (i) large dataset sizes (order of 109 sequences) (ii) large label spaces (order of 103 labels) and (iii) low resolution inputs (100 base pairs or less). Plinko leverages multi-view convolutional neural networks and the pre-defined taxonomy tree structure to improve multi-level taxonomy prediction for hard to classify sequences under the three conditions stated above. Plinko has the advantage of relatively low storage footprint, making the solution portable, and scalable with anticipated genome database growth. To the best of our knowledge, Plinko is the first to use multi-view convolutional neural networks as the core algorithm in a compositional, alignment-free approach to taxonomy prediction.
219

Suspended Micro/Nanofiber Hierarchical Scaffolds for Studying Cell Mechanobiology

Wang, Ji 27 March 2015 (has links)
Extracellular matrix (ECM) is a fibrous natural cell environment, possessing complicated micro-and nano- architectures, which provides signaling cues and influences cell behavior. Mimicking this three dimensional environment in vitro is a challenge in developmental and disease biology. Here, suspended multilayer hierarchical nanofiber assemblies fabricated using the non-electrospinning STEP (Spinneret based Tunable Engineered Parameter) fiber manufacturing technique with controlled fiber diameter (microns to less than 100 nm), orientation and spacing in single and multiple layers are demonstrated as biological scaffolds. Hierarchical nanofiber assemblies were developed to control single cell shape (shape index from 0.15 to 0.57), nuclei shape (shape index 0.75 to 0.99) and focal adhesion cluster length (8-15 micrometer). To further investigate single cell-ECM biophysical interactions, nanofiber nets fused in crisscross patterns were manufactured to measure the "inside out" contractile forces of single mesenchymal stem cells (MSCs). The contractile forces (18-320 nano Newton) were found to scale with fiber structural stiffness (2 -100 nano Newton/micrometer). Cells were observed to shed debris on fibers, which were found to exert forces (15-20 nano Newton). Upon CO? deprivation, cells were observed to monotonically reduce cell spread area and contractile forces. During the apoptotic process, cells exerted both expansive and contractile forces. The platform developed in this study allows a wide parametric investigation of biophysical cues which influence cell behaviors with implications in tissue engineering, developmental biology, and disease biology. / Master of Science
220

Hierarchical Fuzzy Control of the UPFC and SVC located in AEP's Inez Area

Maram, Satish 09 June 2003 (has links)
To reinforce its Inez network, which was operated close to its stability limits, American Electric Power (AEP) undertook two major developments, one being the installation of a Static Var Compensator (SVC) in November, 1980 and the other one being the installation of the world's first Unified Power Flow Controller (UPFC) in 1998. The controllers in the system include the Automatic Voltage Regulators (AVRs) of the generators, the controllers of the SVC, and UPFC. To coordinate the control actions of these controllers and prevent voltage instability resulting from their fighting against each other, a two level hierarchical control scheme using fuzzy logic has been developed and its performance was assessed via simulations. The second level of the hierarchy determines the set points of the local controllers of the AVRs, SVC, and UPFC and defines the switching sequences of the capacitor banks, the goal being to maximize the reactive reserve margins of the Inez subsystem. Numerous simulations were carried out on this system to determine the actions of the fuzzy controller required to prevent the occurrence of voltage collapse under double contingency. Simulations have revealed the occurrence of nonlinear interactions between the machines resulting in stable limit cycles, nonlinear oscillations undergoing period doubling leading to chaos and possible voltage collapse. The proposed fuzzy scheme provides a fast, simple and effective way to stretch the stability limit of the system for double contingency conditions, up to 175 MW in some cases. This is a significant increase in the system capacity. / Master of Science

Page generated in 0.045 seconds