• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 331
  • 135
  • 10
  • 4
  • Tagged with
  • 928
  • 928
  • 467
  • 437
  • 384
  • 380
  • 380
  • 184
  • 174
  • 92
  • 68
  • 66
  • 63
  • 62
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Spatiotemporally Periodic Driven System with Long-Range Interactions

Myers, Owen Dale 01 January 2015 (has links)
It is well known that some driven systems undergo transitions when a system parameter is changed adiabatically around a critical value. This transition can be the result of a fundamental change in the structure of the phase space, called a bifurcation. Most of these transitions are well classified in the theory of bifurcations. Among the driven systems, spatiotemporally periodic (STP) potentials are noteworthy due to the intimate coupling between their time and spatial components. A paradigmatic example of such a system is the Kapitza pendulum, which is a pendulum with an oscillating suspension point. The Kapitza pendulum has the strange property that it will stand stably in the inverted position for certain driving frequencies and amplitudes. A particularly interesting and useful STP system is an array of parallel electrodes driven with an AC electrical potential such that adjacent electrodes are 180 degrees out of phase. Such an electrode array embedded in a surface is called an Electric Curtain (EC). As we will show, by using two ECs and a quadrupole trap it is posible to produce an electric potential simular in form to that of the Kapitza pendulum. Here I will present the results of four related pieces of work, each focused on understanding the behaviors STP systems, long-range interacting particles, and long-range interacting particles in STP systems. I will begin with a discussion on the experimental results of the EC as applied to the cleaning of solar panels in extraterrestrial environments, and as a way to produce a novel one-dimensional multiparticle STP potential. Then I will present a numerical investigation and dynamical systems analysis of the dynamics that may be possible in an EC. Moving to a simpler model in order to explore the rudimentary physics of coulomb interactions in a STP potential, I will show that the tools of statistical mechanics may be important to the study of such systems to understand transitions that fall outside of bifurcation theory. Though the Coulomb and, similarly, gravitational interactions of particles are prevalent in nature, these long-range interactions are not well understood from a statistical mechanics perspective because they are not extensive or additive. Finally, I will present a simple model for understanding long-range interacting pendula, finding interesting non-equilibrium behavior of the pendula angles. Namely, that a quasistationary clustered state can exist when the angles are initially ordered by their index.
222

Reviewing Power Outage Trends, Electric Reliability Indices and Smart Grid Funding

Adderly, Shawn 01 January 2016 (has links)
As our electric power distribution infrastructure has aged, considerable investment has been applied to modernizing the electrical power grid through weatherization and in deployment of real-time monitoring systems. A key question is whether or not these investments are reducing the number and duration of power outages, leading to improved reliability. Statistical methods are applied to analyze electrical disturbance data (from the Department of Energy, DOE) and reliability index data (from state utility public service commission regulators) to detect signs of improvement. The number of installed smart meters provided by several utilities is used to determine whether the number of smart meters correlate with a reduction in outage frequency. Indication emerged that the number of power outages may be decreasing over time. The magnitude of power loss has decreased from 2003 to 2007, and behaves cyclically from 2008 to 2014, with a few outlier points in both groups. The duration also appears to be decreasing between 2003-2014. Large blackout events exceeding 5 GW continue to be rare, and certain power outage events are seasonally dependent. There was a linear relationship between the number of customers and the magnitude of a power outage event. However, no relationship was found between the magnitude of power outages and time to restore power. The frequency of outages maybe decreasing as the number of installed smart meters has increased. Recommendations for inclusion of additional metrics, changes to formatting and semantics of datasets currently provided by federal and state regulators are made to help aid researchers in performing more effective analysis. Confounding variables and lack of information that has made the analysis diffcult is also discussed.
223

The role of Uncertainty in Categorical Perception Utilizing Statistical Learning in Robots

Powell, Nathaniel V. 01 January 2016 (has links)
At the heart of statistical learning lies the concept of uncertainty. Similarly, embodied agents such as robots and animals must likewise address uncertainty, as sensation is always only a partial reflection of reality. This thesis addresses the role that uncertainty can play in a central building block of intelligence: categorization. Cognitive agents are able to perform tasks like categorical perception through physical interaction (active categorical perception; ACP), or passively at a distance (distal categorical perception; DCP). It is possible that the former scaffolds the learning of the latter. However, it is unclear whether DCP indeed scaffolds ACP in humans and animals, nor how a robot could be trained to likewise learn DCP from ACP. Here we demonstrate a method for doing so which involves uncertainty: robots perform ACP when uncertain and DCP when certain. Furthermore, we demonstrate that robots trained in such a manner are more competent at categorizing novel objects than robots trained to categorize in other ways. This suggests that such a mechanism would also be useful for humans and animals, suggesting that they may be employing some version of this mechanism.
224

Evolving Spatially Aggregated Features for Regional Modeling and its Application to Satellite Imagery

Kriegman, Sam 01 January 2016 (has links)
Satellite imagery and remote sensing provide explanatory variables at relatively high resolutions for modeling geospatial phenomena, yet regional summaries are often desirable for analysis and actionable insight. In this paper, we propose a novel method of inducing spatial aggregations as a component of the statistical learning process, yielding regional model features whose construction is driven by model prediction performance rather than prior assumptions. Our results demonstrate that Genetic Programming is particularly well suited to this type of feature construction because it can automatically synthesize appropriate aggregations, as well as better incorporate them into predictive models compared to other regression methods we tested. In our experiments we consider a specific problem instance and real-world dataset relevant to predicting snow properties in high-mountain Asia.
225

Weighted Networks: Applications from Power grid construction to crowd control

McAndrew, Thomas Charles 01 January 2017 (has links)
Since their discovery in the 1950's by Erdos and Renyi, network theory (the study of objects and their associations) has blossomed into a full-fledged branch of mathematics. Due to the network's flexibility, diverse scientific problems can be reformulated as networks and studied using a common set of tools. I define a network G = (V,E) composed of two parts: (i) the set of objects V, called nodes, and (ii) set of relationships (associations) E, called links, that connect objects in V. We can extend the classic network of nodes and links by describing the intensity of these associations with weights. More formally, weighted networks augment the classic network with a function f(e) from links to the real line, uncovering powerful ways to model real-world applications. This thesis studies new ways to construct robust micro powergrids, mine people's perceptions of causality on a social network, and proposes a new way to analyze crowdsourcing all in the context of the weighted network model. The current state of Earth's ecosystem and intensifying climate calls on scientists to find new ways to harvest clean affordable energy. A microgrid, or neighborhood-scale powergrid built using renewable energy sources attached to personal homes, suggest one way to ameliorate this energy crisis. We can study the stability (robustness) of such a small-scale system with weighted networks. A novel use of weighted networks and percolation theory guides the safe and efficient construction of power lines (links, E) connecting a small set of houses (nodes, V) to one another and weights each power line by the distance between houses. This new look at the robustness of microgrid structures calls into question the efficacy of the traditional utility. The next study uses the twitter social network to compare and contrast causal language from everyday conversation. Collecting a set of 1 million tweets, we find a set of words (unigrams), parts of speech, named entities, and sentiment signal the use of informal causal language. Breaking a problem difficult for a computer to solve into many parts and distributing these tasks to a group of humans to solve is called Crowdsourcing. My final project asks volunteers to 'reply' to questions asked of them and 'supply' novel questions for others to answer. I model this 'reply and supply' framework as a dynamic weighted network, proposing new theories about this network's behavior and how to steer it toward worthy goals. This thesis demonstrates novel uses of, enhances the current scientific literature on, and presents novel methodology for, weighted networks.
226

An Examination Of College Persistence Factors For Students From Different Rural Communities: A Multilevel Analysis

Hudacs, Andrew 01 January 2017 (has links)
Students transitioning into college from public school require more than just academic readiness; they also need the personal attributes that allow them to successfully transition into a new community (Braxton, Doyle, Hartley III, Hirschy, Jones, & McLendon, 2014; Nora, 2002; Nora, 2004; Tinto, 1975). Rural students have a different educational experience than their peers at schools in suburban and urban locations (DeYoung & Howley, 1990; Gjelten, 1982). Additionally, the resources, culture, and educational opportunities at rural schools also vary among different types of rural communities. Although some studies have examined the influence of rural students' academic achievement on college access and success, little research has analyzed the relationship between students of different types of rural communities and their persistence in post-secondary education. This study examined the likelihood for college-going students from three different types of rural communities to successfully transition into and persist at a four-year residential college. Multilevel logistic modeling was used to analyze the likelihood for students to persist in college for up to two academic years based on whether they were from rural tourist communities, college communities, and other rural communities. The analysis controlled for a variety of student and high school factors. Findings revealed that student factors related to poverty and academic readiness have the greatest effects, while the type of rural community has no significant influence on college persistence.
227

Evaluating Bilateral Phenomena: The Case of Pain in Sickle Cell Disease

Dahman, Bassam 01 January 2007 (has links)
Symmetry in biological systems is the occurrence of an event on both sides of the system. The term bilateralism was introduced to represent this phenomenon, and it was defined as the conditional co-occurrence of two events given that at least at one of them has occurred. This phenomenon is highly associated with the prevalence of each of the events. Two parameters were developed to evaluate the presence of this phenomenon, testing whether events co-occur with higher probability than would be expected by chance. Nonparametric confidence intervals were constructed using the bootstrap percentile method. These non parametric confidence intervals were used in testing the null hypothesis of no bilateralism.A simulation study was performed to examine the properties of the two bilateralism parameters' estimates. The size and power of the tests of bilateralism were examined under a variety of sample sizes and prevalences of the two events. The simulation study showed that both parameter estimates have similar properties, and the tests have similar size and power. The power of the test was affected by the prevalence of either event, the differences in the prevalences, the sample size and by number of events that occur simultaneously. The methodology of testing for bilateralism was applied on data from the Pain in Sickle Cell Epidemiology Study (PiSCES). This study collected up to 6 months worth of daily diaries about pain and medical utilization from patients with sickle cell disease. Each diary recorded the body site and side where pain was experienced over the past 24 hours. The sample consists of 119 subjects who completed at least 50 daily pain diaries (reference). Information about the subjects age, gender and sickle cell genotype were also available. Nine body sites (5 upper peripheral, and 4 lower peripheral site) were analyzed to test for bilateralism. Bilateralism was tested for each subject and each site separately. The associations of prevalence of bilateralism on each site, and percentages of sites that hurt bilaterally with age, gender and genotype where studied.The results show a high prevalence of bilateral pain among sickle cell patients at all sites. Age gender and genotype were associated with higher prevalence in bilateral pain in some, but not all sites. The percentage of sites that have bilateral pain is also associated with the number of sites that have pain.
228

Accounting for Model Uncertainty in Linear Mixed-Effects Models

Sima, Adam 01 February 2013 (has links)
Standard statistical decision-making tools, such as inference, confidence intervals and forecasting, are contingent on the assumption that the statistical model used in the analysis is the true model. In linear mixed-effect models, ignoring model uncertainty results in an underestimation of the residual variance, contributing to hypothesis tests that demonstrate larger than nominal Type-I errors and confidence intervals with smaller than nominal coverage probabilities. A novel utilization of the generalized degrees of freedom developed by Zhang et al. (2012) is used to adjust the estimate of the residual variance for model uncertainty. Additionally, the general global linear approximation is extended to linear mixed-effect models to adjust the standard errors of the parameter estimates for model uncertainty. Both of these methods use a perturbation method for estimation, where random noise is added to the response variable and, conditional on the observed responses, the corresponding estimate is calculated. A simulation study demonstrates that when the proposed methodologies are utilized, both the variance and standard errors are inflated for model uncertainty. However, when a data-driven strategy is employed, the proposed methodologies show limited usefulness. These methods are evaluated with a trial assessing the performance of cervical traction in the treatment of cervical radiculopathy.
229

Choosing the Cut Point for a Restricted Mean in Survival Analysis, a Data Driven Method

Sheldon, Emily H 25 April 2013 (has links)
Survival Analysis generally uses the median survival time as a common summary statistic. While the median possesses the desirable characteristic of being unbiased, there are times when it is not the best statistic to describe the data at hand. Royston and Parmar (2011) provide an argument that the restricted mean survival time should be the summary statistic used when the proportional hazards assumption is in doubt. Work in Restricted Means dates back to 1949 when J.O. Irwin developed a calculation for the standard error of the restricted mean using Greenwood’s formula. Since then the development of the restricted mean has been thorough in the literature, but its use in practical analyses is still limited. One area that is not well developed in the literature is the choice of the time point to which the mean is restricted. The aim of this dissertation is to develop a data driven method that allows the user to find a cut-point to use to restrict the mean. Three methods are developed. The first is a simple method that locates the time at which the maximum distance between two curves exists. The second is a method adapted from a Renyi-type test, typically used when proportional hazards assumptions are not met, where the Renyi statistics are plotted and piecewise regression model is fit. The join point of the two pieces is where the meant will be restricted. Third is a method that applies a nonlinear model fit to the hazard estimates at each event time, the model allows for the hazards between the two groups to be different up until a certain time, after which the groups hazards are the same. The time point where the two groups’ hazards become the same is the time to which the mean is restricted. The methods are evaluated using MSE and bias calculations, and bootstrap techniques to estimate the variance.
230

Detecting and Correcting Batch Effects in High-Throughput Genomic Experiments

Reese, Sarah 19 April 2013 (has links)
Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal components analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of principal components analysis to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test if a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We further compare existing batch effect correction methods and apply gPCA to test their effectiveness. We conclude that our novel statistic that utilizes guided principal components analysis to identify whether batch effects exist in high-throughput genomic data is effective. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well.

Page generated in 0.1184 seconds