• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 874
  • 412
  • 156
  • 84
  • 79
  • 35
  • 27
  • 18
  • 17
  • 16
  • 14
  • 13
  • 10
  • 8
  • 8
  • Tagged with
  • 2105
  • 2105
  • 548
  • 431
  • 430
  • 382
  • 380
  • 204
  • 192
  • 167
  • 162
  • 160
  • 156
  • 148
  • 147
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Statistical Analysis of the Environmental Geochemistry of an Unmined Uranium Ore Deposit

Levitan, Denise Madeline 13 August 2014 (has links)
An evaluation of the geochemistry of the environment prior to large-scale changes enables scientists and other stakeholders to assess both baseline conditions and the potential impact of those changes to the environment. One area in which documentation of pre-development geochemistry is particularly important is in the exploitation of ore deposits. Ore deposits consist of concentrations of elements or minerals that are enriched enough to be of potential economic value. Their unusual geochemistry often leaves a signature on the environment that can both aid in location an economic resource and present environmental management challenges during its lifecycle. Coles Hill, Virginia, represents one such site. The Coles Hill property is the location of uranium-enriched rock, commonly referred to as the Coles Hill uranium deposit. This dissertation outlines study design, sampling, and statistical analysis methods that can be used in the geochemical characterization of a potential resource extraction site. It presents three studies on geoenvironmental media at Coles Hill. The first study discusses sampling strategies and statistical analysis to address variability in geology, hydrology and climate for baseline assessment and presents an example of such an assessment at Coles Hill. Results suggest a localized environmental impact of the deposit but that differences in bedrock geology within the area surrounding the deposit could also be responsible for some of the variation. This study also emphasizes the importance of consideration of data below analytical detection limits and describes methods for doing so. The second study compares the geochemistry of soil samples collected at Coles Hill with reference data collected by the U.S. Geological Survey using multivariate statistical techniques. Differences are used to suggest potential pathfinder elements such as light rare earth elements to aid in exploration for similar deposits. The third study uses multivariate statistical analysis to examine differences among rocks, soils, and stream sediments to infer important geochemical processes involved in weathering of the deposit. Overall, the results of these studies can aid in the development of future environmental site studies at Coles Hill and elsewhere. / Ph. D.
242

Predictive reliabilities for electronic components

Nagarur, Nagendra N. January 1988 (has links)
A reliability model to study the behavior of an electronic component subject to several failure mechanisms ls developed. The mechanisms considered for the analysis are of degradation type where the number of defects for a mechanism increases with time, eventually causing the failure of the component. The failure pattern of the component subject to a single mechanism · with given initial and final number of defects is modelled as a pure birth process. Failure time for this mechanism is expressed as the first passage time of the birth process to state k from initial state l. First passage time distribution is derived for different forms of transition rates. When the initial and final states of the process are considered as random, the failure time is expressed as the mixture distribution obtained from the conditional first passage time distributions. The mixture distributions are well represented by a Weibull distribution. A computer program is developed to compute the parameters of the Weibull distribution iteratively by the method of matching moments. The approximation results are statistically validated. The results for a single mechanism are extended to the case of multiple mechanisms. Extreme·value theory and competing risk theory are applied to analyze the simultaneous effects of multiple mechanisms. lt is shown that the aggregate failure time distribution has a Weibull form for both the theories. The model explains the influence of physical and chemical properties of the component and the operating conditions on the failure times. It can be used for accelerated testing and for lncorporating reliability at product design stage. / Ph. D.
243

Quantification of Effect of Solar Storms on TEC over U.S. sector Using Machine Learning

Sardana, Disha 26 June 2018 (has links)
A study of large solar storms in the equinox periods of solar cycles 23 and 24 is presented to quantify their effects on the total electron content (TEC) in the ionosphere. We study the dependence of TEC over the contiguous US on various storm parameters, including the onset time of the storm, the duration of the storm, its intensity, and the rate of change of the ring current response. These parameters are inferred autonomously and compared to TEC values obtained from the CORS network of GPS stations. To quantify the effects we examine the difference between the storm-time TEC value and an average from 5 quiet days during the same month. These values are studied over a grid with 1 deg x 1 deg spatial resolution in latitude and longitude over the US sector. Correlations between storm parameters and the quantified delta TEC values are studied using machine learning techniques to identify the most important controlling variables. The weights inferred by the algorithm for each input variable show their importance to the resultant TEC change. The results of this work are compared to recent TEC studies to investigate the effects of large storms on the distribution of ionospheric density over large spatial and temporal scales. / MS / This study analyzes the impact of geomagnetic storms on the electrical properties of the upper atmosphere at altitudes where satellites routinely fly. The storms are caused by bursts of charged particles from the sun entering the Earth’s atmosphere at high latitudes, leading to phenomena like the aurora. These fluctuations in the atmospheric electrical properties can potentially have serious consequences for the electrical power grid, the communications infrastructure, and various technological systems. Given the risks solar storms can pose, it is important to predict how strong the impact of a given storm is likely to be. The current study applies machine learning techniques to model one particular parameter that relates to the electrified atmosphere over the contiguous US sector. We quantify the strength of the fluctuations as a function of various storm parameters, including onset time and duration. This enables us to autonomously infer which storm parameters have the most significant influence on the resultant atmospheric changes, and compare our results to other recent studies.
244

Tapping the Vast Potential of the Data Deluge in Small-scale Food-Animal Production Businesses: Challenges to Near Real-time Data Analysis and Interpretation

Vial, F., Tedder, Andrew 13 September 2019 (has links)
Yes / Food-animal production businesses are part of a data-driven ecosystem shaped by stringent requirements for traceability along the value chain and the expanding capabilities of connected products. Within this sector, the generation of animal health intelligence, in particular, in terms of antimicrobial usage, is hindered by the lack of a centralized framework for data storage and usage. In this Perspective, we delimit the 11 processes required for evidence-based decisions and explore processes 3 (digital data acquisition) to 10 (communication to decision-makers) in more depth. We argue that small agribusinesses disproportionally face challenges related to economies of scale given the high price of equipment and services. There are two main areas of concern regarding the collection and usage of digital farm data. First, recording platforms must be developed with the needs and constraints of small businesses in mind and move away from local data storage, which hinders data accessibility and interoperability. Second, such data are unstructured and exhibit properties that can prove challenging to its near real-time preprocessing and analysis in a sector that is largely lagging behind others in terms of computing infrastructure and buying into digital technologies. To complete the digital transformation of this sector, investment in rural digital infrastructure is required alongside the development of new business models to empower small businesses to commit to near real-time data capture. This approach will deliver critical information to fill gaps in our understanding of emerging diseases and antimicrobial resistance in production animals, eventually leading to effective evidence-based policies. / This article is part of the research topic "Digital transformation of animal health data: Proceedings of the AHEAD 2017 workshop" (https://www.frontiersin.org/research-topics/5834#articles)
245

Towards a framework for engineering big data: An automotive systems perspective

Byrne, Thomas J., Campean, Felician, Neagu, Daniel 05 1900 (has links)
Yes / Demand for more sophisticated models to meet big data expectations require significant data repository obligations, operating concurrently in higher-level applications. Current models provide only disjointed modelling paradigms. The proposed framework addresses the need for higher-level abstraction, using low-level logic in the form of axioms, from which higher-level functionality is logically derived. The framework facilitates definition and usage of subjective structures across the cyber-physical system domain, and is intended to converge the range of heterogeneous data-driven objects.
246

A scoping review to identify the techniques frequently used when analysing qualitative visual data

Smith, S.K., Mountain, Gail, Hawkins, R.J. 30 September 2015 (has links)
No / Challenges were encountered when attempting to analyse video based data during a project exploring touch screen computer technology with people living with dementia. In order to inform the analytic process, a scoping review of published evidence was undertaken. Results of the scope illustrated the use of various techniques when analysing visual data, the most common of which was the transcription of video into text and analysed using conversation analysis. Three additional issues emerged in the course of the review. First, there is an absence of detail when describing the ethical implications involved when utilising visual methods in research. Second, limited priority is given to providing a clear rationale for utilising visual methods when audio or field notes may have been a viable alternative. Third, only 40% of reviewed articles clearly stated a chosen methodology. The conclusions of the review illustrate a lack of consistency across studies in the overall reporting of research methods and recommend that authors be explicit in their reporting of methodological issues across the research process. / The PhD is funded by the ESRC as part of the White Rose University Consortium
247

New Teacher PBL Planning and Implementation

Harris, Marlon Karel 08 1900 (has links)
This study investigated novice science and mathematics teachers' beliefs about planning and implementation of project-based instruction. Data for this qualitative study included two focus groups and a questionnaire. Items in the questionnaire were designed using preliminary findings from the analysis of the two focus groups, and from predefined items from the National Survey of Project Based Learning and High School Reform. The questionnaire was administered to 138 novice secondary mathematics and science teachers certified in their respective content areas. The respondent rate was 70% (n = 96). Of the 96 respondents only 28 participants utilized project-based instruction. Data analysis revealed that the 28 participants held two specific beliefs about project-based instruction. First, participants believed that the implementation and enactment of PBL: (a) made student learning more personalized by specifically meeting the individual interests or needs of students; (b) promoted students' international or cross-cultural understanding; (c) promoted students' civic engagement and contributions to the community or world; and (d) impacted high-achieving students ability levels. Second, participants believed that the implementation and enactment of PBL: (a) made teaching and learning more varied, challenging, or fun; (b) taught skills beyond academic content; and (c) taught academic content knowledge and skills more effectively. These two beliefs were situated within contexts where the instructional strategy most often used to deliver content was direct instruction and by external factors that impacted the participants' utilization and implementation of PBL. External factors included: testing and accountability requirements; a lack of time for planning and implementation of PBL lessons; students' lack of experience or skills associated with enactment of PBL; large class sizes; limited classroom space; and poor attendance.
248

A Heuristic Approach to Selection of Analytical Methods: Three Empirical Healthcare Studies

Tarakci, Yasemin 08 1900 (has links)
Managers rely on analytics to make decisions and the choice of the analytical method can influence their decision-making. This dissertation considers three cases and examines how the choice of analytical methods influence interpretations and implications. These areas are communication for health-related information in social media, health information technology investment by hospitals as it relates to patient satisfaction, and health related expenditure policies of countries. These studies develop theoretical models and empirically test them on primary or secondary data, comparing the performance of popular analytical methods. The conduct of these three studies contributes to a better understanding about the choice of analytical methods and allow development of a heuristic approach by offering guidelines for selecting an appropriate methodology. They demonstrate the value of heuristic approaches for use with non-traditional and traditional statistical methods, as the information gained from non-traditional methods (NNs) provides insights into traditional statistical methods, similar to insights gained from exploratory data analysis. The studies also show the value in examining any dataset with multiple methods because they either confirm each other or fail to confirm, providing insights.
249

Using Neural Networks to Classify Discrete Circular Probability Distributions

Gaumer, Madelyn 01 January 2019 (has links)
Given the rise in the application of neural networks to all sorts of interesting problems, it seems natural to apply them to statistical tests. This senior thesis studies whether neural networks built to classify discrete circular probability distributions can outperform a class of well-known statistical tests for uniformity for discrete circular data that includes the Rayleigh Test1, the Watson Test2, and the Ajne Test3. Each neural network used is relatively small with no more than 3 layers: an input layer taking in discrete data sets on a circle, a hidden layer, and an output layer outputting probability values between 0 and 1, with 0 mapping to uniform and 1 mapping to nonuniform. In evaluating performances, I compare the accuracy, type I error, and type II error of this class of statistical tests and of the neural networks built to compete with them. 1 Jammalamadaka, S. Rao(1-UCSB-PB); SenGupta, A.(6-ISI-ASU)Topics in circular statistics. (English summary) With 1 IBM-PC floppy disk (3.5 inch; HD). Series on Multivariate Analysis, 5. World Scientific Publishing Co., Inc., River Edge, NJ, 2001. xii+322 pp. ISBN: 981-02-3778-2 2 Watson, G. S.Goodness-of-fit tests on a circle. II. Biometrika 49 1962 57–63. 3 Ajne, B.A simple test for uniformity of a circular distribution. Biometrika 55 1968 343–354.
250

Multiple Calibrations in Integrative Data Analysis: A Simulation Study and Application to Multidimensional Family Therapy

Hall, Kristin Wynn 01 January 2013 (has links)
A recent advancement in statistical methodology, Integrative Data Analyses (IDA Curran & Hussong, 2009) has led researchers to employ a calibration technique as to not violate an independence assumption. This technique uses a randomly selected, simplified correlational structured subset, or calibration, of a whole data set in a preliminary stage of analysis. However, a single calibration estimator suffers from instability, low precision and loss of power. To overcome this limitation, a multiple calibration (MC; Greenbaum et al., 2013; Wang et al., 2013) approach has been developed to produce better estimators, while still removing a level of dependency in the data as to not violate independence assumption. The MC method is conceptually similar to multiple imputation (MI; Rubin, 1987; Schafer, 1997), so MI estimators were borrowed for comparison. A simulation study was conducted to compare the MC and MI estimators, as well as to evaluate the performance of the operating characteristics of the methods in a cross classified data characteristic design. The estimators were tested in the context of assessing change over time in a longitudinal data set. Multiple calibrations consisting of a single measurement occasion per subject were drawn from a repeated measures data set, analyzed separately, and then combined by the rules set forth by each method to produce the final results. The data characteristics investigated were effect size, sample size, and the number of repeated measures per subject. Additionally, a real data application of an MC approach in an IDA framework was conducted on data from three completed, randomized controlled trials studying the treatment effects of Multidimensional Family Therapy (MDFT; Liddle et al., 2002) on substance use trajectories for adolescents at a one year follow-up. The simulation study provided empirical evidence of how the MC method preforms, as well as how it compares to the MI method in a total of 27 hypothetical scenarios. There were strong asymptotic tendencies observed for the bias, standard error, mean square error and relative efficiency of an MC estimator to approach the whole set estimators as the number of calibrations approached 100. The MI combination rules proved not appropriate to borrow for the MC case because the standard error formulas were too conservative and performance with respect to power was not robust. As a general suggestion, 5 calibrations are sufficient to produce an estimator with about half the bias of a single calibration estimator and at least some indication of significance, while 20 calibrations are ideal. After 20 calibrations, the contribution of an additional calibration to the combined estimator greatly diminished. The MDFT application demonstrated a successful implementation of 5 calibration approach in an IDA on real data, as well as the risk of missing treatment effects when analysis is limited to a single calibration's results. Additionally, results from the application provided evidence that MDFT interventions reduced the trajectories of substance use involvement at a 1-year follow-up to a greater extent than any of the active control treatment groups, overall and across all gender and ethnicity subgroups. This paper will aid researchers interested in employing a MC approach in an IDA framework or whenever a level of dependency in a data set needs to be removed for an independence assumption to hold.

Page generated in 0.0553 seconds