• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 879
  • 551
  • 228
  • 104
  • 68
  • 45
  • 35
  • 33
  • 28
  • 28
  • 17
  • 13
  • 12
  • 9
  • 8
  • Tagged with
  • 2356
  • 403
  • 242
  • 224
  • 199
  • 177
  • 164
  • 130
  • 129
  • 124
  • 118
  • 112
  • 112
  • 104
  • 103
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Guidelines for the Partial Area under the Summary Receiver Operating Characteristic (SROC) Curve

Fill, Roxanne 12 1900 (has links)
<p> The accuracy of a diagnostic test is often evaluated with the measures of sensitivity and specificity and the joint dependence between these two measures is captured by the receiver operating characteristic (ROC) curve. To combine multiple testing results from studies that are assumed to follow the same underlying probability law, a smooth summary receiver operating characteristic (SROC) curve can be fitted. Moses et al. (1993) proposed a least squares approach to fit the smooth SROC curve. </p> <p> In this thesis we overview the summary measures for the ROC curve in single study data as well as the summary statistics for the SROC curves in meta-analysis. These summary statistics include, the area under the curve (AUC), Q* statistic, area swept under the curve (ASC) and the partial area under the curve (pAUC). </p> <p> Our focus, however is mainly on the partial area under the SROC curve as it is being used frequently in meta-analysis of diagnostic testing. The appeal to use the pAUC instead of the full AUC is that the partial area can be used to focus on a clinically relevant region of the SROC curve where false positive rate (FPR) is small. Simulations and considerations for the use of the summary indices of the ROC and SROC curves are presented here. </p> / Thesis / Master of Science (MSc)
312

DATA FITTING AND LEAST-SQUARE ESTIMATION OF NONLINEAR PARAMETERS FOR MODELS OF DIELECTRIC RELAXATION DATA

Zou, Hai 06 1900 (has links)
<p> The work in this thesis is to develop a tool for calculating the parameters corresponding to certain theoretical model of dielectric relaxation phenomena and then doing the curve fitting using the result after fetching the data from the user. To our best knowledge, this the first such tool to calculate the parameters corresponding to certain theoretical model of dielectric relaxation phenomena while the user only need to provide the experimental data. The parameters are calculated by using a nonlinear least square algorithm implemented in Matlab and a nonlinear function minimizer available in Matlab. The way to do the curve fitting is not by the traditional way such as cubic spline but by calculating the simulated data using the chosen model and the calculated result for the parameters. </p> <p> The available mathematical models include all of popular theoretical models, the Cole-Davidson (DC), the Kohlrausch-Williams-Watts (KWW), the Havriliak-Negami (HN) and the model proposed by R. Hilfer (FD). </p> <p> There are two ways to calculate the parameters for each model as mentioned before. The result returned by this system may not be unique. Especially if the frequency range of data is not wide enough, the result would most likely be non-unique. Since the iterative method is used in the system, it is suggested that the user provides the initial values for the system with his best knowledge or background for the data and the tested sample related to dielectric relaxation process. </p> <p> It is normal if there is a part having worse fitting than the other parts. One of reasons could be the mathematical model's defect, which the model does not work for that part. For the further information, please contact me by email at zouhaijun at yahoo.com. </p> / Thesis / Master of Science (MSc)
313

ADAPTIVE VERTICAL SEISMIC ISOLATION FOR EQUIPMENT

Najafijozani, Mohammadreza January 2019 (has links)
Seismic isolation systems are widely recognized as beneficial for protecting both acceleration- and displacement-sensitive nonstructural systems and components. Furthermore, adaptive isolation systems have been shown to enable engineers to achieve various performance goals under multiple hazard levels. These systems have been implemented for horizontal excitation, but there has been very limited research on isolation for vertical excitation. Thus, this paper seeks to evaluate the benefit of adaptive vertical isolation systems for component isolation, specifically for nuclear plants. To do this, three vertical isolation systems are designed to achieve multiple goals: a linear spring and a linear damper (LSLD), a linear spring and a nonlinear damper (LSND) and a nonlinear spring and a linear damper (NSLD). To investigate the effectiveness of the proposed systems, a stiff piece of equipment is considered at an elevated floor within a power plant. A set of 30 triaxial ground motions is used to investigate the seismic response of the equipment. The maximum isolation displacement and equipment acceleration are used to assess the effectiveness of the three isolation systems. While all systems significantly reduce the seismic accelerations on the equipment compared to the fixed-base case, a LSND system is shown to exhibit superior seismic performance across multiple hazard levels. / Thesis / Master of Applied Science (MASc)
314

Methodological and analytical considerations on ranking probabilities in network meta-analysis: Evaluating comparative effectiveness and safety of interventions

Daly, Caitlin Helen January 2020 (has links)
Network meta-analysis (NMA) synthesizes all available direct (head-to-head) and indirect evidence on the comparative effectiveness of at least three treatments and provides coherent estimates of their relative effects. Ranking probabilities are commonly used to summarize these estimates and provide comparative rankings of treatments. However, the reliability of ranking probabilities as summary measures has not been formally established and treatments are often ranked for each outcome separately. This thesis aims to address methodological gaps and limitations in current literature by providing alternative methods for evaluating the robustness of treatment ranks, establishing comparative rankings, and integrating ranking probabilities across multiple outcomes. These novel tools, addressing three specific objectives, are developed in three papers. The first paper presents a conceptual framework for quantifying the robustness of treatments ranks and for elucidating potential sources of lack of robustness. Cohen’s kappa is proposed for quantifying the agreement between two sets of ranks based on NMAs of the full data and a subset of the data. A leave one-study-out strategy was used to illustrate the framework with empirical data from published NMAs, where ranks based on the surface under the cumulative ranking curve (SUCRA) were considered. Recommendations for using this strategy to evaluate sensitivity or robustness to concerning evidence are given. When two or more cumulative ranking curves cross, treatments with large probabilities of ranking the best, second best, third best, etc. may rank worse than treatments with smaller corresponding probabilities based on SUCRA. This limitation of SUCRA is addressed in the second paper through the proposal of partial SUCRA (pSUCRA) as an alternative measure for ranking treatments. pSUCRA is adopted from the partial area under the receiver operating characteristic curve in diagnostic medicine and is derived to summarize relevant regions of the cumulative ranking curve. Knowledge users are often faced with the challenge of making sense of large volumes of NMA results presented across multiple outcomes. This may be further complicated if the comparative rankings on each outcome contradict each other, leading to subjective final decisions. The third paper addresses this limitation through a comprehensive methodological framework for integrating treatments’ ranking probabilities across multiple outcomes. The framework relies on the area inside spie charts representing treatments’ performances on all outcomes, while also incorporating the outcomes’ relative importance. This approach not only provides an objective measure of the comparative ranking of treatments across multiple outcomes, but also allows graphical presentation of the results, thereby facilitating straightforward interpretation. All contributions in this thesis provide objective means to improve the use of comparative treatment rankings in NMA. Further extensive evaluations of these tools are required to assess their validity in empirical and simulated networks of different size and sparseness. / Thesis / Doctor of Philosophy (PhD) / Decisions on how to best treat a patient should be informed by all relevant evidence comparing the benefits and harms of available options. Network meta-analysis (NMA) is a statistical method for combining evidence on at least three treatments and produces a coherent set of results. Nevertheless, NMA results are typically presented separately for each health outcome (e.g., length of hospital stay, mortality) and the volume of results can be overwhelming to a knowledge user. Moreover, the results can be contradictory across multiple outcomes. Statistics that facilitate the ranking of treatments may aid in easing this interpretative burden while limiting subjectivity. This thesis aims to address methodological gaps and limitations in current ranking approaches by providing alternative methods for evaluating the robustness of treatment ranks, establishing comparative rankings, and integrating ranking probabilities across multiple outcomes. These contributions provide objective means to improve the use of comparative treatment rankings in NMA.
315

A Retrospective View of the Phillips Curve and Its Empirical Validity since the 1950s

Do, Hoang-Phuong 07 May 2021 (has links)
Since the 1960s, the Phillips curve has survived various significant changes (Kuhnian paradigm shifts) in macroeconomic theory and generated endless controversies. This dissertation revisits several important, representative papers throughout the curve's four historical, formative periods: Phillips' foundational paper in 1958, the wage determination literature in the 1960s, the expectations-augmented Phillips curve in the 1970s, and the latest New Keynesian iteration. The purpose is to provide a retrospective evaluation of the curve's empirical evidence. In each period, the preeminent role of the theoretical considerations over statistical learning from the data is first explored. To further appraise the trustworthiness of empirical evidence, a few key empirical models are then selected and evaluated for their statistical adequacy, which refers to the validity of the probabilistic assumptions comprising the statistical models. The evaluation results, using the historical (vintage) data in the first three periods and the modern data in the final one, show that nearly all of the models in the appraisal are misspecified - at least one probabilistic assumption is not valid. The statistically adequate models produced from the respecification with the same data suggest new understandings of the main variables' behaviors. The dissertations' findings from the representative papers cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing. / Doctor of Philosophy / The empirical regularity of the Phillips curve, which captures the inverse relationship between the inflation and unemployment rates, has been widely debated in academic economic research and between policymakers in the last 60 years. To shed light on the debate, this dissertation examines a selected list of influential, representative studies from the Phillips curves' empirical history through its four formative periods. The examinations of these papers are conducted as a blend between a discussion on the methodology of econometrics (the primary quantitative method in economics), the role of theory vs. statistical learning from the observed data, and evaluations of the validity of the probabilistic assumptions assumed behind the empirical models. The main contention is that any departure of probabilistic assumptions produces unreliable statistical inference, rendering the empirical analysis untrustworthy. The evaluation results show that nearly all of the models in the appraisal are untrustworthy - at least one assumption is not valid. Then, an attempt to produce improved empirical models is made to produce new understandings. Overall, the dissertation's findings cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing.
316

Multidimensional Warnings: Determining an Appropriate Stimulus for a Curve-Warning Device

Neurauter, Michael L. 15 October 2004 (has links)
An average of 42,000 fatalities occur on the United States of America's roads each year as a result of motor-vehicle crashes (National Highway Traffic Safety Administration, 2003). The dangers with respect to curves exist, from late notification of direction and speed, varying methods for determining advisory speeds, as well as driver unfamiliarity and/or over confidence. A curve-warning device, a device that notifies the driver of an upcoming curve and, possibly, conveys its vehicle-specific advisory speed and even direction, has the potential to drastically reduce the dangers of curve navigation. This study was performed as a proof of concept with regard to appropriate modalities and respective stimuli for a curve warning application. For this study, objective and subjective measurements were collected in a simulator environment to compare conditions comprised of multiple stimuli from the auditory (icon, tone, and speech), visual (Heads Down Display and Heads Up Display), and haptic (throttle push-back) modalities. The results of the study show that the speech stimulus was the most appropriate of the auditory stimuli for both objective and subjective measurements. Objectively, the HDD and HUD were comparable with respect to performance, although the participants tended to favor the HDD in their subjective ratings. The throttle push-back did little to positively impact the performance measurements, and based on participant comments and ratings, it is not recommended for a curve-warning application. Of the stimulus conditions (combinations of two and three modalities), the Speech and HDD condition provided performance gains and subjective acceptability above the rest of the conditions. / Master of Science
317

The Influence of Parental and Parent-Adolescent Relationship Characteristics on Sexual Trajectories from Adolescence through Young Adulthood

Cheshire, Emily Jade 28 May 2011 (has links)
Using the perspective of sexual script theory (Gagnon & Simon, 1973) and growth curve modeling, this study examined whether characteristics of parents and parent-adolescent connectedness influence change in lifetime number of sexual partners from adolescence through young adulthood. Living in a blended family, having at least one college-educated parent and on-time parent-adolescent sexual communication positively predicted later lifetime number of sexual partners. Parent religiosity and parent-adolescent connectedness negatively predicted later lifetime number of sexual partners. Parent-adolescent sexual communication that focused on negative consequences of sex and parent disapproval of adolescent sexual activity were not significant in the overall model. Control variables included adolescent race/ethnicity, gender, physical maturity, marriage history, virginity pledge history, and expectations of positive consequences of sex. Physical maturity and gender were not significant in the overall model. In conclusion, parents have significant and far-reaching influence on their children's later sexual behavior. This study extended research in the field by examining lifetime number of sexual partners across four time points, which allowed observation of change in this outcome variable with age and accounted for the nested nature of the data. / Master of Science
318

Longitudinal Associations among Adolescent Socioeconomic Status, Delay Discounting, and Substance Use

Peviani, Kristin M. 01 February 2018 (has links)
Adolescence is a period of heightened risk for substance use and heightened vulnerability to substance exposure. Yet, little is known about how socioeconomic status (SES) influences adolescent decision making and behavior across development to add to these risks. This prospective longitudinal study used latent growth curve modeling (GCM) to examine the contributions of SES on adolescent delay discounting and substance use in a sample of 167 adolescents (52% male). Confirmatory factor analysis (CFA) was used to compute SES factor scores across three waves using a composite of parent and spouse education years and combined annual household income. Adolescent delay discounting and substance use were measured annually across three waves. The main goal of this study is to examine how SES may explain individual differences in growth trajectories of delay discounting and substance use. We used parallel process growth curve modeling with SES as a time-varying and time-invariant covariate to examine the associations between adolescent SES, delay discounting, and substance use onset as well as frequency. These results reveal that delay discounting exhibits a declining linear trend across adolescent development whereas cigarette, alcohol, marijuana, and polysubstance use exhibit increasing linear trends across adolescent development. Furthermore, low SES (as a time-invariant covariate) may lead to earlier onset adolescent alcohol and polysubstance use by way of heightened levels of delay discounting. These findings suggest that delay discounting interventions may be a promising avenue for reducing socioeconomic disparities in early onset alcohol and polysubstance use, while delay discounting development is still underway. / Master of Science / Adolescence is a period of heightened risk for substance use and heightened vulnerability to the effects of substances. Yet, little is known about how socioeconomic status (SES) influences adolescent decision making and behavior to add to these risks. This study used latent growth curve modeling (GCM) to examine the role of SES on adolescent decision making and substance use in a sample of 167 adolescents (52% male). Confirmatory factor analysis (CFA) was used to compute SES factor scores across three time points using an average of parent and spouse education years and income. Adolescent delay discounting and substance use were measured annually across three time points. The main goal of this study is to examine how SES may explain individual differences in delay discounting and substance use across adolescence. We used parallel process growth curve modeling with SES as a time-varying and time-invariant covariate to examine the links between adolescent SES, delay discounting, and substance use age of onset and frequency. These results reveal that delay discounting shows linear decreases in growth across adolescence whereas cigarette, alcohol, marijuana, and polysubstance use show increasing linear growth across adolescence. Additionally, low SES may lead to earlier onset adolescent alcohol and polysubstance use by way of heightened levels of delay discounting. These findings suggest that delay discounting interventions may help reduce socioeconomic differences in early onset alcohol and polysubstance use, while delay discounting development is still in progress.
319

Numerical Modeling of Room-and-Pillar Coal Mine Ground Response

Fahrman, Benjamin Paul 28 March 2016 (has links)
Underground coal mine ground control persists as a unique challenge in rock mass engineering. Fall of roof and rib continue to present a hazard to underground personnel. Stability of underground openings is a prerequisite for successful underground coal mine workings. An adaptation of a civil engineering design standard for analyzing the stability of underground excavations for mining geometries is given here. The ground response curve--developed over seventy years ago for assessing tunnel stability--has significant implications for the design of underground excavations, but has seen little use in complex mining applications. The interaction between the small scale (pillar stress-strain) and the large scale (ground response curve) is studied. Further analysis between these two length scales is conducted to estimate the stress on pillars in a room-and-pillar coal mine. These studies are performed in FLAC3D by implementing a two-scale, two-step approach. This two-scale approach allows for the interaction between the small, pillar scale and the large, panel scale to be studied in a computationally efficient manner. / Ph. D.
320

Development of Enhanced Pavement Deterioration Curves

Ercisli, Safak 17 September 2015 (has links)
Modeling pavement deterioration and predicting the pavement performance is crucial for optimum pavement network management. Currently only a few models exist that incorporate the structural capacity of the pavements into deterioration modeling. This thesis develops pavement deterioration models that take into account, along with the age of the pavement, the pavement structural condition expressed in terms of the Modified Structural Index (MSI). The research found MSI to be a significant input parameter that affects the rate of deterioration of a pavement section by using the Akaike Information Criterion (AIC). The AIC method suggests that a model that includes the MSI is at least 10^21 times more likely to be closer to the true model than a model that does not include the MSI. The developed models display the average deterioration of pavement sections for specific ages and MSI values. Virginia Department of Transportation (VDOT) annually collects pavement condition data on road sections with various lengths. Due to the nature of data collection practices, many biased measurements or influential outliers exist in this data. Upon the investigation of data quality and characteristics, the models were built based on filtered and cleansed data. Following the regression models, an empirical Bayesian approach was employed to reduce the variance between observed and predicted conditions and to deliver a more accurate prediction model. / Master of Science

Page generated in 0.0489 seconds