• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 234
  • 84
  • 54
  • 32
  • 31
  • 26
  • 9
  • 7
  • 6
  • 6
  • 6
  • 4
  • 3
  • 3
  • 3
  • Tagged with
  • 570
  • 101
  • 73
  • 59
  • 50
  • 48
  • 48
  • 46
  • 46
  • 44
  • 44
  • 38
  • 38
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Motivation som leder till höga betyg i matematik : En studie om vad skolan kan göra för att öka motivationen hos elever / Motivation that leads to higher grades in mathematics : A study on how school increase students motivation

Fredby, åsa January 2016 (has links)
En kvantitativ undersökning med 248 enkätsvar har genomförts. Syfte var att jämföra och analysera motivationen hos elever med olika betyg i matematik och elever som läser eller inte läser matematikinriktning samt beskriva hur skolan kan hjälpa elever till ökad motivation i matematik. Ett särskilt fokus riktades på SUM-elever. Motivationsteorierna Achievement goal theory, Self-determination theory och Attribution theory studerades för att se vilka karakteristiska drag som leder till ökad motivation. Resultatet pekade på samband mellan elevers betyg och motivation i matematik. Elever med högre betyg hade högre motivation än elever med lägre betyg. Det framkom vad elever tycker är viktigt för att de ska lära sig matematik, vilket till stor del sammanfaller med vad motivationsteorierna beskriver att skolan ska arbeta med för att öka motivationen. Sammanfattningsvis pekade resultatet på att skolan bör ange tydliga mål, ge valmöjligheter samt utmanande men anpassade uppgifter för att hjälpa elever till ökad motivation. Dessutom bör skolan hjälpa elever att notera sina framsteg, att inse att misstag bidrar till lärande samt att ansträngning leder till känsla av kompetens.
112

An Accelerometer-based Gesture Recognition System for a Tactical Communications Application

Tidwell, Robert S., Jr. 12 1900 (has links)
In modern society, computers are primarily interacted with via keyboards, touch screens, voice recognition, video analysis, and many others. For certain applications, these methods may be the most efficient interface. However, there are applications that we can conceive where a more natural interface could be convenient and connect humans and computers in a more intuitive and natural way. These applications are gesture recognition systems and range from the interpretation of sign language by a computer to virtual reality control. This Thesis proposes a gesture recognition system that primarily uses accelerometers to capture gestures from a tactical communications application. A segmentation algorithm is developed based on the accelerometer energy to segment these gestures from an input sequence. Using signal processing and machine learning techniques, the segments are reduced to mathematical features and classified with support vector machines. Experimental results show that the system achieves an overall gesture recognition accuracy of 98.9%. Additional methods, such as non-gesture recognition/suppression, are also proposed and tested.
113

WEIGHTED QUANTILE SUM REGRESSION FOR ANALYZING CORRELATED PREDICTORS ACTING THROUGH A MEDIATION PATHWAY ON A BIOLOGICAL OUTCOME

Evani, Bhanu M 01 January 2017 (has links)
Abstract Weighted Quantile Sum Regression for Analyzing Correlated Predictors Acting Through a Mediation Pathway on a Biological Outcome By Bhanu M. Evani, Ph.D. A thesis submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy at Virginia Commonwealth University. Virginia Commonwealth University, 2017. Major Director: Robert A. Perera, Asst. Professor, Department of Biostatistics This work examines mediated effects of a set of correlated predictors using the recently developed Weighted Quantile Sum (WQS) regression method. Traditionally, mediation analysis has been conducted using the multiple regression method, first proposed by Baron and Kenny (1986), which has since been advanced by several authors like MacKinnon (2008). Mediation analysis of a highly correlated predictor set is challenging due to the condition of multicollinearity. Weighted Quantile Sum (WQS) regression can be used as an alternative method to analyze the mediated effects, when predictor correlations are high. As part of the WQS method, a weighted quartile sum index (WQSindex) is computed to represent the predictor set as an entity. The predictor variables in classic mediation are then replaced with the WQSindex, allowing for the estimation of the total indirect effect between all the predictors and the outcome. Predictors having a high relative importance in their association with the outcome can be identified by examining the empirical weights for the individual predictors estimated by the WQS regression method. Other constrained optimization methods (e.g. LASSO) focus on reducing dimensionality of the correlated predictors to reduce multicollinearity. WQS regression in the context of mediation is studied using Monte Carlo simulation for mediation models with two and three correlated predictors. WQS regression’s performance is compared to the classic OLS multiple regression and the regularized LASSO regression methods. An application of these three methods to the National Health and Nutrition Examination Survey (NHANES) dataset examines the effect of serum concentrations of Polychlorinated Biphenyls (independent variables) on the liver enzyme, alanine aminotransferase ALT (outcome), with chromosomal telomere length as a potential mediator. Keywords: Multicollinearity, Weighted Quantile Sum Regression, Mediation Analysis
114

Linear programming to determine molecular orientation at surfaces through vibrational spectroscopy

Chen, Fei 03 May 2017 (has links)
Applying linear programming (LP) to spectroscopy techniques, such as IR, Raman and SFG, is a new approach to extract the molecular orientation information at surfaces. In Hung’s previous research, he has shown how applying LP results in the computational gain from O(n!) to O(n). However, this LP approach does not always return the known molecular orientation distribution information when mock spectral information is used to build the instance of the model. The first goal of our study is to figure out the cause for the failed LP instances. After that, we also want to know for different cases with what spectral information, can the correct molecular orientation be expected when using LP. To achieve these goals, a simplified molecular model is designated to study the nature of our LP model. With the information gained, we further apply the LP approach to various test cases in order to verify whether it can be systematically applied to different circumstances. We have achieved the following conclusions: with the help of simplified molecular model, the inability to extract a sufficient data set from the given spectral information to build the LP instances is the reason that the LP solver does not return the target composition. When candidates coming from one same molecule, even combining all three spectral information of IR, Raman and SFG, the data set extracted is still not sufficient in order to obtain the target composition for most cases. When candidates are coming from different molecules, Raman or SFG spectral information alone contains sufficient data set to obtain the target composition when candidates of each molecule expanded in [0◦, 90◦) on θ. When candidates of each molecule expanded in [0◦, 180◦] on θ, excluding 90◦, SFG spectral information needs to combine with IR or Raman in order to obtain the sufficient data set to obtain the target composition. When the slack variable is introduced to each spectral technique, for the case of candidates coming from different molecules, when candidates expanded in [0◦, 90◦) on θ, Raman spectral information carries sufficient data set to obtain the target composition. When candidates expanded in [0◦, 180◦] on θ, excluding 90◦, SFG and Raman spectral information together carries sufficient data set in order to obtain the target composition. / Graduate / chenfei.cp@gmail.com
115

Partial sum process of orthogonal series as rough process

Yang, Danyu January 2012 (has links)
In this thesis, we investigate the pathwise regularity of partial sum process of general orthogonal series, and prove that the partial sum process is a geometric 2-rough process under the same condition as in Menshov-Rademacher Theorem. For Fourier series, the condition can be improved, and an equivalent condition on the limit function is identified.
116

A Cross-Sectional Analysis of Health Impacts of Inorganic Arsenic in Chemical Mixtures

Hargarten, Paul 01 January 2015 (has links)
Drinking groundwater is the primary way humans accumulate arsenic. Chronic exposure to inorganic arsenic (iAs) (over decades) has been shown to be associated with multiple health effects at low levels (5-10 ppb) including: cancer, elevated blood pressure and cardiovascular disease, skin lesions, renal failure, and peripheral neuropathy. Using hypertension (or high blood pressure) as a surrogate marker for cardiovascular disease, we examined the effect of iAs alone and in a mixture with other metals using a cross-sectional study of adults in United States (National Health and Examination Survey, NHANES, 2005-2010) adjusting for covariates: urinary creatinine level (mg/dL), poverty index ratio (PIR, measure of socioeconomic status, 1 to 5), age, smoking (yes/no), alcohol usage, gender, non-Hispanic Black, and overweight (BMI>=25). A logistic regression model suggests that a one-unit increase in log of inorganic arsenic increases the odds of hypertension by a factor of 1.093 (95% Confidence Interval=0.935, 1.277) adjusted for these covariates , which indicates that there was not significant evidence to claim that inorganic arsenic is a risk factor for hypertension. Biomonitoring data provides evidence that humans are not only exposed to inorganic arsenic but also to mixtures of chemicals including inorganic arsenic, total mercury, cadmium, and lead. We tested for a mixture effect of these four environmental chemicals using weighted quantile sum (WQS) regression, which takes into account the correlation among the chemicals and with the outcome. For one-unit increase in the weighted sum, the adjusted odds of developing hypertension increases by a factor of 1.027 (95% CI=0.882,1.196), which is also not significant after taking into account the same covariates. The insignificant finding may be due to the low inorganic arsenic concentration (8-620 μg /L) in US drinking water, compared to those in countries like Bangladesh where the concentrations are much higher. Literature provides conflicting evidence of the association of inorganic arsenic and hypertension in low/moderate regions; future studies, especially a large cohort study, are needed to confirm if inorganic arsenic alone or with other metals is associated with hypertension in the United States.
117

Vznik práva na výplatu pojistného plnění / The creation of a right to the payment of insurance claim

Brejchová, Ivana January 2013 (has links)
The aim of this thesis is to clarify the problematics of the creation of a right to the payment of insurance claim in range of personal insurance. The thesis is divided into six chapters in accordance with the topics. In order to introduce the problematics, there are historical roots of insurance in the world as well as of insurance in the area of Bohemia and Moravia stated. The second chapter deals with current legal regulation of personal insurance, with the focus on two-legal regime of the insurance contract. The crucial part of this thesis are chapters three, four and five, where conditions, presumptions and factors, which affect the creation of a right to the payment of insurance claim, are thoroughly dealt with. In the third chapter, there are basic conditions of the creation of a right to the payment of insurance claim, such as the existence of insurance contract, the occurance of insurance event and the absence of exclusions analyzed. The fourth chapter mentions factors that influence the creation of a right to the payment of insurance claim itself, and these are common with all kinds of insurance, such as for instance the inception of insurance, inception of insurance protection, insurance payment, exclusion, demand of a right to insurance claim by the authorised person, reasons for rejection of...
118

An analysis of the impact of data errors on backorder rates in the F404 engine system

Burson, Patrick A. R. 03 1900 (has links)
Approved for public release; distribution in unlimited. / In the management of the U.S. Naval inventory, data quality is of critical importance. Errors in major inventory databases contribute to increased operational costs, reduced revenue, and loss of confidence in the reliability of the supply system. Maintaining error-free databases is not a realistic objective. Data-quality efforts must be prioritized to ensure that limited resources are allocated to achieve the maximum benefit. This thesis proposes a methodology to assist the Naval Inventory Control Point in the prioritization of its data-quality efforts. By linking data errors to Naval inventory performance metrics, statistical testing is used to identify errors that have the greatest adverse impact on inventory operations. By focusing remediation efforts on errors identified in this manner, the Navy can best use its limited resources devoted to improvement of data quality. Two inventory performance metrics are considered: Supply Material Availability (SMA), an established metric in Naval inventory management; and Backorder Persistence Metric (BPM), which is developed in the thesis. Backorder persistence measures the duration of time that the ratio of backorders to quarterly demand exceeds a threshold value. Both metrics can be used together to target remediation on reducing shortage costs and improving inventory system performance. / Lieutenant Commander, Supply Corps, United States Navy
119

Predictability of International Stock Returns with Sum of the Parts and Equity Premiums under Regime Shifts

Athari, Mahtab 18 December 2015 (has links)
This research consists of two essays. The first essay entitled” Stock Return Forecasting with Sum-of-the-Parts Methodology: Evidence from Around the World”, examines forecasting ability of stock returns by employing the sum-of-the-parts (SOP) modeling technique introduced by Ferreira and Santa-Clara (2011).This approach decomposes return into three components of growth in price-earnings ratio, earnings growth, and dividend-price ratio. Each component is forecasted separately and fitted values are used in forecast model to predict stock return. We conduct a series of one-step ahead recursive forecasts for a wide range of developed and emerging markets over the period February 1995 through November 2014. Decomposed return components are forecasted separately using a list of financial variables and the fitted values from the best estimators are used according to out-of-sample performance. Our findings show that the SOP method with financial variables outperforms the historical sample mean for the majority of countries. Second essay entitled,” Equity Premium Predictability under Regime Shifts: International Evidence”, utilizes the modified version of the dividend-price ratio that alleviates some econometric concerns in the literature regarding the non-stationary and persistent predictor when forecasting international equity premium across different regimes. We employ Markov switching technique to address the issue of non-linearity between the equity premium and the predictor. The results show different patterns of equity premium predictability over the regimes across countries by the modified ratio as predictor. In addition, transition probability analysis show the adverse effect of financial crisis on regime transition probabilities by increasing the probability of switching between regimes post-crisis 2007 implying higher risk perceived by investors as a result of uncertainty inherent in regime transitions.
120

Vérification de la validité du concept de surface somme par une approche statistique du contact élastique entre deux surfaces rugueuses / Validity study of the sum surface concept using a statistical approach of elastic contact between two rough surfaces

Tran, Ich tach 26 January 2015 (has links)
Les propriétés de surface, particulièrement microgéométriques, jouent un rôle essentiel dans tous les systèmes tribologiques. L’analyse de la répartition des efforts de contact dans l’interface entre surfaces rugueuses est indispensable à la prédiction du frottement, de l'usure, de l'adhérence, des résistances de contact électrique et thermique… De nombreux modèles ont été proposés ces dernières décennies pour prédire les efforts entre aspérités de surfaces rugueuses. Parmi ces modèles, les modèles statistiques sont majoritairement développés en considérant le contact ente une surface rugueuse équivalente, la surface somme - qui tient compte des microgéométries des deux surfaces en contact ainsi que de leur matériau - et un plan lisse. Cependant la validité de cette modélisation n’a pas été clairement démontrée. L’objectif de notre étude a été de développer un modèle statistique de contact entre deux surfaces rugueuses isotropes aléatoires puis de comparer les résultats obtenus pour ces deux surfaces avec ceux obtenus en considérant la surface somme définie classiquement à partir des deux surfaces rugueuses et un plan lisse. Les différences entre les résultats nous ont amenés à proposer une nouvelle modélisation de la surface somme. / Surface properties, particularly micro-geometry, play a key role in all tribological systems. The analysis of the distribution of contact forces in the interface between rough surfaces is essential for the prediction of friction, wear, adhesion, electrical and thermal contact resistance... Many models have been proposed during the last decades to predict the forces between asperities of rough surfaces. Among these models, statistical models are mainly based on the contact between an equivalent rough surface, the sum surface - which combines micro-geometry of the two surfaces in contact and their material - and a smooth plane. However, the validity of this model has not been clearly demonstrated. The aim of our study was to develop a statistical model of the contact between two random isotropic rough surfaces and then compare the results with those obtained by considering the classical sum surface. The differences between the results have led us to propose a new definition for the sum surface.

Page generated in 0.0781 seconds