• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 126
  • 32
  • 19
  • 15
  • 7
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 255
  • 42
  • 27
  • 27
  • 21
  • 17
  • 16
  • 15
  • 15
  • 14
  • 13
  • 12
  • 12
  • 12
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

The Effect of Psychometric Parallelism among Predictors on the Efficiency of Equal Weights and Least Squares Weights in Multiple Regression

Zhang, Desheng 05 1900 (has links)
There are several conditions for applying equal weights as an alternative to least squares weights. Psychometric parallelism, one of the conditions, has been suggested as a necessary and sufficient condition for equal-weights aggregation. The purpose of this study is to investigate the effect of psychometric parallelism among predictors on the efficiency of equal weights and least squares weights. Target correlation matrices with 10,000 cases were simulated so that the matrices had varying degrees of psychometric parallelism. Five hundred samples with six ratios of observation to predictor = 5/1, 10/1, 20/1, 30/1, 40/1, and 50/1 were drawn from each population. The efficiency is interpreted as the accuracy and the predictive power estimated by the weighting methods. The accuracy is defined by the deviation between the population R² and the sample R² . The predictive power is referred to as the population cross-validated R² and the population mean square error of prediction. The findings indicate there is no statistically significant relationship between the level of psychometric parallelism and the accuracy of least squares weights. In contrast, the correlation between the level of psychometric parallelism and the accuracy of equal weights is significantly negative. Under different conditions, the minimum p value of χ² for testing psychometric parallelism among predictors is also different in order to prove equal weights more powerful than least squares weights. The higher the number of predictors is, the higher the minimum p value. The higher the ratio of observation to predictor is, the higher the minimum p value. The higher the magnitude of intercorrelations among predictors is, the lower the minimum p value. This study demonstrates that the most frequently used levels of significance, 0.05 and 0.01, are no longer the only p values for testing the null hypotheses of psychometric parallelism among predictors when replacing least squares weights with equal weights.
72

Incorporating survey weights into logistic regression models

Wang, Jie 24 April 2013 (has links)
Incorporating survey weights into likelihood-based analysis is a controversial issue because the sampling weights are not simply equal to the reciprocal of selection probabilities but they are adjusted for various characteristics such as age, race, etc. Some adjustments are based on nonresponses as well. This adjustment is accomplished using a combination of probability calculations. When we build a logistic regression model to predict categorical outcomes with survey data, the sampling weights should be considered if the sampling design does not give each individual an equal chance of being selected in the sample. We rescale these weights to sum to an equivalent sample size because the variance is too small with the original weights. These new weights are called the adjusted weights. The old method is to apply quasi-likelihood maximization to make estimation with the adjusted weights. We develop a new method based on the correct likelihood for logistic regression to include the adjusted weights. In the new method, the adjusted weights are further used to adjust for both covariates and intercepts. We explore the differences and similarities between the quasi-likelihood and the correct likelihood methods. We use both binary logistic regression model and multinomial logistic regression model to estimate parameters and apply the methods to body mass index data from the Third National Health and Nutrition Examination Survey. The results show some similarities and differences between the old and new methods in parameter estimates, standard errors and statistical p-values.
73

Forecast Combination with Multiple Models and Expert Correlations

Soule, David P 01 January 2019 (has links)
Combining multiple forecasts in order to generate a single, more accurate one is a well-known approach. A simple average of forecasts has been found to be robust despite theoretically better approaches, increasing availability in the number of expert forecasts, and improved computational capabilities. The dominance of a simple average is related to the small sample sizes and to the estimation errors associated with more complex methods. We study the role that expert correlation, multiple experts, and their relative forecasting accuracy have on the weight estimation error distribution. The distributions we find are used to identify the conditions when a decision maker can confidently estimate weights versus using a simple average. We also propose an improved expert weighting approach that is less sensitive to covariance estimation error while providing much of the benefit from a covariance optimal weight. These two improvements create a new heuristic for better forecast aggregation that is simple to use. This heuristic appears new to the literature and is shown to perform better than a simple average in a simulation study and by application to economic forecast data.
74

The use of credit scorecard design, predictive modelling and text mining to detect fraud in the insurance industry / Terisa Roberts

Roberts, Terisa January 2011 (has links)
The use of analytical techniques for fraud detection and the design of fraud detection systems have been topics of several research projects in the past and have seen varying degrees of success in their practical implementation. In particular, several authors regard the use of credit risk scorecards for fraud detection as a useful analytical detection tool. However, research on analytical fraud detection for the South African insurance industry is limited. Furthermore, real world restrictions like the availability and quality of data elements, highly unbalanced datasets, interpretability challenges with complex analytical techniques and the evolving nature of insurance fraud contribute to the on-going challenge of detecting fraud successfully. Insurance organisations face financial instability from a global recession, tighter regulatory requirements and consolidation of the industry, which implore the need for a practical and effective fraud strategy. Given the volumes of structured and unstructured data available in data warehouses of insurance organisations, it would be sensible for an effective fraud strategy to take into account data-driven methods and incorporate analytical techniques into an overall fraud risk assessment system. Having said that, the complexity of the analytical techniques, coupled with the effort required to prepare the data to support it, should be carefully considered as some studies found that less complex algorithms produce equal or better results. Furthermore, an over reliance on analytical models can underestimate the underlying risk, as observed with credit risk at financial institutions during the financial crisis. An attractive property of the structure of the probabilistic weights-of-evidence (WOE) formulation for risk scorecard construction is its ability to handle data issues like missing values, outliers and rare cases. It is also transparent and flexible in allowing the re-adjustment of the bins based on expert knowledge or other business considerations. The approach proposed in the study is to construct fraud risk scorecards at entity level that incorporate sets of intrinsic and relational risk factors to support a robust fraud risk assessment. The study investigates the application of an integrated Suspicious Activity Assessment System (SAAS) empirically using real-world South African insurance data. The first case study uses a data sample of short-term insurance claims data and the second a data sample of life insurance claims data. Both case studies show promising results. The contributions of the study are summarised as follows: The study identified several challenges with the use of an analytical approach to fraud detection within the context of the South African insurance industry. The study proposes the development of fraud risk scorecards based on WOE measures for diagnostic fraud detection, within the context of the South African insurance industry, and the consideration of alternative algorithms to determine split points. To improve the discriminatory performance of the fraud risk scorecards, the study evaluated the use of analytical techniques, such as text mining, to identify risk factors. In order to identify risk factors from large sets of data, the study suggests the careful consideration of both the types of information as well as the types of statistical techniques in a fraud detection system. The types of information refer to the categories of input data available for analysis, translated into risk factors, and the types of statistical techniques refer to the constraints and assumptions of the underlying statistical techniques. In addition, the study advocates the use of an entity-focused approach to fraud detection, given that fraudulent activity typically occurs at an entity or group of entities level. / PhD, Operational Research, North-West University, Vaal Triangle Campus, 2011
75

Neuro-Mechanical Analysis of Eccentric Overload of Elbow Flexors

2013 January 1900 (has links)
Eccentric overload in training settings utilizes loads higher than concentric one repetition maximum (1RM). There is no clear definition of eccentric “failure” or 1RM using conventional weights, so eccentric 1RM is estimated to be between 145-190% concentric 1RM. Historically, the highest intensity used for eccentric overload is typically 120% of concentric 1RM despite little research using conventional weights with higher eccentric intensities. The purpose of this study was to conduct an exploratory neuro-mechanical analysis of different intensities of elbow flexors eccentric overload using free weights by examining angular kinematics during contraction. Twenty male participants with weight training experience had unilateral concentration curl isometric peak torque assessed on a Humac Norm Dynamometer and concentric 1RM assessed with dumbbells while biceps brachii electromyography (EMG) and elbow joint angle were recorded. Angles were recorded using a custom made electrogoniometer and elbow joint torque was estimated using inverse dynamics. Participants were randomly assigned in counter balanced order to perform eccentric actions at 120%, 140%, 150%, 160% and 170% concentric 1RM with 4 minutes rest between. Variables included peak torque, angular velocity at peak torque, impulse, power, mean EMG, and EMG normalized to peak. Data were analyzed using repeated measures ANOVA or a Friedman test. Angular velocity at peak torque was significantly lower for 120% (65.3 ± 40.8°/s) compared to all other conditions (range: 65.3 ± 40.8 to 162.1 ± 75.2°/s; p<0.01). Peak torque for all conditions (range: 98.2 ± 16.2 to 108.2 ± 21.6 Nm) was significantly higher than isometric peak torque (77.4 ± 16.8Nm; p<0.05). Peak torque at 160% (108.2 ± 21.6Nm) was significantly higher than at 120% (98.2 ± 16.2Nm; p<0.05). Power for 140-170% (range: 166.2 ± 85.7W to 265.8 ± 111.3W) was significantly higher than power at 120% (79.9 ± 66.8W; p<0.05). Impulse was highest at 120% (56.1 ± 54.6Nms) compared to all other conditions (range: 56.2 ± 54.6 to 9.6 ± 3.8Nms; p≤0.05). Impulse at 140% (20.6 ± 11.8Nms) was significantly higher than 170% (9.6 ± 3.8Nms; p<0.05). Isometric mean EMG (0.792 ± 0.285 mV) was significantly higher than all eccentric conditions (range: 0.654 ± 0.313 to 0.533 ± 0.259mV; p<0.05) with no difference between eccentric conditions for mean EMG or EMG normalized to peak. It was concluded that compared to 120%, eccentric overload with intensity ranging from 140-170% concentric 1RM involves minimal increases in peak torque and no change in EMG activation. Intensities above 120% enhance power and decrease impulse. This research has implications on future training prescription of eccentric exercise.
76

A Survey of NCAA Division 1 Strength and Conditioning Coaches- Characteristics and Opinions

Powers, Jeremy 14 July 2008 (has links)
The role of the Strength & Conditioning coach (SCC) has increased dramatically in collegiate athletics over the past 30 years. The SCC now spends more time with the athletes than even the individual sport coaches do because of NCAA rules. Despite the importance of the SCC, little is known as to what makes a good SCC and what a typical SCC is like currently. Limited amounts of research have been conducted to determine the characteristics and opinions of this specific population. The main role of a SCC is to enhance athletic performance of the athletes at a university. They achieve this goal by enhancing strength, power, speed, agility, conditioning, flexibility, among other things. In addition, a good SCC will also help "toughen" up a team mentally, consult athletes on nutrition facts, and serve a variety of roles during team practices. The purpose of this study was to survey NCAA Division I (bowl subdivision) SCCs to assess what characteristics they possess as well as what characteristics they deem to be important for other SCCs to possess. The questions asked ranged from education level to current activity level. The results of the current study supported the hypotheses. SCCs come from a variety of backgrounds in regards to their education, certifications, past experiences, physical activity level, and physical size. The coaches also tended to favor other coaches similar to themselves. With the findings from this study, prospective SCCs will have a better understanding of the hiring practices of prospective employers. Current SCCs will gain a better knowledge of their peers and the field in general. Future research is needed in the field regarding race and gender, two topics only briefly discussed in the current investigation.
77

An Asymptotically Optimal On-Line Algorithm for Parallel Machine Scheduling

Chou, Mabel, Queyranne, Maurice, Simchi-Levi, David 01 1900 (has links)
Jobs arriving over time must be non-preemptively processed on one of m parallel machines, each of which running at its own speed, so as to minimize a weighted sum of the job completion times. In this on-line environment, the processing requirement and weight of a job are not known before the job arrives. The Weighted Shortest Processing Requirement (WSPR) on-line heuristic is a simple extension of the well known WSPT heuristic, which is optimal for the single machine problem without release dates. We prove that the WSPR heuristic is asymptotically optimal for all instances with bounded job processing requirements and weights. This implies that the WSPR algorithm generates a solution whose relative error approaches zero as the number of jobs increases. Our proof does not require any probabilistic assumption on the job parameters and relies extensively on properties of optimal solutions to a single machine relaxation of the problem. / Singapore-MIT Alliance (SMA)
78

The collodial differentiation of starches

Houtz, Harold H. (Harold Howard) 01 January 1940 (has links)
No description available.
79

Geomechanical Wellbore Stability Assesment For Sayindere, Karabogaz, Karababa Formations In X Field

Uyar, Tevhide Tugba 01 July 2011 (has links) (PDF)
Wellbore stability problems make up huge over-costs worldwide. Since in recent years declining resource volumes and favorable oil prices are encouraging operators to drill deeper, more complex well trajectories drilling for hydrocarbons have turn into a much more challenging task. Furthermore, the complexity and variations of those wells have added the weight to planning and problem anticipation at both drilling and production stages. The thesis will describe the geomechanical wellbore stability analysis of Sayindere, Karabogaz and Karababa formations drilled in X field, Adiyaman. The analysis assumes validity of linear elastic theory for porous media and requires drilling reports, well logs, laboratory tests and core analysis. It was observed that with the assessment of geomechanical wellbore stability analysis mud weight window, which includes minimum mud weight and maximum mud weight can be determined for the studied formations.
80

A microscale molecular weight analysis method for characterizing polymers solutions of unknown concentrations

Li, Melissa 25 August 2008 (has links)
Molecular weight and concentration are two most important characteristics of polymers synthesized through chemical or microbial processes. However, current methods for characterizing polymer molecular weight such as Multi-Angle Laser Light Scattering (MALLS) or Gel Permeation Chromatography (GPC) require precise information on concentration as well as extensive sample preparation. Additionally, these current methods are also generally expensive, low throughput, and require large sample titers. These limitations prevent dynamic time-point studies of changes in molecular weight, which would be very useful for monitoring synthesis progress in microbes or in chemical synthesis. In this thesis, we designed, fabricated, and tested a rapid, low cost, high throughput, modular microfluidic system for determining polymer molecular weight in samples of unknown concentrations. To assess the accuracy of this system, we first constructed theoretical predictions for its accuracy, and then compared these to the experimental results from our microfluidic system. The system evaluated molecular weight by correlating the behavior of polymers in various solvent conditions to their molecular weights. The system consists of two modules for measuring fluid viscosity, and for controlling solvent conditions. Results of this study will show that this system is able to evaluate the differences in polymer viscosity for varying molecular weights and solvent conditions. For the solvent control module, we show that salt concentrations in small titers of polymer solutions can be rapidly added or subtracted and evaluated compared with current methods. Next, we will show the efficacy of the viscosity module at rapidly and accurately assessing fluid viscosity over a wide range of molecular weights. Finally, we will show the effects of solvent changes on molecular weight viscosity, and thus the efficacy of the system in determining molecular weight from fluid viscosity. This system will be applied to the evaluation of both the biologically produced polymer Hyaluronic Acid (HA) as well as the synthetically produced polymer Poly-ethylene Oxide (PEO).

Page generated in 0.0198 seconds