• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 250
  • 83
  • 62
  • 52
  • 14
  • 10
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 3
  • 3
  • Tagged with
  • 615
  • 96
  • 85
  • 68
  • 60
  • 53
  • 52
  • 47
  • 46
  • 41
  • 40
  • 39
  • 39
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Model Updating Of A Helicopter Structure Using A Newly Developed Correlation Improvement Technique

Altunel, Fatih 01 December 2009 (has links) (PDF)
Numerical model usage has substantially increased in many industries. It is the aerospace industry that numerical models play possibly the most important role for development of optimum design. However, numerical models need experimental verification. This experimental verification is used not only for validation, but also updating numerical model parameters. Verified and updated models are used to analyze a vast amount of cases that structure is anticipated to face in real life. In this thesis, structural finite element model updating of a utility helicopter fuselage was performed as a case study. Initially, experimental modal analyses were performed using modal shakers. Modal analysis of test results was carried out using LMS Test.lab software. At the same time, finite element analysis of the helicopter fuselage was performed by MSC.Patran &amp / Nastran software. v Initial updating was processed first for the whole helicopter fuselage then, tail of the helicopter was tried to be updated. Furthermore, a new method was proposed for the optimum node removal location for getting better Modal Assurance Criterion (MAC) matrix. This routine was tried on the helicopter case study and it showed better performance than the Coordinate Modal Assurance Criterion (coMAC) that is often used in such analyses.
162

Proposal of New Stability-instability Criterion for Crack Extension Based on Crack Energy Density and Physical Systematization of Other Criteria

WATANABE, Katsuhiko, AZEGAMI, Hideyuki 12 1900 (has links)
No description available.
163

Optimal designs for multivariate calibrations in multiresponse regression models

Guo, Jia-Ming 21 July 2008 (has links)
Consider a linear regression model with a two-dimensional control vector (x_1, x_2) and an m-dimensional response vector y = (y_1, . . . , y_m). The components of y are correlated with a known covariance matrix. Based on the assumed regression model, there are two problems of interest. The first one is to estimate unknown control vector x_c corresponding to an observed y, where xc will be estimated by the classical estimator. The second one is to obtain a suitable estimation of the control vector x_T corresponding to a given target T = (T_1, . . . , T_m) on the expected responses. Consideration in this work includes the deviation of the expected response E(y_i) from its corresponding target value T_i for each component and defines the optimal control vector x, say x_T , to be the one which minimizes the weighted sum of squares of standardized deviations within the range of x. The objective of this study is to find c-optimal designs for estimating x_c and x_T , which minimize the mean squared error of the estimator of xc and x_T respectively. The comparison of the difference between the optimal calibration design and the optimal design for estimating x_T is provided. The efficiencies of the optimal calibration design relative to the uniform design are also presented, and so are the efficiencies of the optimal design for given target vector relative to the uniform design.
164

Notion de rentabilité financière et logique de choix dans les services publics : le cas des choix d’investissement dans quatre services publics municipaux

Defline, Pascale 24 March 2011 (has links)
Cette thèse a pour objet de répondre à la question suivante : un choix de service public est-il compatible avec un choix d’ordre financier, car faisant intervenir des critères financiers, parmi lesquels celui de rentabilité financière ? Elle se situe dans un contexte de profondes mutations du secteur public. Prenant comme cadre théorique le New Public Management, constatant une appropriation par le droit administratif des notions d’intérêt financier et de rentabilité, cette recherche exploratoire se poursuit par des entretiens auprès d’élus et d’administratifs de 26 communes. Elle montre un poids des critères financiers proche de celui des critères de service public et un net intérêt porté à un outil de calcul de rentabilité financière, répondant là positivement à la question. Elle démontre également que les administratifs jouent un véritable rôle d’experts financiers. Enfin elle laisse entrevoir qu’élus et administratifs pourraient exercer un micro-pouvoir sur les spécialistes français du management public et les personnalités politiques, adhérant plutôt à l’idée d’incompatibilité d’un choix de service public et de la notion de rentabilité financière. / This thesis has for object to answer the following question : is a choice of public utility compatible with a choice of financial order, because bringing in financial criteria, among which that of financial profitability ? It takes place in a context of deep changes of the public sector. Taking as theoretical frame the New Public Management, noticing an appropriation by the administrative law of the notions of financial interest and profitability, this exploratory research goes on by interviews with elected representatives and administration staff of 26 municipalities. It shows a weight of the financial criteria close to that of the criteria of public utility and a clear interest for a tool of financial profitability calculation, answering positively the question. It also demonstrates that the administration staff play a real role of financial experts. Finally it lets glimpse that elected representatives and administration staff could exercise a micro-power on the French specialists of public management and the political personalities, subscribing rather to the idea of incompatibility of a choice of public utility and the notion of financial profitability.
165

Accelerated Fuzzy Clustering

Parker, Jonathon Karl 01 January 2013 (has links)
Clustering algorithms are a primary tool in data analysis, facilitating the discovery of groups and structure in unlabeled data. They are used in a wide variety of industries and applications. Despite their ubiquity, clustering algorithms have a flaw: they take an unacceptable amount of time to run as the number of data objects increases. The need to compensate for this flaw has led to the development of a large number of techniques intended to accelerate their performance. This need grows greater every day, as collections of unlabeled data grow larger and larger. How does one increase the speed of a clustering algorithm as the number of data objects increases and at the same time preserve the quality of the results? This question was studied using the Fuzzy c-means clustering algorithm as a baseline. Its performance was compared to the performance of four of its accelerated variants. Four key design principles of accelerated clustering algorithms were identified. Further study and exploration of these principles led to four new and unique contributions to the field of accelerated fuzzy clustering. The first was the identification of a statistical technique that can estimate the minimum amount of data needed to ensure a multinomial, proportional sample. This technique was adapted to work with accelerated clustering algorithms. The second was the development of a stopping criterion for incremental algorithms that minimizes the amount of data required, while maximizing quality. The third and fourth techniques were new ways of combining representative data objects. Five new accelerated algorithms were created to demonstrate the value of these contributions. One additional discovery made during the research was that the key design principles most often improve performance when applied in tandem. This discovery was applied during the creation of the new accelerated algorithms. Experiments show that the new algorithms improve speedup with minimal quality loss, are demonstrably better than related methods and occasionally are an improvement in both speedup and quality over the base algorithm.
166

Development of an Investigator-designed Questionnaire Concerning Childbirth Delivery Options based on the Theory of Planned Behavior

Tai, Chun-Yi 01 January 2013 (has links)
This study responds to the globally increasing rate of caesarean section, and specifically to the very high rate of elective caesarean section among Taiwanese mothers as evidence suggests that such elective caesareans pose potential health risks for mothers and babies. The purpose of this study was to develop and evaluate a multi-component instrument based on the theory of planned behavior (TPB) to better understand Taiwanese pregnant women's decisions regarding their childbirth delivery options (spontaneous vaginal delivery or elective caesarean section). The study was a four-phased mixed method design. First, the TPB guided item development and instrument drafting. Second, pretesting and instrument refinement used cognitive interviewing with a small sample of Taiwanese pregnant women. Third, the instrument was administered to 310 such women to examine psychometric properties of the component scales. Fourth, the phase 3 instrument was re-administered to 30 women to estimate item stability. Confirmatory factor analyses (CFA) were used to assess construct validity of the multi-item, multi-component measurement model with LISREL 9.1. Based on the TPB, the 52-item self-administered Childbirth Delivery Options Questionnaire (CDOQ) was developed to measure three components: intention regarding delivery options, attitudes toward delivery options, and perceptions of significant others' (partner, mother, and mother-in-law) feelings about delivery options. Respondents from phase two thought that the items on the CDOQ were easy to read and comprehend; they reported favorably on the wording and formatting. Preliminary item analysis revealed that the items referring to dangerousness of delivery options did not function as intended and were dropped because they did not differentiate between the two delivery options, leaving 36 items. Test-retest reliability indicated that responses to each item were positively correlated and those referring to spontaneous vaginal delivery were more stable than those referring to elective caesarean section. Corrected item-to-total correlations and expected change in Cronbach's alpha if item deleted revealed that four items might form a measure of general social norms associated with the Taiwanese culture. The Cronbach's alphas for the components of the CDOQ ranged from .55 to .89. The measurement model incorporating the design features of the CDOQ fitted the data well using the CFA. Because serious problems with multicollinearity and suppression were revealed, Beckstead's (2012) criterion-irrelevant-variance-omitted (CIVO) regression method was used to untangle the suppressor effects when predicting intention from the other components of the CDOQ. The results indicated that attitude and partner's feelings were significant and explained the bulk of the variance in intentions. The TPB-based instrument developed here will be of considerable use to maternal-child health researchers. The findings of this study suggest that decisions regarding delivery options may be modified by interventions geared toward pregnant women's attitudes within family- and cultural-centered prenatal programs.
167

Variations of Li's criterion for an extension of the Selberg class

Droll, ANDREW 09 August 2012 (has links)
In 1997, Xian-Jin Li gave an equivalence to the classical Riemann hypothesis, now referred to as Li's criterion, in terms of the non-negativity of a particular infinite sequence of real numbers. We formulate the analogue of Li's criterion as an equivalence for the generalized quasi-Riemann hypothesis for functions in an extension of the Selberg class, and give arithmetic formulae for the corresponding Li coefficients in terms of parameters of the function in question. Moreover, we give explicit non-negative bounds for certain sums of special values of polygamma functions, involved in the arithmetic formulae for these Li coefficients, for a wide class of functions. Finally, we discuss an existing result on correspondences between zero-free regions and the non-negativity of the real parts of finitely many Li coefficients. This discussion involves identifying some errors in the original source work which seem to render one of its theorems conjectural. Under an appropriate conjecture, we give a generalization of the result in question to the case of Li coefficients corresponding to the generalized quasi-Riemann hypothesis. We also give a substantial discussion of research on Li's criterion since its inception, and some additional new supplementary results, in the first chapter. / Thesis (Ph.D, Mathematics & Statistics) -- Queen's University, 2012-07-31 13:14:03.414
168

Response Adaptive Designs in the Presence of Mismeasurement

LI, XUAN January 2012 (has links)
Response adaptive randomization represents a major advance in clinical trial methodology that helps balance the benefits of the collective and the benefits of the individual and improves efficiency without undermining the validity and integrity of the clinical research. Response adaptive designs use information so far accumulated from the trial to modify the randomization procedure and deliberately bias treatment allocation in order to assign more patients to the potentially better treatment. No attention has been paid to incorporating the problem of errors-in-variables in adaptive clinical trials. In this work, some important issues and methods of response adaptive design of clinical trials in the presence of mismeasurement are examined. We formulate response adaptive designs when the dichotomous response may be misclassified. We consider the optimal allocations under various objectives, investigate the asymptotically best response adaptive randomization procedure, and discuss effects of misclassification on the optimal allocation. We derive explicit expressions for the variance-penalized criterion with misclassified binary responses and propose a new target proportion of treatment allocation under the criterion. A real-life clinical trial and some related simulation results are also presented.
169

MULTISCALE MODELING AND ANALYSIS OF FAILURE AND STABILITY DURING SUPERPLASTIC DEFORMATION -- UNDER DIFFERENT LOADING CONDITIONS

Thuramalla, Naveen 01 January 2004 (has links)
Superplastic forming (SPF) is a valuable near net shape fabrication method, used to produce very complex, contoured and monolithic structures that are often lighter, stronger and safer than the assemblies they replace. However, the widespread industrial use of Superplastic (SP) alloys is hindered by a number of issues including low production rate and limited predictive capabilities of stability during deformation and failure. Failure during SPD may result from geometrical macroscopic instabilities and/or microstructural aspects. However, the available failure criteria are either based on geometrical instabilities or microstructural features and do not account for both failure modes. The present study presents a generalized multi-scale stability criterion for SP materials, accounting for both aspects of failure under various loading conditions. A combined model accounting for cavity nucleation and plasticity controlled cavity growth along with a grain growth model and a modified microstructure based constitutive equation for SP materials is incorporated into Harts stability analysis to develop the proposed stability criterion for different loading conditions. Effects of initial grain size, initial levels of cavitation, nucleation strain, strain-rate sensitivity, and grain-growth exponent on the optimum forming curves of different SP alloys are investigated, for different loading conditions.
170

The validation of a performance-based assessment battery

Wilson, Irene Rose 01 January 2002 (has links)
Legislative pressures are being brought to bear on South African employers to demonstrate that occupational assessment is scientifically valid and culturefair. The development of valid and reliable performance-based assessment tools will enable employers to meet these requirements. The general aim of this research was to validate a performance-based assessment battery for the placement of sales representatives. A literature survey examined alternative assessment measures and methods of performance measurement, leading to the conclusion that the combination of the work sample as a predictor measure and the managerial rating of performance as a criterion measure offer a practical and cost-effective assessment process to the sales manager. The empirical study involved 54 sales persons working for the Commercial division of an oil marketing company, selling products and services to the commercial and industrial market. By means of the empirical study, a significant correlation was found between performance of sales representatives in terms of the performance-based assessment battery for the entry level of the career ladder and their behaviour in the field as measured by the managerial performance rating instrument. The limitations of the sample, however, prevent the results from being generalised to other organisations.

Page generated in 0.1124 seconds