• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 345
  • 128
  • 49
  • 39
  • 12
  • 10
  • 9
  • 7
  • 5
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 708
  • 183
  • 94
  • 88
  • 87
  • 76
  • 69
  • 54
  • 53
  • 53
  • 52
  • 51
  • 49
  • 42
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Experimental Design With Short-tailed And Long-tailed Symmetric Error Distributions

Yilmaz, Yildiz Elif 01 September 2004 (has links) (PDF)
One-way and two-way classification models in experimental design for both balanced and unbalanced cases are considered when the errors have Generalized Secant Hyperbolic distribution. Efficient and robust estimators for main and interaction effects are obtained by using the modified maximum likelihood estimation (MML) technique. The test statistics analogous to the normal-theory F statistics are defined to test main and interaction effects and a test statistic for testing linear contrasts is defined. It is shown that test statistics based on MML estimators are efficient and robust. The methodogy obtained is also generalized to situations where the error distributions from block to block are non-identical.
222

Bayesian Learning Under Nonnormality

Yilmaz, Yildiz Elif 01 December 2004 (has links) (PDF)
Naive Bayes classifier and maximum likelihood hypotheses in Bayesian learning are considered when the errors have non-normal distribution. For location and scale parameters, efficient and robust estimators that are obtained by using the modified maximum likelihood estimation (MML) technique are used. In naive Bayes classifier, the error distributions from class to class and from feature to feature are assumed to be non-identical and Generalized Secant Hyperbolic (GSH) and Generalized Logistic (GL) distribution families have been used instead of normal distribution. It is shown that the non-normal naive Bayes classifier obtained in this way classifies the data more accurately than the one based on the normality assumption. Furthermore, the maximum likelihood (ML) hypotheses are obtained under the assumption of non-normality, which also produce better results compared to the conventional ML approach.
223

Fabrication of surface micro- and nanostructures for superhydrophobic surfaces in electric and electronic applications

Xiu, Yonghao 10 November 2008 (has links)
In our study, the superhydrophobic surface based on biomimetic lotus leave is explored to maintain the desired properties for self-cleaning. In controlling bead-up and roll-off characteristics of water droplets the contact angle and contact angle hysteresis were very important and we investigated the determining conditions on different model surfaces with micro- and nanostructures. Two governing equations were proposed, one for contact angle based on Laplace pressure and one for contact angle hysteresis based on Young-Dupré equation. Based on these understanding on superhydrophobicity, possible applications of the superhydrophobicity for self-cleaning and water repellency were explored and application related technical issues were addressed. Based on our understanding of the roughness effect on superhydrophobicity (both contact angle and hysteresis), structured surfaces from polybutadiene, polyurethane, silica, and Si etc were successfully prepared. For engineering applications of superhydrophobic surfaces, stability issues regarding UV, mechanical robustness and humid environment need to be investigated. Among these factors, UV stability is the first one to be studied. Silica surfaces with excellent UV stability were prepared. UV stability on the surface currently is 5,500 h according the standard test method of ASTM D 4329. No degradation on surface superhydrophobicity was observed. New methods for preparing superhydrophobic and transparent silica surfaces were investigated using urea-choline chloride eutectic liquid to generate fine roughness and reduce the cost for preparation of surface structures. Another possible application for self-cleaning in photovoltaic panels was investigated on Si surfaces by construction of the two-scale rough structures followed by fluoroalkyl silane treatment. Regarding the mechanical robustness, epoxy-silica superhydrophobic surfaces were prepared by O2 plasma etching to generate enough surface roughness of silica spheres followed by fluoroalkyl silane treatment. A robustness test method was proposed and the test results showed that the surface is among the most robust surfaces for the superhydrophobic surfaces we prepared and currently reported in literature.
224

Efficient Semiparametric Estimators for Nonlinear Regressions and Models under Sample Selection Bias

Kim, Mi Jeong 2012 August 1900 (has links)
We study the consistency, robustness and efficiency of parameter estimation in different but related models via semiparametric approach. First, we revisit the second- order least squares estimator proposed in Wang and Leblanc (2008) and show that the estimator reaches the semiparametric efficiency. We further extend the method to the heteroscedastic error models and propose a semiparametric efficient estimator in this more general setting. Second, we study a class of semiparametric skewed distributions arising when the sample selection process causes sampling bias for the observations. We begin by assuming the anti-symmetric property to the skewing function. Taking into account the symmetric nature of the population distribution, we propose consistent estimators for the center of the symmetric population. These estimators are robust to model misspecification and reach the minimum possible estimation variance. Next, we extend the model to permit a more flexible skewing structure. Without assuming a particular form of the skewing function, we propose both consistent and efficient estimators for the center of the symmetric population using a semiparametric method. We also analyze the asymptotic properties and derive the corresponding inference procedures. Numerical results are provided to support the results and illustrate the finite sample performance of the proposed estimators.
225

Étude des M-estimateurs et leurs versions pondérées pour des données clusterisées / A study of M estimators and wheighted M estimators in the case of clustered data

El Asri, Mohamed 15 December 2014 (has links)
La classe des M-estimateurs engendre des estimateurs classiques d’un paramètre de localisation multidimensionnel tels que l’estimateur du maximum de vraisemblance, la moyenne empirique et la médiane spatiale. Huber (1964) introduit les M-estimateurs dans le cadre de l’étude des estimateurs robustes. Parmi la littérature dédiée à ces estimateurs, on trouve en particulier les ouvrages de Huber (1981) et de Hampel et al. (1986) sur le comportement asymptotique et la robustesse via le point de rupture et la fonction d’influence (voir Ruiz-Gazen (2012) pour une synthèse sur ces notions). Plus récemment, des résultats sur la convergence et la normalité asymptotique sont établis par Van der Vaart (2000) dans le cadre multidimensionnel. Nevalainen et al. (2006, 2007) étudient le cas particulier de la médiane spatiale pondérée et non-pondérée dans le cas clusterisé. Nous généralisons ces résultats aux M-estimateurs pondérés. Nous étudions leur convergence presque sûre, leur normalité asymptotique ainsi que leur robustesse dans le cas de données clusterisées. / M-estimators were first introduced by Huber (1964) as robust estimators of location and gave rise to a substantial literature. For results on their asymptotic behavior and robustness (using the study of the influence func- tion and the breakdown point), we may refer in particular to the books of Huber (1981) and Hampel et al. (1986). For more recent references, we may cite the work of Ruiz-Gazen (2012) with a nice introductory presentation of robust statistics, and the book of Van der Vaart (2000) for results, in the independent and identically distributed setting, concerning convergence and asymptotic normality in the multivariate setting considered throughout this paper. Most of references address the case where the data are independent and identically distributed. However clustered, and hierarchical, data frequently arise in applications. Typically the facility location problem is an important research topic in spatial data analysis for the geographic location of some economic activity. In this field, recent studies perform spatial modelling with clustered data (see e.g. Liao and Guo, 2008; Javadi and Shahrabi, 2014, and references therein). Concerning robust estimation, Nevalainen et al. (2006) study the spatial median for the multivariate one-sample location problem with clustered data. They show that the intra-cluster correlation has an impact on the asymptotic covariance matrix. The weighted spatial median, introduced in their pioneer paper of 2007, has a superior efficiency with respect to its unweighted version, especially when clusters’ sizes are heterogenous or in the presence of strong intra-cluster correlation. The class of weighted M-estimators (introduced in El Asri, 2013) may be viewed as a generalization of this work to a broad class of estimators: weights are assigned to the objective function that defines M-estimators. The aim is, for example, to adapt M-estimators to the clustered structures, to the size of clusters, or to clusters including extremal values, in order to increase their efficiency or robustness. In this thesis, we study the almost sure convergence of unweighted and weighted M-estimators and establish their asymptotic normality. Then, we provide consistent estimators of the asymptotic variance and derived, numerically, optimal weights that improve the relative efficiency to their unweighted versions. Finally, from a weight-based formulation of the breakdown point, we illustrate how these optimal weights lead to an altered breakdown point.
226

Robustness analysis of VEGA launcher model based on effective sampling strategy

Dong, Siyi January 2016 (has links)
An efficient robustness analysis for the VEGA launch vehicle is essential to minimize the potential system failure during the ascending phase. Monte Carlo sampling method is usually considered as a reliable strategy in industry if the sampling size is large enough. However, due to a large number of uncertainties and a long response time for a single simulation, exploring the entire uncertainties sufficiently through Monte Carlo sampling method is impractical for VEGA launch vehicle. In order to make the robustness analysis more efficient when the number of simulation is limited, the quasi-Monte Carlo(Sobol, Faure, Halton sequence) and heuristic algorithm(Differential Evolution) are proposed. Nevertheless, the reasonable number of samples for simulation is still much smaller than the minimal number of samples for sufficient exploration. To further improve the efficiency of robustness analysis, the redundant uncertainties are sorted out by sensitivity analysis. Only the dominant uncertainties are remained in the robustness analysis. As all samples for simulation are discrete, many uncertainty spaces are not explored with respect to its objective function by sampling or optimization methods. To study these latent information, the meta-model trained by Gaussian Process is introduced. Based on the meta-model, the expected maximum objective value and expected sensitivity of each uncertainties can be analyzed for robustness analysis with much higher efficiency but without loss much accuracy.
227

Robustesse du modèle de Rasch unidimensionnel à la violation de l’hypothèse d’unidimensionnalité

Boade, Georges 06 1900 (has links)
No description available.
228

Robustness of composite framed structures in fire

Beshir, Moustafa January 2016 (has links)
This thesis presents the results of a research study to investigate the behaviour of axially restrained composite beams at ambient and elevated temperatures, and how composite beams and their connections contribute to the robustness of composite framed structures in fire. The commercial finite element analysis package (ABAQUS, 2010) was used to develop the numerical simulation models. This research includes the following four main parts: (1) validation of the simulation model; (2) behaviour of axially restrained composite beams with partial shear interaction at ambient and elevated temperatures; (3) behaviour of composite beams with realistic connections at elevated temperatures and methods of increasing composite beam survival temperatures; and (4) response and robustness of composite frame structures with different extents of damage at elevated temperatures. Based on the results of composite beams, it was found that the survival of axially restrained beams is dominated by the development of catenary action. By utilising catenary action, it is possible for composite beams to develop load carrying capacity significantly above that based on bending resistance. During the development of catenary action, the compression force in the concrete flange of the composite beam decreases, thus reducing the forces in the shear connectors. As a result, the behaviour of shear connector failures ceases to be an issue during the catenary action stage. The results further show that, the load carrying capacities/survival temperatures of composite beams increase by increasing the level of axial restraint up to a certain limit and then decrease at higher levels. Typical realistic composite structures can provide composite beams with sufficient axial restraint to develop catenary action. For detailed composite beams with composite connections, three different beam sizes were investigated using flushed and extended end plate connections with different amounts of slab reinforcement, different load ratios and different bolt sizes. It has been found that the most important method to increase the survival time of composite beams is to use extended end plate connections with sufficient top and bottom reinforcement meshes in the concrete slab, i.e. increasing the amount of slab reinforcement is more beneficial than increasing the bolt size or the number of bolts. Based on the results of modelling a four bay (9 m each, two storey, 4 m high) composite frame with different extents of fire damage to different members, it was found that whenever any of the columns failed, progressive collapse of the frame would occur. Therefore, damages to columns should be prevented or the columns should be designed and constructed to allow for possible damage. If the beams are damaged, it is still possible for the damaged frame to achieve the reference fire resistance time of the undamaged structure (which is used as the criterion to accept that the damaged frame has sufficient robustness) by developing catenary action in the damaged beam. For this to happen, the columns should be designed to resist the catenary tensile force (tying force) in the beams, in addition to the compressive force.
229

Robust and stable optimization for parallel machine scheduling problems / Optimisation robuste et analyse de stabilité pour les problèmes d'ordonnancement sur machines parallèles

Naji, Widad 02 May 2018 (has links)
Scheduling on unrelated parallel machines is a common problem in many systems (as semi-conductors manufacturing,multiprocessor computer applications, textile industry, etc.). In this thesis, we consider two variantsof this problem under uncertain processing time. In the first case, each job can be split into continuoussub-jobs and processed independently on the machines with allowed overlappinf. In the second case whichis termed preemption, we prohibit the overlapping. From a mathematical viewpoint, the splitting problem isa relaxed version of the preemptive problem. The objective is to minimize the makespan.The deterministic linear formulations provided by the literature allow to solve these problems in polynomialtimes under the hypothesis of certainty. But, when we consider uncertain processing times, thesealgorithms suffer from some limitations. Indeed, the solutions compouted based on a nominal instance,supposed to be certain, turn usually to be suboptimal when applied to the actual realization of processingtimes.We incorporate the uncertain processing times in these problems without making any assumption ontheir distribution. Hence, we use discrete scenarios to represent the uncetain processing times and we adopta proactive approach to provide robust solutions. We use special case policies that are commongly used inthe industry to compute robust solutions. We show that the solutions based on some of those policies arepotentially good in terms of robustness according to the worst-case makespan, especially the scenario smaxsolution under which all the processing times are set to their maximal values. However, the robustness costsof these solutions are not satisfying. Thus, we propose to compute optimal robust solutions. For this purpose,we use a mathematical trick that allows us to formulate and solve, in polynomila times, the robust versionsof the considered scheduling problems. Moreover, the computational results affirm that the robustness costof the optimal solution is not usually very high.Moreover, we evaluate the stability of the robust solutions under a new scenario induced by variations.In fact, the decision-maker is only responsible for the consequences of the decisions when the processingtime realizations are within the represented uncertainty set. Thus, we define stability of a robust solution asits ability to cover a new scenario with minor deviations regarding its structure and its performance.The global motivation of this thesis is then to provide a decision support to help decision maker computerobust solutions and choose among these robust solutions those with the most stable structure and the moststable performance. / Scheduling on unrelated parallel machines is a common problem in many systems (as semi-conductors manufacturing,multiprocessor computer applications, textile industry, etc.). In this thesis, we consider two variantsof this problem under uncertain processing time. In the first case, each job can be split into continuoussub-jobs and processed independently on the machines with allowed overlappinf. In the second case whichis termed preemption, we prohibit the overlapping. From a mathematical viewpoint, the splitting problem isa relaxed version of the preemptive problem. The objective is to minimize the makespan.The deterministic linear formulations provided by the literature allow to solve these problems in polynomialtimes under the hypothesis of certainty. But, when we consider uncertain processing times, thesealgorithms suffer from some limitations. Indeed, the solutions compouted based on a nominal instance,supposed to be certain, turn usually to be suboptimal when applied to the actual realization of processingtimes.We incorporate the uncertain processing times in these problems without making any assumption ontheir distribution. Hence, we use discrete scenarios to represent the uncetain processing times and we adopta proactive approach to provide robust solutions. We use special case policies that are commongly used inthe industry to compute robust solutions. We show that the solutions based on some of those policies arepotentially good in terms of robustness according to the worst-case makespan, especially the scenario smaxsolution under which all the processing times are set to their maximal values. However, the robustness costsof these solutions are not satisfying. Thus, we propose to compute optimal robust solutions. For this purpose,we use a mathematical trick that allows us to formulate and solve, in polynomila times, the robust versionsof the considered scheduling problems. Moreover, the computational results affirm that the robustness costof the optimal solution is not usually very high.Moreover, we evaluate the stability of the robust solutions under a new scenario induced by variations.In fact, the decision-maker is only responsible for the consequences of the decisions when the processingtime realizations are within the represented uncertainty set. Thus, we define stability of a robust solution asits ability to cover a new scenario with minor deviations regarding its structure and its performance.The global motivation of this thesis is then to provide a decision support to help decision maker computerobust solutions and choose among these robust solutions those with the most stable structure and the moststable performance.
230

Contribution à la conception de coupleurs magnétiques robustes pour convertisseurs multicellulaires parallèles / Pre-design methodology of robust intercell transformers (ICT) for parallel multicell converters

Sanchez, Sébastien 24 March 2015 (has links)
Face aux enjeux énergétiques actuels, l’électronique de puissance est un domaine de recherche de premier plan. La relative fragilité des composants présents dans les chaînes de conversion implique néanmoins de devoir prendre en compte la gestion des défaillances dès la phase de conception. La défaillance d'un composant est une situation hautement critique tant sur le plan de la sécurité environnante que sur le plan de l'indisponibilité du système qui en découle. Cette problématique de sûreté de fonctionnement constitue la ligne directive de ce travail de thèse, visant à imaginer, et concevoir des solutions permettant de sécuriser ces structures en présence de défauts. Dans cette thèse, nous étudions de nouveaux composants magnétiques offrant un excellent compromis entre la densité de puissance traitée et rendement de conversion, mais très sensibles à toutes perturbations électriques. Une méthode de pré-dimensionnement des composants magnétiques a été développée et des solutions ont été apportés pour sécuriser et maintenir le fonctionnement de la chaîne de conversion suite à l’apparition de plusieurs défauts. / In the research field, power electronics is an important issue in the actual energy challenges. Nowadays, new converters, called « multilevel or multicell converters », are used in many applications requiring high-current with high-power density. This is due to their high frequency performances and good waveforms signals. The main advantages of such converters concern the high efficiency and good system integration. Most embedded systems are required to maintain operation even if failures or faults occur. New magnetic devices called InterCell Transformers (ICTs) in multicell converters help to improve the efficiency and compactness. Nevertheless, such magnetic components are inherently sensitive to any impecfection coming from the converter. The goal of this PhD thesis is to bring solutions to make ICTs more fault tolerant. Therefore, a pre-design methodology for a robust ICT is presented in order to maintain the operation after the occurrence of one or several faults. Several solutions are presented and detailled

Page generated in 0.054 seconds