• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 171
  • 54
  • 50
  • 49
  • 10
  • 8
  • 7
  • 6
  • 5
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 446
  • 95
  • 73
  • 71
  • 66
  • 56
  • 46
  • 43
  • 42
  • 38
  • 37
  • 33
  • 32
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Méthodes et applications industrielles en optimisation multi-critère de paramètres de processus et de forme en emboutissage / Methods and industrial applications in multicriteria optimization of process parameters in sheet metal forming

Oujebbour, Fatima Zahra 12 March 2014 (has links)
Face aux exigences concurrentielles et économiques actuelles dans le secteur automobile, l'emboutissage a l'avantage, comme étant un procédé de mise en forme par grande déformation, de produire, en grandes cadences, des pièces de meilleure qualité géométrique par rapport aux autres procédés de fabrication mécanique. Cependant, il présente des difficultés de mise en œuvre, cette dernière s'effectue généralement dans les entreprises par la méthode classique d'essai-erreur, une méthode longue et très coûteuse. Dans la recherche, le recours à la simulation du procédé par la méthode des éléments finis est une alternative. Elle est actuellement une des innovations technologiques qui cherche à réduire le coût de production et de réalisation des outillages et facilite l'analyse et la résolution des problèmes liés au procédé. Dans le cadre de cette thèse, l'objectif est de prédire et de prévenir, particulièrement, le retour élastique et la rupture. Ces deux problèmes sont les plus répandus en emboutissage et présentent une difficulté en optimisation puisqu'ils sont antagonistes. Une pièce mise en forme par emboutissage à l'aide d'un poinçon sous forme de croix a fait l'objet de l'étude. Nous avons envisagé, d'abord, d'analyser la sensibilité des deux phénomènes concernés par rapport à deux paramètres caractéristiques du procédé d'emboutissage (l'épaisseur du flan initial et de la vitesse du poinçon), puis par rapport à quatre (l'épaisseur du flan initial, de la vitesse du poinçon, l'effort du serre flan et le coefficient du frottement) et finalement par rapport à la forme du contour du flan. Le recours à des méta-modèles pour optimiser les deux critères était nécessaire. / The processing of sheet metal forming is of vital importance to a large range of industries as production of car bodies, cans, appliances, etc. It generates complex and precise parts. Although, it is an involved technology combining elastic-plastic bending and stretch deformation of the workpiece. These deformations can lead to undesirable problems in the desired shape and performance of the stamped. To perform a successful stamping process and avoid shape deviations such as springback and failure defects, process variables should be optimized.In the present work, the objective is the prediction and the prevention of, especially, springback and failure. These two phenomena are the most common problems in stamping process that present much difficulties in optimization since they are two conflicting objectives. The forming test studied in this thesis concern the stamping of an industrial workpiece stamped with a cross punch. To solve this optimization problem, the approach chosen was based on the hybridization of an heuristic and a direct descent method. This hybridization is designed to take advantage from both disciplines, stochastic and deterministic, in order to improve the robustness and the efficiency of the hybrid algorithm. For the multi-objective problem, we adopt methods based on the identification of Pareto front. To have a compromise between the convergence towards the front and the manner in which the solutions are distributed, we choose two appropriate methods. This methods have the capability to capture the Pareto front and have the advantage of generating a set of Pareto-optimal solutions uniformly spaced. The last property can be of important and practical.
62

De l’efficacité à la concurrence : Histoire d’une synthèse entre économie néoclassique et néolibéralisme / From efficiency to competition : A history of a synthesis between neoclassical economics and neoliberalism

Berthonnet, Irène 08 December 2014 (has links)
La thèse s’intéresse au lien étroit qu’établit la théorie économique standard entre les concepts d’efficacité et de concurrence. Les deux notions y sont systématiquement pensées et utilisées conjointement. Au cours du 20ème siècle le lien entre efficacité et concurrence a d’abord pris la forme spécifique d’un énoncé qui affirme l’efficacité de la concurrence, avant de se transformer en véritable identité des deux notions d’efficacité et de concurrence. Après avoir été construite par glissements, déplacements, approximations, et changements terminologiques successifs, cette identification de l’efficacité à la concurrence apparait plutôt comme une doctrine que comme un résultat. Cet aspect doctrinal est le résultat d’un processus de co-construction dans lequel sont engagés la théorie néoclassique et le néolibéralisme, qui sont appréhendés comme deux discours qui s’établissent sur des registres différents, mais compatibles. Les interactions entre théorie néoclassique et néolibéralisme aboutissent à la définition d’une économie standard qui se présente comme une synthèse de ces deux approches. La démonstration est opérée par le biais d’une histoire du critère de Pareto depuis 1894 jusqu’à la période contemporaine. Ce critère, initialement introduit comme « maximum d’ophélimité pour la collectivité » a subi plusieurs transformations sémantiques, épistémologiques et théoriques avant de s’imposer comme critère d’efficacité. Indexé depuis ses origines sur le modèle de l’équilibre général concurrentiel, il est intrinsèquement lié au fonctionnement concurrentiel des marchés. Celui-ci peut cependant être défini et opérationnalisé de manières différentes, et l’indétermination théorique de la notion de concurrence contribue à rendre incertaines la signification et la portée de l’énoncé d’efficacité de la concurrence. / Efficiency and competition are systematically connected in mainstream economics. During the 20th century, this connection has evolved from the assertion of the efficiency of competitive markets to the belief that efficiency is the competitive design in itself. This identification results from various changes of perspectives, approximations, and semantic changes in the various streams of mainstream economics, thus contributing to the doctrinal aspect of the identification of efficiency to competition. This doctrinal characteristic is the result of a co-construction by neoclassical economics and neoliberalism. These theories are apprehended as two specific approaches, different but compatible. Interactions between neoclassical economics and neoliberalism lead to the definition of mainstream economics as a synthesis of these two approaches. The demonstration is based on a history of the Pareto criterion, from 1894 to contemporary economics. The criterion, initially named « maximum of ophelimity for the community » has undergone several theoretical, epistemological and semantic transformations, before becoming widely used as an efficiency criterion. Elaborated upon the properties of the competitive general equilibrium, it is intrinsically linked to the competitive design. But competition can be described and operated in many different ways, and the theoretical indetermination of the economic concept of competition conveys a strong uncertainty regarding the scope and meaning of the assertion of competition’s efficiency.
63

Simulation based exploration of a loading strategy for a LHD-vehicle / Simuleringsbaserad utforskning av styrstrategier för frontlastare

Lindmark, Daniel January 2016 (has links)
Optimizing the loading process of a front loader vehicle is a challenging task. The design space is large and depends on the design of the vehicle, the strategy of the loading process, the nature of the material to load etcetera. Finding an optimal loading strategy, with respect to production and damage on equipment would greatly improve the production and environmental impacts in mining and construction. In this thesis, a method for exploring the design space of a loading strategy is presented. The loading strategy depends on four design variables that controls the shape of the trajectory relative to the shape of the pile. The responses investigated is the production, vehicle damage and work interruptions due to rock spill. Using multi-body dynamic simulations many different strategies can be tested with little cost. The result of these simulations are then used to build surrogate models of the original unknown function. The surrogate models are used to visualize and explore the design space and construct Pareto fronts for the competing responses. The surrogate models were able to predict the production function from the simulations well. The damage and rock spill surrogate models was moderately good in predicting the simulations but still good enough to explore how the design variables affect the response. The produced Pareto fronts makes it easy for the decision maker to compare sets of design variables and choose an optimal design for the loading strategy.
64

The Application of Process Improvement Techniques at a Clothing Manufacturing Company in the Western Cape.

Ayeah, Ebenezer Nkwain January 2003 (has links)
Magister Commercii - MCom / This research project focuses on the application of process improvement techniques in a clothing manufacturer to address delay problems in workflow in the factory. The objective of the research is threefold; investigate delays at the beginning of production and make suggestions, show the usefulness of continuous improvement techniques in improving activities in a clothing manufacturer and demonstrate how action research can be used in doing research in production and operation management. Using tools such as flow charts, check sheets, pareto analysis, fishbone diagrams, interviews and the "ask why five times" tool, an investigation into delays led to a second investigation into sewing defects. This established that these sewing defects are caused mainly by time constraints, the malfunctioning of machines, the wrong handling of garments, and previous operations. After an investigation using the above tools it was established that these defects could be addressed by setting realistic targets, doing regular maintenance on machines, cautioning operators to be more careful during their operations, and encouraging regular checks on garments before the next operation. The action learning methodology led to the following lessons; selecting a correct measuring tool is important, that not all tools need to be used, and that it takes time to do a research project using this method.
65

Relevance of Multi-Objective Optimization in the Chemical Engineering Field

Cáceres Sepúlveda, Geraldine 28 October 2019 (has links)
The first objective of this research project is to carry out multi-objective optimization (MOO) for four simple chemical engineering processes to clearly demonstrate the wealth of information on a given process that can be obtained from the MOO instead of a single aggregate objective function. The four optimization case studies are the design of a PI controller, an SO2 to SO3 reactor, a distillation column and an acrolein reactor. Results that were obtained from these optimization case studies show the benefit of generating and using the Pareto domain to gain a deeper understanding of the underlying relationships between the various process variables and the different performance objectives. In addition, an acrylic acid production plant model is developed in order to propose a methodology to solve multi-objective optimization for the two-reactor system model using artificial neural networks (ANNs) as metamodels, in an effort to reduce the computational time requirement that is usually very high when first-principles models are employed to approximate the Pareto domain. Once the metamodel was trained, the Pareto domain was circumscribed using a genetic algorithm and ranked with the Net Flow method (NFM). After the MOO was carry out with the ANN surrogate model, the optimization time was reduced by a factor of 15.5.
66

Parameter Estimation for Generalized Pareto Distribution

Lin, Der-Chen 01 May 1988 (has links)
The generalized Pareto distribution was introduced by Pickands (1975). Three methods of estimating the parameters of the generalized Pareto distribution were compared by Hosking and Wallis (1987) . The methods are maximum likelihood, method of moments and probability-weighted moments. An alternate method of estimation for the generalized Pareto distribution, based on least square regression of expected order statistics (REOS), is developed and evaluated in this thesis . A Monte Carlo comparison is made between this method and the estimating methods considered by Hosking and Wallis (1987). This method is shown to be generally superior to the maximum likelihood, method of moments and probability-weighted moments
67

Multi-criteria decision making using reinforcement learning and its application to food, energy, and water systems (FEWS) problem

Deshpande, Aishwarya 12 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Multi-criteria decision making (MCDM) methods have evolved over the past several decades. In today’s world with rapidly growing industries, MCDM has proven to be significant in many application areas. In this study, a decision-making model is devised using reinforcement learning to carry out multi-criteria optimization problems. Learning automata algorithm is used to identify an optimal solution in the presence of single and multiple environments (criteria) using pareto optimality. The application of this model is also discussed, where the model provides an optimal solution to the food, energy, and water systems (FEWS) problem.
68

A comprehensive analysis of extreme rainfall

Kagoda, Paulo Abuneeri 13 August 2008 (has links)
No description available.
69

The Vital Few Foster Mothers

Cherry, Donna J., Orme, John G. 01 September 2013 (has links)
Many foster parents serve only briefly, and foster and adopt few children. Anecdotal reports suggest that a small percentage of foster parents provide a disproportionate amount of care; however, we know virtually nothing about these parents. This study applied the Pareto Principle, also known as the 80-20 rule or Vital Few, as a framework to conceptualize these foster parents. Using latent class analysis, two classes of mothers were identified: one accounted for 21% of mothers and the other 79%. We refer to the former as the Vital Few and the latter as the Useful Many. Vital Few mothers fostered 73% of foster children - 10 times more than Useful Many mothers although only fostering three times longer. They adopted twice as many foster children while experiencing half the yearly rate of placement disruptions. Vital Few mothers were less likely to work outside the home, had better parenting attitudes, more stable home environments, more time to foster, and more professional support for fostering, but less support from kin. Further, they were as competent as the Useful Many on numerous other psychosocial measures. Understanding characteristics of these resilient Vital Few can inform recruitment and retention efforts and offer realistic expectations of foster parents.
70

Characterizing Material Property Tradeoffs of Polycrystalline Diamond for Design Evaluation and Selection

Haddock, Neil David 13 July 2009 (has links) (PDF)
Polycrystalline diamond (PCD) is used as a cutting tool in many industries because of its superior wear resistance compared to single crystal diamond. Engineers who design new PCD materials must have an understanding of the tradeoffs between material properties in order to tailor a product for different applications. Two competing material properties that are often encountered in PCD are transverse rupture strength and thermal-resistance. Thermal-resistance is directly related to the cobalt content of PCD, and is the ability of the material to withstand thermally induced degradation. In this thesis, we characterize the tradeoff boundary between transverse rupture strength and cobalt content of PCD. We also characterize the tradeoff boundary between cost and cobalt content, and show how both of these tradeoff boundaries can be used to manage product development, which adds value for managers in both engineering and business. In order to characterize these tradeoffs, empirical models are developed for each material property in terms of the design variables of sintering pressure and diamond grain size, where the pressure ranges from 55 kbar to 77 kbar and the grain size ranges from 12 μm to 70 μm in diameter. Then the models are used as optimization objectives in the normal constraint method to generate the tradeoff boundary. Finally, the tradeoff boundary is validated through additional experiments. The tradeoff boundary shows that the relationship between transverse rupture strength and cobalt content is not linear. It also shows that the optimal PCD designs can occur over a wide range of pressures and grain sizes, but pressures above 66 kbar and grain sizes between 20 and 30 μm appear to offer the best compromise between these material properties. These results are compared to the wear rates of PCD compacts in rock cutting tests. The rock cutting test results confirm that the designs with the best compromise between transverse rupture strength and cobalt content also have the highest wear resistance. In general, the designs that offer the best compromise between the properties are also the most expensive to manufacture.

Page generated in 0.0923 seconds