• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 229
  • 63
  • 49
  • 31
  • 31
  • 22
  • 9
  • 9
  • 7
  • 6
  • 5
  • 5
  • 4
  • 3
  • 3
  • Tagged with
  • 591
  • 99
  • 87
  • 45
  • 43
  • 41
  • 37
  • 35
  • 33
  • 32
  • 32
  • 31
  • 31
  • 31
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Applications de techniques avancées de contrôle des procédés en industrie du semi-conducteur.

Jedidi, Nader 05 October 2009 (has links) (PDF)
Cette thèse porte sur le développement d‘outils de contrôle avancé des procédés appliqués à l'industrie microélectronique. Des analyses statistiques ont mis en évidence la longueur de grille en poly-silicium comme principal responsable des variabilités lot à lot temporelle et spatiale des performances électriques des transistors courts (courants de saturation, de fuite et la tension de seuil). Une nouvelle stratégie de régulation mieux adaptée a été étudiée : le contrôle coopératif qui s'appuie sur un algorithme d'identification récursif. Les performances de plusieurs estimateurs en ligne ont été simulées et comparées. Une régulation de compensation a aussi été développée entre la gravure de la grille et l'implantation des poches permettant de compenser la déviation de la longueur de la grille en ajustant la dose d'implantation des poches. Sa mise en production a permis de réduire la dispersion lot à lot de 40%.
42

Run-time optimization of adaptive irregular applications

Yu, Hao 15 November 2004 (has links)
Compared to traditional compile-time optimization, run-time optimization could offer significant performance improvements when parallelizing and optimizing adaptive irregular applications, because it performs program analysis and adaptive optimizations during program execution. Run-time techniques can succeed where static techniques fail because they exploit the characteristics of input data, programs' dynamic behaviors, and the underneath execution environment. When optimizing adaptive irregular applications for parallel execution, a common observation is that the effectiveness of the optimizing transformations depends on programs' input data and their dynamic phases. This dissertation presents a set of run-time optimization techniques that match the characteristics of programs' dynamic memory access patterns and the appropriate optimization (parallelization) transformations. First, we present a general adaptive algorithm selection framework to automatically and adaptively select at run-time the best performing, functionally equivalent algorithm for each of its execution instances. The selection process is based on off-line automatically generated prediction models and characteristics (collected and analyzed dynamically) of the algorithm's input data, In this dissertation, we specialize this framework for automatic selection of reduction algorithms. In this research, we have identified a small set of machine independent high-level characterization parameters and then we deployed an off-line, systematic experiment process to generate prediction models. These models, in turn, match the parameters to the best optimization transformations for a given machine. The technique has been evaluated thoroughly in terms of applications, platforms, and programs' dynamic behaviors. Specifically, for the reduction algorithm selection, the selected performance is within 2% of optimal performance and on average is 60% better than "Replicated Buffer," the default parallel reduction algorithm specified by OpenMP standard. To reduce the overhead of speculative run-time parallelization, we have developed an adaptive run-time parallelization technique that dynamically chooses effcient shadow structures to record a program's dynamic memory access patterns for parallelization. This technique complements the original speculative run-time parallelization technique, the LRPD test, in parallelizing loops with sparse memory accesses. The techniques presented in this dissertation have been implemented in an optimizing research compiler and can be viewed as effective building blocks for comprehensive run-time optimization systems, e.g., feedback-directed optimization systems and dynamic compilation systems.
43

Labour market effects of immigration : evidence from Canada

Islam, Asadul 15 August 2003 (has links)
Immigration, the subject of repeated policy debates throughout the last two decades, has once again assumed a central position on the policy agenda. This debate has become more intense in recent years in Canada; the fear is over the potential job displacement and unemployment of Canadian-born workers, and the consequence to the Canadian economy. The recent immigrant incomes have been falling compared to their older counterparts helped to trigger the current policy debate. This thesis attempts to address this debate by providing an objective assessment of the displacement of Canadian-born workers due to immigration and the unemployment-immigration dynamics over the past 40 years of immigration to Canada. The thesis consists of two objectives:<p>Objective-I: Job Displacement Effects of Immigration on Canadian-born <p>First I address the job displacement effects on Canadian-born due to exogenous shifts in immigration flows. It is, therefore, necessary to consider the substitutability or complementarity between Canadian-born and immigrant workers. This is examined by estimating the set of wage earnings equation from the Generalized Leontief Production Function. The model specification abstracts from the role of capital, by assuming that labor and capital are separable in production. I then derive the iterated Zellner-efficient estimator (IZEF) (which is numerically equivalent to the maximum likelihood estimator) from the set of wage earnings equations. Then the degree of substitutability or complementarity is calculated using Hicks (as opposed to Allens) elasticity of complementarity. The estimated Hicksian elasticities suggest, in the aggregate, there is no displacement of Canadian-born workers by immigration, although there is some displacement by industry.<p>Objective-II: Unemployment and Immigration Dynamics<p>Next, I consider immigrant not only as an additions to the existing labor force but also job creation effects through their effects for goods and services. Here immigrants are considered as endogenous and I model the dynamics of unemployment and immigration. As a first step, statistical causality is investigated between immigration and unemployment. But causality methods can suffer from omitted variable problem. So, I construct a theoretical labor market and use the cointegration analysis to determine the long run relationship among unemployment rate, immigration level, real wage, and real GDP. Then, I estimate the short-run dynamics with a specification in difference form where the parameters of the cointegrating vectors from the first-step are fixed and entered as an error correction mechanism. The causality test finds no evidence of a significant effect of Canadian unemployment on immigration. The estimation of the long-run and short-run parameter indicates that no statistically significant relationship exists between unemployment and immigration.
44

Labour market effects of immigration : evidence from Canada

Islam, Asadul 15 August 2003
Immigration, the subject of repeated policy debates throughout the last two decades, has once again assumed a central position on the policy agenda. This debate has become more intense in recent years in Canada; the fear is over the potential job displacement and unemployment of Canadian-born workers, and the consequence to the Canadian economy. The recent immigrant incomes have been falling compared to their older counterparts helped to trigger the current policy debate. This thesis attempts to address this debate by providing an objective assessment of the displacement of Canadian-born workers due to immigration and the unemployment-immigration dynamics over the past 40 years of immigration to Canada. The thesis consists of two objectives:<p>Objective-I: Job Displacement Effects of Immigration on Canadian-born <p>First I address the job displacement effects on Canadian-born due to exogenous shifts in immigration flows. It is, therefore, necessary to consider the substitutability or complementarity between Canadian-born and immigrant workers. This is examined by estimating the set of wage earnings equation from the Generalized Leontief Production Function. The model specification abstracts from the role of capital, by assuming that labor and capital are separable in production. I then derive the iterated Zellner-efficient estimator (IZEF) (which is numerically equivalent to the maximum likelihood estimator) from the set of wage earnings equations. Then the degree of substitutability or complementarity is calculated using Hicks (as opposed to Allens) elasticity of complementarity. The estimated Hicksian elasticities suggest, in the aggregate, there is no displacement of Canadian-born workers by immigration, although there is some displacement by industry.<p>Objective-II: Unemployment and Immigration Dynamics<p>Next, I consider immigrant not only as an additions to the existing labor force but also job creation effects through their effects for goods and services. Here immigrants are considered as endogenous and I model the dynamics of unemployment and immigration. As a first step, statistical causality is investigated between immigration and unemployment. But causality methods can suffer from omitted variable problem. So, I construct a theoretical labor market and use the cointegration analysis to determine the long run relationship among unemployment rate, immigration level, real wage, and real GDP. Then, I estimate the short-run dynamics with a specification in difference form where the parameters of the cointegrating vectors from the first-step are fixed and entered as an error correction mechanism. The causality test finds no evidence of a significant effect of Canadian unemployment on immigration. The estimation of the long-run and short-run parameter indicates that no statistically significant relationship exists between unemployment and immigration.
45

The effects of exchange rate volatility on export : Empirical study between Sweden and Germany

Mai Thi Van, Anh January 2011 (has links)
The relationship between exchange rate volatility and trade flow has been examined in a number of previous researches. The paper mainly focuses on investigating the impact of exchange rate volatility on export values from Sweden to Germany during 2000:01 and 2011:06. The Auto Regressive Distributed Lag (ARDL) model is employed to obtain the estimates of the long run equilibrium and the short run dynamics, simultaneously. The results indicate that the exchange rate volatility has significant short run effects on export value in majority of estimated industries while its meaningful long run impacts do not appear in any cases. However, applying the “bounds test” approach, the co-integration is also found in more than half cases due to long run impacts of other factors such as foreign income on export earnings.
46

An estimation of U.S. gasoline demand in the short and long run

Rayska, Tetyana January 2011 (has links)
The rapid growth of gasoline consumption in the USA for the last decades brings much concern to scientists and politicians. Therefore many researchers investigated the influence of the main factors that have an impact on gasoline demand. In our study we tried to estimate gasoline demand in the USA, using national time series data for the period 1984-2010. Gasoline demand function considered in this paper includes price, income, fuel efficiency and gasoline consumption in previous year, as the main explanatory variables. The model is estimated using simultaneous equations and cointegration and error correction model (ECM). The results of both methods show a significant price and income effect on gasoline demand. The price is found inelastic and its impact on gasoline demand is very small, however when we correct for endogeneity of price variable, we obtain higher price elasticity. The results on income elasticities obtained from two methods are dubious, since the two methods gave us the different results. In whole, an income raise will lead to an increase of consumption, gasoline demand is inelastic with respect to income in the short-run, while in the long-run it is found to be elastic according to 2SLS method, while the results of cointegration method indicate that gasoline response to income changes is higher in the short-run than in the long-run. Lag of error term suggests that around 57% of adjustment between short-run and long-run occurs during the first year.
47

Run-time optimization of adaptive irregular applications

Yu, Hao 15 November 2004 (has links)
Compared to traditional compile-time optimization, run-time optimization could o&#64256;er signi&#64257;cant performance improvements when parallelizing and optimizing adaptive irregular applications, because it performs program analysis and adaptive optimizations during program execution. Run-time techniques can succeed where static techniques fail because they exploit the characteristics of input data, programs' dynamic behaviors, and the underneath execution environment. When optimizing adaptive irregular applications for parallel execution, a common observation is that the effectiveness of the optimizing transformations depends on programs' input data and their dynamic phases. This dissertation presents a set of run-time optimization techniques that match the characteristics of programs' dynamic memory access patterns and the appropriate optimization (parallelization) transformations. First, we present a general adaptive algorithm selection framework to automatically and adaptively select at run-time the best performing, functionally equivalent algorithm for each of its execution instances. The selection process is based on off-line automatically generated prediction models and characteristics (collected and analyzed dynamically) of the algorithm's input data, In this dissertation, we specialize this framework for automatic selection of reduction algorithms. In this research, we have identi&#64257;ed a small set of machine independent high-level characterization parameters and then we deployed an off-line, systematic experiment process to generate prediction models. These models, in turn, match the parameters to the best optimization transformations for a given machine. The technique has been evaluated thoroughly in terms of applications, platforms, and programs' dynamic behaviors. Speci&#64257;cally, for the reduction algorithm selection, the selected performance is within 2% of optimal performance and on average is 60% better than "Replicated Buffer," the default parallel reduction algorithm speci&#64257;ed by OpenMP standard. To reduce the overhead of speculative run-time parallelization, we have developed an adaptive run-time parallelization technique that dynamically chooses effcient shadow structures to record a program's dynamic memory access patterns for parallelization. This technique complements the original speculative run-time parallelization technique, the LRPD test, in parallelizing loops with sparse memory accesses. The techniques presented in this dissertation have been implemented in an optimizing research compiler and can be viewed as effective building blocks for comprehensive run-time optimization systems, e.g., feedback-directed optimization systems and dynamic compilation systems.
48

Control performance assessment of run-to-run control system used in high-mix semiconductor manufacturing

Jiang, Xiaojing 04 October 2012 (has links)
Control performance assessment (CPA) is an important tool to realize high performance control systems in manufacturing plants. CPA of both continuous and batch processes have attracted much attention from researchers, but only a few results about semiconductor processes have been proposed previously. This work provides methods for performance assessment and diagnosis of the run-to-run control system used in high-mix semiconductor manufacturing processes. First, the output error source of the processes with a run-to-run EWMA controller is analyzed and a CPA method (namely CPA I) is proposed based on closed-loop parameter estimation. In CPA I, ARMAX regression is directly applied to the process output error, and the performance index is defined based on the variance of the regression results. The influence of plant model mismatch in the process gain and disturbance model parameter to the control performance in the cases with or without set point change is studied. CPA I method is applied to diagnose the plant model mismatch in the case with set point change. Second, an advanced CPA method (namely CPA II) is developed to assess the control performance degradation in the case without set point change. An estimated disturbance is generated by a filter, and ARMAX regression method is applied to the estimated disturbance to assess the control performance. The influence of plant model mismatch, improper controller tuning, metrology delay, and high-mix process parameters is studied and the results showed that CPA II method can quickly identify, diagnose and correct the control performance degradation. The CPA II method is applied to industrial data from a high-mix photolithography process in Texas Instruments and the influence of metrology delay and plant model mismatch is discussed. A control performance optimization (CPO) method based on analysis of estimated disturbance is proposed, and optimal EWMA controller tuning factor is suggested. Finally, the CPA II method is applied to non-threaded run-to-run controller which is developed based on state estimation and Kalman filter. Overall process control performance and state estimation behavior are assessed. The influence of plant model mismatch and improper selection of different controller variables is studied. / text
49

Control-friendly scheduling algorithms for multi-tool, multi-product manufacturing systems

Bregenzer, Brent Constant 27 January 2012 (has links)
The fabrication of semiconductor devices is a highly competitive and capital intensive industry. Due to the high costs of building wafer fabrication facilities (fabs), it is expected that products should be made efficiently with respect to both time and material, and that expensive unit operations (tools) should be utilized as much as possible. The process flow is characterized by frequent machine failures, drifting tool states, parallel processing, and reentrant flows. In addition, the competitive nature of the industry requires products to be made quickly and within tight tolerances. All of these factors conspire to make both the scheduling of product flow through the system and the control of product quality metrics extremely difficult. Up to now, much research has been done on the two problems separately, but until recently, interactions between the two systems, which can sometimes be detrimental to one another, have mostly been ignored. The research contained here seeks to tackle the scheduling problem by utilizing objectives based on control system parameters in order that the two systems might behave in a more beneficial manner. A non-threaded control system is used that models the multi-tool, multi-product process in a state space form, and estimates the states using a Kalman filter. Additionally, the process flow is modeled by a discrete event simulation. The two systems are then merged to give a representation of the overall system. Two control system matrices, the estimate error covariance matrix from the Kalman filter and a square form of the system observability matrix called the information matrix, are used to generate several control-based scheduling algorithms. These methods are then tested against more tradition approaches from the scheduling literature to determine their effectiveness on both the basis of how well they maintain the outputs near their targets and how well they minimize the cycle time of the products in the system. The two metrics are viewed simultaneously through use of Pareto plots and merits of the various scheduling methods are judged on the basis of Pareto optimality for several test cases. / text
50

A Prototype Software To Select And Construct Control Charts For Short Runs

Doganci, Hakan 01 October 2004 (has links) (PDF)
Small and Medium Sized Enterprises (SMEs) were founded to improve the activity and effectiveness of small industries, to provide economic and social needs of the country, to increase the competitive level of the country, and to establish integration in the industry. In today&rsquo / s competition conditions, SMEs should continuously improve themselves / otherwise, they could lose their market shares. One of the major problems encountered in Turkish SMEs is poor quality activities / especially, not being able to exploit the Statistical Process Control (SPC) techniques. Production runs become shorter and shorter, and the product variety seems to be ever increasing, which cause short production runs. Using traditional control charts for short production runs can yield wrong and costly results. Instead of traditional control charts, short run charts such as Difference Charts (DNOM), Zed Charts, and Zed-Star Charts should be preferred.For this purpose, software that not only constructs short run control charts but also implements charts by tests to solve the problems of SMEs is developed. A Control Chart Selection Wizard, which is capable of emulating human expertise in finding a suitable control chart according to the user response for different cases is developed and added as a subprogram. Software was tested at Ar&ccedil / elik Dishwasher Plant in Ankara. The overall evaluation of the developed software, as regards the user, was satisfactory. The software can meet some requirements of the SMEs.

Page generated in 0.0479 seconds