361 |
Synchrotron electron beam controlGayadeen, Sandira January 2014 (has links)
This thesis develops techniques for the design and analysis of controllers to achieve sub-micron accuracy on the position of electron beams for the optimal performance of synchrotrons. The techniques have been applied to Diamond Light Source, the UK's national synchrotron facility. Electron beam motion in synchrotrons is considered as a large-scale, two-dimensional process and by using basis functions, controllable modes of the process are identified which are independent and allow the design to be approached in terms of a family of single-input, single-output transfer functions. This thesis develops techniques for the design and analysis of controllers to achieve sub-micron accuracy on the position of electron beams for the optimal performance of synchrotrons. The techniques have been applied to Diamond Light Source, the UK's national synchrotron facility. Electron beam motion in synchrotrons is considered as a large-scale, two-dimensional process and by using basis functions, controllable modes of the process are identified which are independent and allow the design to be approached in terms of a family of single-input, single-output transfer functions. In this thesis, loop shaping concepts for dynamical systems are applied to the two-dimensional frequency domain to meet closed loop specifications. Spatial uncertainties are modelled by complex Fourier matrices and the closed loop robust stability, in the presence of spatial uncertainties is analysed within an Integral Quadratic Constraint framework. Two extensions to the unconstrained, single-actuator array controller design are considered. The first being anti-windup augmentation to give satisfactory performance when rate limit constraints are imposed on the actuators and the second being a strategy to account for two arrays of actuators with different dynamics. The resulting control schemes offer both stability and performance guarantees within structures that are feasible for online computation in real time.
|
362 |
Statistical methods for quantifying uncertainty in climate projections from ensembles of climate modelsSansom, Philip George January 2014 (has links)
Appropriate and defensible statistical frameworks are required in order to make credible inferences about future climate based on projections derived from multiple climate models. It is shown that a two-way analysis of variance framework can be used to estimate the response of the actual climate, if all the climate models in an ensemble simulate the same response. The maximum likelihood estimate of the expected response provides a set of weights for combining projections from multiple climate models. Statistical F tests are used to show that the differences between the climate response of the North Atlantic storm track simulated by a large ensemble of climate models cannot be distinguished from internal variability. When climate models simulate different responses, the differences between the re- sponses represent an additional source of uncertainty. Projections simulated by climate models that share common components cannot be considered independent. Ensemble thinning is advocated in order to obtain a subset of climate models whose outputs are judged to be exchangeable and can be modelled as a random sample. It is shown that the agreement between models on the climate response in the North Atlantic storm track is overestimated due to model dependence. Correlations between the climate responses and historical climates simulated by cli- mate models can be used to constrain projections of future climate. It is shown that the estimate of any such emergent relationship will be biased, if internal variability is large compared to the model uncertainty about the historical climate. A Bayesian hierarchical framework is proposed that is able to separate model uncertainty from internal variability, and to estimate emergent constraints without bias. Conditional cross-validation is used to show that an apparent emergent relationship in the North Atlantic storm track is not robust. The uncertain relationship between an ensemble of climate models and the actual climate can be represented by a random discrepancy. It is shown that identical inferences are obtained whether the climate models are treated as predictors for the actual climate or vice versa, provided that the discrepancy is assumed to be sym- metric. Emergent relationships are reinterpreted as constraints on the discrepancy between the expected response of the ensemble and the actual climate response, onditional on observations of the recent climate. A simple method is proposed for estimating observation uncertainty from reanalysis data. It is estimated that natural variability accounts for 30-45% of the spread in projections of the climate response in the North Atlantic storm track.
|
363 |
Minimering av slöserier och kapacitetsbegränsningar för att öka produktionskapaciteten : En fallstudie på företaget Svensson & LinnérKarlsson, Therese, Eriksson, Rebecca January 2016 (has links)
Bakgrund: Formgivningsprocessen på Svensson & Linnér är en produktionsprocess som syftar till att förändra produktens form genom pressning samt böjning. I en produktionsprocess existerar det oftast ett flertal slöserier och kapacitetbegränsningar som inte kan identifieras förrän en processkartläggning är gjord. Kartläggning av processer möjliggör därför att företag blir medvetna om de slöserier och kapacitetsbegränsningar som existerar. Detta gör att företag kan förbättra sina processer genom att eliminera ledtider, väntetider och onödiga rörelser. Syfte: Syftet med detta examensarbete är att identifiera slöserier och kapacitetsbegränsningar i formgivningsprocessen på Svensson & Linnér samt förklara dess bakomliggande orsaker. Syftet är vidare att föreslå förbättringsåtgärder som borde göras för att öka kapaciteten i processen. Metod: Studien som genomförts är en fallstudie på företaget Svensson & Linnér där data samlats in genom deltagande och strukturerade observationer samt genom ostrukturerade och semi-strukturerade intervjuer. En processkartläggning och beräkningar av kapacitetsutnyttjandet har gjorts för att kunna identifiera slöserier och kapacitetsbegränsningar. Orsak-verkandiagram har sedan upprättas för att identifiera bakomliggande orsaker och ge förslag på förbättringsåtgärder som bör göras för att öka kapaciteten i processen. Slutsatser: Studien har kommit fram till att det i formgivningsprocessen existerar slöserier i form av onödiga lager, rörelser, transporter, väntan, inkorrekta processer, defekta produkter och outnyttjad kreativitet hos medarbetarna. Efter beräkningar av kapacitetsutnyttjandet i processen identifierades det att funktionen skär- och sliproboten är kapacitetsbegränsningen i processen. Utifrån de bakomliggande orsakerna har fem förbättringsförslag identifierats som kan leda till kapacitetsökning i formgivningsprocessen. Svensson & Linnér bör endast ha ett lager för stansat faner där FIFO-principen och ställagring av pallat gods bör användas. I processen bör fel och brister åtgärdas långsiktigt istället för provisoriskt och arbetssättet bör standardiseras. I robotcellerna bör soptunnor placeras ut så att avfall kan slängas direkt och inte vid skiftbytet senare. För att öka kapaciteten i kapacitetsbegränsningen, skär- och sliproboten, föreslås det att fallföretaget ska anpassa kapaciteten efter efterfrågan på produkten genom att utnyttja övertid i processen. / Background: The shaping process at Svensson and Linnér is a process that aims to change the shape of the product by pressing and bending. In a production process there usually exists a number of wastes or capacity constraints which not can be identified until a process mapping is made. A mapping of a process therefore enables companies to become aware of the waste and the capacity constraints that exist. This enables companies to improve their processes by eliminating lead times, waiting times and unnecessary movements. Purpose: The aim of this thesis to identify waste and capacity constraints in the shaping process at Svensson & Linnér and to explain its underlying causes. The aim is then to propose improvements that should be done in order to increase the capacity in the shaping process. Method: The study is conducted as a case study at the company Svensson & Linnér where data was collected through participant and structured observations and through unstructured and semi-structured interviews. A process mapping and calculations of capacity utilization has been made to identify waste and capacity constraints. Cause - effect diagrams was then established to identify the underlying causes and to suggest improvements that could lead to increased capacity. Conclusions: The study has concluded that it in the shaping process exists wastes in terms of unnecessary inventory, motion, transportation, waiting, incorrect procedures, defective products and untapped creativity of the operatives. After calculations of capacity utilization in the process, it was identified that the cutting and grinding robot is the capacity limit of the process. Based on the underlying causes have five suggestions for improvements been identified that could lead to increased capacity in the shaping process. Svensson & Linnér should only have one stock for punched veneer where the FIFO principle and rack storage of palletized goods should be used. In the process should errors and flaws be fixed in long terms instead of provisionally and working methods should be standardized. In the robot cells should dustbins be placed so that waste can be disposed immediately and not at the shift change later. To increase the capacity of the capacity constraint, cutting and grinding robot, it is suggested that the company should adjust their capacity to demand for the product by using overtime in the process.
|
364 |
USING THE AMD TAXI CHIPS IN A PARALLEL TO FIBER INTERFACEBroffel, Robert W. 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / To meet the security constraints for mission control rooms; the rooms must interface to
other systems via fiber optic cable. Analog data from DAC (Digital to Analog Converter)
outputs were initially brought into the rooms on copper wire. This paper outlines the
conversion to fiber optic cable using the AMD TAXI chips in our Optical Digital Interface
(ODI).
|
365 |
The application of the theory of constraints at Middelburg ferrochrome to determine the expansion capacity of the inbound raw material logistics systemGardner, Jeffrey Morgan 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2004. / ENGLISH ABSTRACT: With the demand for ferrochrome increasing, Middelburg Ferrochrome is considering various capacity expansion opportunities. One of the proposed expansions would entail the construction of a pelletising and sintering facility on the site. During the feasibility study, a study was necessitated to verify if the inbound raw material logistics system at Middelburg Ferrochrome would be able to accommodate the additional raw material required for the pelletising and sintering facility.
The objective of this study was to validate if the existing raw materials system is indeed capable of accommodating the additional demand placed on it by the addition of a pelletising and sintering plant at Middelburg Ferrochrome. This will be analysed with the use of the Theory of Constraints.
In the study project, an overview of the principles of the Theory of Constraints is presented. An integral part of the Theory of Constraints is the Five Focussing Step process. This process was applied to the existing inbound raw materials logistics system at MFC, in an effort to maximise the raw material processing capability of the system.
The methodology that was used to verify whether the inbound raw materials logistics system was capable of handling the additional raw materials involved the construction of a simulation model that was representative of the operation of the existing system. Measurements that were required to assess the performance of the system relative to its goal were defined. The performance of the existing raw materials system was assessed by subjecting the results of the simulation to the defined measures. The simulation model was then adapted to include the pelletising and sintering facility and the increased raw material requirements were introduced to the adapted model. The performance of the future system was then assessed in the same way as the existing system.
The benefits that were derived from the application of the Five Focussing Steps were introduced to the adapted simulation model and the simulation results were then reassessed according to the defined measures. Key findings of the study were that through the application of the Five Focussing Steps the processing capability of the inbound raw materials logistics system could be increased. Only with the inclusion of benefits derived from the application of the Five Focussing Steps would the inbound raw materials logistics system at Middelburg Ferrochrome be capable of meeting the additional demand placed on it by the introduction of a pelletising and sintering facility at the site. / AFRIKAANSE OPSOMMING: ‘n Toename in die vraag na ferrochroom het tot gevolg dat Middelburg Ferrocrome (MFC) verskeie kapasiteit-uitbreidingsgeleenthede oorweeg. Een van die voorgestelde uitbreidings behels die konstruksie van ‘n pelletisering- en sinteraanleg. Hierdie voorstel se lewensvatbaarheidstudie het onder meer behels dat die bestaande logistieke stelsel wat Middelburg Ferrocrome se inkomende grondstowwe hanteer, ontleed moes word. Die doel van die analise was om te bepaal of die bestaande logistieke stelsel in staat sou wees om die addisionele grondstowwe, benodig vir die pelletisering- en sinteraanleg, te hanteer.
Die doelwit van hierdie studie is om vas te stel of die vermoë van die huidige grondstofstelsel in staat is om die addisionele las van die voorgestelde Middelburg Ferrocrome pelletisering- en sinteraanleg te kan hanteer. Dit word met die behulp van die "Theory of Constraints" ondersoek.
’n Oorsig van die beginsels van die “Theory of Constraints” word in hierdie studie weergegee. ’n Integrale deel van die “Theory of Constraints” is die Vyf Fokus Stappe proses. Met hierdie studie is die Vyf Fokus Stappe proses toegepas op die inkomende grondstof logistieke stelsel van MFC ten einde die grondstof proseseringskapasiteit van die stelsel te maksimeer.
Die metodologie wat gevolg is om te bepaal of die inkomende grondstof logistieke stelsel die vermoë het om die addisionele grondstoflas te hanteer, het die konstruksie van ’n simulasie model wat die bedryfskondisies van die bestaande stelsel naboots, behels. Die huidige logistieke stelsel se inset waardes is geïdentifiseer en in die simulasie model ingevoer ten einde die model se akkuraatheid te verifieer. Die simulasie model is hierna aangepas om die voorgestelde pelletisering- en sinteraanleg in te sluit. Die werkverrigting van die voorgestelde stelsel is hierna op dieselfde wyse as die bestaande stelsel geëvalueer.
Die voordele wat uit die toepassing van die Vyf Fokus Stappe afgelei is, is by die aangepaste simulasie model bygevoeg waarna die simulasie resultate herevalueer is. Die hoof bevindings wat tydens die studie gemaak is, is dat die inkomende grondstof logistieke stelsel se kapasiteit wel deur die toepassing van die Vyf Fokus Stappe verhoog kan word. Daar is verder gevind dat die bestaande inkomende grondstof logistieke stelsel wel die addisionele las van ‘n pelletisering- en sinteraanleg kan hanteer, maar slegs indien die voordele wat uit die toepassing van die Vyf Fokus Stappe verkry word, ingesluit word.
|
366 |
Financial Credibility, Financial Constraints and Rule of Law : A quantitative study on international firmsAndersson, Daniel, Kostet, Jakob January 2016 (has links)
Reducing firms’ financial constraints can be an important element for economic growth. Previous scholars have documented various factors that affect firms’ ability to access finance (e.g. Lambert et al., 2007, p. 385). In this study, we investigate the impact of financial reporting credibility in reducing firms’ financial constraints. In addition, we study the role that rule of law at a country level have on the above stated association. We hypothesize that financial reporting credibility decreases firms’ financial constraints. Then, we propose that the ability of financial reporting credibility to reduce financial constraints weakens when rule of law (at a country level) decreases. This is the first study to investigate how the association between financial reporting credibility and financial constraints are affected by rule of law on a country level, to the authors’ knowledge. The study uses 52,381 firms operating in 98 countries that responded to the World Bank’s Enterprise Surveys between the time period 2006 to 2015. Financial constraints are measured through a variable that takes into consideration the perceived amount of obstacles firms are facing in their current operations and the proxy for financial credibility is whether firms have been audited or not. Our moderating term is the World Bank’s rule of law index. By using both regression and matching analysis, we find a significant negative association between financial credibility and financial constraints. This indicates that increased financial reporting credibility leads to less financial constraints for firms. For the moderating effect of the rule of law, the results are insignificant. However, we observe that when the level of rule of law is high, increased financial credibility leads to minor improvements in access to external finance.
|
367 |
Three essays on hypotheses testing involving inequality constraintsHsu, Yu-Chin, 1978- 21 September 2010 (has links)
The focus of this research is on hypotheses testing involving inequality constraints. In the first chapter of this dissertation, we propose Kolmogorov-Smirnov type tests for stochastic dominance relations between the potential outcomes of a binary treatment under the unconfoundedness assumption. Our stochastic dominance tests compare every point of the cumulative distribution functions (CDF), so they can fully utilize all information in the distributions. For first order stochastic dominance, the test statistic is defined as the supremum of the difference of two inverse-probability-weighting estimators for the CDFs of the potential outcomes. The critical values are approximated based on a simulation method. We show that our test has good size properties and is consistent in the sense that it can detect any violation of the null hypothesis asymptotically. First order stochastic dominance tests in the treated subpopulation, and higher order stochastic dominance tests in the whole population and among the treated are shown to share the same properties. The tests are applied to evaluate the effect of a job training program on incomes, and we find that job training has a positive effect on real earnings. Finally, we extend our tests to cases in which the unconfoundedness assumption does not hold. On the other hand, there has been a considerable amount of attention paid to testing inequality restrictions using Wald type tests. As noted by Wolak (1991), there are certain situations where it is difficult to obtain tests with correct size even asymptotically. These situations occur when the variance-covariance matrix of the functions in the constraints depends on the unknown parameters as would be the case in nonlinear models. This dependence on the unknown parameters makes it computationally difficult to find the least favorable configuration (LFC) which can be used to bound the size of the test. In the second chapter of this dissertation, we extend Hansen's (2005) superior predictive ability (SPA) test to testing hypotheses involving general inequality constraints in which the variance-covariance matrix can be dependent on the unknown parameters. For our test we are able to obtain correct size asymptotically plus test consistency without requiring knowledge of the LFC. Also the test can be applied to a wider class of problems than considered in Wolak (1991). In the last chapter, we construct new Kolmogorov-Smirnov tests for stochastic dominance of any pre-specified order without resorting to the LFC to improve the power of Barrett and Donald's (2003) tests. To do this, we first show that under the null hypothesis if the objects being compared at a given income level are not equal, then the objects at this given income level will have no effect on the null distribution. Second, we extend Hansen's (2005) recentering method to a continuum of inequality constraints and construct a recentering function that will converge to the underlying parameter function uniformly asymptotically under the null hypothesis. We treat the recentering function as a true underlying parameter function and add it to the simulated Brownian bridge processes to simulate the critical values. We show that our tests can control the size asymptotically and are consistent. We also show that by avoiding the LFC, our tests are less conservative and more powerful than Barrett and Donald's (2003). Monte Carlo simulations support our results. We also examine the performances of our tests in an empirical example. / text
|
368 |
Into the knownInge, Courtney Lynn 16 October 2014 (has links)
This report details a design process that generates new forms from mundane materials and tools. By utilizing a structure of limitations to establish artificial constraints, making becomes a sort of game where the designer must negotiate the rules and objects in order to achieve a solution. The best results come from setting up explicit limitations about the type of manipulation permitted, establishing design objectives, specifying the material or tool to be explored. Throughout the process of designing a structure of limitations affords the designer a critical distance from the assumed uses of common materials and familiar tools resulting in new forms, and often unexpected results. Self-assigned parameters help the designer gain control over rules and constraints established by clients. / text
|
369 |
放空限制,價格發現與市場效率公正性-以台灣50ETF為例 / Short Sales Constraint, Price Discovery, Market Efficiency and Fairness-Taiwan Top50 Tracker Fund顏珮儒, Yen,Pei Ru Unknown Date (has links)
世界各證券市場對於放空交易都有放空限制,然而對於ETF則不作放空限制,此乃因為ETF透明度與流動性相較一般證券高,交易管理上可與指數期貨同。而ETF對於放空市場之所以重要,主要在於它本身豁免於放空限制,使市場多頭空頭結構完整,投資人有管道可以充分反映對於該股所掌握的消息,一但價格偏離基金淨值,投資人也可以透過申購買回策略進行套利,最終結果為ETF價格貼近於基金淨值,股價維持合理價位。當市場股價都能充分反應多空消息,市場效率性將提升,對健全台灣證券市場並往金融自由化前進的一大步,而主要關鍵點便在於完善的賣空機制。
本文主要以台灣50ETF本身豁免平盤以下不能放空這特殊優勢出發,並以台灣五十指數及台灣50ETF實際日交易資料及日內交易資料,運用「事件研究法」,期間以「台灣50ETF上市」及「台灣五十個股豁免融券放空」兩事件為主,觀察存在放空限制與解除放空限制前後期,在價格高估與價格效率性變化情形,此外,也觀察解除放空限制期間對台灣50ETF的影響,以及推估該限制對台灣50ETF放空交易者的放空行為在前後期變動。最後,也針對台灣50ETF本身豁免平盤以下放空限制主要原因在於流動性與透明度足夠,監理機構控管容易,進一步探討如何增加台灣證券市場個股資訊透明度,且該透明度足夠讓個股同樣可從事放空交易,進而促使股價合理化,台灣證券市場的公正度與效率性平衡發展。 / 主要研究結果摘要如下:
1.台灣50ETF法人放空交易者在2005年台灣五十個股豁免放空限制時積極從事實物申購買回套利動作,信用交易量值減少,相對較不活絡。主要原因以往只有ETF可以放空,五十個股豁免後,投資人更有管道從事放空交易。
2.個股豁免放空限制實施對台灣50ETF報酬變動影響在短天期較為顯著。
3.台灣50ETF在兩階段放空限制實施後,價格高估可能性均下降,存在放空機制使ETF市場價格效率提升,在短期間(三個月期)效果較為顯著。
4.放空限制的進一步放寬(2005年),台灣50ETF賣空交易張數比例(交易量)在事件過後沒有增加,表示放空交易者沒有增加對於交易的需求。此外,我們也沒有發現強而有利證據證明取消放空限制在尾盤會出現賣壓現象。 / To some extent, almost all stock markets locate short-sell constraints on the short selling transactions process; however, they never do so to ETF thanks to its higher transparency and liquidity making it an easy supervised equity. Further, ETF could be considered as a good implication for the stock market simply owing to its exemption from short sell constraints that brings a strong demand and supply structure In other words, if the ETF price is not equal to its fund NAV (net asset value), investors could make profits by buying back or selling ETFs; in the long run, the price will reverse to its NAV and maintain on a reasonable level. As a result, we could infer that if the price could fully reflect investors’ diversified opinions, the market will be more efficient, and we could expect Taiwan stock market to forge ahead soon. Hence, the key issue lies in a healthy short selling mechanism.
This paper tries to figure out whether short sell constraints on Taiwan stock market have significant influence on stock price and price efficiency. We use Taiwan50 index and TW50 ETF daily and intraday data from TEJ database, employee Event Study Method, select two dates as our event date which are “the listed date for TW50 ETF,” and “Taiwan 50 stocks exemption from short selling regulations,” and try to examine the difference before and after the date. Additionally, by summarizing and analyzing the trading frequency changes before or after the periods, we try to understand and estimate how the short sellers trading behavior would vary due to the implementation of regulation. / The conclusion of this paper is summarized as follows:
1.Consistent with our assumptions, the institutional investors increased their TW50 ETF purchasing or selling transactions rather than credit transactions after the date when appointed 50 stocks has been exempted from short selling regulations, in other words, investors could only sell short ETF to express their negative opinion toward those 50 stocks and get partially reaction.
2.The exemption policy has a significant effect on TW50 ETF in short term.
3.The two stage of exemption policy has decreased the possibility of price overvaluation and increased the price efficiency of TW50ETF in the short term.
4. We find that the short sell transactions ratio (or the volume) decreased after the second stage of exemption, which means short sellers did not increase their transaction needs.
|
370 |
Symbolic analysis of scenario based timed models for component based systems : Compositionality results for testingBannour, Boutheina 14 June 2012 (has links) (PDF)
In this thesis, we describe how to use UML sequence diagrams with MARTE timing constraints to specify entirely the behavior of component-based systems while abstracting as much as possible the functional roles of components composing it. We have shown how to conduct compositional analysis of such specifications. For this, we have defined operational semantics to sequence diagrams by translating them into TIOSTS which are symbolic automata with timing constraints. We have used symbolic execution techniques to compute possible executions of the system in the form of a symbolic tree. We have defined projection mechanisms to extract the execution tree associated with any distinguished component. The resulting projected tree characterizes the possible behaviors of the component with respect to the context of the whole system specification. As such, it represents a constraint to be satisfied by the component and it can be used as a correctness reference to validate the system in a compositional manner. For that purpose, we have grounded our validation framework on testing techniques. We have presented compositional results relating the correctness of a system to the correctness of components. Based on these results, we have defined an incremental approach for testing from sequence diagrams.
|
Page generated in 0.0591 seconds