521 |
Developing a Discrete Event Simulation Methodology to support a Six Sigma Approach for Manufacturing Organization - Case study.Hussain, Anees, Munive-Hernandez, J. Eduardo, Campean, Felician 17 March 2019 (has links)
Yes / Competition in the manufacturing industry is growing at an accelerated rate due to globalization trend. This global competition urges manufacturing organizations to review and improve their processes in order to enhance and maintain their competitive advantage. One of those initiatives is the implementation of the Six Sigma methodology to analyze and reduce variation hence improving the processes of manufacturing organizations. This paper presents a Discrete Event Simulation methodology to support a Six Sigma approach for manufacturing organizations. Several approaches to implement Six Sigma focus on improving time management and reducing cycle time. However, these efforts may fail in their effective and practical implementation to achieve the desired results. Following the proposed methodology, a Discrete Event Simulation model was built to assist decision makers in understanding the behavior of the current manufacturing process. This approach helps to systematically define, measure and analyze the current state process to test different scenarios to improve performance. The paper is amongst the first to offer a simulation methodology to support a process improvement approach. It applies an action research strategy to develop and validate the proposed modelling methodology in a British manufacturing organization competing in global markets.
|
522 |
Adaptive Sampling Line Search for Simulation OptimizationRagavan, Prasanna Kumar 08 March 2017 (has links)
This thesis is concerned with the development of algorithms for simulation optimization (SO), a special case of stochastic optimization where the objective function can only be evaluated through noisy observations from a simulation. Deterministic techniques, when directly applied to simulation optimization problems fail to converge due to their inability to handle randomness thus requiring sophisticated algorithms. However, many existing algorithms dedicated for simulation optimization often show poor performance on implementation as they require extensive parameter tuning.
To overcome these shortfalls with existing SO algorithms, we develop ADALINE, a line search based algorithm that eliminates the need for any user defined parameters. ADALINE is designed to identify a local minimum on continuous and integer ordered feasible sets. ADALINE on a continuous feasible set mimics deterministic line search algorithms, while it iterates between a line search and an enumeration procedure on integer ordered feasible sets in its quest to identify a local minimum. ADALINE improves upon many of the existing SO algorithms by determining the sample size adaptively as a trade-off between the error due to estimation and the optimization error, that is, the algorithm expends simulation effort proportional to the quality of the incumbent solution. We also show that ADALINE converges ``almost surely'' to the set of local minima. Finally, our numerical results suggest that ADALINE converges to a local minimum faster, outperforming other advanced SO algorithms that utilize variable sampling strategies.
To demonstrate the performance of our algorithm on a practical problem, we apply ADALINE in solving a surgery rescheduling problem. In the rescheduling problem, the objective is to minimize the cost of disruptions to an existing schedule shared between multiple surgical specialties while accommodating semi-urgent surgeries that require expedited intervention. The disruptions to the schedule are determined using a threshold based heuristic and ADALINE identifies the best threshold levels for various surgical specialties that minimizes the expected total cost of disruption. A comparison of the solutions obtained using a Sample Average Approximation (SAA) approach, and ADALINE is provided. We find that the adaptive sampling strategy in ADALINE identifies a better solution quickly than SAA. / Ph. D. / This thesis is concerned with the development of algorithms for simulation optimization (SO), where the objective function does not have an analytical form, and can only be estimated through noisy observations from a simulation. Deterministic techniques, when directly applied to simulation optimization problems fail to converge due to their inability to handle randomness thus requiring sophisticated algorithms. However, many existing algorithms dedicated for simulation optimization often show poor performance on implementation as they require extensive parameter tuning.
To overcome these shortfalls with existing SO algorithms, we develop ADALINE, a line search based algorithm that minimizes the need for user defined parameter. ADALINE is designed to identify a local minimum on continuous and integer ordered feasible sets. ADALINE on continuous feasible sets mimics deterministic line search algorithms, while it iterates between a line search and an enumeration procedure on integer ordered feasible sets in its quest to identify a local minimum. ADALINE improves upon many of the existing SO algorithms by determining the sample size adaptively as a trade-off between the error due to estimation and the optimization error, that is, the algorithm expends simulation effort proportional to the quality of the incumbent solution. Finally, our numerical results suggest that ADALINE converges to a local minimum faster than the best available SO algorithm for the purpose.
To demonstrate the performance of our algorithm on a practical problem, we apply ADALINE in solving a surgery rescheduling problem. In the rescheduling problem, the objective is to minimize the cost of disruptions to an existing schedule shared between multiple surgical specialties while accommodating semi-urgent surgeries that require expedited intervention. The disruptions to the schedule are determined using a threshold based heuristic and ADALINE identifies the best threshold levels for various surgical specialties that minimizes the expected total cost of disruption. A comparison of the solutions obtained using traditional optimization techniques, and ADALINE is provided. We find that the adaptive sampling strategy in ADALINE identifies a better solution more quickly than traditional optimization.
|
523 |
Alzheimer’s Detection With The Discrete Wavelet Transform And Convolutional Neural NetworksNardone, Melissa N 01 December 2022 (has links) (PDF)
Alzheimer’s disease slowly destroys an individual’s memory, and it is estimated to impact more than 5.5 million Americans. Over time, Alzheimer’s disease can cause behavior and personality changes. Current diagnosis techniques are challenging because individuals may show no clinical signs of the disease in the initial stages. As of today, there is no cure for Alzheimer’s. Therefore, symptom management is key, and it is critical that Alzheimer’s is detected early before major cognitive damage.
The approach implemented in this thesis explores the idea of using the Discrete Wavelet Transform (DWT) and Convolutional Neural Networks (CNN) for Alzheimer’s detection. The neural network is trained and tested using Magnetic Resonance Image (MRI) brain scans from the ADNI1 (Alzheimer’s Disease Neuroimaging Initiative) dataset; and various mother wavelets and network hyperparameters are implemented to identify the optimal model. The resulting model can successfully identify patients with mild Alzheimer’s disease (AD) and the ones that are cognitively normal (NL) with an average accuracy of accuracy of 77.53±2.37%, an f1-score of 77.03±3.24%, precision of 80.63±11.03%, recall or sensitivity or 77.90±11.52%, and a specificity of 77.53±2.37%.
|
524 |
The Treatment Effect of the City Connects Intervention on Exiting Limited English Proficiency StatusAkbayin, Bercem January 2017 (has links)
Thesis advisor: Henry I. Braun / The City Connects intervention is motivated by the belief that out-of-school factors act as barriers to student thriving in cognitive and non-cognitive domains. It seeks to address these barriers first by identifying each student’s strengths and needs and then by providing a tailored set of prevention, intervention, and enrichment programs. Underlying the program is the assumption that provision of high-quality resources and individualized services will enable children to be cognitively, socio-emotionally, and physically prepared to thrive in school. This study’s purpose was to estimate the effects of the City Connects intervention on English learners’ (EL) likelihood of exiting Limited English Proficiency (LEP) status. ELs comprise a student subpopulation most at-risk to fail academically, and exposure to the program was hypothesized to improve their likelihood of exiting LEP status earlier than otherwise. A series of one- and two-level discrete-time event history analyses were conducted on the main analytic sample as well as two sub-samples. As participation in City Connects is at the school-level, school-level matching was used for sub-samples 1 and 2, and propensity score weights were applied at the student-level for all three samples. Additionally, hazard probabilities, survival probabilities, cumulative hazard rates, and median lifetimes were estimated. Lastly, a sensitivity analysis was conducted to examine whether effects were robust to unobserved selection bias. The results indicated that ELs participating in the City Connects intervention were significantly more likely to exit LEP status earlier than their peers in comparison schools. The median time in LEP status in City Connects schools was shorter and translated into a gain of at least one half of a year in grade in mainstream classes. Also, all the fitted models indicated that approximately 10 percent more City Connects students exited LEP status by the end of fifth grade than comparison students. Findings highlight the impact of the City Connects intervention, as ELs entering mainstream classes earlier could translate into important academic and non-academic gains, such as improved academic achievement and increased self-confidence. / Thesis (PhD) — Boston College, 2017. / Submitted to: Boston College. Lynch School of Education. / Discipline: Educational Research, Measurement and Evaluation.
|
525 |
A l'intersection de la combinatoire des mots et de la géométrie discrète : palindromes, symétries et pavages / At the intersection of combinatorics on words and discrete geometry : palindromes, symmetries and tilingsBlondin Massé, Alexandre 02 December 2011 (has links)
Dans cette thèse, différents problèmes de la combinatoire des mots et de géométrie discrète sont considérés. Nous étudions d'abord l'occurrence des palindromes dans les codages de rotations, une famille de mots incluant entre autres les mots sturmiens et les suites de Rote. En particulier, nous démontrons que ces mots sont pleins, c'est-à-dire qu'ils réalisent la complexité palindromique maximale. Ensuite, nous étudions une nouvelle famille de mots, appelés mots pseudostandards généralisés, qui sont générés à l'aide d'un opérateur appelé clôture pseudopalindromique itérée. Nous présentons entre autres une généralisation d'une formule décrite par Justin qui permet de générer de façon linéaire et optimale un mot pseudostandard généralisé. L'objet central, le f-palindrome ou pseudopalindrome est un indicateur des symétries présentes dans les objets géométriques. Dans les derniers chapitres, nous nous concentrons davantage sur des problèmes de nature géométrique. Plus précisément, nous don-nons la solution à deux conjectures de Provençal concernant les pavages par translation, en exploitant la présence de palindromes et de périodicité locale dans les mots de contour. À la fin de plusieurs chapitres, différents problèmes ouverts et conjectures sont brièvement présentés. / In this thesis, we explore different problems at the intersection of combinatorics on words and discrete geometry. First, we study the occurrences of palindromes in codings of rotations, a family of words including the famous Sturmian words and Rote sequences. In particular, we show that these words are full, i.e. they realize the maximal palindromic complexity. Next, we consider a new family of words called generalized pseudostandard words, which are generated by an operator called iterated pseudopalindromic closure. We present a generalization of a formula described by Justin which allows one to generate in linear (thus optimal) time a generalized pseudostandard word. The central object, the f-palindrome or pseudopalindrome, is an indicator of the symmetries in geometric objects. In the last chapters, we focus on geometric problems. More precisely, we solve two conjectures of Provençal about tilings by translation, by exploiting the presence of palindromes and local periodicity in boundary words. At the end of many chapters, different open problems and conjectures are briefly presented.
|
526 |
Reconstruction Tomographique MojetteServieres, Myriam 07 December 2005 (has links) (PDF)
Une des thématiques abordée par l'équipe Image et Vidéo-Communication est la reconstruction tomographique discréte à l'aide de la transformée Mojette. Ma thèse s'inscrit dans le cadre de la reconstruction tomographique médicale. La transformée Mojette est une version discrète exacte de la transformée de Radon qui est l'outil mathématique permettant la reconstruction tomographique. Pour évaluer la qualité des reconstructions, nous avons utilisé des fantômes numériques 2D simples (objet carré, rond) en absence puis en présence de bruit. Le coeur de mon travail de thèse est la reconstruction d'un objet à l'aide d'un algorithme de rétroprojection filtrée exacte Mojette en absence de bruit s'appuyant sur la géométrie discrète. Pour un nombre fini de projections dépendant de la taille de l'objet à reconstruire la reconstruction est exacte. La majorité des tomographes industriels utilisent l'algorithme de rétroprojection de projections filtrées (Filtered Back Projection ou FBP) pour reconstruire la région d'intérêt. Cet algorithme possède deux défauts théoriques, un au niveau du filtre utilisé, l'autre au niveau de la rétroprojection elle-même. Nous avons pu mettre au point un algorithme de Mojette FBP. Cet algorithme fait partie des méthodes directes de reconstruction. Il a aussi été testé avec succès en présence de bruit. Cet algorithme permet une équivalence continu-discret lors de la reconstruction. L'étape de projection/rétroprojection Mojette présente la particularité intéressante de pouvoir être décrit par une matrice Toeplitz bloc Toeplitz. Pour utiliser cette propriété nous avons mis en oeuvre un algorithme de gradient conjugué.
|
527 |
Combinatorial Considerations on Two Models from Statistical MechanicsThapper, Johan January 2007 (has links)
Interactions between combinatorics and statistical mechanics have provided many fruitful insights in both fields. A compelling example is Kuperberg’s solution to the alternating sign matrix conjecture, and its following generalisations. In this thesis we investigate two models from statistical mechanics which have received attention in recent years. The first is the fully packed loop model. A conjecture from 2001 by Razumov and Stroganov opened the field for a large ongoing investigation of the O(1) loop model and its connections to a refinement of the fully packed loop model. We apply a combinatorial bijection originally found by de Gier to an older conjecture made by Propp. The second model is the hard particle model. Recent discoveries by Fendley et al. and results by Jonsson suggests that the hard square model with cylindrical boundary conditions possess some beautiful combinatorial properties. We apply both topological and purely combinatorial methods to related independence complexes to try and gain a better understanding of this model.
|
528 |
A simulation framework for the analysis of reusable launch vehicle operations and maintenanceDees, Patrick Daniel 26 July 2012 (has links)
During development of a complex system, feasibility initially overshadows other concerns, in some cases leading to a design which may not be viable long-term. In particular for the case of Reusable Launch Vehicles, Operations&Maintenance comprises the majority of the vehicle's LCC, whose stochastic nature precludes direct analysis. Through the use of simulation, probabilistic methods can however provide estimates on the economic behavior of such a system as it evolves over time. Here the problem of operations optimization is examined through the use of discrete event simulation. The resulting tool built from the lessons learned in the literature review simulates a RLV or fleet of vehicles undergoing maintenance and the maintenance sites it/they visit as the campaign evolves over a period of time. The goal of this work is to develop a method for uncovering an optimal operations scheme by investigating the effect of maintenance technician skillset distributions on important metrics such as the achievable annual flight rate and maintenance man hours spent on each vehicle per flight. Using these metrics, the availability of technicians for each subsystem is optimized to levels which produce the greatest revenue from flights and minimum expenditure from maintenance.
|
529 |
A Study On Bandpassed Speech From The Point Of IntelligibilityGanesh, Murthy C N S 10 1900 (has links)
Speech has been the subject of interest for a very long time. Even with so much advancement in the processing techniques and in the understanding of the source of speech, it is, even today, rather difficult to generate speech in the laboratory in all its aspects. A simple aspect like how the speech can retain its intelligibility even if it is distorted or band passed is not really understood. This thesis deals
with one small feature of speech viz., the intelligibility of speech is retained even when it is bandpassed with a minimum bandwidth of around 1 KHz located any where on the speech
spectrum of 0-4 KHz.
Several experiments have been conducted by the earlier workers by passing speech through various distortors like differentiators, integrators and infinite peak clippers and it is found that the intelligibility is retained to a very large extent in the distorted speech. The integrator and the differentiator remove essentially a certain portion of the spectrum. Therefore, it is thought that the intelligibility of the speech is spread over the entire speech spectrum and that, the intelligibility of speech may not be impaired even when it is bandpassed with a minimum bandwidth and the band may be located any where in the speech spectrum. To test this idea and establish this feature if it exists, preliminary
experiments have been conducted by passing the speech through different filters and it is found that the conjecture seems to be on the right line.
To carry out systematic experiments on this an
experimental set up has been designed and fabricated which consists of a microprocessor controlled speech recording, storing and speech playback system. Also, a personal computer
is coupled to the microprocessor system to enable the storage and processing of the data. Thirty persons drawn from different walks of life like teachers, mechanics and students have been involved for collecting the samples and for
recognition of the information of the processed speech. Even though the sentences like 'This is devices lab' are used to ascertain the effect of bandwidth on the intelligibility, for the purpose of analysis, vowels are used as the speech samples.
The experiments essentially consist of recording words and sentences spoken by the 30 participants and these recorded speech samples are passed through different filters with different bandwidths and central frequencies. The filtered output is played back to the various listeners and
observations regarding the intelligibility of the speech are noted. The listeners do not have any prior information about the content of the speech. It has been found that in almost
all (95%) cases, the messages or words are intelligible for most of the listeners when the band width of the filter is about 1 KHz and this is independent of the location of the pass band in the spectrum of 0-4 KHz. To understand how this feature of speech arises, spectrums of vowels spoken by 30 people have using FFT algorithms on the digitized samples of the speech.
It is felt that there is a cyclic behavior of the spectrum in all the samples. To make sure that the periodicity is present and also to arrive at the periodicity, a moving average procedure is employed to smoothen the spectrum. The smoothened spectrums of all the vowels indeed show a periodicity of about 1 KHz. When the periodicities are analysed the average value of the periodicities has been found to be 1038 Hz with a standard deviation of 19 Hz. In view of this it is thought that the acoustic source
responsible for speech must have generated this periodic spectrum, which might have been modified periodically to imprint the intelligibility. If this is true, one can perhaps easily understand this feature of the speech viz., the intelligibility is retained in a bandpassed speech of bandwidth 1 K H z . the pass band located any where in the speech spectrum of 0-4 KHz. This thesis describing the experiments and the analysis of the speech has been presented in 5 chapters. Chapter 1 deals with the basics of speech and the processing tools used to analyse the speech signal. Chapter 2 presents the literature survey from where the present problem is tracked down. Chapter 3 describes the details of the structure and the fabrication of the experimental setup that has been used. In chapter 4, the detailed account of the way in which the
experiments are conducted and the way in which the speech is analysed is given. In conclusion in chapter 5, the work is summarised and the future work needed to establish the mechanism of speech responsible for the feature of speech described in this thesis is suggested.
|
530 |
Combinatorial Considerations on Two Models from Statistical MechanicsThapper, Johan January 2007 (has links)
<p>Interactions between combinatorics and statistical mechanics have provided many fruitful insights in both fields. A compelling example is Kuperberg’s solution to the alternating sign matrix conjecture, and its following generalisations. In this thesis we investigate two models from statistical mechanics which have received attention in recent years.</p><p>The first is the fully packed loop model. A conjecture from 2001 by Razumov and Stroganov opened the field for a large ongoing investigation of the O(1) loop model and its connections to a refinement of the fully packed loop model. We apply a combinatorial bijection originally found by de Gier to an older conjecture made by Propp.</p><p>The second model is the hard particle model. Recent discoveries by Fendley et al. and results by Jonsson suggests that the hard square model with cylindrical boundary conditions possess some beautiful combinatorial properties. We apply both topological and purely combinatorial methods to related independence complexes to try and gain a better understanding of this model.</p>
|
Page generated in 0.0697 seconds