31 |
Petroleum refinery scheduling with consideration for uncertaintyHamisu, Aminu Alhaji 07 1900 (has links)
Scheduling refinery operation promises a big cut in logistics cost, maximizes efficiency, organizes allocation of material and resources, and ensures that production meets targets set by planning team. Obtaining accurate and reliable schedules for execution in refinery plants under different scenarios has been a serious challenge. This research was undertaken with the aim to develop robust methodologies and solution procedures to address refinery scheduling problems with uncertainties in process parameters.
The research goal was achieved by first developing a methodology for short-term crude oil unloading and transfer, as an extension to a scheduling model reported by Lee et al. (1996). The extended model considers real life technical issues not captured in the original model and has shown to be more reliable through case studies. Uncertainties due to disruptive events and low inventory at the end of scheduling horizon were addressed. With the extended model, crude oil scheduling problem was formulated under receding horizon control framework to address demand uncertainty. This work proposed a strategy called fixed end horizon whose efficiency in terms of performance was investigated and found out to be better in comparison with an existing approach.
In the main refinery production area, a novel scheduling model was developed. A large scale refinery problem was used as a case study to test the model with scheduling horizon discretized into a number of time periods of variable length. An equivalent formulation with equal interval lengths was also presented and compared with the variable length formulation. The results obtained clearly show the advantage of using variable timing. A methodology under self-optimizing control (SOC) framework was then developed to address uncertainty in problems involving mixed integer formulation. Through case study and scenarios, the approach has proven to be efficient in dealing with uncertainty in crude oil composition.
|
32 |
Autonomic Programming Paradigm for High Performance ComputingJararweh, Yaser January 2010 (has links)
The advances in computing and communication technologies and software tools have resulted in an explosive growth in networked applications and information services that cover all aspects of our life. These services and applications are inherently complex, dynamic and heterogeneous. In a similar way, the underlying information infrastructure, e.g. the Internet, is large, complex, heterogeneous and dynamic, globally aggregating large numbers of independent computing and communication resources. The combination of the two results in application development and management complexities that break current computing paradigms, which are based on static behaviors. As a result, applications, programming environments and information infrastructures are rapidly becoming fragile, unmanageable and insecure. This has led researchers to consider alternative programming paradigms and management techniques that are based on strategies used by biological systems. Autonomic programming paradigm is inspired by the human autonomic nervous system that handles complexity, uncertainties and abnormality. The overarching goal of the autonomic programming paradigm is to help building systems and applications capable of self-management. Firstly, we investigated the large-scale scientific computing applications which generally experience different execution phases at run time and each phase has different computational, communication and storage requirements as well as different physical characteristics. In this dissertation, we present Physics Aware Optimization (PAO) paradigm that enables programmers to identify the appropriate solution methods to exploit the heterogeneity and the dynamism of the application execution states. We implement a Physics Aware Optimization Manager to exploit the PAO paradigm. On the other hand we present a self configuration paradigm based on the principles of autonomic computing that can handle efficiently complexity, dynamism and uncertainty in configuring server and networked systems and their applications. Our approach is based on making any resource/application to operate as an Autonomic Component (that means it can be self-managed component) by using our autonomic programming paradigm. Our POA technique for medical application yielded about 3X improvement of performance with 98.3% simulation accuracy compared to traditional techniques for performance optimization. Also, our Self-configuration management for power and performance management in GPU cluster demonstrated 53.7% power savings for CUDAworkload while maintaining the cluster performance within given acceptable thresholds.
|
33 |
Savanorių veiklos optimizavimas nevyriausybinėse socialinės pagalbos vaikams organizacijose / Optimizing volunteer activities non-governmental organizations which provide social assistance to childrenČemeškienė, Rita 08 June 2004 (has links)
SUMMARY
New volunteering traditions have presently been emerging in Lithuania. Volunteer work complements the governmental network of social security as well as satisfies relevant demands of the society. Volunteers devote their own time to satisfy the needs of other people without any pay. They establish non-governmental organizations intended to render social assistance to different social groups of the population. The objective of this paper is aimed at optimizing volunteer activities in non-governmental organizations (NGO) which provide social assistance to children.
The objective of the research is to theoretically and empirically develop the motivation of volunteers, to reveal the importance of their motivation and preparation to work with children optimizing the activities of NGOs. The content of the present research includes analytical survey of NGO volunteer activities: the concept of volunteer activities NGO in Lithuania; the target of volunteering if defined as children of risky group social development; motivation of volunteer activities and preparation to work with children. Motivation theories have also been reviewed as well as motivation factors inside the organization and training of volunteers. Results of research volunteer activities in NGOs have also been presented. The outcome of the research is the following: altruistic motives and the need to socialize stimulate volunteers to volunteering activities. The substance of volunteer needs is generally formed... [to full text]
|
34 |
Toepassing van hidrodinamiese modelle om kenmerkende randwaardes, geldig vir vloedbesproeiing in Suid-Afrika, af te lei / G.H.J. KrugerKruger, Gert Hendrik Jacobus January 2007 (has links)
Thesis (M.Ing. (Development and Management))--North-West University, Potchefstroom Campus, 2008.
|
35 |
Toepassing van hidrodinamiese modelle om kenmerkende randwaardes, geldig vir vloedbesproeiing in Suid-Afrika, af te lei / G.H.J. KrugerKruger, Gert Hendrik Jacobus January 2007 (has links)
Thesis (M.Ing. (Development and Management))--North-West University, Potchefstroom Campus, 2008.
|
36 |
Knowledge Management as a tool in Health Care Systems optimization : The case of Närsjukvården Österlen ABLassen Nielsen, Anders January 2006 (has links)
Background: Närsjukvården Österlen AB (=NÖAB) won a five-year contract, late in 2000, to operate the local health care services in Simrishamn on behalf of the Region Skåne. The economical forecast for 2002 was a loss of 18 million SEK. A turnaround was urgent. Aim: Primarily to evaluate Knowledge Management (=KM) techniques as a tool in the process of turning a health care organization around. Secondarily, to describe the means by which NÖAB became a more efficient health care organization. In order to evaluate the use of KM in the turnaround process it is necessary to answer three fundamental research questions. Did a turnaround take place? Did the individual projects contribute to increased efficiency? And finally can the approach used in the projects be characterized as KM. Method: The study was an ongoing case study using action research combined with evaluation. The Evaluation uses public data (both quantitative and qualitative) and evaluations done by third parties. That allows for a profound validation of the conclusions. Three central processes were singled out for the evaluation. 1) The makeover of the acute patients’ way into the system, 2) the disease management program (=DPM) for patients suffering from COPD and 3) the introduction of an error-management system. Results: The operating results were raised from minus 15 million SEK in 2002 to plus 10 million SEK in 2005. Manhours were reduced with 20.6%. The average cost for a consultations were reduced with 24.6%. The introduction of the COPD DPM resulted in a saving of approximately 1 million SEK a year. A total of 312 adverse event reports were filled during the first 10 month - an average of 31 a month. The introduction of KM turned the organization into a patient centered, lean health care organization. Changed the decisions making, and resulted in a significant shift towards an acceptance culture. Conclusion: From the nature of the described projects, the description of the landmarks used and the discussion on how the projects fit into a Knowledge Management way of thinking it is concluded that a Knowledge Management approach was applied. The success of the turnaround described in the case makes a strong argument for the use of Knowledge Management when faced with the need to optimize health care systems. / <p>ISBN 91-7997-162-8</p>
|
37 |
Practical Optimal Experimental Design in Drug Development and Drug Treatment using Nonlinear Mixed Effects ModelsNyberg, Joakim January 2011 (has links)
The cost of releasing a new drug on the market has increased rapidly in the last decade. The reasons for this increase vary with the drug, but the need to make correct decisions earlier in the drug development process and to maximize the information gained throughout the process is evident. Optimal experimental design (OD) describes the procedure of maximizing relevant information in drug development and drug treatment processes. While various optimization criteria can be considered in OD, the most common is to optimize the unknown model parameters for an upcoming study. To date, OD has mainly been used to optimize the independent variables, e.g. sample times, but it can be used for any design variable in a study. This thesis addresses the OD of multiple continuous or discrete design variables for nonlinear mixed effects models. The methodology for optimizing and the optimization of different types of models with either continuous or discrete data are presented and the benefits of OD for such models are shown. A software tool for optimizing these models in parallel is developed and three OD examples are demonstrated: 1) optimization of an intravenous glucose tolerance test resulting in a reduction in the number of samples by a third, 2) optimization of drug compound screening experiments resulting in the estimation of nonlinear kinetics and 3) an individual dose-finding study for the treatment of children with ciclosporin before kidney transplantation resulting in a reduction in the number of blood samples to ~27% of the original number and an 83% reduction in the study duration. This thesis uses examples and methodology to show that studies in drug development and drug treatment can be optimized using nonlinear mixed effects OD. This provides a tool than can lower the cost and increase the overall efficiency of drug development and drug treatment.
|
38 |
Integrated compiler optimizations for tensor contractionsGao, Xiaoyang, January 2008 (has links)
Thesis (Ph. D.)--Ohio State University, 2008. / Title from first page of PDF file. Includes bibliographical references (p. 140-144).
|
39 |
Investigando estratégias otimizadas para monitoramento eficiente do universo BitTorrent / Investigating optimized strategies for efficiently monitoring the bittorrent universeMansilha, Rodrigo Brandão January 2012 (has links)
Trabalhos recentes na literatura indicam que o BitTorrent é o protocolo de com- partilhamento de arquivos com maior popularidade, sendo responsável por mais da metade do tráfego P2P em algumas localidades geográficas. Apesar de vários estudos sobre a dinâmica do “universo BitTorrent”, até recentemente não existia metodologia para observá-lo sistematicamente. Um estudo preliminar indicou a existência de múltiplas estratégias de monito- ramento, que diferem em termos de objetos observados, conjunto de parâmetros e custos associados, e se sobrepõem em termos de informações extraídas. Um se- gundo trabalho apresentou uma combinação dessas estratégias na forma de uma arquitetura de monitoramento flexível e escalável. Nesse contexto, o objetivo da presente dissertação é investigar como otimizar o conjunto de estratégias e seus parâmetros para monitorar eficientemente o universo de redes BitTorrent tendo em vista um dado conjunto de informações a ser observado e recursos computacionais disponíveis. Como solução é proposto um controle de monitoramento adaptativo, que emprega um modelo de programação para otimizar, a cada rodada, o monitoramento considerando o estado percebido da rede. Embora o foco deste trabalho seja redes BitTorrent, é proposto um modelo genérico, que pode ser aplicado a outras redes P2P, aumentando portanto a contribuição desta dissertação. Os resultados de uma avaliação analítica indicam que o modelo de programação proposto gera soluções ótimas. Além disso, experimentos realizados com instâncias desse modelo geradas aleatoriamente mostram que o mesmo tem potencial para ser aplicado em redes mais complexas que BitTorrent. / Recent studies in the literature have indicated that BitTorrent is the most pop- ular file sharing protocol, being responsible for more than half of the P2P traffic in some geographical locations. Despite the several studies about the dynamics of the “BitTorrent universe”, until recently there has been no methodology to systematically observe it. A preliminary study identified multiple monitoring strategies, which differ in terms of observed objects, set of parameters and associated costs, and overlap in terms of extracted information. A subsequent work has combined those strategies in a flexible, extensible BitTorrent monitoring architecture. In this context, the goal of the present dissertation is to investigate how to optimize the set of strategies and their parameters for efficiently monitoring the universe of BitTorrent networks, considering a given set of monitoring objectives and available computational resources. As a solution, an adaptive monitoring control is proposed, which employs a programming model to optimize the monitoring at each round, considering the perceived network state. Although the focus of this work is BitTorrent networks, we propose a generic model that can be applied to other P2P networks, thereby increasing the contribution of this dissertation. The results of an analytical evaluation indicate that the proposed programming model generates optimal solutions. Furthermore, experiments carried out with ran- domly generated instances of this model show that it has potential to be applied to more complex networks than BitTorrent.
|
40 |
A importância do ponto de operação nas técnicas de self-optimizing controlSchultz, Eduardo dos Santos January 2015 (has links)
A otimização de processos vem se tornando uma ferramenta fundamental para o aumento da lucratividade das plantas químicas. Diversos métodos de otimização foram propostos ao longo dos anos, sendo que a otimização em tempo real (RTO) é a solução mais consolidada industrialmente, enquanto que o self-optimizing control (SOC) surge como uma alternativa simplificada, com um menor custo de implantação em relação a esse. Neste trabalho são estudados diversos aspectos da metodologia de SOC, iniciando pela análise do impacto do ponto de operação para o desenvolvimento de estruturas de controle auto-otimizáveis. São propostas modificações na formulação do problema de otimização de SOC de modo que as variáveis controladas sejam determinadas no mesmo problema de otimização em que é escolhido o ponto de operação, permitindo a redução da perda do processo. De forma a analisar a influência da dinâmica nos resultados obtidos, é realizado um estudo comparativo da perda gerada no processo ao longo da operação para as estruturas de otimização baseadas em RTO e em SOC. Com base nos resultados obtidos para uma unidade didática, mostra-se que o comportamento dinâmico do distúrbio possui grande influência na escolha da técnica de otimização, quebrando a ideia de que o RTO é um limite superior do SOC. A aplicação industrial das técnicas clássicas de SOC é validada em uma unidade de separação de propeno, baseada em uma unidade real em operação. A partir da modelagem do processo em simulador comercial, foram geradas as variáveis controladas que permitam uma perda aceitável para a unidade, comprovando a viabilidade de implantação da metodologia em unidades reais. / Process optimization has become a fundamental tool for increasing chemical plants profit. Several optimization methods have been proposed over the years, and real-time optimization (RTO) is the most consolidated solution industrially while self-optimizing control (SOC) appears as a simplified alternative with a lower implementation cost. In this work several aspects of SOC methodology are studied, starting from the analysis of the impact of operating point in the development of self-optimizing control structures. Improvements are proposed in SOC optimization problem formulation where controlled variables are determined in the same optimization problem that operating point, thus reducing significantly process loss. In order to analyze the influence of dynamics on the results, a comparative study is accomplished comparing the loss generated in the process throughout the operation for optimization structures based on RTO and SOC. With the results generated for a toy unit, it is shown that the disturbance dynamic behavior has a great influence on choosing the optimization technique, breaking the idea that RTO is an upper limit of SOC. The industrial application of classical SOC techniques is tested on a propylene separation unit, really operating nowadays. The process was modelled in a commercial simulator and with this model it was generated the best set of controlled variables, based on SOC, that achieve an acceptable loss for the unit, showing that the methodology can be applied in in real units.
|
Page generated in 0.0612 seconds