Spelling suggestions: "subject:"asimulator."" "subject:"cosimulator.""
421 |
Modeling Of Carbon Dioxide Sequestration In A Deep Saline AquiferBasbug, Basar 01 July 2005 (has links) (PDF)
ABSTRACT
MODELING OF CARBON DIOXIDE SEQUESTRATION
IN A DEEP SALINE AQUIFER
BASBUg, BaSar
M.S., Department of Petroleum and Natural Gas Engineering
Supervisor : Prof. Dr. Fevzi Gü / mrah
July 2005, 245 pages
CO2 is one of the hazardous greenhouse gases causing significant changes in the
environment. The sequestering CO2 in a suitable geological medium can be a feasible
method to avoid the negative effects of CO2 emissions in the atmosphere. CO2
sequestration is the capture of, separation, and long-term storage of CO2 in
underground geological environments.
A case study was simulated regarding the CO2 sequestration in a deep saline aquifer.
The compositional numerical model (GEM) of the CMG software was used to study
the ability of the selected aquifer to accept and retain the large quantities of injected
CO2 at supercritical state for long periods of time (200 years). A field-scale model
with two injectors and six water producers and a single-well aquifer model cases were
studied.
In a single-well aquifer model, the effects of parameters such as vertical to horizontal
permeability ratio, aquifer pressure, injection rate, and salinity on the sequestration
process were examined and the sensitivity analyses were performed after simulating
the field-scale model.
The supercritical CO2, one-state fluid which exhibits both gas and liquid-like
properties, and gaseous CO2 were sequestered in the forms of free CO2 bubble,
dissolved CO2 in brine and precipitated CO2 with calcite mineral in a deep saline
aquifer. The isothermal condition was assumed during injection and sequestration
processes. The change in porosity and permeability values that might have occurred
due to mineralization and CO2 adsorption on rock were not considered in this study.
Vertical to horizontal permeability ratio and initial pressure conditions were the most
dominating parameters affecting the CO2 saturation in each layer of the aquifer
whereas CO2 injection rate influenced CO2 saturation in middle and bottom layers
since CO2 was injected through bottom layer.
|
422 |
A New Reactive Method For Processing Web Usage DataBayir, Murat Ali 01 July 2007 (has links) (PDF)
In this thesis, a new reactive session reconstruction method ' / Smart-SRA' / is introduced. Web usage mining is a type of web mining, which exploits data mining techniques to discover valuable information from navigations of Web users. As in classical data mining, data processing and pattern discovery are the main issues in web usage mining. The first phase of the web usage mining is the data processing phase including session reconstruction. Session reconstruction is the most important task of web usage mining since it directly affects the quality of the extracted frequent patterns at the final step, significantly. Session reconstruction methods can be classified into two categories, namely ' / reactive' / and ' / proactive' / with respect to the data source and the data processing time. If the user requests are processed after the server handles them, this technique is called as &lsquo / reactive&rsquo / , while in &lsquo / proactive&rsquo / strategies this processing occurs during the interactive browsing of the web site. Smart-SRA is a reactive session reconstruction techique, which uses web log data and the site topology. In order to compare Smart-SRA with previous reactive methods, a web agent simulator has been developed. Our agent simulator models behavior of web users and generates web user navigations as well as the log data kept by the web server. In this way, the actual user sessions will be known and the successes of different techniques can be compared. In this thesis, it is shown that the sessions generated by Smart-SRA are more accurate than the sessions constructed by previous heuristics.
|
423 |
Modeling the Effect of Hurricanes on Power Distribution SystemsChanda, Suraj 2011 August 1900 (has links)
There are many calamitous events such as earthquakes, hurricanes, tsunamis etc. that occur suddenly and cause great loss of life, damage, or hardship. Hurricanes cause significant damage to power distribution systems, resulting in prolonged customer outages and excessive delays in the reconstruction efforts. Accordingly, predicting the effects of power outages on the performance of power distribution systems is of major importance to government agencies, utilities, and customers. Unfortunately, the current tools to predict the performance of power distribution systems during catastrophic events are limited in both the performance measures considered, as well as in their ability to model real systems.
The main goal of this research is to develop a methodology for simulating hurricanes of different intensity on power distribution systems of small and medium size cities. Our study includes a detailed comparison between the engineering-based and connectivity-based models of power distribution systems, as well as the impact of power re-routing algorithms on the system reliability. Our approach is based on fragility curves that capture the ability of the system to withstand a range of wind speeds. In addition, we develop a multiscale approach that facilitates efficient computation of fragility curves for large cities. With this approach, hurricanes are simulated over small zones of a city and fragility curves are obtained. These are used to estimate the damage for identical zones throughout the city. To validate our techniques, two testbeds, Micropolis and Mesopolis, were used. Micropolis is synthetic model for a small city and Mesopolis for a big city. Obtained results have validated our approach and have shown that they can be used to effectively predict hurricane damage.
|
424 |
Application of Genetic Algorithm to a Forced Landing Manoeuvre on Transfer of Training AnalysisTong, Peter, mail@petertong.com January 2007 (has links)
This study raises some issues for training pilots to fly forced landings and examines the impact that these issues may have on the design of simulators for such training. It focuses on flight trajectories that a pilot of a single-engine general aviation aircraft should fly after engine failure and how pilots can be better simulator trained for this forced landing manoeuvre. A sensitivity study on the effects of errors and an investigation on the effect of tolerances in the aerodynamic parameters as prescribed in the Manual of Criteria for the Qualification of Flight Simulators have on the performance of flight simulators used for pilot training was carried out. It uses a simplified analytical model for the Beech Bonanza model E33A aircraft and a vertical atmospheric turbulence based on the MIL-F-8785C specifications. It was found that the effect of the tolerances is highly sensitive on the nature of the manoeuvre flown and that in some cases, negative transfe r of training may be induced by the tolerances. A forced landing trajectory optimisation was carried out using Genetic Algorithm. The forced landing manoeuvre analyses with pre-selected touchdown locations and pre-selected final headings were carried out for an engine failure at 650 ft AGL for bank angles varying from banking left at 45° to banking right at 45°, and with an aircraft's speed varying from 75.6 mph to 208 mph, corresponding to 5% above airplane's stall speed and airplane's maximum speed respectively. The results show that certain pre-selected touchdown locations are more susceptible to horizontal wind. The results for the forced landing manoeuvre with a pre-selected location show minimal distance error while the quality of the results for the forced landing manoeuvre with a pre-selected location and a final heading show that the results depend on the end constraints. For certain pre-selected touchdown locations and final headings, the airplane may either touchdown very close to the pre-selected touchdown location but with greater final h eading error from the pre-selected final heading or touchdown with minimal final heading error from the pre-selected final heading but further away from the pre-selected touchdown location. Analyses for an obstacle avoidance forced landing manoeuvre were also carried out where an obstacle was intentionally placed in the flight path as found by the GA program developed for without obstacle. The methodology developed successfully found flight paths that will avoid the obstacle and touchdown near the pre-selected location. In some cases, there exist more than one ensemble grouping of flight paths. The distance error depends on both the pre-selected touchdown location and where the obstacle was placed. The distance error tends to increase with the addition of a specific final heading requirement for an obstacle avoidance forced landing manoeuvre. As with the case without specific final heading requirement, there is a trade off between touching down nearer to the pre-selected location and touching down with a smaller final heading error.
|
425 |
Concise analysis and testing of a software model of a satellite remote sensing system used for image generationVan der Westhuizen, Lynette 12 1900 (has links)
Thesis (MEng) -- Stellenbosch University, 2014. / ENGLISH ABSTRACT: The capability of simulating the output image of earth observation satellite sensors is
of great value, as it reduces the dependency on extensive field tests when developing,
testing and calibrating satellite sensors. The aim of this study was to develop a software
model to simulate the data acquisition process used by passive remote sensing satellites
for the purpose of image generation. To design the software model, a comprehensive
study was done of a physical real world satellite remote sensing system in order to
identify and analyse the different elements of the data acquisition process. The different
elements were identified as being the target, the atmosphere, the sensor and satellite,
and radiation. These elements and a signature rendering equation are used to model
the target-atmosphere-sensor relationship of the data acquisition process. The signature
rendering equation is a mathematical model of the different solar and self-emitted thermal
radiance paths that contribute to the radiance reaching the sensor. It is proposed that the
software model be implemented as an additional space remote sensing application in the
Optronics Sensor Simulator (OSSIM) simulation environment. The OSSIM environment
provides the infrastructure and key capabilities upon which this specialist work builds.
OSSIM includes a staring array sensor model, which was adapted and expanded in this
study to operate as a generic satellite sensor. The OSSIM signature rendering equation
was found to include all the necessary terms required to model the at-sensor radiance
for a satellite sensor with the exception of an adjacency effect term. The equation was
expanded in this study to include a term to describe the in-field-of-view adjacency effect
due to aerosol scattering. This effect was modelled as a constant value over the sensor
field of view. Models were designed to simulate across-track scanning mirrors, the satellite
orbit trajectory and basic image processing for geometric discontinuities. Testing of the
software model showed that all functions operated correctly within the set operating
conditions and that the in-field-of-view adjacency effect can be modelled effectively by a
constant value over the sensor field of view. It was concluded that the satellite remote
sensing software model designed in this study accurately simulates the key features of the real world system and provides a concise and sound framework on which future
functionality can be expanded. / AFRIKAANSE OPSOMMING: Dit is nuttig om ’n sagteware program te besit wat die gegenereerde beelde van ’n
satellietsensor vir aarde-waarneming kan naboots. So ’n sagteware program sal die
afhanklikheid van breedvoerige veldwerktoetse verminder gedurende die ontwerp, toetsing
en kalibrasie fases van die ontwikkeling van ’n satellietsensor. Die doel van hierdie
studie was om ’n sagteware model te ontwerp wat die dataverwerwingsproses van ’n
passiewe satelliet afstandswaarnemingstelsel kan naboots, met die doel om beelde te
genereer. Om die sagteware model te ontwerp het ’n omvattende studie van ’n fisiese
regte wêreld satelliet afstandswaarnemingstelsel geverg, om die verskillende elemente
van die dataverwerwingsproses te identifiseer en te analiseer. Die verskillende elemente
is geïdentifiseer as die teiken, die atmosfeer, die sensor en satelliet, en vloed. Hierdie
elemente, tesame met ’n duimdrukvergelyking, is gebruik om die teiken-atmosfeer-sensor
verhouding van die dataverwerwingsproses te modelleer. Die duimdrukvergelyking is ’n
wiskundige model van die verskillende voortplantingspaaie van gereflekteerde sonvloed en
self-stralende termiese vloed wat bydra tot die totale vloed wat die sensor bereik. Dit is
voorgestel dat die sagteware model as ’n addisionele ruimte afstandswaarnemingtoepassing
in die ‘Optronics sensor Simulator’ (OSSIM) simulasie-omgewing geïmplementeer word.
Die OSSIM simulasie-omgewing voorsien die nodige infrastruktuur en belangrike funksies
waarop hierdie spesialis werk gebou kan word. OSSIM het ’n starende-skikking sensor
model wat in hierdie studie aangepas is en uitgebrei is om as ’n generiese satellietsensor
te funksioneer. Die OSSIM duimdrukvergelyking bevat al die nodige radiometriese terme,
behalwe ’n nabyheids-verstrooiing term, om die vloed by die satellietsensor te modeleer.
Die duimdrukvergelyking is uitgebrei in hierdie studie om ’n term in te sluit wat die
verstrooiing van vloed vanaf naby-geleë voorwerpe, as gevolg van aerosol verstrooiing, kan
beskryf. Die nabyheids-verstrooiing is gemodeleer as ’n konstante waarde oor die sigveld
van die sensor. Modelle is ontwerp om die beweging van oor-baan skandering-spieëls
en die satelliet wentelbaan trajek te bereken. ’n Basiese beeldverwerkings model is ook ontwerp om diskontinuïteite in geometriese vorms in die sensor beelde reg te stel. Toetsing
van die sagteware model het gewys dat al die funksies korrek gefunksioneer het binne die
limiete van die vasgestelde operasionele voorwaardes. Die toets resultate het ook bewys
dat die in-sig-veld nabyheids-verstrooiing akkuraat gemodeleer kan word as ’n konstante
waarde oor die sensor sigveld. Daar is tot die gevolgtrekking gekom dat die satelliet
afstandswaarneming sagteware model wat in hierdie studie ontwerp is al die belangrikste
kenmerke van die werklike wêreld stelsel kan simuleer. Die model vorm ’n beknopte en
stewige raamwerk waarop toekomstige werk uitgebrei kan word.
|
426 |
Requirements specification for the optimisation function of an electric utility's energy flow simulatorHatton, Marc 03 1900 (has links)
Thesis (MEng)--Stellenbosch University, 2015. / ENGLISH ABSTRACT: Efficient and reliable energy generation capability is vital to any
country's economic growth. Many strategic, tactical and operational
decisions take place along the energy supply chain. Shortcomings in
South Africa's electricity production industry have led to the development
of an energy
ow simulator. The energy
ow simulator is
claimed to incorporate all significant factors involved in the energy
ow process from primary energy to end-use consumption. The energy
ow simulator thus provides a decision support system for electric
utility planners.
The original aim of this study was to develop a global optimisation
model and integrate it into the existing energy
ow simulator. After
gaining an understanding of the architecture of the energy
ow simulator
and scrutinising a large number of variables, it was concluded that
global optimisation was infeasible. The energy
ow simulator is made
up of four modules and is operated on a module-by-module basis, with
inputs and outputs
owing between modules. One of the modules,
namely the primary energy module, lends itself well to optimisation.
The primary energy module simulates coal stockpile levels through
Monte Carlo simulation. Classic inventory management policies were
adapted to fit the structure of the primary energy module, which is
treated as a black box. The coal stockpile management policies that
are introduced provide a prescriptive means to deal with the stochastic
nature of the coal stockpiles.
As the planning horizon continuously changes and the entire energy
ow
simulator has to be re-run, an efficient algorithm is required to optimise
stockpile management policies. Optimisation is achieved through
the rapidly converging cross-entropy method. By integrating the simulation and optimisation model, a prescriptive capability is added
to the primary energy module. Furthermore, this study shows that
coal stockpile management policies can be improved. An integrated
solution is developed by nesting the primary energy module within the
optimisation model. Scalability is incorporated into the optimisation
model through a coding approach that automatically adjusts to an everchanging
planning horizon as well as the commission and decommission
of power stations.
As this study is the first of several research projects to come, it paves
the way for future research on the energy
ow simulator by proposing
future areas of investigation. / AFRIKAANSE OPSOMMING: Effektiewe en betroubare energie-opwekkingsvermoë is van kardinale belang
in enige land se ekonomiese groei. Baie strategiese, taktiese en operasionele
besluite word deurgaans in die energie-verskaffingsketting geneem.
Tekortkominge in Suid-Afrika se elektrisiteitsopwekkingsindustrie
het tot die ontwikkeling van 'n energie-vloei-simuleerder gelei. Die
energie-vloei-simuleerder vervat na bewering al die belangrike faktore
wat op die energie-vloei-proses betrekking het van primêre energieverbruik
tot eindgebruik. Die energie-vloei-simuleerder verskaf dus 'n
ondersteuningstelsel aan elektrisiteitsdiensbeplanners vir die neem van
besluite.
Die oorspronklike doel van hierdie studie was om 'n globale optimeringsmodel
te ontwikkel en te integreer in die bestaande energie-vloeisimuleerder.
Na 'n begrip aangaande die argitektuur van die energievloei-
simuleerder gevorm is en 'n groot aantal veranderlikes ondersoek
is, is die slotsom bereik dat globale optimering nie lewensvatbaar is
nie. Die energie-vloei-simuleerder bestaan uit vier eenhede en werk op
'n eenheid-tot-eenheid basis met insette en uitsette wat tussen eenhede
vloei. Een van die eenhede, naamlik die primêre energiemodel, leen
dit goed tot optimering. Die primêre energiemodel boots steenkoolreserwevlakke
deur Monte Carlo-simulering na. Tradisionele voorraadbestuursbeleide
is aangepas om die primêre energiemodel se struktuur
wat as 'n swartboks hanteer word, te pas. Die steenkoolreserwebestuursbeleide
wat ingestel is, verskaf 'n voorgeskrewe middel om met
die stogastiese aard van die steenkoolreserwes te werk.
Aangesien die beplanningshorison deurgaans verander en die hele
energie-vloei-simulering weer met die energie-vloei-simuleerder uitgevoer
moet word, word 'n effektiewe algoritme benodig om die re-serwebestuursbeleide te optimeer. Optimering word bereik deur die
vinnige konvergerende kruis-entropie-metode. 'n Geïntegreerde oplossing
is ontwikkel deur die primêre energiemodel en die optimering
funksie saam te voeg. Skalering word ingesluit in die optimeringsmodel
deur 'n koderingsbenadering wat outomaties aanpas tot 'n
altyd-veranderende beplanningshorison asook die ingebruikneem en
uitgebruikstel van kragstasies.
Aangesien hierdie studie die eerste van verskeie navorsingsprojekte
is, baan dit die weg vir toekomstige navorsing oor die energie-vloeisimuleerder
deur ondersoekareas vir die toekoms voor te stel.
|
427 |
Simulador de redes Profibus / Profibus network simulatorRenato da Veiga Torres 19 September 2013 (has links)
Este trabalho propõe o desenvolvimento de um simulador de redes Profibus DP que reproduz o comportamento de uma rede real. A simulação de redes em condições de falha permite aos usuários experimental situações que podem ocorrer em instalações reais, facilitando o treinamento de manutenção e antecipando ações corretivas. O simulador permite o treinamento de usuários que utilizam ferramentas de diagnóstico. Auxilia no estudo do protocolo Profibus DP e também contribui com o desenvolvimento da ferramenta Profidoctor que é um sistema especialista com o objetivo de fazer a análise de problemas de rede. / This work aims development of Profibus DP network simulator that reproduces real network behavior. The network simulation under fault conditions allows experiment situations that may occur in real network installations, helping in maintenance training and corrective actions. This simulator tool helps in diagnostic tool users training, Profibus DP protocol learning and also assists development of new tool Profidoctor: an expert system that will report complete analysis of network problems.
|
428 |
Design and real-time process optimisation of steam assisted gravity drainage for improved heavy oil recoveryBali, Amol Bhagwan January 2013 (has links)
“Introduction to the Canadian Oil Sands”, “Canada’s Oil Sand Industry: An Overview”, “Heavy Oil Technologies”, and so many other topics about heavy oil have become the hotcakes in the oil industry. A number of new projects are in Execute phase for the development of heavy oil assets. This clearly shows the increasing demand for heavy oil. An oil industry is working hard to meet the world oil demand by developing deep water, HPHT, heavy oil, shale sands and all other non-conventional reservoirs but the main challenge is to develop and operate them in a risk free environment. Understanding the reservoir and fluid properties and developing new technologies help the industry to reduce the risk in developing non-conventional fields. A major problem in heavy oil field is to understand the behaviour of heavy oil. The viscous oil flows sluggishly in the formations and hence it is difficult to transport through unconsolidated formations and is very difficult to produce by conventional methods. Viscous oil recovery entails neatly designed enhanced oil recovery processes like Steam Assisted Gravity Drainage and the success of such technologies are critically dependent on accurate knowledge of reservoir, well and fluid properties of oil under variety of pressure and temperature conditions. This research project has provided some solutions to the challenges in heavy oil field development and can help the oil industry to optimise heavy oil production. Detailed experimental understanding of PVT properties has allowed this project to contribute to the knowledge. Reservoir, well and fluid properties were studied thoroughly and demonstrated the criticality of each parameter on the efficiency of Steam Assisted Gravity Drainage. An user friendly SAGD simulator is a big output of this research which allows the user to optimise the heavy oil recovery and enables to do risk assessments quickly during design phase of SAGD. A SAGD simulator is developed.
|
429 |
Avaliação de misturas mornas com emprego de simulador de tráfego linear / Evaluation of warm asphalt mixtures with the use of linear traffic simulatorRivoire Junior, Larry January 2014 (has links)
Esta pesquisa relata o estudo da elaboração e avaliação de desempenho de misturas asfálticas mornas com uso de zeólitas naturais. Ela foi dividida em três etapas. A primeira consistiu na execução de ensaios de laboratório que buscaram conhecer algumas propriedades e o funcionamento da mistura asfáltica morna com utilização de zeólitas naturais através da determinação do projeto da mistura e da elaboração em diferentes temperaturas. A segunda etapa consistiu na preparação da usina de asfalto para misturas à quente para a fabricação de misturas a temperaturas inferiores, nas regulagens de temperatura e na verificação dos procedimentos necessários para adequar uma usina padrão para elaborar misturas mornas nas temperaturas de projeto. Na última etapa, foi feito o acompanhamento do transporte, espalhamento, compactação das misturas mornas e posterior avaliação de desempenho dos pavimentos, com o emprego de simulador de tráfego linear, em três testes, realizados em duas pistas experimentais, localizadas na Área de Testes de Pavimentos, no Campus do Vale da Universidade Federal do Rio Grande do Sul. / This research reports the study of the production, implementation and performance evaluation of warm asphalt mixtures prepared by adding natural zeolites. It was divided into three stages. The first consisted of performing laboratory tests that get to know some properties and the functioning of warm mix asphalt, determining the projects of mixtures which were used in accelerated tests after analysis of specimens produced by different temperatures of manufacturing and compression. The second step consisted of preparing an asphalt plant for conventional mixtures for the manufacture of mixtures at lower temperatures, in temperature settings, and checking the necessary procedures to tailor a pilot plant to produce warm mixtures according to the temperatures of the project. In the last step, the transport, spreading and compaction of warm mixtures and subsequent performance evaluation of pavements were checked, with the use of linear traffic simulator in three tests conducted on two experimental tracks, located in Testing Pavement Area, on Valley Campus of the Federal University of Rio Grande do Sul.
|
430 |
Hardware Root of Trust for Linux Based Edge GatewayAl-Galby, Mohamed, Arezou, Madani January 2018 (has links)
Linux-based Edge Gateways that connects hundreds or maybe thousands of IoT devices, are exposed to various threats and cyber-attacks from the internet. These attacks form a considerable risk targeting the privacy and confidentiality of IoT devices throughout their gateways. Many researches and studies have been conducted to alleviate such a problem. One of the solutions can be achieved by building a root of trust based on a hardware module such as Trusted Platform Module (TPM) or software like Trusted Execution Environment (TEE). In this work, we provide a solution to the problem by enabling Hardware Root of Trust (HRoT) using TPM on a product from HMS Industrial Network AB known as GWen board, a Linux-based embedded system, used as gateway to connect IoT devices. We describe a method that uses the processor of the GWen (i.e. Zynq-7020 FPGA SoC) to enable secure boot. Besides, we provide a method to enable the TPM chip mounted on the GWen (i.e. SLB 9670 TPM 2.0) using TPM Software Stack TSS 2.0. We demonstrated, in detail, various use-cases using the TPM on GWen including cryptographic keys generation, secure key storage and key usage for different cryptographic operations. Furthermore, we conducted an analysis to the adopted solution by inspecting the latency of TPM commands on the GWen gateway. According to the high restrictions of TPM 2.0 specifications and based on our results, adding the TPM 2.0 to the IoT gateway GWen will enhance the security of its Linux distribution and will makes it possible to securely identify and authenticate the gateway on the network based on its secret keys that are stored securely inside its TPM.
|
Page generated in 0.0418 seconds