• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 17
  • 9
  • 6
  • 5
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 114
  • 18
  • 17
  • 16
  • 14
  • 11
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Torkklimat under byggproduktion : En kvantitativ studie baserad på klimatdata och litteraturstudier / Drying climate during construction condition : A quantitative study based on climate date and literature studies

Sidenqvist, Daniel, Ternstedt, Viktor January 2014 (has links)
En utmaning under produktionen är uttorkning av byggfukt i byggnadens betongbjälklag. Om uttorkningen försenas kan golvbeläggningar inte appliceras i rätt tid, men att arbetet forceras är inte ett alternativ, då risken finns att fuktskador uppkommer under driftsskedet. Avgörande för att uttorkningen skall fortskrida enligt plan är att torkklimatet i byggnaden är gynnsamt för uttorkningsprocessen. För att kontrollera torkklimatet genomför företaget ett omfattande arbete genom att mäta klimatet, där dataloggar på olika platser i byggnaden registrerar luftens temperatur och relativa fuktighet. Tillsammans med trendmätningar i betong är tanken att insamlad data skall ge en bild av hur uttorkningen fortskrider. Svårigheten med torkklimat är att ämnesområdet i många avseenden är abstrakt och teoretiskt, just därför att parametrarna som beskriver torkklimatet ständigt varierar. För att insamlad data skall addera värde för produktionen måste datamängderna analyseras och visualiseras på ett tydligt sätt, som medför att produktionsledningen kan agera utifrån informationen utan att nödvändigtvis besitta spetskompetens inom ämnesområdet. Examensarbetet syftar till att bistå företaget med ökade kunskaper om torkklimat under byggproduktion, kopplat till uttorkning av byggfukt. Under arbetet har förutsättningarna för en effektiv uttorkning studerats, med avseende på torkklimatets variation. Under arbetet har också en undersökning gjorts för att ta reda på när företaget i ett generellt fall bör komplettera torkklimatet med ökad ventilation och/eller avfuktning för att hålla ett jämnt torkklimat i byggnaden året om. Som bakgrund till ämnesområdet redovisas en beskrivning av de styrande faktorerna för torkklimatet, teoretiskt och projektspecifikt. Resultatet av studien visar att förutsättningarna för en effektiv uttorkning har funnits på de platser i byggnaden som studerats, i det avseendet att en majoritet av klimattrenderna följer uppsatta kriterier i projektets fuktsäkerhetsplan.  Studien visar också att en komplettering med ökad ventilation och/eller avfuktning bör starta under perioden mars till april och under oktober till november kan uttorkningsstrategin återgå till att enbart värma inomhusluften. / A challenge during construction is drying of construction moisture in the building's concrete slab. If the drying is delayed, the flooring can’t be applied at the right time, but forcing the process is not an option, because of the risk of moisture damage during the operating phase. Crucial to the planed drying is that the climate in the building is favorable for the drying process. In order to control the drying climate the company is conducting an extensive work by measuring, where data logs at different locations in the building records air temperature and relative humidity. Along with the trend measurements of moisture levels in the concrete slab, the idea is that the data series should provide a picture of how the drying progresses. The difficulty with drying climates is that the subject in many respects is abstract and theoretical, because the parameters that describe the drying climate are continuously changing. If the collected data should add value to the construction site, the amount of data needs to be analyzed and visualized in a clear way, which means that the company's construction management services can act based on the information without necessarily possessing expertise in the subject area. This bachelor degree project aims to assist the company with knowledge of drying climate during construction condition, linked to the drying of construction moisture. During the work, the conditions for effective drying of construction moisture have been studied with respect to variations in drying climate.  Another study also conducted, was to find out when the company in a general case may complement drying climate with increased ventilation and/or dehumidification to keep the drying climate at a steady level in the building year round. As background to the subject, a description of the factors controlling the drying climate is presented in the report, theoretical and specifically for the project. The results of the study show that the conditions for effective drying of construction moisture have been the case of the building sites that has been sampled, in the sense that a majority of climate trends follow the established criteria in the project's moisture safety plan. The study also shows that increased ventilation and/or dehumidification may be applied as a supplement in the period from March/April until October/November at which time the dehydration strategy can revert back to warming indoor air only.
102

Anomaly Detection for Root Cause Analysis in System Logs using Long Short-Term Memory / Anomalidetektion för Grundorsaksanalys i Loggar från Mjukvara med hjälp av Long Short-Term Memory

von Hacht, Johan January 2021 (has links)
Many software systems are under test to ensure that they function as expected. Sometimes, a test can fail, and in that case, it is essential to understand the cause of the failure. However, as systems grow larger and become more complex, this task can become non-trivial and potentially take much time. Therefore, even partially, automating the process of root cause analysis can save time for the developers involved. This thesis investigates the use of a Long Short-Term Memory (LSTM) anomaly detector in system logs for root cause analysis. The implementation is evaluated in a quantitative and a qualitative experiment. The quantitative experiment evaluates the performance of the anomaly detector in terms of precision, recall, and F1 measure. Anomaly injection is used to measure these metrics since there are no labels in the data. Additionally, the LSTM is compared with a baseline model. The qualitative experiment evaluates how effective the anomaly detector could be for root cause analysis of the test failures. This was evaluated in interviews with an expert in the software system that produced the log data that the thesis uses. The results show that the LSTM anomaly detector achieved a higher F1 measure than the proposed baseline implementation thanks to its ability to detect unusual events and events happening out of order. The qualitative results indicate that the anomaly detector could be used for root cause analysis. In many of the evaluated test failures, the expert being interviewed could deduce the cause of the failure. Even if the detector did not find the exact issue, a particular part of the software might be highlighted, meaning that it produces many anomalous log messages. With this information, the expert could contact the people responsible for that part of the application for help. In conclusion, the anomaly detector automatically collects the necessary information for the expert to perform root cause analysis. As a result, it could save the expert time to perform this task. With further improvements, it could also be possible for non-experts to utilise the anomaly detector, reducing the need for an expert. / Många mjukvarusystem testas för att försäkra att de fungerar som de ska. Ibland kan ett test misslyckas och i detta fall är det viktigt att förstå varför det gick fel. Detta kan bli problematiskt när mjukvarusystemen växer och blir mer komplexa eftersom att denna uppgift kan bli icke trivial och ta mycket tid. Om man skulle kunna automatisera felsökningsprocessen skulle det kunna spara mycket tid för de invloverade utvecklarna. Denna rapport undersöker användningen av en Long Short-Term Memory (LSTM) anomalidetektor för grundorsaksanalys i loggar. Implementationen utvärderas genom en kvantitativ och kvalitativ undersökning. Den kvantitativa undersökningen utvärderar prestandan av anomalidetektorn med precision, recall och F1 mått. Artificiellt insatta anomalier används för att kunna beräkna dessa mått eftersom att det inte finns etiketter i den använda datan. Implementationen jämförs också med en annan simpel anomalidetektor. Den kvalitativa undersökning utvärderar hur användbar anomalidetektorn är för grundorsaksanalys för misslyckade tester. Detta utvärderades genom intervjuer med en expert inom mjukvaran som producerade datan som användes in denna rapport. Resultaten visar att LSTM anomalidetektorn lyckades nå ett högre F1 mått jämfört med den simpla modellen. Detta tack vare att den kunde upptäcka ovanliga loggmeddelanden och loggmeddelanden som skedde i fel ordning. De kvalitativa resultaten pekar på att anomalidetektorn kan användas för grundorsaksanalys för misslyckade tester. I många av de misslyckade tester som utvärderades kunde experten hitta anledningen till att felet misslyckades genom det som hittades av anomalidetektorn. Även om detektorn inte hittade den exakta orsaken till att testet misslyckades så kan den belysa en vissa del av mjukvaran. Detta betyder att just den delen av mjukvaran producerad många anomalier i loggarna. Med denna information kan experten kontakta andra personer som känner till den delen av mjukvaran bättre för hjälp. Anomalidetektorn automatiskt den information som är viktig för att experten ska kunna utföra grundorsaksanalys. Tack vare detta kan experten spendera mindre tid på denna uppgift. Med vissa förbättringar skulle det också kunna vara möjligt för mindre erfarna utvecklare att använda anomalidetektorn. Detta minskar behovet för en expert.
103

A Real- time Log Correlation System for Security Information and Event Management

Dubuc, Clémence January 2021 (has links)
The correlation of several events in a period of time is a necessity for a threat detection platform. In the case of multistep attacks (attacks characterized by a sequence of executed commands), it allows detecting the different steps one by one and correlating them to raise an alert. It also allows detecting abnormal behaviors on the IT system, for example, multiple suspicious actions performed by the same account. The correlation of security events increases the security of the system and reduces the number of false positives. The correlation of the events is made thanks to pre- existing correlation rules. The goal of this thesis is to evaluate the feasibility of using a correlation engine based on Apache Spark. There is a necessity of changing the actual correlation system because it is not scalable, it cannot handle all the incoming data and it cannot perform some types of correlation like aggregating the events by attributes or counting the cardinality. The novelty is the improvement of the performance and the correlation capacities of the system. Two systems are proposed for correlating events in this project. The first one is based on Apache Spark Structured Streaming and analyzed the flow of security logs in real- time. As the results are not satisfactory, a second system is implemented. It uses a more traditional approach by storing the logs into an Elastic Search cluster and does correlation queries on it. In the end, the two systems are able to correlate the logs of the platform. Nevertheless, the system based on Apache Spark uses too many resources by correlation rule and it is too expensive to launch hundreds of correlation queries at the same time. For those reasons, the system based on Elastic Search is preferred and is implemented in the workflow. / Korrelation av flera händelser under en viss tidsperiod är en nödvändighet för plattformen för hotdetektering. När det gäller attacker i flera steg (attacker som kännetecknas av en sekvens av utförda kommandon) gör det möjligt att upptäcka de olika stegen ett efter ett och korrelera dem för att utlösa en varning. Den gör det också möjligt att upptäcka onormala beteenden i IT- systemet, t.ex. flera misstänkta åtgärder som utförs av samma konto. Korrelationen av säkerhetshändelser ökar systemets säkerhet och minskar antalet falska positiva upptäckter. Korrelationen av händelserna görs tack vare redan existerande korrelationsregler. Målet med den här avhandlingen är att utvärdera genomförbarheten av en korrelationsmotor baserad på Apache Spark. Det är nödvändigt att ändra det nuvarande korrelationssystemet eftersom det inte är skalbart, det kan inte hantera alla inkommande data och det kan inte utföra vissa typer av korrelation, t.ex. aggregering av händelserna efter attribut eller beräkning av kardinaliteten. Det nya är att förbättra systemets prestanda och korrelationskapacitet. I detta projekt föreslås två system för korrelering av händelser. Det första bygger på Apache Spark Structured Streaming och analyserade flödet av säkerhetsloggar i realtid. Eftersom resultaten inte var tillfredsställande har ett andra system införts. Det använder ett mer traditionellt tillvägagångssätt genom att lagra loggarna i ett Elastic Searchkluster och göra korrelationsförfrågningar på dem. I slutändan kan de två systemen korrelera plattformens loggar. Det system som bygger på Apache Spark använder dock för många resurser per korrelationsregel och det är för dyrt att starta hundratals korrelationsförfrågningar samtidigt. Av dessa skäl föredras systemet baserat på Elastic Search och det implementeras i arbetsflödet.
104

Détection dynamique des intrusions dans les systèmes informatiques / Dynamic intrusion detection in computer systems

Pierrot, David 21 September 2018 (has links)
La démocratisation d’Internet, couplée à l’effet de la mondialisation, a pour résultat d’interconnecter les personnes, les états et les entreprises. Le côté déplaisant de cette interconnexion mondiale des systèmes d’information réside dans un phénomène appelé « Cybercriminalité ». Des personnes, des groupes mal intentionnés ont pour objectif de nuire à l’intégrité des systèmes d’information dans un but financier ou pour servir une cause. Les conséquences d’une intrusion peuvent s’avérer problématiques pour l’existence d’une entreprise ou d’une organisation. Les impacts sont synonymes de perte financière, de dégradation de l’image de marque et de manque de sérieux. La détection d’une intrusion n’est pas une finalité en soit, la réduction du delta détection-réaction est devenue prioritaire. Les différentes solutions existantes s’avèrent être relativement lourdes à mettre place aussi bien en matière de compétence que de mise à jour. Les travaux de recherche ont permis d’identifier les méthodes de fouille de données les plus performantes mais l’intégration dans une système d’information reste difficile. La capture et la conversion des données demandent des ressources de calcul importantes et ne permettent pas forcément une détection dans des délais acceptables. Notre contribution permet, à partir d’une quantité de données relativement moindre de détecter les intrusions. Nous utilisons les événements firewall ce qui réduit les besoins en terme de puissance de calcul tout en limitant la connaissance du système d’information par les personnes en charge de la détection des intrusions. Nous proposons une approche prenant en compte les aspects techniques par l’utilisation d’une méthode hybride de fouille de données mais aussi les aspects fonctionnels. L’addition de ces deux aspects est regroupé en quatre phases. La première phase consiste à visualiser et identifier les activités réseau. La deuxième phase concerne la détection des activités anormales en utilisant des méthodes de fouille de données sur la source émettrice de flux mais également sur les actifs visés. Les troisième et quatrième phases utilisent les résultats d’une analyse de risque et d’audit technique de sécurité pour une prioritisation des actions à mener. L’ensemble de ces points donne une vision générale sur l’hygiène du système d’information mais aussi une orientation sur la surveillance et les corrections à apporter. L’approche développée a donné lieu à un prototype nommé D113. Ce prototype, testé sur une plate-forme d’expérimentation sur deux architectures de taille différentes a permis de valider nos orientations et approches. Les résultats obtenus sont positifs mais perfectibles. Des perspectives ont été définies dans ce sens. / The expansion and democratization of the digital world coupled with the effect of the Internet globalization, has allowed individuals, countries, states and companies to interconnect and interact at incidence levels never previously imagined. Cybercrime, in turn, is unfortunately one the negative aspects of this rapid global interconnection expansion. We often find malicious individuals and/or groups aiming to undermine the integrity of Information Systems for either financial gain or to serve a cause. The consequences of an intrusion can be problematic for the existence of a company or an organization. The impacts are synonymous with financial loss, brand image degradation and lack of seriousness. The detection of an intrusion is not an end in itself, the reduction of the delta detection-reaction has become a priority. The different existing solutions prove to be cumbersome to set up. Research has identified more efficient data mining methods, but integration into an information system remains difficult. Capturing and converting protected resource data does not allow detection within acceptable time frames. Our contribution helps to detect intrusions. Protect us against Firewall events which reduces the need for computing power while limiting the knowledge of the information system by intrusion detectors. We propose an approach taking into account the technical aspects by the use of a hybrid method of data mining but also the functional aspects. The addition of these two aspects is grouped into four phases. The first phase is to visualize and identify network activities. The second phase concerns the detection of abnormal activities using data mining methods on the source of the flow but also on the targeted assets. The third and fourth phases use the results of a risk analysis and a safety verification technique to prioritize the actions to be carried out. All these points give a general vision on the hygiene of the information system but also a direction on monitoring and corrections to be made.The approach developed to a prototype named D113. This prototype, tested on a platform of experimentation in two architectures of different size made it possible to validate our orientations and approaches. The results obtained are positive but perfectible. Prospects have been defined in this direction.
105

Technology and L2 writing : EFL student perspectives on electronic feedback using online learning logs

Zareekbatani, Alireza January 2015 (has links)
The use of instructional technology has opened up new avenues in education with broad implications in the foreign or additional language (L2) learning context. One of the research priorities is to explore student perceptions of the use of such modern means in their education which otherwise might not be anticipated. The present study aimed to determine (a) the perceived affordances as well as limitations of the information and communication technology (ICT) pedagogical application in coded corrective feedback (e-feedback) provision on L2 writing, (b) English as a foreign language (EFL) learners’ perspectives on using e-feedback to reduce their local and global mistakes, and (c) the type of self-regulated learning (SRL) behaviours, according to EFL students’ self-reports, electronic feedback and learning logs called forth in cognitive, affective, and metacognitive domains. The participants (n=48) were high-intermediate to advanced EFL learners from four cohorts enrolled on an International English Language Testing System (IELTS) preparation course in a branch of the Institute of Science and Technology in Tehran. Each cohort went through 84 face-to-face tutorial sessions in four months. During this period, they also wrote essays and received e-feedback on 12 IELTS Writing Task 2 prompts with a minimum of drafting work three times for each on an e-learning platform (www.ekbatani.ir) specially designed for this study. The data from all four cohorts were collected over the course of 11 months, using semi-structured interviews, online structured and unstructured learning logs, and an open-ended questionnaire to provide an in-depth picture of student perceptions of this technology mediation. Through a purely qualitative research design, the log, interview, and open-ended questionnaire data were analysed, categorised and coded. The findings represented students’ perceptions of the benefits of the e-feedback and learning logs as (i) offering a motivating and empowering means of providing EFL writing support, (ii) enhancing the thinking and problem-solving processes, (iii) a flexible and fast scaffolding approach for L2 writing improvement, and (iv) encouraging student writers’ active knowledge construction by helping them notice mistakes, focus on writing specifics, overcome the fear of writing, and grow confidence in L2 learning. The self-reported data indicated perceived limitations including (i) the time-consuming nature of the e-feedback processes, (ii) the occasional need for face-to-face discussions, peer feedback addition, providing supplements to e-feedback such as on-demand e-tutorials, and (iii) increased workload for the teacher in proportion to the number of students. Specific writing improvement was perceived to be locally in the use of punctuation signs and grammar, in spelling skills and the scope of vocabulary; and globally in organising ideas, finding ideas in the form of blueprints, and developing ideas into full-length essays. The student perceptions demonstrated that the learner-centred e-feedback environment created different affordances for students’ cognitive, affective, and metacognitive behaviours: (i) cognitively, it assisted the use and development of various learning strategies, enhanced student EFL writing experience, and increased awareness of error patterns in their essays; (ii) affectively, it supported students’ motivational processes, ability to appraise their progress, restore, and sustain positivity, and greater perceived self-efficacy beliefs in their own L2 writing skills; finally, (iii) metacognitive affordances included the ability to rethink and amend their plans as well as seek out support, ability to reflect on the writing processes holistically, ability to self-monitor to remain on course, and ability to devise and implement a plan of action mostly by finding a strategy to deal with mistakes and by taking greater caution in writing their future drafts. Despite arising from a particular contextual framework with the experience of particular cohorts of students, the findings can hopefully be of value to researchers and practitioners in the fields of online language pedagogy, second language acquisition (SLA), EFL writing, and computer-assisted language learning (CALL) with communication uses. The findings can assist language courseware designers, e-feedback platform developers, and L2 writing course administrators to support and enhance their practices and decisions, especially in providing and implementing ICT and SRL initiatives in EFL writing.
106

Lithology and provenance of late Eocene - Oligocene sediments in eastern Taranaki Basin margin and implications for paleogeography

Hopcroft, Bradley Scott January 2009 (has links)
The latest Eocene and Oligocene was a time of marked paleoenvironmental change in Taranaki Basin, involving a transition from the accumulation of coal measures and inner shelf deposits to the development of upper bathyal environments. Up until the end of the Early Oligocene (Lower Whaingaroan Stage) Taranaki Basin had an extensional tectonic setting. Marine transgression culminated in the accumulation of condensed facies of the Matapo Sandstone Member of the lower part of the Ngatoro Group. During the Late Oligocene (Upper Whaingaroan Stage) Taranaki Basin's tectonic setting changed to one of crustal shortening with basement overthrusting westward into the basin on Taranaki Fault. The major part of the Ngatoro Group in thickness, including the Tariki Sandstone Member, Otaraoa Formation, Tikorangi Formation and Taimana Formation, accumulated in response to this change in tectonic setting. Various methods of stratigraphic and sedimentological characterisation have been undertaken to evaluate the stratigraphy of the Ngatoro Group. Wireline log records have been calibrated through particle sizing and carbonate digestion of well cuttings. A suite of wireline motifs have been defined for formations and members of the Ngatoro Group. The integration with other lithological and paleoenvironmental data sources has helped to better define the Late Eocene - Oligocene stratigraphy and sedimentary facies for eastern Taranaki Basin margin. U-Pb geochronology by laser ablation inductively coupled plasma-mass spectrometry (LA-ICP-MS) has been used to determine detrital ages for over 350 zircons from 13 samples of Late Eocene - Oligocene sandstone samples in eastern Taranaki Basin and correlative onshore North Island units. The spread of ages (1554 - 102 Ma) and the proportion of ages in particular age bands integrated with modal petrography data have aided provenance evaluation. A range of source rocks contributed to the Late Eocene - Oligocene sedimentary units analysed, mainly the Waipapa Terrane (Early Permian to Late Jurassic) as shown by 206Pb/238U zircon ages and the abundance of fine-grained sedimentary rock fragments observed in samples. The Median Batholith (i.e. Darran/Median Suite and Separation Point Suite) is also identified as a significant source, indicated by Early Triassic to Early Jurassic and Early Cretaceous 206Pb/238U zircon ages and an abundance of quartz in samples. Other minor sources identified include Murihiku and Caples Terranes, Rakaia Sub-terrane and possibly the Karamea Batholith. The Tariki Sandstone and the Hauturu Sandstone have the same source, with the main 206Pb/238U zircon ages of aggregated samples (124 - 116 Ma and 121 Ma, respectively) consistent with a Separation Point Suite/Median Batholith (124 - 116 Ma) source. Derivation of sediments from a landmass that existed to the east and southeast of the Wellington area has been inferred for the Late Eocene - Oligocene units, with subsequent migration of sediments northward into Taranaki Basin and the Waikato Region (i.e. Te Kuiti Group depocentre) via longshore drift. New provenance data have been used to revise understanding about the development of eastern Taranaki Basin margin through the Late Eocene to earliest Miocene. Three new paleogeography maps are presented for the Runangan (Late Eocene), Lower Whaingaroan (Early Oligocene) and Upper Whaingaroan (early-mid-Oligocene). New paleogeography interpretations illustrate a dramatic change in the basin development between Matapo Sandstone (Lower Whaingaroan) and Tariki Sandstone (Upper Whaingaroan) deposition, consistent with an Upper Whaingaroan age for the start of reverse movement on Taranaki Fault.
107

Development and application of a 3D equation-of-state compositional fluid-flow simulator in cylindrical coordinates for near-wellbore phenomena

Abdollah Pour, Roohollah 06 February 2012 (has links)
Well logs and formation testers are routinely used for detection and quantification of hydrocarbon reserves. Overbalanced drilling causes invasion of mud filtrate into permeable rocks, hence radial displacement of in-situ saturating fluids away from the wellbore. The spatial distribution of fluids in the near-wellbore region remains affected by a multitude of petrophysical and fluid factors originating from the process of mud-filtrate invasion. Consequently, depending on the type of drilling mud (e.g. water- and oil-base muds) and the influence of mud filtrate, well logs and formation-tester measurements are sensitive to a combination of in-situ (original) fluids and mud filtrate in addition to petrophysical properties of the invaded formations. This behavior can often impair the reliable assessment of hydrocarbon saturation and formation storage/mobility. The effect of mud-filtrate invasion on well logs and formation-tester measurements acquired in vertical wells has been extensively documented in the past. Much work is still needed to understand and quantify the influence of mud-filtrate invasion on well logs acquired in horizontal and deviated wells, where the spatial distribution of fluids in the near-wellbore region is not axial-symmetric in general, and can be appreciably affected by gravity segregation, permeability anisotropy, capillary pressure, and flow barriers. This dissertation develops a general algorithm to simulate the process of mud-filtrate invasion in vertical and deviated wells for drilling conditions that involve water- and oil-base mud. The algorithm is formulated in cylindrical coordinates to take advantage of the geometrical embedding imposed by the wellbore in the spatial distribution of fluids within invaded formations. In addition, the algorithm reproduces the formation of mudcake due to invasion in permeable formations and allows the simulation of pressure and fractional flow-rate measurements acquired with dual-packer and point-probe formation testers after the onset of invasion. An equation-of-state (EOS) formulation is invoked to simulate invasion with both water- and oil-base muds into rock formations saturated with water, oil, gas, or stable combinations of the three fluids. The algorithm also allows the simulation of physical dispersion, fluid miscibility, and wettability alteration. Discretized fluid flow equations are solved with an implicit pressure and explicit concentration (IMPEC) scheme. Thermodynamic equilibrium and mass balance, together with volume constraint equations govern the time-space evolution of molar and fluid-phase concentrations. Calculations of pressure-volume-temperature (PVT) properties of the hydrocarbon phase are performed with Peng-Robinson's equation of state. A full-tensor permeability formulation is implemented with mass balance equations to accurately model fluid flow behavior in horizontal and deviated wells. The simulator is rigorously and successfully verified with both analytical solutions and commercial simulators. Numerical simulations performed over a wide range of fluid and petrophysical conditions confirm the strong influence that well deviation angle can have on the spatial distribution of fluid saturation resulting from invasion, especially in the vicinity of flow barriers. Analysis on the effect of physical dispersion on the radial distribution of salt concentration shows that electrical resistivity logs could be greatly affected by salt dispersivity when the invading fluid has lower salinity than in-situ water. The effect of emulsifiers and oil-wetting agents present in oil-base mud was studied to quantify wettability alteration and changes in residual water saturation. It was found that wettability alteration releases a fraction of otherwise irreducible water during invasion and this causes electrical resistivity logs to exhibit an abnormal trend from shallow- to deep-sensing apparent resistivity. Simulation of formation-tester measurements acquired in deviated wells indicates that (i) invasion increases the pressure drop during both drawdown and buildup regimes, (ii) bed-boundary effects increase as the wellbore deviation angle increases, and (iii) a probe facing upward around the perimeter of the wellbore achieves the fastest fluid clean-up when the density of invading fluid is larger than that of in-situ fluid. / text
108

Reservoir quality, structural architecture, fluid evolution and their controls on reservoir performance in block 9, F-O gas field, Bredasdorp Basin, offshore South Africa

Fadipe, Oluwaseun Adejuwon January 2012 (has links)
Philosophiae Doctor - PhD / The use of integrated approach to evaluate the quality of reservoir rocks is increasingly becoming vital in petroleum geoscience. This approach was employed to unravel the reason for the erratic reservoir quality of sandstones of the F-O gas field with the aim of predicting reservoir quality, evaluate the samples for presence, distribution and character of hydrocarbon inclusions so as to gain a better understanding of the fluid history. Information on the chemical conditions of diagenetic processes is commonly preserved in aqueous and oil fluid inclusion occurring in petroleum reservoir cements. Diagenesis plays a vital role in preserving, creating, or destroying porosity and permeability, while the awareness of the type of trap(s) prior to drilling serves as input for appropriate drilling designs. Thus an in-depth understanding of diagenetic histories and trap mechanisms of potential reservoirs are of paramount interest during exploration stage.This research work focused on the F-O tract located in the eastern part of Block 9 on the north-eastern flank of the Bredasdorp Basin, a sub-basin of Outeniqua Basin on the southern continental shelf, offshore South Africa. The Bredasdorp Basin experienced an onset of rifting during the Middle-Late Jurassic as a result of dextral trans-tensional stress produced by the breakup of Gondwanaland that occurred in the east of the Falkland Plateau and the Mozambique Ridge. This phenomenon initiated a normal faulting, north of the Agulhas-Falkland fracture zone followed by a widespread uplift of major bounding arches within the horst blocks in the region that enhanced an erosion of lower Valanginian drift to onset second order unconformity.This study considered 52 selected reservoir core samples from six wells(F-O1, F-O2, F-O3, F-O4, F-R1 and F-S1) in the F-O field of Bredasdorp Basin with attention on the Valanginian age sandstone. An integrated approach incorporating detailed core descriptions, wireline log analysis (using Interactive petrophysics), structural interpretation from 2D seismic lines (using SMT software) cutting across all the six wells, multi-mineral (thin section, SEM,XRD) analyses, geochemical (immobile fluid and XRF) and fluid inclusion(fluid inclusion petrography and bulk volatile) analyses were deployed for the execution of this study. Core description revealed six facies from the six wells grading from pure shale (Facies 1), through progressively coarsening interbedded sand-shale “heterolithic facies (Facies 2 - 4), to cross bedded and minor massive sandstone (Facies 5 - 6). Sedimentary structures and mineral patches varies from well to well with bioturbation, synaeresis crack, echinoid fragments, fossil burrow, foreset mudrapes, glauconite and siderite as the main observed features. All these indicate that the Valanginian reservoir section in the studied wells was deposited in the upper shallow marine settings. A combination of wireline logs were used to delineate the reservoir zone prior to core description. The principal reservoirs are tight, highly faulted Valanginian shallow-marine sandstones beneath the drift-onset unconformity, 1At1 and were deposited as an extensive sandstone “sheet” within a tidal setting. The top and base of the reservoir are defined by the 13At1 and 1At1 seismic events,respectively. This heterogeneous reservoir sandstones present low-fair porosity of between 2 to 18 % and a low-fair permeability value greater than 0.1 to 10 mD. The evolution of the F-O field was found to be controlled by extensional events owing to series of interpreted listric normal faults and rifting or graben generated possibly by the opening of the Atlantic. The field is on a well-defined structural high at the level of the regional drift-onset unconformity, 1At1.Multi-mineral analysis reveals the presence of quartz and kaolinite as the major porosity and permeability constraint respectively along with micaceous phases. The distribution of quartz and feldspar overgrowth and crystals vary from formation to formation and from bed to bed within the same structure. The increase in temperature that led to kaolinite formation could have triggered the low-porosity observed. Three types of kaolinite were recognized in the sandstone, (1) kaolinite growing in between expanded mica flakes; (2)vermiform kaolinite; and (3) euhedral kaolinite crystals forming matrix.Compositional study of the upper shallow marine sandstones in the Valanginian age indicates that the sandstones are geochemically classified as majorly litharenite having few F-O2 samples as subarkose with all F-O1 samples classified as sub-litharenite sandstone.Most of the studied wells are more of wet gas, characterized by strong response of C2 – C5 with F-O1 well showing more of gas condensate with oil shows (C7 – C11) based on the number of carbon atom present. In some cases,sulphur species (characterized by the presence of H2S, S2, CS2 and SO2) of probably thermal origin were identified while some log signatures revealed aromatic enriched sandstones possibly detecting nearby gas charges. The studied wells in the F-O field, based on fluid inclusion bulk volatile analysis are classified as gas discoveries except for F-O1 with gas condensate and oil shows.The integration of multi-mineral results and fluid inclusion studies show a dead oil stain with no visible liquid petroleum inclusion in the samples indicating the presence of quartz, kaolinite and stylolite as a major poro-perm constraint.
109

Modelo de evaluación de métricas de control para procesos de negocio utilizando Process Mining / Control Metrics Evaluation Model for Business Processes using Process Mining

García Oliva, Rodrigo Alfonso, Santos Barrenechea, Jesús Javier 24 October 2020 (has links)
Este proyecto tiene como objetivo analizar la complejidad de los procesos de negocio en las empresas retail de una forma profunda que en otras técnicas resulta muy difícil o incluso imposible de realizar. Con Process Mining es posible superar esta brecha y eso es lo que queremos demostrar a través de la implementación de un modelo. El proyecto propone un modelo de Process Mining que contemple la presencia de diversas fuentes de información de un proceso logístico en una empresa minorista, así como la aplicación de las tres fases de Process Mining (Descubrimiento, Conformidad y Mejora) y adicionalmente se propone una fase de diagnóstico la cual detalla un conjunto de métricas de control para evaluar el proceso de logística y así poder generar una plan de mejora que dé las pautas para optimizar el proceso en base a lo analizado mediante esta técnica. El modelo desarrollado se implementó en una empresa peruana del sector retail (TopiTop S.A) para el análisis del proceso de logística, específicamente el de gestión de órdenes de compra. Este se analizó dando como resultado de la aplicación del modelo y de la evaluación de las métricas propuestas, la identificación de anomalías en el proceso a través de la aplicación de cada una de las fases del modelo propuesto, asegurando la calidad del análisis en la fase de preprocesamiento, generando el modelo de procesos y extrayendo información que se derivó en métricas de control a través de la herramienta de código abierto ProM Tools. / This project aims to analyze the complexity of business processes in retail companies in a deep way that in other techniques is very difficult or even impossible to do. With Process Mining it is possible to overcome this gap and that is what we want to demonstrate through the implementation of a Process Mining model. The project proposes a Process Mining model that contemplates the presence of various sources of information of a logistic process in a retail company, as well as the application of the three phases of Process Mining (Discovery, Compliance and Improvement). Additionally, a diagnostic phase is proposed, which details a set of control metrics to evaluate the logistic process and thus be able to generate an improvement plan that gives the guidelines to optimize the process based on what has been analyzed through this technique. The model developed was implemented in a peruvian company in the retail sector (TopiTop S.A.) for the analysis of the logistics process, specifically the management of purchase orders. This was analyzed giving as a result of the application of the model and the evaluation of the proposed metrics, the identification of anomalies in the process through the application of each of the phases of the proposed model, ensuring the quality of the analysis in the pre-processing phase, generating the process model and extracting information that was derived in control metrics through the open source tool ProM Tools. / Tesis
110

Štípací stroj na dřevo / Wood-splitting machine

Henzl, Lukáš January 2014 (has links)
The thesis describes the design of the wood-splitting machine including stress and check calculations and design of the hydraulic circuit. For structural design, it is necessary to choose the maximum dimensions of wooden logs, which have length 1,1 m and diameter 0,5 m and over. The machine structure is vertical. Another task of the thesis is a technical report of similar machines and the justification of the design concepts of splitting machine. In the thesis, there is elaborated an experiment on different types of wood splitters, stress analysis calculated by using finite element method and there is also drawn up technical parameters of wood.

Page generated in 0.0327 seconds