151 |
Analysis and management of wood roomIsokangas, A. (Ari) 10 August 2010 (has links)
Abstract
The objective of this work was to study the effect of adjustable process parameters on wood loss and bark removal in tumble drum debarking. The effect of capacity on the size distribution of the chips was studied in order to determine the optimal capacity for both the debarking and chipping sub-processes. The final aim was to propose a control strategy to optimise the processes by adapting their parameters according to the quality of the raw material.
When the research started, earlier automation systems had focused on keeping the process alive, and economic values such as wood loss were not considered important. The process is usually controlled manually and shifts have different ways to manage it, which are based on trial and error. Bark removal in chemical pulp mills is usually higher than the values recommended in the literature, which in turn causes log breaking in the drum and increases wood loss. Even a small reduction in wood loss could have a substantial financial outcome. The lack of raw materials some time ago and the recession nowadays have highlighted the importance of more efficient log use.
Data survey techniques were employed to reveal the interactions between drum variables from noisy measurements. Wood room data were analysed by modelling and deriving conclusions from the resulting parameters. In addition, log breaking and the size distribution of the chips were analysed under different process conditions. A pilot-scale drum was used to study residence time and the mechanical abrasion of logs.
The results of this work indicated that the ratio of the volume of logs in the drum to capacity determines the residence time of the logs in the drum. Other variables influence the volume of logs in the drum, which together with capacity determine the residence time of logs in the drum, which affects wood loss and bark removal. The effect of capacity on the size distribution of the chips was not unambiguous, however, and it was therefore recommended to operate wood room at high capacity, because this reduces wood loss and increases annual production. The proposed control strategy adapts the residence time of logs in the drum to the quality of the debarked raw material by controlling the position of the closing gate. In addition, the control strategy adjusts the rotating speed of the drum using an open loop control.
The results can be used to optimise the wood room process parameters. If the problem in wood room is excessive debarking, the residence time of the logs can be reduced and the rotating speed of the drum lowered. In this way the logs will be damaged less and wood loss will be reduced. Bark removal requirements in mechanical pulp mills are high, and the process parameters can be adapted to avoid problems in the subsequent processes due to excessive bark.
|
152 |
Numerical techniques for optimal investment consumption modelsMvondo, Bernardin Gael January 2014 (has links)
>Magister Scientiae - MSc / The problem of optimal investment has been extensively studied by numerous researchers in order to generalize the original framework. Those generalizations have been made in different directions and using different techniques. For example, Perera [Optimal consumption, investment and insurance with insurable risk for an investor in a Levy market, Insurance: Mathematics and Economics, 46 (3) (2010) 479-484] applied the martingale approach to obtain a closed form solution for the optimal investment, consumption and insurance strategies of an individual in the presence of an insurable risk when the insurable risk and risky asset returns are described by Levy processes and the utility is a constant absolute risk aversion. In another work, Sattinger [The Markov consumption problem, Journal of Mathematical Economics, 47 (4-5) (2011) 409-416] gave a model of consumption behavior under uncertainty as the solution to a continuous-time dynamic control problem in which an individual moves between employment and unemployment according to a Markov process. In this thesis, we will review the consumption models in the above framework and will simulate some of them using an infinite series expansion method − a key focus of this research. Several numerical results obtained by using MATLAB are presented with detailed explanations.
|
153 |
Event Mining for System and Service ManagementTang, Liang 18 April 2014 (has links)
Modern IT infrastructures are constructed by large scale computing systems and administered by IT service providers. Manually maintaining such large computing systems is costly and inefficient. Service providers often seek automatic or semi-automatic methodologies of detecting and resolving system issues to improve their service quality and efficiency. This dissertation investigates several data-driven approaches for assisting service providers in achieving this goal. The detailed problems studied by these approaches can be categorized into the three aspects in the service workflow: 1) preprocessing raw textual system logs to structural events; 2) refining monitoring configurations for eliminating false positives and false negatives; 3) improving the efficiency of system diagnosis on detected alerts. Solving these problems usually requires a huge amount of domain knowledge about the particular computing systems. The approaches investigated by this dissertation are developed based on event mining algorithms, which are able to automatically derive part of that knowledge from the historical system logs, events and tickets.
In particular, two textual clustering algorithms are developed for converting raw textual logs into system events. For refining the monitoring configuration, a rule based alert prediction algorithm is proposed for eliminating false alerts (false positives) without losing any real alert and a textual classification method is applied to identify the missing alerts (false negatives) from manual incident tickets. For system diagnosis, this dissertation presents an efficient algorithm for discovering the temporal dependencies between system events with corresponding time lags, which can help the administrators to determine the redundancies of deployed monitoring situations and dependencies of system components. To improve the efficiency of incident ticket resolving, several KNN-based algorithms that recommend relevant historical tickets with resolutions for incoming tickets are investigated. Finally, this dissertation offers a novel algorithm for searching similar textual event segments over large system logs that assists administrators to locate similar system behaviors in the logs. Extensive empirical evaluation on system logs, events and tickets from real IT infrastructures demonstrates the effectiveness and efficiency of the proposed approaches.
|
154 |
Uma abordagem para detecção de outliers em dados categoricosSilva, Flávio Roberto 27 February 2004 (has links)
Orientador: Geovane Cayres Magalhães / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-04T02:10:07Z (GMT). No. of bitstreams: 1
Silva_FlavioRoberto_M.pdf: 2674028 bytes, checksum: 456319a74b85e74d16832bff92d67eed (MD5)
Previous issue date: 2004 / Resumo: Outliers são elementos que não obedecem a um padrão do conjunto de dados ao qual eles pertencem. A detecção de outliers pode trazer informações não esperadas e importantes para algumas aplicações, como por exemplo: descoberta de fraudes em sistemas telefônicos e de cartão de crédito e sistemas de detecção de intrusão. Esta dissertação apresenta uma nova abordagem para detecção de outliers em bancos de dados com atributos categóricos. A abordagem proposta usa modelos log-lineares como um padrão para o conjunto de dados, o que torna mais fácil a tarefa de interpretação dos resultados pelo usuário. Também é apresentado o FOCaD (Finding Outliers in Categorical Data), protótipo de um sistema de análise de dados categóricos. Ele ajusta e seleciona modelos, faz testes estatísticos e detecta outliers / Abstract: An outlier is an element that does not conform to a given pattern to a set. Outlier detection can lead to unexpected and useful information to some applications, e.g., discovery of fraud in telephonic and credit card systems, intrusion detection systems. This Master Thesis presents a new approach for outlier detection in databases with categorical attributes. The proposed approach uses log-linear models as a pattern for the dataset, which makes easier the task of interpreting results by the user'. It is also presented FOCaD (Finding Outliers in Categorical Data), a prototype of a categorical data analysis system. It adjusts and selects models, performs statistic tests, and outlier detection / Mestrado / Ciência da Computação / Mestre em Ciência da Computação
|
155 |
Heterogeneidade não-observada na analise da historia de eventosMatuda, Nivea da Silva 15 June 1998 (has links)
Orientador: Aida C. G. Verdugo Lazo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-07-23T18:33:22Z (GMT). No. of bitstreams: 1
Matuda_NiveadaSilva_M.pdf: 3623854 bytes, checksum: 146f65b0f200872e091830629e681137 (MD5)
Previous issue date: 1998 / Resumo: Não informado / Abstract: Not informed / Mestrado / Mestre em Estatística
|
156 |
A Descriptive Analysis of the Use and Effect of a Self-management Project in an Undergraduate Course in Behavior AnalysisLamancusa, Michelle 05 1900 (has links)
Undergraduate male and female students enrolled in an introductory behavior analysis course with minimal instruction on self-management were given modified exploratory logs to use in a self-management project. Students self-monitored behavior via the log, constructed their own interventions, and reported changes in behavior and extent of success in a write up at course end. Changes in self-reported descriptions in the logs as well as the written results of a pre and post survey of emotional responses were counted. Successful self-management project interventions were reported by most students. Correspondence between planned and actual events increased. Negative reinforcement procedures characterized most students' intervention. Correspondence between events at pre and post and actual log reports was highest at post.
|
157 |
PETROPHYSICAL ANALYSIS OF WELLS IN THE ARIKAREE CREEK FIELD, COLORADO TO DEVELOP A PREDICTIVE MODEL FOR HIGH PRODUCTIONDePriest, Keegan 01 December 2019 (has links)
All the oil and gas wells producing in the Arikaree Creek Field, Colorado targeted the Spergen Formation along similar structures within a wrench fault system; however, the wells have vastly different production values. This thesis develops a predictive model for high production in the field while also accounting for a failed waterflood event that was initiated in 2016. Petrophysical analysis of thirteen wells show that high producing wells share common characteristics of pay zone location, lithology, porosity and permeability with one another and that the Spergen Formation is not homogenous. Highly productive wells have pay zones in the lower part of the formation in sections that are dolomitized, and have anonymously high water saturation. This is likely related to the paragenesis of the formation that dolomitized the lower parts of the formation, increasing porosity and permeability, but leaving the pay zones with the high water saturation values. This heterogeneity likely accounts for the failed waterflood. Results show that the important petrophysical components for highly productive wells are the location of the payzone within the reservoir, porosity, permeability and water saturation. Additionally, homogeneity is crucial for successful waterflooding, which was not present.
|
158 |
Algebraic Methods for the Estimation of Statistical DistributionsGrosdos Koutsoumpelias, Alexandros 15 July 2021 (has links)
This thesis deals with the problem of estimating statistical distributions from data. In the first part, the method of moments is used in combination with computational algebraic techniques in order to estimate parameters coming from local Dirac mixtures and their convolutions. The second part focuses on the nonparametric setting, in particular on combinatorial and algebraic aspects of the estimation of log-concave distributions.
|
159 |
A Novel Authentication And Validation Mechanism For Analyzing Syslogs ForensicallyMonteiro, Steena D.S. 01 December 2008 (has links)
This research proposes a novel technique for authenticating and validating syslogs for forensic analysis. This technique uses a modification of the Needham Schroeder protocol, which uses nonces (numbers used only once) and public keys. Syslogs, which were developed from an event-logging perspective and not from an evidence-sustaining one, are system treasure maps that chart out and pinpoint attacks and attack attempts. Over the past few years, research on securing syslogs has yielded enhanced syslog protocols that focus on tamper prevention and detection. However, many of these protocols, though efficient from a security perspective, are inadequate when forensics comes into play. From a legal perspective, any kind of evidence found at a crime scene needs to be validated. In addition, any digital forensic evidence when presented in court needs to be admissible, authentic, believable, and reliable. Currently, a patchy log on the server side and client side cannot be considered as formal authentication of a wrongdoer. This work presents a method that ties together, authenticates, and validates all the entities involved in the crime scene--the user using the application, the system that is being used, and the application being used on the system by the user. This means that instead of merely transmitting the header and the message, which is the standard syslog protocol format, the syslog entry along with the user fingerprint, application fingerprint, and system fingerprint are transmitted to the logging server. The assignment of digital fingerprints and the addition of a challenge response mechanism to the underlying syslogging mechanism aim to validate generated syslogs forensically.
|
160 |
Measuring and Modeling of Plant Root Uptake of Organic ChemicalsDettenmaier, Erik 01 December 2008 (has links)
Determining the root uptake of xenobiotic organic chemicals into plants is critical for assessing the human and ecological health risks associated with the consumption of plants growing in contaminated environments. Root uptake of xenobiotic organics occurs passively in conjunction with transpiration and the transport from root to shoot is ultimately controlled by passage through one or more lipid root membranes. The transpiration stream concentration factor (TSCF), the ratio between the concentration of a chemical in the xylem to that in the solution used by the roots, is used to describe the relative ability of an organic chemical to be passively transported from root to shoot. However, relatively few experimental TSCF values exist due to the cost and the lack of regulatory requirements for generating such data. Where literature data exist for chemicals having more than one TSCF, the variability is often large due to the lack of standardized methods and difficulty in accounting for metabolism and volatilization losses occurring during the uptake experiments. Because of the scarcity of experimental values, estimated TSCFs are often used. Widely cited estimation approaches relating TSCF and the logarithm octanol/water partition coefficient (log KOW) suggest that only compounds that are in the intermediate lipophilicity range (log KOW = 2) will be taken up and translocated by plants. However, recent data for highly water soluble compounds such as 1,4-dioxane, MTBE, and sulfolane suggest that these estimation techniques should be critically reviewed. To re-evaluate the relationship between TSCF and log Kow, TSCFs were measured for 25 organic chemicals ranging in log KOW from -0.8 to 5 using an improved pressure chamber technique. The technique provides an approach for efficiently generating consistent plant uptake data. By using this data, a new mass transfer model relating TSCF and log KOW was developed that indicates that neutral, polar organic compounds are most likely taken up by plant roots and translocated to shoot tissue. An extensive review of literature TSCF studies supports the updated model.
|
Page generated in 0.1173 seconds