71 |
Survival Instantaneous Log-Odds Ratio From Empirical FunctionsJung, Jung Ah, Drane, J. Wanzer 01 January 2007 (has links)
The objective of this work is to introduce a new method called the Survivorship Instantaneous Log-odds Ratios (SILOR); to illustrate the creation of SILOR from empirical bivariate survival functions; to also derive standard errors of estimation; to compare results with those derived from logistic regression. Hip fracture, AGE and BMI from the Third National Health and Nutritional Examination Survey (NHANES III) were used to calculate empirical survival functions for the adverse health outcome (AHO) and non-AHO. A stable copula was used to create a parametric bivariate survival function, that was fitted to the empirical bivariate survival function. The bivariate survival function had SILOR contours which are not constant. The proposed method has better advantages than logistic regression by following two reasons. The comparison deals with (i) the shapes of the survival surfaces, S(X1, X2), and (ii) the isobols of the log-odds ratios. When using logistic regression the survival surface is either a hyper plane or at most a conic section. Our approach preserves the shape of the survival surface in two dimensions, and the isobols are observed in every detail instead of being overly smoothed by a regression with no more than a second degree polynomial. The present method is straightforward, and it captures all but random variability of the data.
|
72 |
The Rhetoric Revision Log: A Second Study on a Feedback Tool for ESL Student WritingCole, Natalie Marie 01 December 2017 (has links)
A common pattern in teacher feedback to ESL writing is to provide students feedback on primarily grammar, often sidelining content (Ferris, 2003). This research is a second study of an original study done by Yi (2010) on a rhetoric revision log. This Rhetoric Revision Log (RRL) helped teachers and students track content errors in writing. This research further studies the success of the RRL with some minor changes made based on previous research results. Data consists of the Rhetoric Revision Log (RRL) given to 42 students in three different ESL writing classes at the same level with four different teachers. All students' pretests, posttests, response to surveys in regards to the use of the log, response to interviews in regards to the log, and the data on content-based needed revisions were analyzed. Teachers' responses in interviews were examined, as well, to draw conclusions about the efficacy of the log. Results show that the use of the RRL helped students reduce content errors in writing. Findings from student surveys and interviews indicate that a majority of students find the RRL beneficial, and teacher interviews provided positive feedback about the implementation of the log in ESL writing classes.
|
73 |
Root Cause Analysis and Classification for Firewall Log Events Using NLP Methods / Rotorsaksanalys och klassificering för brandväggslogghändelser med hjälp av NLP-metoderWang, Tongxin January 2022 (has links)
Network log records are robust evidence for enterprises to make error diagnoses. The current method of Ericsson’s Networks team for troubleshooting is mainly by manual observation. However, as the system is getting vast and complex, the log messages show a growth trend. At this point, it is vital to accurately and quickly discern the root cause of error logs. This thesis proposes models that can address two main problems applying Natural Language Processing methods: manual log root cause classification is progressed to automated classification and Question Answering (QA) system to give root cause directly. Models are validated on Ericsson’s firewall traffic data. Different feature extraction methods and classification models are chosen, with the more effective Term Frequency-Inverse Document Frequency (TF-IDF) method combined with a Random Forest classifier obtaining the F1 score of 0.87 and Bidirectional Encoder Representations from Transformers (BERT) fine-tuned classification obtaining the F1 score of 0.90. The validated QA model also gets good performance in quality assessment. The final results demonstrate that the proposed models can optimize manual analysis. While choosing algorithms, deep learning models such as BERT can produce similar or even better results than Random Forest and Naive Bayes classifiers. However, it is complex to implement the BERT since it requires more resources compared to more straightforward solutions and more caution. / Nätverksloggposter är robusta bevis för företag att göra feldiagnoser. Ericssons nätverksteams nuvarande metod för felsökning är huvudsakligen manuell observation. Men eftersom systemet blir stort och komplext visar loggmeddelandena en tillväxttrend. Vid denna tidpunkt är det viktigt att noggrant och snabbt urskilja grundorsaken till felloggar. Den här avhandlingen föreslår modeller som kan lösa två huvudproblem vid tillämpning av Natural Language Processing-metoder: manuell logggrundorsaksklassificering går vidare till automatiserad klassificering och QA-system (Question Answering) för att ge grundorsaken direkt. Modellerna är validerade på Ericssons brandväggstrafikdata. Olika funktionsextraktionsmetoder och klassificeringsmodeller valdes, med den mer effektiva metoden Term Frequency-Inverse Document Frequency (TF-IDF) kombinerad med en Random Forest-klassificerare som fick ett F1-poäng på 0,87 och Bidirectional Encoder Representations from Transformers (BERT) finjusterade klassificering som erhåller en F1-poäng på 0,90. Den validerade QA-modellen får också bra prestanda vid kvalitetsbedömning. De slutliga resultaten visar att de föreslagna modellerna kan optimera manuell analys. När man väljer algoritmer kan djupinlärningsmodeller som BERT ge liknande eller till och med bättre resultat än Random Forest och Naive Bayes klassificerare. Det är dock komplicerat att implementera BERT eftersom det kräver mer resurser jämfört med enklare lösningar och mer försiktighet.
|
74 |
Optimizing log truck payload through improved weight controlOverboe, Paul David 24 July 2012 (has links)
Trucking of forest products is a very important segment of the harvesting process and it is monitored relatively closely by external sources. Load weight is the focal point of the attention received by log hauling. The optimization of load weights is therefore very important to a logging operation's success and this can be achieved only through adequate gross vehicle weight control. Methods of load weight control are reviewed and possible applications discussed in this report. Studies were conducted to evaluate the adequacy of load weight control achieved utilizing two quite different methods. A reporting technique which provided loader operators with information about trends in the delivery weights of trucks which they loaded was used to heighten their awareness of problem areas in load weight distributions. This study was conducted at two southern paper mills with substantially different truck weight regulation environments. Two separate case studies were conducted on Virginia loggers utilizing on-board electronic truck scales.
Results of the loading study indicated that the passive treatment had affected the behavior of some of the producers studied. The behavioral changes observed generally improved the economic optimization of load delivery weights. The on-board scale studies indicated that the scale systems did perform well in the applications observed. However, the economic benefits associated with use of the scales were negligible for the two producers studied due to a reduction in delivery weights after installation of the scales. / Master of Science
|
75 |
A Computer Simulation Model for Predicting the Impacts of Log Truck Turn-Time on Timber Harvesting System ProductivityBarrett, Scott M. 09 February 2001 (has links)
A computer simulation model was developed to represent a logging contractor's harvesting and trucking system of wood delivery from the contractor's in-woods landing to the receiving mill. The Log Trucking System Simulation model (LTSS) focuses on the impacts to logging contractors as changes in truck turn times cause an imbalance between harvesting and trucking systems. The model was designed to serve as a practical tool that can illustrate the magnitude of cost and productivity changes as the delivery capacity of the contractor's trucking system changes.
The model was used to perform incremental analyses using an example contractor's costs and production rates to illustrate the nature of impacts associated with changes in the contractor's trucking system. These analyses indicated that the primary impact of increased turn times occurs when increased delivery time decreases the number of loads per day the contractor's trucking system can deliver. When increased delivery times cause the trucking system to limit harvesting production, total costs per delivered ton increase. In cases where trucking significantly limits system production, total costs per delivered ton would decrease if additional trucks were added.
The model allows the user to simulate a harvest with up to eight products trucked to different receiving mills. The LTSS model can be utilized without extensive data input requirements and serves as a user friendly tool for predicting cost and productivity changes in a logging contractor's harvesting and trucking system based on changes in truck delivery times. / Master of Science
|
76 |
A Log-Linear Analysis of a Set of Medical DataKo, Barbara Mook-Pik 02 1900 (has links)
<p> Methotrexate had been suspected to be harmful to the liver in psoriatic patients. The data of a prospective study to find out whether the drug affected the acquisition and worsening of various liver pathology was analysed. Personal particulars which would have adverse effect on the liver were also investigated. Log-linear models were fitted to this set of categorical data in the form of multidimensional contingency tables. It was found that methotrexate would be hepatotoxic if the drug was taken over a prolonged period and/or if the cumulative dose taken was large. Otherwise, methotrexate could be administered to psoriatic patients without causing much harm to the liver.</p> / Thesis / Master of Science (MSc)
|
77 |
BAYESIAN ANALYSIS OF LOG-BINOMIAL MODELSZHOU, RONG 13 July 2005 (has links)
No description available.
|
78 |
Automatic interpretation of computed tomography (CT) images for hardwood log defect detectionLi, Pei 18 November 2008 (has links)
This thesis describes the design of an image interpretation system for the automatic detection of internal hardwood log defects. The goal of the research is that such a system should not only be able to identify and locate internal defects of hardwood logs using computed tomography (CT) imagery, but also should be able to accommodate more than one type of wood, and should show potential for real-time industrial implementation. This thesis describes a new image classification system that utilizes a feed forward artificial neural network as the image classifier. The classifier was trained with back-propagation, using training samples collected from two different types of hardwood logs, red oak and water oak. Pre-processing and post-processing are performed to increase the system classification performance, and to make the system be able to accommodate more than one wood type. It is shown in this thesis that such a neural-net based approach can yield a high classification accuracy, and it shows a high potential for parallelism. Several possible design alternatives and comparisons are also addressed in the thesis. The final image interpretation system has been successfully tested, exhibiting a classification accuracy of 95% with test images from four hardwood logs. / Master of Science
|
79 |
Measuring and evaluating log truck performance in a variety of operating conditionsMcCormack, Robert James 19 October 2005 (has links)
Studies of log truck speeds and fuel consumption were made at four location in the southeastern United States. Execution of the study necessitated the development and testing of a microprocessor based data logger capable of withstanding the harsh operating environment found in forest harvesting and transport equipment.
The first study investigated the normal operating pattern for a truck in a logging contractors flect. The truck was found to be highly utilized and to incur considerable distances of unloaded running to service the contractor's widely separated operations. A second study highlighted the fucl and speed penalties associated with operations on sand and gravel roads.
The third study documented significant performance differences between routes delivering to one location even where road surface differences were minimal. A fourth, detailed study illustrated speed and fuel consumption differences between urban and mural operations. Tests on a group of five experienced drivers demonstrated considerable differences in speed and fuel usage. Some drivers appeared to have a driving style which delivered higher speed with low fuel consumption. A detailed analysis of individual speed profiles indicated that as much as 1/3 to 1/2 of the recorded fuel consumption on one section was associated with air resistance.
In conclusion the studies noted that for the trucks and conditions evaluated:
(1) there are significant performance losses and increased costs associated with operations on low standard road sections. Road roughness was a significant factor determining speed.
(2) performance and cost differences between routes were demonstrated even for roads of comparable road surface type. This indicated that inter-route costs differences may be pervasive. These differences would require acknowledgement and evaluation if equitable route payment schedules were to be constructed.
(3) All the trucks studied operated for at least part of the time at high speeds and may be incurring unnecessary fuel and maintenance expenses. Application of aerodynamic deflectors might be beneficial and their applicability should be tested.
(4) Some driving styles appear more efficient and deserve further investigation and documentation. Changing driver behavior might present the most cost effective means of improvement in fleet performance. / Ph. D.
|
80 |
Frequent Inventory of Network Devices for Incident Response: A Data-driven Approach to Cybersecurity and Network OperationsKobezak, Philip D. 22 May 2018 (has links)
Challenges exist in higher education networks with host inventory and identification. Any student, staff, faculty, or dedicated IT administrator can be the primary responsible personnel for devices on the network. Confounding the problem is that there is also a large mix of personally-owned devices. These network environments are a hybrid of corporate enterprise, federated network, and Internet service provider. This management model has survived for decades based on the ability to identify responsible personnel when a host, system, or user account is suspected to have been compromised or is disrupting network availability for others. Mobile devices, roaming wireless access, and users accessing services from multiple devices has made the task of identification onerous. With increasing numbers of hosts on networks of higher education institutions, strategies such as dynamic addressing and address translation become necessary. The proliferation of the Internet of Things (IoT) makes this identification task even more difficult. Loss of intellectual property, extortion, theft, and reputational damage are all significant risks to research institution networks. Quickly responding to and remediating incidents reduces exposure and risk.
This research evaluates what universities are doing for host inventory and creates a working prototype of a system for associating relevant log events to one or more responsible people. The prototype reduces the need for human-driven updates while enriching the dynamic host inventory with additional information. It also shows the value of associating application and service authentications to hosts. The prototype uses live network data which is de-identified to protect privacy. / Master of Science / Keeping track of computers or hosts on a network has become increasingly difficult. In the past, most of the hosts were owned by the institution, but now more hosts are owned by the end users. The management of institution networks has become a mix of corporate enterprise, federated network, and Internet service provider. This model has survived for decades based on the ability to identify someone responsible when a host or system is suspected to be infected with malware or is disrupting network availability for others. Mobile devices, roaming wireless access, and users accessing services from multiple devices has made the task of identification more difficult. With increasing numbers of hosts on networks of higher education institutions, strategies such as dynamic addressing and address translation become necessary. The proliferation of the Internet of Things (IoT) makes identification even more difficult. Loss of intellectual property, theft, and reputational damage are all significant risks to institution networks. Quickly responding to and remediating cybersecurity incidents reduces exposure and risk.
This research considers what universities are doing for host inventory and creates a working prototype of a system for associating relevant log events to one or more responsible people. The prototype reduces the need for human-driven updates while incorporating additional information for the dynamic host inventory. It also shows the value of associating application and service authentications to hosts. The prototype uses real network data which is de-identified to protect privacy.
|
Page generated in 0.0393 seconds