• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 350
  • 134
  • 38
  • 33
  • 32
  • 31
  • 16
  • 12
  • 11
  • 10
  • 8
  • 6
  • 6
  • 6
  • 4
  • Tagged with
  • 782
  • 122
  • 89
  • 86
  • 84
  • 73
  • 65
  • 59
  • 53
  • 51
  • 51
  • 50
  • 44
  • 41
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

Analysis and Modeling of World Wide Web Traffic

Abdulla, Ghaleb 30 April 1998 (has links)
This dissertation deals with monitoring, collecting, analyzing, and modeling of World Wide Web (WWW) traffic and client interactions. The rapid growth of WWW usage has not been accompanied by an overall understanding of models of information resources and their deployment strategies. Consequently, the current Web architecture often faces performance and reliability problems. Scalability, latency, bandwidth, and disconnected operations are some of the important issues that should be considered when attempting to adjust for the growth in Web usage. The WWW Consortium launched an effort to design a new protocol that will be able to support future demands. Before doing that, however, we need to characterize current users' interactions with the WWW and understand how it is being used. We focus on proxies since they provide a good medium or caching, filtering information, payment methods, and copyright management. We collected proxy data from our environment over a period of more than two years. We also collected data from other sources such as schools, information service providers, and commercial aites. Sampling times range from days to years. We analyzed the collected data looking for important characteristics that can help in designing a better HTTP protocol. We developed a modeling approach that considers Web traffic characteristics such as self-similarity and long-range dependency. We developed an algorithm to characterize users' sessions. Finally we developed a high-level Web traffic model suitable for sensitivity analysis. As a result of this work we develop statistical models of parameters such as arrival times, file sizes, file types, and locality of reference. We describe an approach to model long-range and dependent Web traffic and we characterize activities of users accessing a digital library courseware server or Web search tools. Temporal and spatial locality of reference within examined user communities is high, so caching can be an effective tool to help reduce network traffic and to help solve the scalability problem. We recommend utilizing our findings to promote a smart distribution or push model to cache documents when there is likelihood of repeat accesses. / Ph. D.
402

Android Application Install-time Permission Validation and Run-time Malicious Pattern Detection

Ma, Zhongmin 31 January 2014 (has links)
The open source structure of Android applications introduces security vulnerabilities that can be readily exploited by third-party applications. We address certain vulnerabilities at both installation and runtime using machine learning. Effective classification techniques with neural networks can be used to verify the application categories on installation. We devise a novel application category verification methodology that involves machine learning the application permissions and estimating the likelihoods of different categories. To detect malicious patterns in runtime, we present a Hidden Markov Model (HMM) method to analyze the activity usage by tracking Intent log information. After applying our technique to nearly 1,700 popular third-party Android applications and malware, we report that a major portion of the category declarations were judged correctly. This demonstrates the effectiveness of neural network decision engines in validating Android application categories. The approach, using HMM to analyze the Intent log for the detection of malicious runtime behavior, is new. The test results show promise with a limited input dataset (69.7% accuracy). To improve the performance, further work will be carried out to: increase the dataset size by adding game applications, to optimize Baum-Welch algorithm parameters, and to balance the size of the Intent sequence. To better emulate the participant's usage, some popular applications can be selected in advance, and the remainder can be randomly chosen. / Master of Science
403

The effect of machine and tire size on soil compaction due to skidding with rubber-tired skidders

Greene, Walter Dale January 1983 (has links)
M.S.
404

Corporate Default Predictions and Methods for Uncertainty Quantifications

Yuan, Miao 01 August 2016 (has links)
Regarding quantifying uncertainties in prediction, two projects with different perspectives and application backgrounds are presented in this dissertation. The goal of the first project is to predict the corporate default risks based on large-scale time-to-event and covariate data in the context of controlling credit risks. Specifically, we propose a competing risks model to incorporate exits of companies due to default and other reasons. Because of the stochastic and dynamic nature of the corporate risks, we incorporate both company-level and market-level covariate processes into the event intensities. We propose a parsimonious Markovian time series model and a dynamic factor model (DFM) to efficiently capture the mean and correlation structure of the high-dimensional covariate dynamics. For estimating parameters in the DFM, we derive an expectation maximization (EM) algorithm in explicit forms under necessary constraints. For multi-period default risks, we consider both the corporate-level and the market-level predictions. We also develop prediction interval (PI) procedures that synthetically take uncertainties in the future observation, parameter estimation, and the future covariate processes into account. In the second project, to quantify the uncertainties in the maximum likelihood (ML) estimators and compute the exact tolerance interval (TI) factors regarding the nominal confidence level, we propose algorithms for two-sided control-the-center and control-both-tails TI for complete or Type II censored data following the (log)-location-scale family of distributions. Our approaches are based on pivotal properties of ML estimators of parameters for the (log)-location-scale family and utilize the Monte-Carlo simulations. While for Type I censored data, only approximate pivotal quantities exist. An adjusted procedure is developed to compute the approximate factors. The observed CP is shown to be asymptotically accurate by our simulation study. Our proposed methods are illustrated using real-data examples. / Ph. D.
405

A Generalized Log-Law Formulation For a Wide Range of Boundary Roughness Conditions Encountered in Streams

Plott, James Read 27 September 2012 (has links)
It is demonstrated that the method for locating a velocity profile origin, or plane of zero velocity, by fitting log profiles to streamwise velocity measurements is applicable to a larger range of roughness scales than previously expected. Five different sets of detailed, experimental velocity measurements were analyzed encompassing sediment-scale roughness elements, roughness caused by rigid vegetation, and large-scale roughness elements comprised of mobile bedforms. The method resulted in similar values of normalized zero-plane displacement for all roughness types considered. The ratios of zero-plane displacement, dh, to roughness height, ks, were 0.20 and 0.26 for the sediment- and vegetation-scale experiments, respectively. The results for the two experiments with bedform dominated roughness were 0.34 and 0.41. An estimate of dh/ks ranging from 0.2 to 0.4 is therefore recommended for a range of roughness types with the higher end of the range being more appropriate for the larger, bedform-scale roughness elements, and the lower end for the sediment-scale roughness elements. In addition, it is demonstrated that the location of the plane of zero velocity is temporally constant even when the bed height is not. The effects of roughness element packing density were also examined with the identification of a possible threshold at 4%, above which zero-plane displacement is independent of packing density. The findings can be applied to field velocity measurements under mobile bed conditions, facilitating the calculation of turbulence parameters such as shear velocity, by using point measurements and providing guidelines for the estimation of an appropriate value for zero-plane displacement. / Master of Science
406

Birkett log house and addition

Williams, Warren L. January 1991 (has links)
The thrust of this project is twofold. The first is to create an addition to a reconstructed two-story log house. The second is to locate this structure upon the site in such a manner as to enhance the experience of its presence during approach. The first goal is addressed by connecting two appendage structures to the rear facade of the log house by means of a narrow transitional space. The intent is to maintain this transitional space as an architectural connection between the greater masses without allowing it to become a dominant element. The arrangement of the addition massing respects the prominence of the original log structure and compliments its dog-trot configuration. The shapes of the addition masses, freestanding studio/utilities building, deck area and pool also respond to the vehicular path which culminates in a circle. The second goal, the positioning of this structure within the boundaries of the selected site, was greatly facilitated by the site's numerous attributes. These range from the historical precedent of a previous log dwelling built upon the grassy knoll of choice to the natural enclosure of the site. A U-shaped, deciduously forested valley of dramatic slope along the three enclosing sides, the site provides a seasonally changing backdrop within which the entire structure can be experienced. Further enclosing this valley along the fourth side is a small, treelined river whose woody screen is permeated only in one small spot by a ford, which had been the site's previous access. / Master of Architecture
407

Gristhorpe man: an early bronze age log-coffin burial scientifically defined

Melton, Nigel D., Montgomery, Janet, Knüsel, Christopher J., Batt, Catherine M., Needham, S., Parker Pearson, M., Sheridan, A., Heron, Carl P., Horsley, T., Schmidt, Armin R., Evans, Adrian A., Carter, E.A., Edwards, Howell G.M., Hargreaves, Michael D., Janaway, Robert C., Lynnerup, N., Northover, P., O'Connor, Sonia A., Ogden, Alan R., Taylor, Timothy F., Wastling, Vaughan, Wilson, Andrew S. January 2010 (has links)
No / A log-coffin excavated in the early nineteenth century proved to be well enough preserved in the early twenty-first century for the full armoury of modern scientific investigation to give its occupants and contents new identity, new origins and a new date. In many ways the interpretation is much the same as before: a local big man buried looking out to sea. Modern analytical techniques can create a person more real, more human and more securely anchored in history. This research team shows how.
408

Gristhorpe Man: a Raman spectroscopic study of 'mistletoe berries' in a Bronze Age log coffin burial

Edwards, Howell G.M., Montgomery, Janet, Melton, Nigel D., Hargreaves, Michael D., Wilson, Andrew S., Carter, E.A. 10 February 2010 (has links)
No / In 1834 in a tumulus at Gristhorpe, North Yorkshire, UK, an intact coffin fashioned from the hollowed-out trunk of an oak tree was found to contain a well-preserved skeleton stained black from the oak tannins, wrapped in an animal skin and buried with a range of grave artefacts, including a bronze dagger, flints and a bark vessel. The remains were deposited in the Rotunda Museum at Scarborough, where closure due to refurbishment in 2005–2008 provided an opportunity for the scientific investigation of the skeletal remains and artefacts using a wide range of techniques. Dendrochronological and radiocarbon dating has established the age of the skeleton as 2140–1940 BC at 95% confidence, in the Early Bronze Age. As part of this project, Raman spectra of several mysterious small spherical objects discovered in the coffin underneath the skeleton and initially believed to be ‘mistletoe berries’ associated with ancient burial customs have been recorded non-destructively. The interpretation of the Raman spectral data, microscopic analysis and comparison with modern specimens has led to the conclusion that the small spheres are phosphatic urinary stones, which reflect the archaeological dietary evidence and stable isotope analysis of bone collagen of Gristhorpe Man.
409

Impact of Ellipticality on Lumber Grade and Volume Recovery For Red Oak Logs

Ese-Etame, Roncs 15 August 2006 (has links)
Hardwood sawmills must become more efficient to remain competitive. One way to increase efficiency and competitiveness is to increase the value or lumber volume produced from logs. While methods to maximize value and volume recovery exist for round logs, little information exists on how to maximize these outcomes for logs with ellipticality. The goal of this research was to determine the impact of low and high degrees of ellipticality on green lumber grade and volume recovery for red oak logs under current sawing methods. Logs of low and high ellipticality were selected and processed at four Appalachian area sawmills. Processing variables and lumber output were tracked for all logs. It was determined that there was no significant difference in overrun, lumber volume, lumber value, and lumber grade between low and high ellipticality logs when comparing the log output at all four sawmills. It was determined that how an individual sawmill processes logs affects the outcome between logs with high and low ellipticality. None of the sawmills produced more value for high elliptical logs than for low elliptical logs and it was possible to produce more lumber volume and value with low elliptical logs. Highly elliptical logs required more processing time than low ellipticality logs in terms of log turns, total elapsed time at the headrig, and number of sawlines at the headrig. The increased processing time results in increased processing costs which were estimated to be $1.28 to $11.33 per log. These costs were not offset by an increase in lumber volume nor lumber value; therefore, highly elliptical sawlogs are less desirable to process than low elliptical logs using current sawing methods. / Master of Science
410

Assessing Anonymized System Logs Usefulness for Behavioral Analysis in RNN Models

Vagis, Tom Richard, Ghiasvand, Siavash 06 August 2024 (has links)
Assessing Anonymized System Logs Usefulness for Behavioral Analysis in RNN Models Tom Richard Vargis1,∗, Siavash Ghiasvand1,2 1Technische Universität Dresden, Germany 2Center for Scalable Data Analytics and Artificial Intelligence (ScaDS.AI) Dresden/Leipzig, Germany Abstract System logs are a common source of monitoring data for analyzing computing systems behavior. Due to the complexity of modern computing systems and the large size of collected monitoring data, automated analysis mechanisms are required. Numerous machine learning and deep learning methods are proposed to address this challenge. However, due to the existence of sensitive data in system logs their analysis and storage raise serious privacy concerns. Anonymization methods could be used to cleanse the monitoring data before analysis. However, anonymized system logs in general do not provide an adequate usefulness for majority of behavioral analysis. Content-aware anonymization mechanisms such as 𝑃𝛼𝑅𝑆 preserve the correlation of system logs even after anonymization. This work evaluates the usefulness of anonymized system logs of Taurus HPC cluster anonymized using 𝑃𝛼𝑅𝑆, for behavioural analysis via recurrent neural network models. To facilitate the reproducibility and further development of this work, the implemented prototype and monitoring data are publicly available [12].

Page generated in 0.106 seconds