• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1286
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2848
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 163
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Confidence intervals in life-testing

Karch, Angela Irene 03 June 2011 (has links)
The purpose of the study was to develop a sequential test method for obtaining a confidence interval in life-testing. The problem of using a maximum likelihood estimator based upon grouped data was considered. Life-times that were investigated are described by the exponential distribution. The sequential test used the length of the confidence interval as a stopping rule.The test method and necessary calculations were described. The results of using different length values as a stopping rule were compared using a computer simulation. Results are indicated in two categories: percent of time the estimate contained the true parameter value, and average number of data collection times needed to obtain the estimate. It was concluded that the test method was accurate and efficient. The length value was a considerable factor in deriving good results from the test method. It was recommended that research be continued to establish a method of choosing the best length value to be used.Ball State UniversityMuncie, IN 47306
562

Art and Politics of Appropriation

Zeilinger, Martin 17 January 2012 (has links)
This thesis works towards a theory of creative appropriation as critical praxis. Defining ‘appropriation’ as the re-use of already-authored cultural matter, I investigate how the ubiquity of aesthetically and commercially motivated appropriative practices has impacted concepts of creativity, originality, authorship and ownership. Throughout this thesis, appropriation is understood as bridging the artistic, political, economic, and scientific realms. As such, it strongly affects cultural and socio-political landscapes, and has become an ideal vehicle for effectively criticizing and, perhaps, radically changing dominant aesthetic, legal and ethical discourses regarding the (re)production, ownership and circulation of knowledge, artifacts, skills, resources, and cultural matter in general. Critical appropriation is thus posited as a political strategy that can draw together the different causes motivating appropriative processes across the globe, and organize them for the benefit of a multitude which values concepts of reusing, sharing and collectivity over concepts of the individually authored and the privately owned. My arguments regarding this critical potentiality are based on concrete practices emanating from several media (textual – visual – sonic – digital). The corpus includes Berlin Dadaist collage, ‘found footage’ filmmaking, audio sampling, and digital media art. It is critically contextualized in the fields of philosophy, law, and aesthetics, and paired with relevant examples from extra-aesthetic arenas (economics, industrial production and science). Following a trajectory from the analog to the digital, my thesis traces the emergence and tactical employment of critical appropriative practices in the context of different historical, philosophical, technological and economic circumstances. Focussing on conceptual and practical shifts from the analog to the digital furthermore enables me to draw connections between analytic perspectives founded in dialectic materialism and contemporary theories foregrounding issues of immaterial labor. The important qualitative changes that practices and perceptions of appropriation have undergone are argue to significantly amplify the critical potential of all appropriative practices. Ultimately, my comparative analyses thus establish appropriation as an ideal site for effectively challenging – both in terms of form and content – the ingrained, restrictive notions of original genius and naturalized authorship-qua-ownership on which present cultures and technologies of global capitalism are so heavily based.
563

Fate of Current-use Pesticides in the Canadian Atmosphere

Hayward, Stephen 14 November 2011 (has links)
Across Canada, and around the world, very large amounts of pesticides are produced and applied to agricultural crops each year. Although pesticide usage is declining, they are still a necessary part of industrial agriculture. Numerous pesticides have been quantified in the atmosphere, at high levels near regions of use and at lower, but still significant levels in remote regions. Some of the most persistent pesticides have been banned, but others continue to be used despite their persistence and potential for long-range transport (LRT). We have applied and refined an XAD-2 resin-based passive air sampler (PAS) to study the concentrations of pesticides in the atmosphere. A set of laboratory experiments measured the equilibria sorption coefficients for chemicals on XAD-2 resin, allowing the determination of a new predictive equation for equilibria sorption coefficients, and thus interpretation of the range of applicability of both XAD-based PAS and active air samplers (AAS). A set of field experiments were performed to compare the data obtained by both PAS and AAS, and to study the temporal trends of a wide range of pesticides in an agricultural area of southern Ontario. Because it is now apparent that XAD-PAS sampling rates can vary between compounds and with temperature, we also determined new compound-specific sampling rates for pesticides in the XAD-PAS. The XAD-PAS were deployed in two transects across Canada, one from the Great Lakes region to the Canadian Arctic, and one across southern British Columbia in four different mountain regions and at different elevations. The air concentrations of current-use pesticides were correlated with regions of their use in both transects. The variation of air concentration with elevation was correlated with local, ground-level sources in British Columbia. The LRT of pesticides was determined from the north-south transect, and correlated to their atmospheric half-lives. Historic-use pesticides such as hexachlorobenzene and hexachlorocyclohexane were found to have relatively uniform distributions in the Canadian atmosphere, while further evidence of α-hexachlorocyclohexane evaporation from oceans was observed in both transects.
564

Greater Galagos near Mt. Kasigau, Kenya: Population Density Estimates

Falcetto, Andrea 01 August 2012 (has links)
This study examined population density and habitat use of a species of greater galago, genus Otolemur, around Mt. Kasigau, Kenya. Mt. Kasigau has a unique regional microclimate, a cloud forest, which provides many different flora and fauna a home. To examine population density, two different methods were used. The first method was using line transects and analyzing results using Distance 6.0. These surveys were conducted during both dry and wet season and results were compared after the study. The second method was to trap individuals and use mark-recapture to determine population density. When caught in a trap, individuals were also weighed and measurements were taken. Accurate weight was taken for 17 individuals; ear height, tail length, and hind foot length were recorded for 21 individuals. Mark-recapture data were analyzed using Krebs/WIN 3.0. Population density estimates using both methods were compared and provided similar results; Distance 6.0 estimated 0.62 individuals per hectare and Krebs/WIN 3.0 estimated 0.51 individuals per hectare. Morphological measurements were not consistent with published data of known species of Otolemur garnettii, which is the most common greater galago in East Africa. It is possible the Mt. Kasigau population is reproductively isolated and DNA analysis should be conducted in the future. Habitat usage of greater and lesser galagos and diurnal primates was examined to determine which tree species are commonly used by these individuals.
565

Enhanced sampling and applications in protein folding

Zhang, Cheng 24 July 2013 (has links)
We show that a single-copy tempering method is useful in protein-folding simulations of large scale and high accuracy (explicit solvent, atomic representation, and physics-based potential). The method uses a runtime estimate of the average potential energy from an integral identity to guide a random walk in the continuous temperature space. It was used for folding three mini-proteins, trpzip2 (PDB ID: 1LE1), trp-cage (1L2Y), and villin headpiece (1VII) within atomic accuracy. Further, using a modification of the method with a dihedral bias potential added on the roof temperature, we were able to fold four larger helical proteins: α3D (2A3D), α3W (1LQ7), Fap1-NRα (2KUB) and S-836 (2JUA). We also discuss how to optimally use simulation data through an integral identity. With the help of a general mean force formula, the identity makes better use of data collected in a molecular dynamics simulation and is more accurate and precise than the common histogram approach.
566

Accelerating microarchitectural simulation via statistical sampling principles

Bryan, Paul David 05 December 2012 (has links)
The design and evaluation of computer systems rely heavily upon simulation. Simulation is also a major bottleneck in the iterative design process. Applications that may be executed natively on physical systems in a matter of minutes may take weeks or months to simulate. As designs incorporate increasingly higher numbers of processor cores, it is expected the times required to simulate future systems will become an even greater issue. Simulation exhibits a tradeoff between speed and accuracy. By basing experimental procedures upon known statistical methods, the simulation of systems may be dramatically accelerated while retaining reliable methods to estimate error. This thesis focuses on the acceleration of simulation through statistical processes. The first two techniques discussed in this thesis focus on accelerating single-threaded simulation via cluster sampling. Cluster sampling extracts multiple groups of contiguous population elements to form a sample. This thesis introduces techniques to reduce sampling and non-sampling bias components, which must be reduced for sample measurements to be reliable. Non-sampling bias is reduced through the Reverse State Reconstruction algorithm, which removes ineffectual instructions from the skipped instruction stream between simulated clusters. Sampling bias is reduced via the Single Pass Sampling Regimen Design Process, which guides the user towards selected representative sampling regimens. Unfortunately, the extension of cluster sampling to include multi-threaded architectures is non-trivial and raises many interesting challenges. Overcoming these challenges will be discussed. This thesis also introduces thread skew, a useful metric that quantitatively measures the non-sampling bias associated with divergent thread progressions at the beginning of a sampling unit. Finally, the Barrier Interval Simulation method is discussed as a technique to dramatically decrease the simulation times of certain classes of multi-threaded programs. It segments a program into discrete intervals, separated by barriers, which are leveraged to avoid many of the challenges that prevent multi-threaded sampling.
567

Load shedding in network monitoring applications

Barlet Ros, Pere 15 December 2008 (has links)
Monitoring and mining real-time network data streams are crucial operations for managing and operating data networks. The information that network operators desire to extract from the network traffic is of different size, granularity and accuracy depending on the measurement task (e.g., relevant data for capacity planning and intrusion detection are very different). To satisfy these different demands, a new class of monitoring systems is emerging to handle multiple and arbitrary monitoring applications. Such systems must inevitably cope with the effects of continuous overload situations due to the large volumes, high data rates and bursty nature of the network traffic. These overload situations can severely compromise the accuracy and effectiveness of monitoring systems, when their results are most valuable to network operators. In this thesis, we propose a technique called load shedding as an effective and low-cost alternative to over-provisioning in network monitoring systems. It allows these systems to handle efficiently overload situations in the presence of multiple, arbitrary and competing monitoring applications. We present the design and evaluation of a predictive load shedding scheme that can shed excess load in front of extreme traffic conditions and maintain the accuracy of the monitoring applications within bounds defined by end users, while assuring a fair allocation of computing resources to non-cooperative applications. The main novelty of our scheme is that it considers monitoring applications as black boxes, with arbitrary (and highly variable) input traffic and processing cost. Without any explicit knowledge of the application internals, the proposed scheme extracts a set of features from the traffic streams to build an on-line prediction model of the resource requirements of each monitoring application, which is used to anticipate overload situations and control the overall resource usage by sampling the input packet streams. This way, the monitoring system preserves a high degree of flexibility, increasing the range of applications and network scenarios where it can be used. Since not all monitoring applications are robust against sampling, we then extend our load shedding scheme to support custom load shedding methods defined by end users, in order to provide a generic solution for arbitrary monitoring applications. Our scheme allows the monitoring system to safely delegate the task of shedding excess load to the applications and still guarantee fairness of service with non-cooperative users. We implemented our load shedding scheme in an existing network monitoring system and deployed it in a research ISP network. We present experimental evidence of the performance and robustness of our system with several concurrent monitoring applications during long-lived executions and using real-world traffic traces.
568

Sampling from the Hardcore Process

Dodds, William C 01 January 2013 (has links)
Partially Recursive Acceptance Rejection (PRAR) and bounding chains used in conjunction with coupling from the past (CFTP) are two perfect simulation protocols which can be used to sample from a variety of unnormalized target distributions. This paper first examines and then implements these two protocols to sample from the hardcore gas process. We empirically determine the subset of the hardcore process's parameters for which these two algorithms run in polynomial time. Comparing the efficiency of these two algorithms, we find that PRAR runs much faster for small values of the hardcore process's parameter whereas the bounding chain approach is vastly superior for large values of the process's parameter.
569

Using radial k-space sampling and temporal filters in MRI to improve temporal resolution

Brynolfsson, Patrik January 2010 (has links)
In this master thesis methods for increasing temporal resolution when reconstructing radially sampled MRI data have been developed and evaluated. This has been done in two steps; first the order in which data is sampled in k-space has been optimized, and second; temporal filters have been developed in order to utilize the high sampling density in central regions of k-space as a result of the polar sampling geometry to increase temporal resolution while maintaining image quality.By properly designing the temporal filters the temporal resolution is increased by a factor 3–20 depending on other variables such as imageresolution and the size of the time varying areas in the image. The results are obtained from simulated raw data and subsequent reconstruction. The next step should be to acquire and reconstruct raw data to confirm the results. / This Master thesis work was performed at Dept. Radiation Physis, Linköping University, but examined at Dept. Radiation Physics, Umeå University
570

Compressed Sampling for High Frequency Receivers Applications

bi, xiaofei January 2011 (has links)
In digital signal processing field, for recovering the signal without distortion, Shannon sampling theory must be fulfilled in the traditional signal sampling. However, in some practical applications, it is becoming an obstacle because of the dramatic increase of the costs due to increased volume of the storage and transmission as a function of frequency for sampling. Therefore, how to reduce the number of the sampling in analog to digital conversion (ADC) for wideband and how to compress the large data effectively has been becoming major subject for study. Recently, a novel technique, so-called “compressed sampling”, abbreviated as CS, has been proposed to solve the problem. This method will capture and represent compressible signals at a sampling rate significantly lower than the Nyquist rate.   This paper not only surveys the theory of compressed sampling, but also simulates the CS with the software Matlab. The error between the recovered signal and original signal for simulation is around -200dB. The attempts were made to apply CS. The error between the recovered signal and original one for experiment is around -40 dB which means the CS is realized in a certain extent. Furthermore, some related applications and the suggestions of the further work are discussed.

Page generated in 0.0351 seconds