• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1284
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2847
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 162
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
481

Soil Sampling and Analysis

Walworth, J. L. 10 1900 (has links)
Revised; Originally Published: 2006 / 5 pp.
482

SAMPLING VOLUME EFFECT ON MEASURING SALT IN THE SOIL PROFILE.

Hassan, Hesham Mahmoud. January 1982 (has links)
No description available.
483

APPLICATION OF GEOSTATISTICS TO AN OPERATING IRON ORE MINE

Nogueira Neto, Joao Antunes, 1952- January 1987 (has links)
The competition in the world market for iron ore has increased lately. Therefore, an improved method of estimating the ore quality in small working areas has become an attractive cost-cutting strategy in short-term mine plans. Estimated grades of different working areas of a mine form the basis of any short-term mine plan. The generally sparse exploration data obtained during the development phase is not enough to accurately estimate the grades of small working areas. Therefore, additional sample information is often required in any operating mine. The findings of this case study show that better utilization of all available exploration information at this mine would improve estimation of small working areas even without additional face samples. Through the use of kriging variance, this study also determined the optimum face sampling grid, whose spacing turned out to be approximately 100 meters as compared to 50 meters in use today. (Abstract shortened with permission of author.)
484

Advancing Bioaccumulation Modeling and Water Sampling of Ionogenic Organic Chemicals

Cao, Xiaoshu 24 June 2014 (has links)
Although many commercial chemicals can dissociate, the study of the biological and environmental fate of ionogenic organic chemicals (IOCs) is still in its infancy. Uptake of the veterinary drug diclofenac in vultures and cattle was successfully simulated with a newly developed physiologically-based pharmacokinetic model for IOCs, lending credence to diclofenac’s proposed role in South Asian vulture population declines. Proteins and phospholipids rather than total lipids control the tissue distribution of diclofenac. A method was developed to simultaneously extract neutral and acidic pesticides and benzotriazoles from water samples with recoveries ranging 70-100%. This method was applied to samples from a laboratory calibration experiment of the Polar Organic Chemical Integrative Sampler. The sampler had higher uptake for neutral and acidic pesticides when filled with triphasic sorbent admixture and OASIS MAS sorbent, respectively. While either sorbent can also be applied for methylated benzotriazoles, neither is capable of quantitatively sampling all three compound groups.
485

Investigations of peptide structural stability in vacuo

Kalapothakis, Jason Michael Drosos January 2010 (has links)
Gas-phase analytical techniques provide very valuable tools for tackling the structural complexity of macromolecular structures such as those encountered in biological systems. Conformational dynamics of polypeptides and polypeptide assemblies underlie most biological functionalities, yet great difficulties arise when investigating such phenomena with the well-established techniques of X-ray crystallography and NMR. In areas such as these ion mobility interfaced with mass spectrometry (IMMS) and molecular modelling can make a significant contribution. During an IMMS experiment analyte ions drift in a chamber filled with an inert gas; measurement of the transport properties of analyte ions under the influence of a weak electric field can lead to determination of the orientationally-averaged collision cross-section of all resolved ionic species. A comparison with cross-sections estimated for model molecular geometries can lead to structural assignments. Thus IMMS can be used effectively to separate gas-phase ions based on their conformation. The drift tube employed in the experiments described herein is thermally regulated, which also enables the determination of collision cross-sections over a range of temperatures, and can provide a view of temperature-dependent conformational dynamics over the experimental (low microsecond) timescale. Studies described herein employ IMMS and a gamut of other MS-based techniques, solution spectroscopy and – importantly – molecular mechanics simulations to assess a) conformational stability of isolated peptide ions, with a focus on small model peptides and proteins, especially the Trp cage miniprotein; and b) structural characteristics of oligomeric aggregates of an amyloidogenic peptide. The results obtained serve to clarify the factors which dominate the intrinsic stability of non-covalent structure in isolated peptides and peptide assemblies. Strong electrostatic interactions are found to play a pivotal role in determining the conformations of isolated proteins. Secondary structures held together by hydrogen bonding, such as helices, are stable in the absence of solvent, however gas-phase protein structures display loss of their hydrophobic cores. The absence of a polar solvent, “self-solvation” is by far the most potent force influencing the gas-phase configuration of these systems. Geometries that are more compact than the folded state observed in solution are routinely detected, indicating the existence of intrinsically stable compact non-native states in globular proteins, illuminating the nature of proteins’ ‘unfolded’ states.
486

Validation of diffusive samplers for nitrogen oxides and applications in various environments

Hagenbjörk-Gustafsson, Annika January 2014 (has links)
The overall aim of this thesis was to validate diffusive samplers for measurements of nitrogen dioxide (NO2) and nitrogen oxides (NOx). The Willems badge was validated for NO2 measurements both in laboratory tests and in field tests (Paper I-II). The sampling rate was 40.0 mL/min for ambient air concentrations and 46.0 mL/min for higher concentrations. No effects of different factors on sampling rate were found except for a reduced sampling rate at low wind velocity. The results of the laboratory validation were confirmed in field tests in ambient air and with personal sampling. The correlation between diffusive samplers and the reference monitor was good for ambient measurements. In conclusion, the Willems badge performs well at wind velocities down to 0.3 m/s, and this makes it suitable for personal sampling but less suitable for measurements in indoor air where the wind velocity is lower. Paper III reports about the field validation of the Ogawa diffusive samplers. Absolute humidity and temperature were found to have the strongest effect on sampling rate with lower uptake rates at low absolute humidity or temperature. The sampling rates above 0 °C were 8.6 mL/min for NO2 and 9.9 mL/min for NOx. NO2 and NOx concentrations that were determined using the manufacturer’s protocol were either underestimated or overestimated. The agreement between concentrations measured by the Ogawa sampler and the reference monitor was improved when field-determined sampling rates were used to calculate concentrations. Paper IV is based on a study with the aim of assessing the exposure of the Swedish general population to NO2 and some carcinogenic substances. The surveys were performed in one of five Swedish cities every year. In each survey, personal measurements of NO2 and some carcinogenic substances were conducted on 40 randomly selected individuals. In the study presented in this thesis, the NO2 part of the study is in focus and results were available for eight surveys conducted across the five cities. The estimated arithmetic mean concentration for the general Swedish population was 14.1 μg/m3. The exposure level for NO2 was higher for smokers compared with non-smokers, and the NO2 exposure levels were higher for people who had gas stoves at home or who were exposed at their workplace. The exposure was lower for those who had oil heating in their houses.
487

Analogue to information system based on PLL-based frequency synthesizers with fast locking schemes

Lin, Ming-Lang January 2010 (has links)
Data conversion is the crucial interface between the real world and digital processing systems. Analogue-to-digital converters and digital-to-analogue converters are two key conversion devices and used as the interface. Up to now, the conventional ADCs based on Nyquist sampling theorem are facing a critical challenge: the resolution and the sampling rate must be radically increased when some applications such as radar detection and ultra-wideband communication emerge. The offset of comparators and the setup time of sample-and-hold circuits, however, limit the resulution and clock rate of ADCs. Alternatively, in some applications such as speech, temperature sensor, etc. signals remain possibly unchanged for prolonged periods with brief bursts of significant activity. If trational ADCs are employed in such circumstances a higher bandwidth is required for transmitting the converted samples. On the other hand, sampling signals with an extremely high clock rate are also required for converting the signals with the feature of sparsity in time domain. The level-crossing sampling scheme (LCSS) is one of the data conversions suitable for converting signals with the sparsity feature and brief bursts of signigicant activity. due to the traditional LCSS with a fixed clock rate being limited in applications a novel irregular data conversion scheme called analogue-to-information system (AIS) is proposed in this thesis. The AIS is typically based upon LCSS, but an adjustable clock generator and a real time data compression scheme are applied to it. As the system-level simulations results of AIS show it can be seen that a data transmission saving rate nearly 30% is achieved for different signals. PLLs with fast pull-in and locking schemes are very important when they are applied in TDMA systems and fequency hopping wireless systems. So a novel triple path nonlinear phase frequency detector (TPNPFD) is also proposed in this thesis. Compared to otherPFDs, the pll-in and locking time in TPNPFD is much shorter. A proper transmission data format can make the recreation of the skipped samples and the reconstruction of the original signal more efficient, i.e. they can be achieved in a minimum number of the received data without increasing much more hardware complexity. So the preliminary data format used for transmitting the converted data from AIS is also given in the final chapter of this thesis for future works.
488

Data Science with Graphs: A Signal Processing Perspective

Chen, Siheng 01 December 2016 (has links)
A massive amount of data is being generated at an unprecedented level from a diversity of sources, including social media, internet services, biological studies, physical infrastructure monitoring and many others. The necessity of analyzing such complex data has led to the birth of an emerging framework, graph signal processing. This framework offers an unified and mathematically rigorous paradigm for the analysis of high-dimensional data with complex and irregular structure. It extends fundamental signal processing concepts such as signals, Fourier transform, frequency response and filtering, from signals residing on regular lattices, which have been studied by the classical signal processing theory, to data residing on general graphs, which are called graph signals. In this thesis, we consider five fundamental tasks on graphs from the perspective of graph signal processing: representation, sampling, recovery, detection and localization. Representation, aiming to concisely model shapes of graph signals, is at the heart of the proposed techniques. Sampling followed by recovery, aiming to reconstruct an original graph signal from a few selected samples, is applicable in semi-supervised learning and user profiling in online social networks. Detection followed by localization, aiming to identify and localize targeted patterns in noisy graph signals, is related to many real-world applications, such as localizing virus attacks in cyber-physical systems, localizing stimuli in brain connectivity networks, and mining traffic events in city street networks, to name just a few. We illustrate the power of the proposed tools on two real-world problems: fast resampling of 3D point clouds and mining of urban traffic data.
489

Study of Users’ Data Volume as Function of Quality of Experience for Churn Prediction

Hemanth Kumar, Ravuri January 2016 (has links)
Customer churn has always been a problem to be addressed by the telecommunication service providers. So far, work done in this regard was based on analyzing historical data of the customers by using different data mining techniques. Investigations based on individual user behavior with a motive of churn prediction are expected to give an idea about the user’s point view towards churn. Data volumes/data usage of the users is seen as parameter to assess the satisfaction of the users with the service. The subjective and objective behavior of the mobile phone users has been captured by collecting data about the data volumes/data usage for both Wi-Fi and mobile services along with their ratings of Quality of Experience (QoE).   The Experience Sampling Method has been deployed to collect the user data. Android tool was used to collect weekly data volumes of the users. A questionnaire was prepared with questions regarding quality, annoyance and churn risk of the users. The questionnaire was used to collect the weekly opinions of the users on the service. A total of 22 users participated in the study, of which 3 persons churned to other service provider during the study. The data collected in the study was analyzed using averages, correlations and decision trees. Comparisons were made between Wi-Fi and mobile services, churners and non-churners/active users. A 2-fold churn prediction model was proposed based on conclusions of the study.
490

topicmodels: An R Package for Fitting Topic Models

Hornik, Kurt, Grün, Bettina January 2011 (has links) (PDF)
Topic models allow the probabilistic modeling of term frequency occurrences in documents. The fitted model can be used to estimate the similarity between documents as well as between a set of specified keywords using an additional layer of latent variables which are referred to as topics. The R package topicmodels provides basic infrastructure for fitting topic models based on data structures from the text mining package tm. The package includes interfaces to two algorithms for fitting topic models: the variational expectation-maximization algorithm provided by David M. Blei and co-authors and an algorithm using Gibbs sampling by Xuan-Hieu Phan and co-authors.

Page generated in 0.0718 seconds