• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1286
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2848
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 163
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
471

Deltagande i fysiska videospelsträffar och dess sociala effekter för individen : En fallstudie av ett IRL-game event

Nyström, Kenn January 2016 (has links)
Video games and the social effects that they inflict upon society and the individual have been a highly debated subject. While studies have been made in regards to several social issues and their connection to video games, there was little research in regards to physical game gatherings like LAN-parties, as well as larger game gatherings like Dreamhack, and what the social effects of having physical contact with other people are at these gatherings. The goal with this study was to answer the question: “What are the social effects for the individual when participating in physical video game gatherings?”. This was done through a qualitative study by conducting five semi-structured interviews at the physical game gathering called Umeå Game Night that was located in the Umeå cultural center Klossen at Ålidhems Centrum. Four of the participants were male and one female. “Snowball-sampling” was used to gather the participants for the study through Game Night’s Facebook group. However, this sampling was unsuccessful when no participants were gained from it. Instead I had to take direct contact with people at the game gathering. The interviews were all done during the game gathering in their facility and were then transcribed for analysis with two types of methods; an inductive analysis that was backed up by a deductive analysis in the form of Activity Theory using Engeström’s model of Activity Theory. The results of this study showed that physical game gatherings helped to overcome some negative social effects that the participants thought surrounded online game as well as other social problems that they brought up during the interviews, like toxic behaviour, discrimination, and the feeling of not being welcome. There was an overwhelmingly positive reaction from the participants when being at the game gathering. Being able to socialize with other people proved to be the main motivation. However, even though physical game gatherings showed to improve negative social effects surrounding game, the learning process was still difficult to overcome for new players, and participating in these game gathering may even have negative results in keeping the new players interested in wanting to play as well as participate at the physical game gatherings. This was mainly because of the skill disparity between the experienced players, who were the majority at the game gathering, and the new players, who would feel potentially frustrated from seeing the experienced players being much better than themselves. However, the less experienced female participant in the study indicated that this issue may still be overcome, but more research needs to be done to see how big the issue surrounding the learning process when playing games at game gatherings actually is, and if there are ways to solve it.
472

Error in the invariant measure of numerical discretization schemes for canonical sampling of molecular dynamics

Matthews, Charles January 2013 (has links)
Molecular dynamics (MD) computations aim to simulate materials at the atomic level by approximating molecular interactions classically, relying on the Born-Oppenheimer approximation and semi-empirical potential energy functions as an alternative to solving the difficult time-dependent Schrodinger equation. An approximate solution is obtained by discretization in time, with an appropriate algorithm used to advance the state of the system between successive timesteps. Modern MD simulations simulate complex systems with as many as a trillion individual atoms in three spatial dimensions. Many applications use MD to compute ensemble averages of molecular systems at constant temperature. Langevin dynamics approximates the effects of weakly coupling an external energy reservoir to a system of interest, by adding the stochastic Ornstein-Uhlenbeck process to the system momenta, where the resulting trajectories are ergodic with respect to the canonical (Boltzmann-Gibbs) distribution. By solving the resulting stochastic differential equations (SDEs), we can compute trajectories that sample the accessible states of a system at a constant temperature by evolving the dynamics in time. The complexity of the classical potential energy function requires the use of efficient discretization schemes to evolve the dynamics. In this thesis we provide a systematic evaluation of splitting-based methods for the integration of Langevin dynamics. We focus on the weak properties of methods for confiurational sampling in MD, given as the accuracy of averages computed via numerical discretization. Our emphasis is on the application of discretization algorithms to high performance computing (HPC) simulations of a wide variety of phenomena, where configurational sampling is the goal. Our first contribution is to give a framework for the analysis of stochastic splitting methods in the spirit of backward error analysis, which provides, in certain cases, explicit formulae required to correct the errors in observed averages. A second contribution of this thesis is the investigation of the performance of schemes in the overdamped limit of Langevin dynamics (Brownian or Smoluchowski dynamics), showing the inconsistency of some numerical schemes in this limit. A new method is given that is second-order accurate (in law) but requires only one force evaluation per timestep. Finally we compare the performance of our derived schemes against those in common use in MD codes, by comparing the observed errors introduced by each algorithm when sampling a solvated alanine dipeptide molecule, based on our implementation of the schemes in state-of-the-art molecular simulation software. One scheme is found to give exceptional results for the computed averages of functions purely of position.
473

A POTENTIAL SUPPLY SYSTEM FOR URANIUM BASED UPON A CRUSTAL ABUNDANCE MODEL.

CHAVEZ-MARTINEZ, MARIO LUIS. January 1982 (has links)
The design of a computerized system for the estimation of uranium potential supply in the United States constitutes the primary objective of this dissertation. Once completed, this system performs for various levels of economic variables, such as prices, the estimation of potential uranium supply without requiring the appraisal by geologists, area by area, of undiscovered uranium endowment. The main components that form the system are explicit models of endowment, exploration, and production. These component models are derived from engineering and geological data, and together, they comprise the system. This system is unique in that it links physical attributes of endowment to time series of price and production. This linkage is made by simulating the activities of the U.S. uranium industry, activities (exploration, mine development, and production) that are involved in the transformation of endowment to potential supply. Uranium endowment is first generated by employing a crustal abundance model; a data file containing characteristics (tonnage, grade, depth, intra-deposit grade variation) of the discrete deposits that comprise the endowment is established by this model. An exploration model relates discoveries to exploration effect and deposit characteristics. Discovery yield for a given effort is linked to the relative "discoverability" of the deposits of the endowment as well as to the total exploration effort. An economic evaluation is performed on each discovery to determine whether or not the deposit can be developed and produced, given the stated level of the economic variables. The system then determines the magnitude of potential supply that could be forthcoming from all discoverable and exploitable deposits for the stated economic circumstances. Initially, the parameters of the system must be estimated. The approach employed for this estimation makes use of the time series information on uranium exploration and production activities. In essence, the system is used to simulate the past history of the U.S. uranium industry (period 1948-1978) and to generate industry statistics for these activities; the parameters selected are those values that cause the system to yield a time series that matches closely that which actually occurred.
474

A study on three different sampling frames for telephone survey

Chan, Pik-heung., 陳碧響. January 1991 (has links)
published_or_final_version / Applied Statistics / Master / Master of Social Sciences
475

A comparative study of optimal stratification in business and agricultural surveys

Hayward, Michael Clifford January 2010 (has links)
This thesis is a comparative study of optimal design-based univariate stratification as applied to highly skewed populations such as those observed in business and agricultural surveys. Optimal stratification is a widely used method for reducing the variance or cost of estimates, and this work considers various optimal stratification algorithms, and in particular optimal boundary algorithms, to support this objective. We first provide a background to the theory of stratification and stratified random sampling, and extend this through the derivation of optimal allocation strategies. We then examine the effect of allocation strategies on the variance and design effect of estimators, and in particular find several issues in applying optimal or Neyman allocation when there is little correlation between the survey population and auxiliary information. We present a derivation of the intractable equations for the construction of optimal stratum boundaries, based on the work of Dalenius (1950), and derive the cumulative square root of frequency approximation of Dalenius & Hodges (1957). We then note a number of issues within the implementation of the cumulative square root of frequency rule surrounding the construction of initial intervals, and find that the placement of boundaries and the variance of estimates can be affected by the number of initial intervals. This then leads us to propose two new extensions to the cumulative square root of frequency algorithm, using linear and spline interpolation, and we find that these result in some improvements in the results for this algorithm. We also present a complete derivation of the Ekman algorithm, and consider the extended approach of Hedlin (2000). We derive several new results relating to the Ekman algorithm, and propose a new kernel density based algorithm. We find all three Ekman based algorithms produce similar results for larger populations, and provide some recommendations on the use of these algorithms depending on the size of the population. We look at the derivation and implementation of the Lavallee-Hidiroglou algorithm, and find that it is often slow to converge or does not converge for Neyman allocation. We therefore adopt a random search model of Kozak (2004), and note that the Lavallee-Hidiroglou algorithm generally produces superior results across all populations used in this thesis. We briefly investigate the optimal number of strata by examining the work of Cochran (1977) and Kozak (2006), and find that there is a diminishing marginal effect from increasing the number of strata and possibly some benefit from constructing more than six strata. However we also acknowledge that the cost of constructing such strata may offset any potential gain in precision from constructing more than five or six strata. Finally we consider the how many of these problems can be developed further, and ultimately find that such problems for deciding the number of strata, construction of stratum boundaries, and the allocation of sample units among the strata may require an approach that takes account of the relationship between the auxiliary variable and the survey information. We therefore suggest investigating these algorithms further within the context of a model-assisted environment in order to help account for the relationship between the auxiliary information and survey population.
476

Determination of Design Parameters and Investigation on Operation Performance for an Integrated Gas Cleaning System to Remove Tars from Biomass Gasification Producer Gas.

Mwandila, Gershom January 2010 (has links)
Determinations of design parameters and investigation on operation performance of a tar removal system for gas cleaning of biomass producer gas have been undertaken. The presence of the tars in the producer gas has been the major hindrance for the commercialisation of the biomass gasification technology for power generation, hydrogen production, Fischer Tropsch (FT) synthesis, chemical synthesis and synthetic natural gas (SNG) synthesis. The characteristic of the tars to condense at reduced temperatures cause problems in the downstream processing as the tars can block and foul the downstream process equipment such as gas engines reactor channels, fuel cells, etc. Considerable efforts have been directed at the removal of tars from the producer gas where the tars can be either chemically converted into lighter molecular weight molecules or physically transferred from gas phase to liquid or solid phase. In the former, the tars have been removed in a scrubber by transferring them from the producer gas to a scrubbing liquid and then removed from the liquid to air in a stripper and finally recycled them into air to a gasifier to recover their energy. A tar removal test system involving a scrubber and stripper has been designed based on the predicted tar solubility in canola methyl ester (CME) as the scrubbing liquid and its measured properties (CME is a type of methyl ester biodiesel). The tar solubility has been predicted to decrease with increasing temperatures and thus its value increases at lower temperatures. In designing the test system, the design parameters are needed including equilibrium coefficients of the gas-liquid system, molar transfer coefficient and the optimum liquid to gas flow rate ratio. The equilibrium coefficients have been predicted based on thermodynamic theories where the required data are determined from CME composition and known properties of each component of the CME as well as the properties of the model tar (naphthalene). The molar transfer coefficients are then experimentally determined and the correlations as a function of liquid and gas flow rates are proposed which are consistent with literature. The optimum liquid to gas flow rate ratios have been found to be 21.4±0.1 for the scrubber and 5.7±0.1 for the stripper. Using these optimum ratios, the tar removal efficiencies in the scrubber and the stripper are 77 and 74%, respectively. The analysis of the system performance has been achieved after an innovative method of determining tar concentrations in both the liquid and gas phase had been developed based on the concept of the density of liquid mixtures. However, these tar removal efficiencies are low due to the fact that the targeted tar concentration in the scrubber’s off-gas was large. As a result the system has been redesigned based on the determined design parameters and its operation performance retested. In the redesigned system, the tar removal efficiency in the scrubber and stripper is 99%. The redesigned system would be integrated with the UC gasifier for downstream gas cleaning. Since 1% of tars are not removed, a makeup tar free CME of 0.0375 litres per hour for the 100kW UC gasifier has been introduced in the recycle stream between the scrubber and stripper to avoid tar accumulation in the system.
477

Improvements in ranked set sampling

Haq, Abdul January 2014 (has links)
The main focus of many agricultural, ecological and environmental studies is to develop well designed, cost-effective and efficient sampling designs. Ranked set sampling (RSS) is one of those sampling methods that can help accomplish such objectives by incorporating prior information and expert knowledge to the design. In this thesis, new RSS schemes are suggested for efficiently estimating the population mean. These sampling schemes can be used as cost-effective alternatives to the traditional simple random sampling (SRS) and RSS schemes. It is shown that the mean estimators under the proposed sampling schemes are at least as efficient as the mean estimator with SRS. We consider the best linear unbiased estimators (BLUEs) and the best linear invariant estimators (BLIEs) for the unknown parameters (location and scale) of a location-scale family of distributions under double RSS (DRSS) scheme. The BLUEs and BLIEs with DRSS are more precise than their counterparts based on SRS and RSS schemes. We also consider the BLUEs based on DRSS and ordered DRSS (ODRSS) schemes for the unknown parameters of a simple linear regression model using replicated observations. It turns out that, in terms of relative efficiencies, the BLUEs under ODRSS are better than the BLUEs with SRS, RSS, ordered RSS (ORSS) and DRSS schemes. Quality control charts are widely recognized for their potential to be a powerful process monitoring tool of the statistical process control. These control charts are frequently used in many industrial and service organizations to monitor in-control and out-of-control performances of a production or manufacturing process. The RSS schemes have had considerable attention in the construction of quality control charts. We propose new exponentially weighted moving average (EWMA) control charts for monitoring the process mean and the process dispersion based on the BLUEs obtained under ORSS and ODRSS schemes. We also suggest an improved maximum EWMA control chart for simultaneously monitoring the process mean and dispersion based on the BLUEs with ORSS scheme. The proposed EWMA control charts perform substantially better than their counterparts based on SRS and RSS schemes. Finally, some new EWMA charts are also suggested for monitoring the process dispersion using the best linear unbiased absolute estimators of the scale parameter under SRS and RSS schemes.
478

Soil Sampling and Analysis

Walworth, James 07 1900 (has links)
5 pp. / Soil testing is comprised of four steps: Collection of a representative soil sample, laboratory analyses of the soil sample, interpretation of analytical results, and management recommendations based on interpreted analytical results.
479

Leaf Sampling Guide with Interpretation and Evaluation for Arizona Pecan Orchards

Walworth, James, Pond, Andrew, Kilby, Michael W. 07 1900 (has links)
4 pp. / Leaf analysis is an excellent tool for determining the nutritional status of pecan trees.
480

Leaf Sampling Guide with Interpretation and Evaluation for Arizona Pecan Orchards

Walworth, James L., Pond, Andrew P., Kilby, Michael W. 10 1900 (has links)
Revised; Originally Published: 2006 / 3 pp.

Page generated in 0.0569 seconds