• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1284
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2847
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 162
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

A PARALLEL IMPLEMENTATION OF GIBBS SAMPLING ALGORITHM FOR 2PNO IRT MODELS

Rahimi, Mona 01 August 2011 (has links)
Item response theory (IRT) is a newer and improved theory compared to the classical measurement theory. The fully Bayesian approach shows promise for IRT models. However, it is computationally expensive, and therefore is limited in various applications. It is important to seek ways to reduce the execution time and a suitable solution is the use of high performance computing (HPC). HPC offers considerably high computational power and can handle applications with high computation and memory requirements. In this work, we have applied two different parallelism methods to the existing fully Bayesian algorithm for 2PNO IRT models so that it can be run on a high performance parallel machine with less communication load. With our parallel version of the algorithm, the empirical results show that a speedup was achieved and the execution time was considerably reduced.
352

Robust spectrum sensing techniques for cognitive radio networks

Huang, Qi January 2016 (has links)
Cognitive radio is a promising technology that improves the spectral utilisation by allowing unlicensed secondary users to access underutilised frequency bands in an opportunistic manner. This task can be carried out through spectrum sensing: the secondary user monitors the presence of primary users over the radio spectrum periodically to avoid harmful interference to the licensed service. Traditional energy based sensing methods assume the value of noise power as prior knowledge. They suffer from the noise uncertainty problem as even a mild noise level mismatch will lead to significant performance loss. Hence, developing an efficient robust detection method is important. In this thesis, a novel sensing technique using the F-test is proposed. By assuming a multiple antenna assisted receiver, this detector uses the F-statistic as the test statistic which offers absolute robustness against the noise variance uncertainty. In addition, since the channel state information (CSI) is required to be known, the impact of CSI uncertainty is also discussed. Results show the F-test based sensing method performs better than the energy detector and has a constant false alarm probability, independent of the accuracy of the CSI estimate. Another main topic of this thesis is to address the sensing problem for non-Gaussian noise. Most of the current sensing techniques consider Gaussian noise as implied by the central limit theorem (CLT) and it offers mathematical tractability. However, it sometimes fails to model the noise in practical wireless communication systems, which often shows a non-Gaussian heavy-tailed behaviour. In this thesis, several sensing algorithms are proposed for non-Gaussian noise. Firstly, a non-parametric eigenvalue based detector is developed by exploiting the eigenstructure of the sample covariance matrix. This detector is blind as no information about the noise, signal and channel is required. In addition, the conventional energy detector and the aforementioned F-test based detector are generalised to non-Gaussian noise, which require the noise power and CSI to be known, respectively. A major concern of these detection methods is to control the false alarm probability. Although the test statistics are easy to evaluate, the corresponding null distributions are difficult to obtain as they depend on the noise type which may be unknown and non-Gaussian. In this thesis, we apply the powerful bootstrap technique to overcome this difficulty. The key idea is to reuse the data through resampling instead of repeating the experiment a large number of times. By using the nonparametric bootstrap approach to estimate the null distribution of the test statistic, the assumptions on the data model are minimised and no large sample assumption is invoked. In addition, for the F-statistic based method, we also propose a degrees-of-freedom modification approach for null distribution approximation. This method assumes a known noise kurtosis and yields closed form solutions. Simulation results show that in non-Gaussian noise, all the three detectors maintain the desired false alarm probability by using the proposed algorithms. The F-statistic based detector performs the best, e.g., to obtain a 90% detection probability in Laplacian noise, it provides a 2.5 dB and 4 dB signal-to-noise ratio (SNR) gain compared with the eigenvalue based detector and the energy based detector, respectively.
353

Estudo das propriedades dos gráficos de controle bivariados com amostragem dupla

Machado, Marcela Aparecida Guerreiro [UNESP] 24 August 2006 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:28:34Z (GMT). No. of bitstreams: 0 Previous issue date: 2006-08-24Bitstream added on 2014-06-13T18:57:56Z : No. of bitstreams: 1 machado_mag_me_guara.pdf: 834159 bytes, checksum: e29ab88c45f958f9b9029408d280288c (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Assim como o gráfico deXBARRA, o gráfico T2 de Hotelling é lento na detecção de pequenas a moderadas pertubações no processo. Estudos consagrados mostram que o desempenho do gráfico de XBARRA melhora em muito com o uso da amostragem dupla. Com base nestes resultados, este trabalho se dedica ao estudo das propriedades dos gráficos T2 com amostragem dupla para processos bivariados. Através de uma rotação dos eixos cartesianos é possível transformar as variáveis originais, que em geral são altamente correlacionadas, em variáveis independentes. Com as novas variáveis e trabalhando com coordenadas polares foi possível obter o número médio de amostras (NMA) que o gráfico proposto necessita para detectar uma alteração no processo. Por meio de comparações dos NMAs foi possível verificar que o gráfico de controle proposto é, na maioria das vezes, mais eficiente que os gráficos adaptativos em que o tamanho das amostras e/ou o intervalo entre retirada de amostras são variáveis. / Similarly to the X chart, the T2 chart is slow to detect small or even moderate process disturbances. Earlier studies have shown that the use of the double sampling procedure improves substabtially the X chart performance. Based on that, we propose here to study the performance of the T2 chart with double sampling applied to control bivariate processes. An appropriate rotation transforms the original bivariate variables, in general presenting high correlation, in independent variables. With these equivalent variables and working with polar coordinates, it was possible to obtain the average run length (ARL) that measures the effectiveness of the proposed chart in detecting a process change. By comparisons of ARLs it was possible to verify that the proposed control chart is, frequently, more efficient than the adaptive charts with variable sample size or variable sampling interval.
354

Estudo das propriedades dos gráficos de controle bivariados com amostragem dupla /

Machado, Marcela Aparecida Guerreiro. January 2006 (has links)
Resumo: Assim como o gráfico de"XBARRA", o gráfico T2 de Hotelling é lento na detecção de pequenas a moderadas pertubações no processo. Estudos consagrados mostram que o desempenho do gráfico de "XBARRA" melhora em muito com o uso da amostragem dupla. Com base nestes resultados, este trabalho se dedica ao estudo das propriedades dos gráficos T2 com amostragem dupla para processos bivariados. Através de uma rotação dos eixos cartesianos é possível transformar as variáveis originais, que em geral são altamente correlacionadas, em variáveis independentes. Com as novas variáveis e trabalhando com coordenadas polares foi possível obter o número médio de amostras (NMA) que o gráfico proposto necessita para detectar uma alteração no processo. Por meio de comparações dos NMAs foi possível verificar que o gráfico de controle proposto é, na maioria das vezes, mais eficiente que os gráficos adaptativos em que o tamanho das amostras e/ou o intervalo entre retirada de amostras são variáveis. / Abstract: Similarly to the X chart, the T2 chart is slow to detect small or even moderate process disturbances. Earlier studies have shown that the use of the double sampling procedure improves substabtially the X chart performance. Based on that, we propose here to study the performance of the T2 chart with double sampling applied to control bivariate processes. An appropriate rotation transforms the original bivariate variables, in general presenting high correlation, in independent variables. With these equivalent variables and working with polar coordinates, it was possible to obtain the average run length (ARL) that measures the effectiveness of the proposed chart in detecting a process change. By comparisons of ARLs it was possible to verify that the proposed control chart is, frequently, more efficient than the adaptive charts with variable sample size or variable sampling interval. / Orientador: Antonio Fernando Branco Costa / Coorientador: Fernando Augusto Silva Marins / Banca: Messias Borges Silva / Banca: Sebastião de Amorim / Mestre
355

Reconciliação ilusória: compensação de erros por amostragem manual. / Illysory reconciliation: errors compensation by manual sampling.

Thammiris Mohamad El Hajj 03 June 2013 (has links)
No contexto da indústria mineral, reconciliação pode ser definida como a prática de comparar a massa e o teor médio de minério previstos pelos modelos geológicos com a massa e o teor gerados na usina de beneficiamento. Esta prática tem se mostrado cada vez mais importante, visto que, quando corretamente executada, aumenta a confiabilidade no planejamento de curto prazo e otimiza as operações de lavra e beneficiamento do minério. No entanto, a utilidade da reconciliação depende da qualidade e confiabilidade dos dados de entrada, gerados por diferentes métodos de amostragem. Uma boa reconciliação pode ser ilusória. Em muitos casos, erros cometidos em determinado ponto do processo são compensados por erros cometidos em outros pontos, resultando em reconciliações excelentes. Entretanto, esse fato mascara os erros do sistema que, mais cedo ou mais tarde, podem se revelar. Frequentemente, os erros de amostragem podem levar a uma análise errônea do sistema de reconciliação, gerando consequências graves à operação, principalmente quando a lavra alcança regiões mais pobres ou mais heterogêneas do depósito. Como uma boa estimativa só é possível com práticas corretas de amostragem, a confiabilidade dos resultados de reconciliação depende da representatividade das amostras que os geraram. Este trabalho analisa as práticas de amostragem manual em uma mina de cobre e ouro em Goiás e propõe um método mais confiável para fins de reconciliação. Os resultados mostram que a reconciliação aparentemente excelente entre mina e usina é ilusória, consequência da compensação de diversos erros devidos às práticas de coleta de amostras para o planejamento de curto prazo. / In the mining industry context, reconciliation can be defined as the practice of comparing the tonnage and average grade of ore predicted by the geological models with the tonnage and grade generated by the processing or metallurgical plant. This practice has shown an increasingly importance, since, if correctly executed, allows to improve the reliability on short-term planning and to optimize the mining and processing operations. However, the usefulness of reconciliation relies on the quality and reliability of the input data, generated by different sampling methods. Successful reconciliation can be illusory. In many cases, errors generated at one point of the process are offset by errors generated at other points, resulting in excellent reconciliations. However, this fact can hide compensating biases in the system that may surface someday. Very often sampling errors can be masked and may lead to erroneous analysis of the reconciliation system, generating serious consequences to the operation, especially when mining reaches poorer or more heterogeneous areas of the deposit. Since good estimation is only possible with correct sampling practices, the reliability in the reconciliation results depends on the representativeness of the samples that generated them. This work analyzes the manual sampling practices carried out at a copper and gold mine in Goiás, proposing a more reliable sampling method for reconciliation purposes. Results show that the apparently excellent reconciliation between the mine and the plant is in fact illusory, consequence of the compensation of many errors due to sampling practices for short-term planning.
356

Reconciliação pró-ativa em empreendimentos mineiros. / Proactive reconciliation at mining industry.

Ana Carolina Chieregati 18 April 2007 (has links)
As práticas de reconciliação consistem na comparação entre as quantidades e teores de minério estimados pelos modelos da jazida e as quantidades e teores de minério produzidos na usina de beneficiamento. O resultado dessas comparações é geralmente um grupo de fatores que são aplicados a estimativas futuras, na tentativa de melhorar a previsão do desempenho de uma operação. Atualmente, a prática comum de reconciliação baseia-se na definição do mine call factor (MCF) e sua aplicação às estimativas dos modelos de recursos e de controle de teor. O MCF expressa a diferença entre a produção prevista pelos modelos e a produção registrada na usina e, portanto, sua aplicação permite uma correção nas estimativas dos modelos. Esta é uma prática de reconciliação reativa. Entretanto, a aplicação desses fatores às estimativas dos modelos pode mascarar as causas dos erros responsáveis pelas discrepâncias observadas. As causas reais de qualquer variância só podem ser identificadas analisando-se as informações referentes a cada variância e, em seguida, modificando metodologias e processos. Este é o conceito de prognosticação, ou reconciliação pró-ativa, um processo iterativo de recalibração constante das entradas de dados e dos cálculos. Portanto, a prognosticação permite uma correção das metodologias de coleta de dados, e não simplesmente uma correção das estimativas dos modelos. O presente trabalho analisa as práticas de reconciliação realizadas em uma mina de ouro do Brasil e sugere um novo protocolo de amostragem, com base nos conceitos de prognosticação. / Reconciliation is the practice of comparing the tonnage and average grade of ore predicted from resource and grade control models with the tonnage and grade generated by the processing plant. The result is usually a group of factors, which are applied to future estimates in an attempt to better predict how the operation may perform. The common practice of reconciliation is based on definition of the mine call factor (MCF) and its application to resource or grade control estimates. The MCF expresses the difference, a ratio or percentage, between the predicted grade and the grade reported by the plant. Therefore, its application allows to correct model estimates. This practice is named reactive reconciliation. However, the use of generic factors that are applied across differing time scales and material types often disguises the causes of the error responsible for the discrepancy. The root causes of any given variance can only be identified by analyzing the information behind any variance and, then, making changes to methodologies and processes. This practice is named prognostication, or proactive reconciliation, an iterative process resulting in constant recalibration of the inputs and the calculations. The prognostication allows personnel to adjust processes so that results align within acceptable tolerance ranges, and not only to correct model estimates. This study analyses the reconciliation practices performed at a gold mine in Brazil and suggests a new sampling protocol, based on prognostication concepts.
357

A Literature Review of Wipe Sampling Methods for Pesticides in Published Exposure Measurement Studies in the United States

Low, Christopher Michael 19 October 2016 (has links)
Pesticides in the United States are frequently used to control pests in many settings from residential homes to agricultural crops. Most pesticides, when used in accordance with their manufacturer's label are relatively safe, and will naturally degrade once exposed to the environment, however, these natural degradative processes can be hindered when introduced indoors. Furthermore, it has been shown that pesticides can easily bond to surface dislodgeable residues (SDRs) commonly known as dust. There are various methods that can be used to characterize the presence and exposure of pesticides indoors. Wipe sampling is one of the important methods commonly used to measure pesticides on surfaces due to its simple and inexpensive nature, however, several methods exist for wipe sampling and each method has varying steps involving different wiping material, pre-treatment of wipes, wetting solvent, surface type, collection pattern, and storage. The purpose of this literature review is to summarize concisely the methods from eighteen recent studies that used surface wipes to sample for pesticides from indoor environments. This report details the methods applied to perform the literature review, provide general wipe sampling information from government agencies, discuss other related surface sampling methods, provide a brief summary of wipe sampling methods applied in each study, and compare the methods applied to provide considerations for those seeking to use surface wipes for sampling pesticides. Overall, it would seem that there are more variations than similarities between wipe sampling methods from the literature reviewed. Similarities included the use of isopropyl alcohol (IPA) as the wetting solvent and how wipe samples were stored after collection. The differences in wiping materials, pre-treatment of wipes, surface types, and collection patterns still demonstrate the need for a standardized method. Until a standardized method is established, poor comparisons of study results will continue and knowledge gaps will remain.
358

Objectivity in stratification, sampling and classification of vegetation

Westfall, Robert Howard 03 September 2009 (has links)
The aims of this study are to increase objectivity in stratification, sampling and classification of vegetation, thereby, improving repeatability, predictability and relevancy of vegetation classifications. The aims are achieved by: relating stratification, sampling and classification to scale; improved small-scale vegetation mapping using, satellite imagery; improved plant cover estimations; and vegetation classification by minimum entropy. A comprehensive computer program package was developed to facilitate the aims of this study and reduce time spent on vegetation analyses. It is recommended that the vegetation resource be given the highest national priority because correct vegetation management can also ensure conservation of soil and soil water. / Thesis (PHD)--University of Pretoria, 2009. / Plant Science / unrestricted
359

Long-term use of fish and shellfish resources revealed through vibracore sampling at EjTa-13, Hecate Island, Central Coast, BC

Duffield, Seonaid Eileen Shute 03 January 2018 (has links)
This Master’s research program was undertaken as part of the Hakai Ancient Landscapes Archaeology Project in Heiltsuk and Wuikinuxv Territories on the Central Pacific Coast of British Columbia (BC), Canada. The project tested the utility of applying vibracore technology to sample a shell midden site on Hecate Island on the BC Central Coast. This revealed that the earliest archaeological occupation began approximately 6,000 years ago, continuing into the 16th Century AD. Analysis using 21 radiocarbon dates from six core samples shows the site was repeatedly occupied and accumulated consistently throughout the tested area and extended to a depth of 544 cm depth below surface. Sampled sediments were utilized to evaluate evidence of fisheries resource management through time with reference to the nearby, intensively-studied archaeological site Namu (ElSx-1). Zooarchaeological results show the herring (Clupea pallasii), salmon (Oncorhynchus spp.), rockfish (Sebastes spp.) and greenling (Hexagrammos spp.) were fished persistently and in similar abundances through the occupation of the site. Overall results for vertebrate fauna reveal the total number of specimens is 19,173 and the total number of identified specimens is 6,566. Results also show a consistent harvest of certain shellfish taxa (e.g., mussel and barnacle), however shellfish weight per litre increases through time. When comparing the relative abundance of herring and salmon through time at Namu and EjTa-13, results show that salmon at Namu was more abundant than at EjTa-13. This is likely due to the productivity of salmon in the Namu River adjacent to the site. Alternatively, herring remains were represented similarly between sites indicating the resource was equally desirable at EjTa-13 and Namu. Surprisingly, a large number of very small artifacts of various materials were also recovered (an estimated 550 artifacts per cubic metre of cultural sediments), which indicates that the field and laboratory methods used are especially conducive to the recovery of small items. These results show a persistent and sustainable local fishery through six millennia until the contact period. / Graduate / 2018-12-15
360

Randomization in a two armed clinical trial: an overview of different randomization techniques

Batidzirai, Jesca Mercy January 2011 (has links)
Randomization is the key element of any sensible clinical trial. It is the only way we can be sure that the patients have been allocated into the treatment groups without bias and that the treatment groups are almost similar before the start of the trial. The randomization schemes used to allocate patients into the treatment groups play a role in achieving this goal. This study uses SAS simulations to do categorical data analysis and comparison of differences between two main randomization schemes namely unrestricted and restricted randomization in dental studies where there are small samples, i.e. simple randomization and the minimization method respectively. Results show that minimization produces almost equally sized treatment groups, but simple randomization is weak in balancing prognostic factors. Nevertheless, simple randomization can also produce balanced groups even in small samples, by chance. Statistical power is also improved when minimization is used than in simple randomization, but bigger samples might be needed to boost the power.

Page generated in 0.0371 seconds