• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 22
  • 15
  • 10
  • 6
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 181
  • 19
  • 18
  • 15
  • 13
  • 13
  • 13
  • 12
  • 12
  • 12
  • 11
  • 11
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Biased AI : The hidden problem that needs an answer / Biased AI : A hidden problem that needs a solution

Fridensköld, Jonatan January 2019 (has links)
No description available.
12

Bayesian Inference of a Finite Population under Selection Bias

Xu, Zhiqing 01 May 2014 (has links)
Length-biased sampling method gives the samples from a weighted distribution. With the underlying distribution of the population, one can estimate the attributes of the population by converting the weighted samples. In this thesis, generalized gamma distribution is considered as the underlying distribution of the population and the inference of the weighted distribution is made. Both the models with known and unknown finite population size are considered. In the modes with known finite population size, maximum likelihood estimation and bootstrapping methods are attempted to derive the distributions of the parameters and population mean. For the sake of comparison, both the models with and without the selection bias are built. The computer simulation results show the model with selection bias gives better prediction for the population mean. In the model with unknown finite population size, the distributions of the population size as well as the sample complements are derived. Bayesian analysis is performed using numerical methods. Both the Gibbs sampler and random sampling method are employed to generate the parameters from their joint posterior distribution. The fitness of the size-biased samples are checked by utilizing conditional predictive ordinate.
13

Köp billigt, laga dyrt! : Hyperboliska preferenser som förklaring till prissättningen på reservdelsmarknader

Sävje, Fredrik January 2009 (has links)
<p>This paper analyses the pricing on spare parts. Empirical studies have showed that manufacturers of durable goods make an unproportional large profit on its spare parts in relation to the revenue it generates. It is first showed that according to the standard economic model the price on spare part ought to be zero since the producer include an insurance in the price of the main good. Further it is showed that moral hazard alone do not explain the pricing found in the studies. Finally an analysis of whether consumers with present-biased preferences could be a possible explanation is made. The analysis finds that it is a possibility however somewhat unlikely.</p>
14

Köp billigt, laga dyrt! : Hyperboliska preferenser som förklaring till prissättningen på reservdelsmarknader

Sävje, Fredrik January 2009 (has links)
This paper analyses the pricing on spare parts. Empirical studies have showed that manufacturers of durable goods make an unproportional large profit on its spare parts in relation to the revenue it generates. It is first showed that according to the standard economic model the price on spare part ought to be zero since the producer include an insurance in the price of the main good. Further it is showed that moral hazard alone do not explain the pricing found in the studies. Finally an analysis of whether consumers with present-biased preferences could be a possible explanation is made. The analysis finds that it is a possibility however somewhat unlikely.
15

Competition in Visual Working Memory

Emrich, Stephen Michael 06 December 2012 (has links)
The processing of information within the visual system is limited by several cognitive and neural bottlenecks. One critical bottleneck occurs in visual working memory (VWM), as the amount of information that can be maintained on-line is limited to three to four items. While numerous theories have addressed this limited capacity of VWM, it is unclear how processing bottlenecks in the initial selection and perception of visual information affect the number or precision of representations that can be maintained in VWM. The purpose of this dissertation was to examine whether early competition for resources within the visual system limits the number or precision of representation that can be maintained in VWM. To establish whether competitive interactions affect VWM, Chapters 1 – 4 tested whether performance on VWM tasks was related to the distance between memory items. The results of these experiments reveal that when objects are presented close together in space, VWM performance is impaired relative to when those same objects are presented further apart. Using a three-component model of continuous responses in a recall task, Chapters 3 – 4 demonstrated that the distance between objects primarily affects the precision of responses, and increases the number of non-target errors. Chapter 5 extended these findings to distractors, demonstrating that multiple distractors affect the precision and accuracy of VWM responses. Chapters 6 – 7 tested how attentional selection can bias memory representations, revealing that objects that are given high attentional priority were reported with greater precision. Finally, Chapters 8 and 9 examined bias-signals as a potential source of individual differences in VWM performance, revealing that high-performers have more precise representations of sub-capacity representations than low-performers. Together, these results reveal that VWM performance is limited by competition for representation within the visual system, and that attention plays a critical role in resolving competition and consequently, determining the contents of VWM.
16

Competition in Visual Working Memory

Emrich, Stephen Michael 06 December 2012 (has links)
The processing of information within the visual system is limited by several cognitive and neural bottlenecks. One critical bottleneck occurs in visual working memory (VWM), as the amount of information that can be maintained on-line is limited to three to four items. While numerous theories have addressed this limited capacity of VWM, it is unclear how processing bottlenecks in the initial selection and perception of visual information affect the number or precision of representations that can be maintained in VWM. The purpose of this dissertation was to examine whether early competition for resources within the visual system limits the number or precision of representation that can be maintained in VWM. To establish whether competitive interactions affect VWM, Chapters 1 – 4 tested whether performance on VWM tasks was related to the distance between memory items. The results of these experiments reveal that when objects are presented close together in space, VWM performance is impaired relative to when those same objects are presented further apart. Using a three-component model of continuous responses in a recall task, Chapters 3 – 4 demonstrated that the distance between objects primarily affects the precision of responses, and increases the number of non-target errors. Chapter 5 extended these findings to distractors, demonstrating that multiple distractors affect the precision and accuracy of VWM responses. Chapters 6 – 7 tested how attentional selection can bias memory representations, revealing that objects that are given high attentional priority were reported with greater precision. Finally, Chapters 8 and 9 examined bias-signals as a potential source of individual differences in VWM performance, revealing that high-performers have more precise representations of sub-capacity representations than low-performers. Together, these results reveal that VWM performance is limited by competition for representation within the visual system, and that attention plays a critical role in resolving competition and consequently, determining the contents of VWM.
17

Uso de informação secundária imprecisa e inacurada no planejamento de curto prazo

Araújo, Cristina da Paixão January 2015 (has links)
No setor de mineração, a amostragem está presente no empreendimento mineral desde a fase da exploração até a lavra. Para diminuir a incerteza na previsão de teores, o planejamento de lavra requer adensamento da amostragem para garantir previsões acuradas e precisas. Acredita-se, que quanto maior a quantidade de amostras, maior a confiabilidade nas estimativa de teores. Na fase exploração, geralmente, a amostragem é realizada por furos de sondagem com coroas diamantadas, que é uma técnica com alto custo de execução e produz amostras com acuracidade e precisão. Nesta fase, existem poucos dados com alta qualidade. Já na fase operacional, a amostragem é realizada por outras técnicas devido a restrições orçamentárias e ao alto custo de execução da sondagem diamantada. Em geral, estas amostras possuem baixa qualidade (imprecisas e inacuradas) e não são submetidas a protocolos de controle que qualidade. Logo, nesta fase existem muitos dados com baixa qualidade com erro de vies e precisao. Esta dissertação avalia o impacto do uso de dados imprecisos no planejamento de curto prazo. Para isto, foram analisados dois bancos de dados distintos. O primeiro estudo utiliza o banco de dados exaustivo Walker Lake, que foi usado e considerado como o teor real do depósito. Inicialmente, as amostras foram obtidas a partir do conjunto de dados com espaçamento regular de 20×20 m e 5×5 m, a partir do banco de dados exaustivo. Um erro relativo de ±25% (imprecisão) e 10% de viés foram adicionados aos dados espaçados a 5×5 m (dados geológicos curto prazo) em diferentes cenários. Depois foram estudadas diferentes metodologias para incorporar a informação imprecisa nas estimativas. O segundo estudo é realizado em uma mina de ouro, com dois tipos de dados diferentes, a furos de sondagem (dados primários) e circulação reversa (dados secundários). Nestes estudos foram investigadas duas metodologias: cokrigagem e krigagem ordinária, e os dados foram utilizados para estimar blocos. As curvas teor tonelagem, análise de deriva e a classificação errônea dos blocos foram avaliadas para cada estudo. Para o banco de dados, Walker Lake, os resultados mostraram que o uso da cokrigagem ordinária estandardizada é a melhor metodologia em situações que existem dados imprecisos e enviesados, com boa correlação entre as variáveis primárias e secundárias. As estimativas produzidas são mais próximas da distribuição real dos blocos, reduzindo o erro de classificação dos blocos. Já para o banco de dados de Ouro, as amostras possuem moderada correlaçao e continuidade espacial curta para pequenas distâncias do depósito. Nesta situação, a correção da imprecisão da variável secundária utilizando a krigagem ordinária produziram melhores resultados com estimativas menos enviesadas e melhor classificação dos blocos como minério e estéril. / Decisions starting at mineral exploration through mining are based on grade block models obtained from samples. To decrease the uncertainty in the estimates, the short term mining planning requires additional sampling to ensure accurate and precise predictions. As more samples are made available, there is trend towards more reliable estimates. In the exploration stage, usually, sampling is performed by diamond drill holes (DDH), which are expensive but produces accurate and precise samples. In this stage there are few data with high quality. In the production stage, sampling is obtained by other techniques due to the high costs of DDHs. In general, these samples have low quality and are not controlled by QA / QC protocols. This study evaluates the impact of using imprecise data in short-term mineplanning. For this, it was analyzed two different data sets. The first case used the exhaustive Walker Lake dataset as the source to obtain the true and sampled grades. Initially, samples were obtained from the exhaustive dataset at regularly spaced grids at 20 × 20 m and 5 × 5 meters. A relative error (imprecision) of ± 25% and a 10% bias were added to the data spaced at 5 × 5 m (short-term geological data) in different scenarios. The second study is in a gold mine with two different types of data obtained from diamond drilling holes (DDH_Hard data) and reverse circulation (RC_Soft data).To combine these different types of data, two methodologies were investigated: cokriging and ordinary kriging. Both types of data were used to estimate a block model using the two methodologies. The grade tonnage curves and swath plots were used to compare the results against the true block grades at the same block support. In addition, the block misclassification was evaluated. In the Walker Lake the results show that standardized ordinary cokriging is a better methodology for imprecise and biased data and produces estimates closer to the true grade block distribution, reducing block misclassification. For the data set at the underground mine gold, the samples had moderate correlation and short spatial continuity for small distances at this deposit. In this situation, the estimates using ordinary kriging with hard and soft data (standardized and re-escaled) produced better results with less bias and better blocks classification of ore and waste.
18

Uso de informação secundária imprecisa e inacurada no planejamento de curto prazo

Araújo, Cristina da Paixão January 2015 (has links)
No setor de mineração, a amostragem está presente no empreendimento mineral desde a fase da exploração até a lavra. Para diminuir a incerteza na previsão de teores, o planejamento de lavra requer adensamento da amostragem para garantir previsões acuradas e precisas. Acredita-se, que quanto maior a quantidade de amostras, maior a confiabilidade nas estimativa de teores. Na fase exploração, geralmente, a amostragem é realizada por furos de sondagem com coroas diamantadas, que é uma técnica com alto custo de execução e produz amostras com acuracidade e precisão. Nesta fase, existem poucos dados com alta qualidade. Já na fase operacional, a amostragem é realizada por outras técnicas devido a restrições orçamentárias e ao alto custo de execução da sondagem diamantada. Em geral, estas amostras possuem baixa qualidade (imprecisas e inacuradas) e não são submetidas a protocolos de controle que qualidade. Logo, nesta fase existem muitos dados com baixa qualidade com erro de vies e precisao. Esta dissertação avalia o impacto do uso de dados imprecisos no planejamento de curto prazo. Para isto, foram analisados dois bancos de dados distintos. O primeiro estudo utiliza o banco de dados exaustivo Walker Lake, que foi usado e considerado como o teor real do depósito. Inicialmente, as amostras foram obtidas a partir do conjunto de dados com espaçamento regular de 20×20 m e 5×5 m, a partir do banco de dados exaustivo. Um erro relativo de ±25% (imprecisão) e 10% de viés foram adicionados aos dados espaçados a 5×5 m (dados geológicos curto prazo) em diferentes cenários. Depois foram estudadas diferentes metodologias para incorporar a informação imprecisa nas estimativas. O segundo estudo é realizado em uma mina de ouro, com dois tipos de dados diferentes, a furos de sondagem (dados primários) e circulação reversa (dados secundários). Nestes estudos foram investigadas duas metodologias: cokrigagem e krigagem ordinária, e os dados foram utilizados para estimar blocos. As curvas teor tonelagem, análise de deriva e a classificação errônea dos blocos foram avaliadas para cada estudo. Para o banco de dados, Walker Lake, os resultados mostraram que o uso da cokrigagem ordinária estandardizada é a melhor metodologia em situações que existem dados imprecisos e enviesados, com boa correlação entre as variáveis primárias e secundárias. As estimativas produzidas são mais próximas da distribuição real dos blocos, reduzindo o erro de classificação dos blocos. Já para o banco de dados de Ouro, as amostras possuem moderada correlaçao e continuidade espacial curta para pequenas distâncias do depósito. Nesta situação, a correção da imprecisão da variável secundária utilizando a krigagem ordinária produziram melhores resultados com estimativas menos enviesadas e melhor classificação dos blocos como minério e estéril. / Decisions starting at mineral exploration through mining are based on grade block models obtained from samples. To decrease the uncertainty in the estimates, the short term mining planning requires additional sampling to ensure accurate and precise predictions. As more samples are made available, there is trend towards more reliable estimates. In the exploration stage, usually, sampling is performed by diamond drill holes (DDH), which are expensive but produces accurate and precise samples. In this stage there are few data with high quality. In the production stage, sampling is obtained by other techniques due to the high costs of DDHs. In general, these samples have low quality and are not controlled by QA / QC protocols. This study evaluates the impact of using imprecise data in short-term mineplanning. For this, it was analyzed two different data sets. The first case used the exhaustive Walker Lake dataset as the source to obtain the true and sampled grades. Initially, samples were obtained from the exhaustive dataset at regularly spaced grids at 20 × 20 m and 5 × 5 meters. A relative error (imprecision) of ± 25% and a 10% bias were added to the data spaced at 5 × 5 m (short-term geological data) in different scenarios. The second study is in a gold mine with two different types of data obtained from diamond drilling holes (DDH_Hard data) and reverse circulation (RC_Soft data).To combine these different types of data, two methodologies were investigated: cokriging and ordinary kriging. Both types of data were used to estimate a block model using the two methodologies. The grade tonnage curves and swath plots were used to compare the results against the true block grades at the same block support. In addition, the block misclassification was evaluated. In the Walker Lake the results show that standardized ordinary cokriging is a better methodology for imprecise and biased data and produces estimates closer to the true grade block distribution, reducing block misclassification. For the data set at the underground mine gold, the samples had moderate correlation and short spatial continuity for small distances at this deposit. In this situation, the estimates using ordinary kriging with hard and soft data (standardized and re-escaled) produced better results with less bias and better blocks classification of ore and waste.
19

Uso de informação secundária imprecisa e inacurada no planejamento de curto prazo

Araújo, Cristina da Paixão January 2015 (has links)
No setor de mineração, a amostragem está presente no empreendimento mineral desde a fase da exploração até a lavra. Para diminuir a incerteza na previsão de teores, o planejamento de lavra requer adensamento da amostragem para garantir previsões acuradas e precisas. Acredita-se, que quanto maior a quantidade de amostras, maior a confiabilidade nas estimativa de teores. Na fase exploração, geralmente, a amostragem é realizada por furos de sondagem com coroas diamantadas, que é uma técnica com alto custo de execução e produz amostras com acuracidade e precisão. Nesta fase, existem poucos dados com alta qualidade. Já na fase operacional, a amostragem é realizada por outras técnicas devido a restrições orçamentárias e ao alto custo de execução da sondagem diamantada. Em geral, estas amostras possuem baixa qualidade (imprecisas e inacuradas) e não são submetidas a protocolos de controle que qualidade. Logo, nesta fase existem muitos dados com baixa qualidade com erro de vies e precisao. Esta dissertação avalia o impacto do uso de dados imprecisos no planejamento de curto prazo. Para isto, foram analisados dois bancos de dados distintos. O primeiro estudo utiliza o banco de dados exaustivo Walker Lake, que foi usado e considerado como o teor real do depósito. Inicialmente, as amostras foram obtidas a partir do conjunto de dados com espaçamento regular de 20×20 m e 5×5 m, a partir do banco de dados exaustivo. Um erro relativo de ±25% (imprecisão) e 10% de viés foram adicionados aos dados espaçados a 5×5 m (dados geológicos curto prazo) em diferentes cenários. Depois foram estudadas diferentes metodologias para incorporar a informação imprecisa nas estimativas. O segundo estudo é realizado em uma mina de ouro, com dois tipos de dados diferentes, a furos de sondagem (dados primários) e circulação reversa (dados secundários). Nestes estudos foram investigadas duas metodologias: cokrigagem e krigagem ordinária, e os dados foram utilizados para estimar blocos. As curvas teor tonelagem, análise de deriva e a classificação errônea dos blocos foram avaliadas para cada estudo. Para o banco de dados, Walker Lake, os resultados mostraram que o uso da cokrigagem ordinária estandardizada é a melhor metodologia em situações que existem dados imprecisos e enviesados, com boa correlação entre as variáveis primárias e secundárias. As estimativas produzidas são mais próximas da distribuição real dos blocos, reduzindo o erro de classificação dos blocos. Já para o banco de dados de Ouro, as amostras possuem moderada correlaçao e continuidade espacial curta para pequenas distâncias do depósito. Nesta situação, a correção da imprecisão da variável secundária utilizando a krigagem ordinária produziram melhores resultados com estimativas menos enviesadas e melhor classificação dos blocos como minério e estéril. / Decisions starting at mineral exploration through mining are based on grade block models obtained from samples. To decrease the uncertainty in the estimates, the short term mining planning requires additional sampling to ensure accurate and precise predictions. As more samples are made available, there is trend towards more reliable estimates. In the exploration stage, usually, sampling is performed by diamond drill holes (DDH), which are expensive but produces accurate and precise samples. In this stage there are few data with high quality. In the production stage, sampling is obtained by other techniques due to the high costs of DDHs. In general, these samples have low quality and are not controlled by QA / QC protocols. This study evaluates the impact of using imprecise data in short-term mineplanning. For this, it was analyzed two different data sets. The first case used the exhaustive Walker Lake dataset as the source to obtain the true and sampled grades. Initially, samples were obtained from the exhaustive dataset at regularly spaced grids at 20 × 20 m and 5 × 5 meters. A relative error (imprecision) of ± 25% and a 10% bias were added to the data spaced at 5 × 5 m (short-term geological data) in different scenarios. The second study is in a gold mine with two different types of data obtained from diamond drilling holes (DDH_Hard data) and reverse circulation (RC_Soft data).To combine these different types of data, two methodologies were investigated: cokriging and ordinary kriging. Both types of data were used to estimate a block model using the two methodologies. The grade tonnage curves and swath plots were used to compare the results against the true block grades at the same block support. In addition, the block misclassification was evaluated. In the Walker Lake the results show that standardized ordinary cokriging is a better methodology for imprecise and biased data and produces estimates closer to the true grade block distribution, reducing block misclassification. For the data set at the underground mine gold, the samples had moderate correlation and short spatial continuity for small distances at this deposit. In this situation, the estimates using ordinary kriging with hard and soft data (standardized and re-escaled) produced better results with less bias and better blocks classification of ore and waste.
20

Experimental Investigation of Snapover: The Sudden Increase of Plasma Current Drawn to a Positively Biased Conductor When Surrounded by a Dielectric

Thomson, Clint D. 01 May 2001 (has links)
Snapover is particularly relevant to Earth-orbiting spacecraft powered by high-voltage solar arrays. During snapover, the current collected by a positively biased conductor that is immersed in a plasma suddenly increases when two conditions are met: i) there is an immediately adjacent insulator; ii) the conductor exceeds a positive threshold voltage with respect to the plasma. The enhanced current develops as a consequence of the insulator, either through secondary electron (SE) emission or by material ionization. Experiments were performed to examine snapover onset potential and current collection dependence on conductor and insulator materials, conductor size and shape, sample history, biasing rate, and contamination and smoothness of the dielectric surface. Numerous current jumps were observed between applied voltages of 100 V and 1000 V. Both surface roughening and surface coatings were found to inhibit snapover. In general, the results did not support previous simple interpretations of the SE model.

Page generated in 0.0349 seconds