Spelling suggestions: "subject:"aprecision."" "subject:"imprecision.""
401 |
Estimación probabilística del grado de excepcionalidad de un elemento arbitrario en un conjunto finito de datos: aplicación de la teoría de conjuntos aproximados de precisión variableFernández Oliva, Alberto 27 September 2010 (has links)
No description available.
|
402 |
Método não destrutivo para predição da maturação de amendoim (arachis hypogaea l.) utilizando sensoriamento remoto /Santos, Adão Felipe dos. January 2019 (has links)
Orientador: Rouverson Pereira Silva / Resumo: A utilização de técnicas de sensoriamento remoto teve expressivo aumento na agricultura nos últimos anos para muitas culturas, contudo, ainda são escassos os trabalhos que envolvem a cultura de amendoim, principalmente os que visam solucionar um dos seus principais problemas, a predição da maturação. Dessa forma, foram desenvolvidos trabalhos no Brasil e nos EUA visando verificar o potencial uso do sensoriamento aéreo e orbital na predição da maturação de amendoim. No primeiro capítulo desta tese, encontra-se a revisão de literatura. No segundo, realizou-se uma análise de variabilidade utilizando cartas de controle para identificar diferenças no comportamento da reflectância espectral e dos índices de vegetação obtidos por imagens de drone e do satélite PlanetScope, em que se chegou à conclusão que as duas plataformas apresentam comportamento similares ao longo do tempo. No terceiro capítulo, foram utilizadas duas áreas comerciais nos EUA, irrigada e não irrigada, e as imagens foram obtidas por meio de drone. Observou-se que os índices de vegetação que tiveram comportamento similar nas duas áreas foram aqueles em que se modificou a equação original, substituindo a banda do red pelo red edge (NLI e MNLI). No quarto capítulo, utilizou-se uma área comercial no Brasil, sendo as imagens utilizadas para extrair a reflectância obtidas do satélite PlanetScope. Os melhores índices de vegetação, com menores erros na predição da maturação foram o NDVI e o SR. Por fim, no capítulo cinco,... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: The use of remote sensing techniques has had a significant increase in agriculture in recent years for many crops; however, the work involving peanut cultivation is still scarce, mainly to solve one of the main problems of the crop, the prediction of maturity. Thus, work was developed in Brazil and the USA to verify the potential use of aerial and orbital remote sensing in the prediction of peanut maturity. The first chapter of this thesis is the literature review. In the second, a variability analysis was performed using control charts to identify differences in reflectance response and vegetation indices obtained by drone images and PlanetScope satellite, in which it was concluded that the two platforms have similar responses over the period. In the third, two commercial areas were used in the USA, irrigated and non-irrigated, and the images were obtained by drone. It was observed that the vegetation indices that had similar responses in both areas were those in which the original equation was modified, replacing the red band by the red edge (NLI and MNLI). In the fourth chapter, a commercial area was used in Brazil, with the images used to extract the reflectance obtained from the PlanetScope satellite. The best vegetation indices, with the lowest errors in the prediction of maturity, were NDVI and SR. Finally, in chapter five, final considerations with some recommendations and the next steps of the research are stated. / Doutor
|
403 |
ON-FARM UTILIZATION OF PRECISION DAIRY MONITORING: USEFULNESS, ACCURACY, AND AFFORDABILITYEckelkamp, Elizabeth A. 01 January 2018 (has links)
Precision dairy monitoring is used to supplement or replace human observation of dairy cattle. This study examined the value dairy producers placed on disease alerts generated from a precision dairy monitoring technology. A secondary objective was calculating the accuracy of technology-generated disease alerts compared against observed disease events. A final objective was determining the economic viability of investing in a precision dairy monitoring technology for detecting estrus and diseases.
A year-long observational study was conducted on four Kentucky dairy farms. All lactating dairy cows were equipped with a neck and leg tri-axial accelerometer. Technologies measured eating time, lying time, standing time, walking time, and activity (steps) in 15-min sections throughout the day. A decrease of ≥ 30% or more from a cow’s 10-d moving behavioral mean created an alert. Alerts were assessed by dairy producers for usefulness and by the author for accuracy. Finally, raw information was analyzed with three machine-learning techniques: random forest, least discriminate analyses, and principal component neural networks.
Through generalized linear mixed modeling analyses, dairy producers were found to utilize the alert list when ≤ 20 alerts occurred, when alerts occurred in cows’ ≤ 60 d in lactation, and when alerts occurred during the week. The longer the system was in place, the less likely producers were to utilize alerts. This is likely because the alerts were not for a specific disease, but rather informed the dairy producer an issue might have occurred. The longer dairy producers were exposed to a technology, producers more easily decided which alerts were worth their attention.
Sensitivity, specificity, accuracy, and balanced accuracy were calculated for disease alerts that occurred and disease events that were reported. Sensitivity ranged from 12 to 48%, specificity from 91 to 96%, accuracy from 90 to 96%, and balanced accuracy from 50 to 59%. The high number of false positives correspond with the lack of usefulness producers reported. Machine learning techniques improved sensitivity (66 to 86%) and balanced accuracy (48 to 85%). Specificity (24 to 89%) and accuracy (70 to 86%) decreased with the machine learning techniques, but overall detection performance was improved. Precision dairy monitoring technologies have potential to detect behavior changes linked to disease events.
A partial budget was created based on the reproduction, production, and early lactation removal rate of an average cow in a herd. The cow results were expanded to a 1,000 cow herd for sensitivity analyses. Four analyses were run including increased milk production from early disease detection, increased estrus detection rate, decreased early lactation removal from early disease detection, and all changes in combination. Economic profitability was determined through net present value with a value ≥ $0 indicating a profitable investment. Each sensitivity analysis was conducted 10,000, with different numbers for key inputs randomly selected from a previously defined distribution. If either milk production or estrus detection were improved, net present value was ≥ 0 in 76 and 85% of the iterations. However, reduced early lactation removal never resulted in a value ≥ 0. Investing in precision dairy technology resulting in improved estrus detection rate and early disease detection was a positive economic decision in most iterations.
|
404 |
STUDIES OF MAGNETICALLY INDUCED FARADAY ROTATION BY POLARIZED HELIUM-3 ATOMSAbney, Joshua 01 January 2018 (has links)
Gyromagnetic Faraday rotation offers a new method to probe limits on properties of simple spin systems such as the possible magnetic moment of asymmetric dark matter or as a polarization monitor for polarized targets. Theoretical calculations predict the expected rotations of linearly polarized light due to the magnetization of spin-1/2 particles are close to or beyond the limit of what can currently be measured experimentally (10−9 rad). So far, this effect has not been verified. Nuclear spin polarized 3He provides an ideal test system due to its simple structure and ability to achieve high nuclear spin polarization via spin-exchange optical pumping (SEOP). To maximize the expected signal from 3He, a SEOP system is built with a modern narrowband pumping laser and a 3He target designed to use with a multipass cavity. Additionally, a sensitive triple modulation apparatus for polarimetry is utilized and further developed to detect Faraday rotations on the order of nanoradians. This works presents the results of the measurement of the magnetic Faraday effect.
|
405 |
AUTOMATED BODY CONDITION SCORING: PROGRESSION ACROSS LACTATION AND ITS ASSOCIATION WITH DISEASE AND REPRODUCTION IN DAIRY CATTLETruman, Carissa Marie 01 January 2019 (has links)
Body condition scoring is a technique used to noninvasively assess fat reserves. It provides an objective estimate to describe the current and past nutritional status of the dairy cow and has been associated with increased disease risk and breeding success. Traditionally body condition scores are taken manually by visual appraisal on a 1 to 5 scale, in one-quarter increments. However, recent studies have shown the potential of automating the body condition scoring of cows using images. The first objective was to estimate the likelihood of disease development and breeding success, using odds ratios, associated with body condition score scored automatically at various points in lactation. The second objective of our research was to use a commercially available automated body condition scoring camera system to monitor body condition across the lactation period to evaluate differences between stratified parameters and to develop an equation to predict the dynamics of the body condition score. We found that poor body condition score at different times during the transition period are associated with increased disease occurrence and lower reproductive success. Automated body condition scoring (ABCS) curve during lactation was influenced by many factors, such as parity, ABCS at time of calving, disease occurrence, and milk production.
|
406 |
Aprendizagem de máquina na determinação de ambientes de produção de cana-de-açúcar /Almeida, Gabriela Mourão de January 2019 (has links)
Orientador: Gener Tadeu Pereira / Resumo: A cana-de-açúcar é uma das culturas mais expressivas do mercado agrícola nacional. Visando um aumento de produtividade e qualidade da matéria prima, técnicas como a de manejo localizado, que já vem sendo adotada há muitos anos pelas usinas, porém, ainda de forma manual. O objetivo desse trabalho é determinar ambientes de manejo de cana-de-açúcar utilizando quantidade reduzida de variáveis de baixo custo, por meio de técnica de aprendizagem de máquina. Para atingir a máxima eficiência na predição, os dados foram submetidos à estatística descritiva, em seguida, à seleção de regressão “stepwise” para determinar quais variáveis seriam úteis ao modelo. Em seguida foi aplicado teste de multicolinearidade e, por fim, a árvore de decisão classificatória. Para avaliar a eficiência do modelo foi preparada uma matriz de confusão. Foi detectado que as variáveis ligadas às características de formação do solo foram as escolhidas para determinar os ambientes de produção, dando destaque a variável areia. A técnica de regressão “stepwise” mostrou-se eficiente na seleção de variáveis e a árvore de decisão mostrou eficiência na determinação dos ambientes, obtendo a satisfatória acurácia de 75%, além de ter gerado ambientes de manejo mais contínuos na área de cultivo. / Abstract: Sugar cane is one of the most significant crops in the national agricultural market. Aiming to increase the quality and quality of the raw material, techniques such as localized management, which has been adopted for many years by the plants, but still manually. The objective of this work is to determine the sugarcane management environments, using the reduced number of low-cost variables, through the machine learning technique. To achieve maximum prediction efficiency, the data were subjected to descriptive statistics, followed by stepwise regression selection to determine useful variable variables useful in the model. Then, the multicollinearity test was applied and, finally, a classification decision tree. To evaluate the efficiency of the model, a confusion matrix was prepared. It was detected that the variables selected to the soil characteristics were chosen to determine the production environments, highlighting a sand variable. A stepwise regression technique was efficient in the selection of variables and a reduced decision tree in the determination of environments, obtaining a satisfactory satisfaction of 75%, besides showing more continuous management environments in the cultivation area. / Mestre
|
407 |
Site- and Location-Adjusted Approaches to Adaptive Allocation Clinical Trial DesignsDi Pace, Brian S 01 January 2019 (has links)
Response-Adaptive (RA) designs are used to adaptively allocate patients in clinical trials. These methods have been generalized to include Covariate-Adjusted Response-Adaptive (CARA) designs, which adjust treatment assignments for a set of covariates while maintaining features of the RA designs. Challenges may arise in multi-center trials if differential treatment responses and/or effects among sites exist. We propose Site-Adjusted Response-Adaptive (SARA) approaches to account for inter-center variability in treatment response and/or effectiveness, including either a fixed site effect or both random site and treatment-by-site interaction effects to calculate conditional probabilities. These success probabilities are used to update assignment probabilities for allocating patients between treatment groups as subjects accrue. Both frequentist and Bayesian models are considered. Treatment differences could also be attributed to differences in social determinants of health (SDH) that often manifest, especially if unmeasured, as spatial heterogeneity amongst the patient population. In these cases, patient residential location can be used as a proxy for these difficult to measure SDH. We propose the Location-Adjusted Response-Adaptive (LARA) approach to account for location-based variability in both treatment response and/or effectiveness. A Bayesian low-rank kriging model will interpolate spatially-varying joint treatment random effects to calculate the conditional probabilities of success, utilizing patient outcomes, treatment assignments and residential information. We compare the proposed methods with several existing allocation strategies that ignore site for a variety of scenarios where treatment success probabilities vary.
|
408 |
Use Of passive samplers to characterize the spatial heterogeneity of coarse particle mass concentration and composition in Cleveland, OHSawvel, Eric J. 01 December 2013 (has links)
The overall goals of this dissertation are: 1) to better quantify the spatial heterogeneity of coarse particulate matter (PM10-2.5) and its chemical composition; and 2) to evaluate the performance (accuracy and precision) of passive samplers analyzed by computer-controlled scanning electron microscopy with energy-dispersive X-ray spectroscopy (CCSEM-EDS) for PM10-2.5. For these goals, field studies were conducted over multiple seasons in Cleveland, OH and were the source of data for this dissertation.
To achieve the first goal, we characterized spatial variability in the mass and composition of PM10-2.5 in Cleveland, OH with the aid of inexpensive passive samplers. Passive samplers were deployed at 25 optimized sites for three week-long intervals in summer 2008 to characterize spatial variability in components of PM10-2.5. The size and composition of individual particles were determined using CCSEM-EDS. For each sample, this information was used to estimate PM10-2.5 mass and aerosol composition by particle class. The highest PM10-2.5 means were observed at three central industrial urban sites (35.4 Μg m-3, 43.4 Μg m-3, and 47.6 Μg m-3), whereas lower means were observed to the west and east of this area with the lowest means observed at outskirt suburban background sites (12.9 Μg m-3 and 14.7 Μg m-3). Concentration maps for PM10-2.5 and some compositional components of PM10-2.5 (Fe oxide and Ca rich) show an elongated shape of high values stretching from Lake Erie south through the central industrial area, whereas those for other compositional components (e.g., Si/Al rich) are considerably less heterogeneous. The findings from the spatial variability of coarse particles by compositional class analysis, presented in Chapter II of this dissertation, show that the concentrations of some particle classes were substantially more spatially heterogeneous than others. The data suggest that industrial sources located in The Flats district in particular may contribute to the observed concentration variability and heterogeneity. Lastly, percent relative spatial heterogeneity (SH%) is more consistent with spatial heterogeneity as visualized in the concentration surface maps compared to the coefficient of divergence (COD).
The second goal was achieved by assessing the performance of passive samplers analyzed by CCSEM-EDS to measure PM10-2.5 (Chapter III) and investigating potential sources of variability in the measurement of PM10-2.5 with passive samplers analyzed by CCSEM-EDS (Chapter IV). Data for these analyses were obtained in studies conducted in summer 2009 and winter 2010. The precision of PM10-2.5 measured with the passive samplers was highly variable and ranged from a low coefficient of variation (CV) of 2.1% to a high CV of 90.8%. Eighty percent of the CVs were less than 40%. This assessment showed the CV for passive samplers was greater than that recommended by the United States Environmental Protection Agency (EPA) guidelines for the Federal Reference Method (FRM). Several CV values were high, exceeding 40% indicating substantially dissimilar results between co-located passive samplers. The overall CV for the passive samplers was 41.2% in 2009 and 33.8% in 2010. The precision when high CVs > 40% (n = 5 of 25) were excluded from the analysis was 24.1% in 2009 and 18.2% for 2010.
Despite issues with precision, PM10-2.5 measured with passive samplers agreed well with that measured with FRM samplers with accuracy approaching EPA Federal Equivalent Method (FEM) criteria. The intercept was 1.21 and not statistically significant (p = 3.88). The passive to FRM sampler comparison (1:1) line fell within the 95% confidence interval (CI) for the best-fit linear regression and was statistically significant (p < 0.05). However, several data points had large standard deviations resulting in high variability between co-located passive samplers (n = 3), which extend outside of the 95% CI's. The passive sampler limit of detection (LOD) for the CCSEM method was 2.8 Μg m-3. This study also showed certain samples had higher CVs and that further investigation was needed to better understand the sources of variability in the measurement of PM10-2.5 with passive samplers.
Sources of variability observed in the measurement of PM10-2.5 with passive samplers analyzed by CCSEM were explored in Chapter IV of this dissertation. This research suggests mass concentrations greater than 20 Μg m-3 for week long samples are needed on the passive sampler substrate to obtain overall CVs by mass less than 15%. It also suggests that greater than 55 particle counts within a compositional class are needed to reduce analytical CVs to less than 15%. Another finding from this study was increasing the concentration from 6.2 to 10.6 Μg m-3 increases the CCSEM analytical precision by mass 38% and by number 75% for random orientation. Also certain compositional classes appeared problematical for precision of passive sampler measurements. For example, the presence of salt plus moisture introduces challenges for CCSEM analysis through the wetting of salt crystalline particles which dissolve creating a displaced dry deposition pattern of particles upon subsequent evaporation. This process can falsely elevate or reduce the particle count and alter its distribution on the sampling media.
|
409 |
Risky Predictions, Damn Strange Coincidences, and Theory Appraisal: A Multivariate Corroboration Index for Path Analytic ModelsHogarty, Kristine Y 31 October 2003 (has links)
The empirical testing of theories is an important component of research in any field. Yet despite the long history of science, the extent to which theories are supported or contradicted by the results of empirical research remains ill defined. Quite commonly, support or contradiction is based solely on the "reject" or "fail to reject" decisions that result from tests of null hypotheses that are derived from aspects of theory. Decisions and recommendations based on this forced and often artificial dichotomy have been scrutinized in the past.
In recent years, such an overly simplified approach to theory testing has been vigorously challenged in the past.Theories differ in the extent to which they provide precise predictions about observations. The precision of predictions derived from theories is proportional to the strength of support that may be provided by empirical evidence congruent with the prediction. However, the notion of precision linked to strength of support is surprisingly absent from many discussions regarding the appraisal of theories.
Meehl (1990a) has presented a logically sound index of corroboration to summarize the extent to which empirical tests of theories provide support or contradiction of theories. The purpose of this study was to evaluate the utility of this index of corroboration and its behavior when employing path analytic methods in the context of social science research.
The performance of a multivariate extension of Meehl’s Corroboration Index (Ci) was evaluated using Monte Carlo methods. Correlational data were simulated to correspond to tests of theories via traditional path analysis. Five factors were included in the study: number of variables in the path model, level of intolerance of the theory, correspondence of the theory to the ‘true’ path model used for data generation, sample size and level of collinearity.
Results were evaluated in terms of the mean and standard error of the resulting multivariate Ci values. The level of intolerance was observed to be the strongest influence on mean Ci. Verisimilitude and model complexity were not observed to be strong determinants of the mean Ci. Sample size and collinearity evidenced small relationships with the mean value of Ci, but were more closely related to the sampling error.
Implications for theory and practice include alternatives and complements to tests of statistical significance, a shift from comparing findings to the null hypothesis, to the comparison of alternative theories and models, and the inclusion of additional logical components besides the theory itself. Lastly, an alternative conceptualization of the multivariate corroboration index is advanced to guide future research efforts.
|
410 |
Classificação de plantas daninhas em banco de imagens utilizando redes neurais convolucionais /Marques Junior, Luiz Carlos. January 2019 (has links)
Orientador: José Alfredo Covolan Ulson / Banca: Adriano de Souza Marques / Banca: Fernando de Souza Campos / Resumo: As espécies exóticas invasoras, também conhecidas como plantas daninhas, competem por recursos, como sol, água e nutrientes paralelamente a cultura plantada, impondo prejuízos econômicos ao agricultor. Para minimizar este problema, atualmente os agricultores fazem uso de herbicidas para a eliminação e/ou controle das plantas daninhas. O uso de herbicidas depara-se com problemas: i) algumas plantas daninhas são resistentes a aplicação de herbicidas e, ii) quando aplicados em demasia pode-se ter a contaminação da cultura plantada, do lençol freático e dos mananciais como rios e lagos. Nesse contexto, visando o desenvolvimento de ferramentas que permitam a minimização do emprego de herbicidas, novas abordagens que fazem uso de visão computacional e inteligência artificial aparecem como soluções promissoras, agregando novas ferramentas a agricultura de precisão. Dentre essas soluções destaca-se o aprendizado profundo (do inglês Deep Learning), que utiliza as redes neurais convolucionais para extrair características relevantes, principalmente em imagens, dessa maneira, permite por exemplo a identificação e a classificação de plantas daninhas, o que possibilita ao agricultor optar tanto pela eliminação mecânica da planta daninha quanto a aplicação localizada de herbicidas e em quantidades adequadas. A partir deste desafio que é a correta classificação de diferentes espécies de plantas daninhas, especialmente plantas resistentes aos herbicidas comerciais, o objetivo deste trabalho f... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Exotic invasive species, also known as weeds, compete for resources such as sun, water and nutrients in parallel with the planted crop, imposing economic losses to the farmer. To minimize this problem, farmers are currently using herbicides for the elimination and / or control of weeds.The use of herbicides has problems: i) some weeds are resistant to the application of herbicides and ii) when applied too much can contaminate the planted crop, groundwater and springs such as rivers and lakes. In this context, aiming at developing tools to minimize the use of herbicides, new approaches that make use of computer vision and artificial intelligence appear as promising solutions, adding new tools to precision agriculture. Among these solutions are the Deep Learning, which uses the convolutional neural networks to extract relevant features, mainly in images, thus, allows for example the identification and classification of weeds, which enables the farmer to opt for the mechanical elimination of the weeds as well as the localized application of herbicides and in adequate quantities. From this challenge, which is the correct classification of different weed species, especially plants resistant to commercial herbicides, the objective of this study was to apply and compare the performance of four architectures of convolutional neural networks for classification of weed five species contained in an image bank developed for this work. The training and classification of the species were c... (Complete abstract click electronic access below) / Mestre
|
Page generated in 0.0781 seconds