811 |
Informovanost žáků druhého stupně základních škol v oblasti ochrany obyvatelstva v Jihočeském kraji / Knowledge of pupils of upper primary schools in the field of protection of population in South Bohemian regionSUKOVÁ, Denisa January 2015 (has links)
Lives of each of us may bring unexpected situations, such as natural disasters, accidents involving release of hazardous substances or extensive traffic accidents. Statutory regulations and organizational measures taken by the state serve to mitigate the consequences of such emergencies. One of the tasks of the state is to protect the society, which also involves protection of population. Citizens themselves can contribute to the mitigation of the consequences of emergencies. Therefore, it is important to educate the citizens in this regard for them to be able react adequately. As part of the training of the population it is important to inculcate children from their early age with the basic rules of protection and help. Thus the issues of protecting people in emergencies are part of framework educational programmes for preschool, elementary as well as secondary education. It is the issues of protection of population that the theoretical part of this thesis deals with. The introduction describes protection of population from a historical perspective. It subsequently describes the current situation not only in the Czech Republic but it also briefly describes the context of the European Union and the political-military organization NATO. The thesis also describes the system of education in the sphere of protection of population at elementary schools. An integral component of the theoretical part is a chapter dealing with the integrated rescue system. The theoretical part is concluded with a chapter on statistical methods. The objective of the thesis was to find out about the level of knowledge of pupils in the 6th and 9th grades in the sphere of protection of population at selected schools in the South Bohemia region and to compare, subsequently, the knowledge of these pupils using the methods of descriptive and mathematical statistics. The following hypothesis was formulated for this thesis: 'Pupils on the 9th grade of elementary schools in the South Bohemia region have significantly better knowledge in the sphere of protection of population than 6th-grade pupils.' To achieve the defined objectives and to test the hypothesis, it was necessary to create a questionnaire focusing on the issue and to carry out a survey. The research group consisted of 100 pupils from 6th-grade classes and 100 pupils from 9th-grade classes at eight elementary schools in the South Bohemia region. The questionnaire submitted to the pupils included 15 questions. The questions in which the pupils showed a lack of knowledge mainly included questions focused on first aid. Evaluation of the questions shows that only 25 % of the pupils surveyed know what the frequency of chest compressions during resuscitation of an adult is and that 54 % of the pupils know how to behave in a situation where somebody is a victim of a high-voltage electric shock. Another problem area is the knowledge of signals. Only 51 % of the respondents know the signal for 'General Warning' and know what to do in the event this signal is sounded. The signal of 'Acoustic Siren Test' is known by only 21 % of them. In contrast, the issues in which pupils showed a good knowledge include, for instance, emergency telephone numbers (85 % of correct answers), integrated rescue system (87 % of correct answers) or evacuation (86 % of correct answers). Overall, the 6th-grade pupils answered all the questions correctly in 52.9 % of cases; it was 58 % in the 9th-grade pupils. The selected hypothesis was tested and confirmed using the methods of descriptive and mathematical statistics. The processed data is presented in the 'Results' chapter and it is subsequently evaluated in the 'Discussion' chapter. The benefit of this thesis is in an obtained image of knowledge in the sphere of protection of population shown by pupils of elementary schools. The outputs of the thesis have been provided to the participating schools.
|
812 |
The Analysis of Big Data on Cites and Regions - Some Computational and Statistical ChallengesSchintler, Laurie A., Fischer, Manfred M. 28 October 2018 (has links) (PDF)
Big Data on cities and regions bring new opportunities and challenges to data analysts and city planners. On the one side, they hold great promise to combine increasingly detailed data for each citizen with critical infrastructures to plan, govern and manage cities and regions, improve their sustainability, optimize processes and maximize the provision of public and private services. On the other side, the massive sample size and high-dimensionality of Big Data and their geo-temporal character introduce unique computational and statistical challenges. This chapter provides overviews on the salient characteristics of Big Data and how these features impact on paradigm change of data management and analysis, and also on the computing environment. / Series: Working Papers in Regional Science
|
813 |
Prediction of Infectious Disease outbreaks based on limited informationMarmara, Vincent Anthony January 2016 (has links)
The last two decades have seen several large-scale epidemics of international impact, including human, animal and plant epidemics. Policy makers face health challenges that require epidemic predictions based on limited information. There is therefore a pressing need to construct models that allow us to frame all available information to predict an emerging outbreak and to control it in a timely manner. The aim of this thesis is to develop an early-warning modelling approach that can predict emerging disease outbreaks. Based on Bayesian techniques ideally suited to combine information from different sources into a single modelling and estimation framework, I developed a suite of approaches to epidemiological data that can deal with data from different sources and of varying quality. The SEIR model, particle filter algorithm and a number of influenza-related datasets were utilised to examine various models and methodologies to predict influenza outbreaks. The data included a combination of consultations and diagnosed influenza-like illness (ILI) cases for five influenza seasons. I showed that for the pandemic season, different proxies lead to similar behaviour of the effective reproduction number. For influenza datasets, there exists a strong relationship between consultations and diagnosed datasets, especially when considering time-dependent models. Individual parameters for different influenza seasons provided similar values, thereby offering an opportunity to utilise such information in future outbreaks. Moreover, my findings showed that when the temperature drops below 14°C, this triggers the first substantial rise in the number of ILI cases, highlighting that temperature data is an important signal to trigger the start of the influenza epidemic. Further probing was carried out among Maltese citizens and estimates on the under-reporting rate of the seasonal influenza were established. Based on these findings, a new epidemiological model and framework were developed, providing accurate real-time forecasts with a clear early warning signal to the influenza outbreak. This research utilised a combination of novel data sources to predict influenza outbreaks. Such information is beneficial for health authorities to plan health strategies and control epidemics.
|
814 |
Analyse propabiliste régionale des précipitations : prise en compte de la variabilité et du changement climatique / Regional frequency analysis of precipitation accounting for climate variability and changeSun, Xun 28 October 2013 (has links)
Les événements de pluies extrêmes et les inondations qui en résultent constituent une préoccupation majeure en France comme dans le monde. Dans le domaine de l'ingénierie, les méthodes d'analyse probabiliste sont pratiquement utilisées pour prédire les risques, dimensionner des ouvrages hydrauliques et préparer l'atténuation. Ces méthodes sont classiquement basées sur l'hypothèse que les observations sont identiquement distribuées. Il y a aujourd'hui de plus en plus d'éléments montrant que des variabilités climatiques à grande échelle (par exemple les oscillations El Niño – La Niña, cf. indice ENSO) ont une influence significative sur les précipitations dans le monde. Par ailleurs, les effets attendus du changement climatique sur le cycle de l'eau remettent en question l'hypothèse de variables aléatoires "identiquement distribuées" dans le temps. Il est ainsi important de comprendre et de prédire l'impact de la variabilité et du changement climatique sur l'intensité et la fréquence des événements hydrologiques, surtout les extrêmes. Cette thèse propose une étape importante vers cet objectif, en développant un cadre spatio-temporel d'analyse probabiliste régionale qui prend en compte les effets de la variabilité climatique sur les événements hydrologiques. Les données sont supposées suivre une distribution, dont les paramètres sont liés à des variables temporelles et/ou spatiales à l'aide de modèles de régression. Les paramètres sont estimés avec une méthode de Monte-Carlo par Chaînes de Markov dans un cadre Bayésien. La dépendance spatiale des données est modélisée par des copules. Les outils de comparaison de modèles sont aussi intégrés. L'élaboration de ce cadre général de modélisation est complétée par des simulations Monte-Carlo pour évaluer sa fiabilité. Deux études de cas sont effectuées pour confirmer la généralité, la flexibilité et l'utilité du cadre de modélisation pour comprendre et prédire l'impact de la variabilité climatique sur les événements hydrologiques. Ces cas d'études sont réalisés à deux échelles spatiales distinctes: • Echelle régionale: les pluies d'été dans le sud-est du Queensland (Australie). Ce cas d'étude analyse l'impact de l'oscillation ENSO sur la pluie totale et la pluie maximale d'été. En utilisant un modèle régional, l'impact asymétrique de l'ENSO est souligné: une phase La Niña induit une augmentation significative sur la pluie totale et maximale, alors qu'une phase El Niño n'a pas d'influence significative. • Echelle mondiale: une nouvelle base de données mondiale des précipitations extrêmes composée de 11588 stations pluviométriques est utilisée pour analyser l'impact des oscillations ENSO sur les précipitations extrêmes mondiales. Cette analyse permet d'apprécier les secteurs où ENSO a un impact sur les précipitations à l'échelle mondiale et de quantifier son impact sur les estimations de quantiles extrêmes. Par ailleurs, l'asymétrie de l'impact ENSO et son caractère saisonnier sont également évalués. / Extreme precipitations and their consequences (floods) are one of the most threatening natural disasters for human beings. In engineering design, Frequency Analysis (FA) techniques are an integral part of risk assessment and mitigation. FA uses statistical models to estimate the probability of extreme hydrological events which provides information for designing hydraulic structures. However, standard FA methods commonly rely on the assumption that the distribution of observations is identically distributed. However, there is now a substantial body of evidence that large-scale modes of climate variability (e.g. El-Niño Southern Oscillation, ENSO; Indian Ocean Dipole, IOD; etc.) exert a significant influence on precipitation in various regions worldwide. Furthermore, climate change is likely to have an influence on hydrology, thus further challenging the “identically distributed” assumption. Therefore, FA techniques need to move beyond this assumption. In order to provide a more accurate risk assessment, it is important to understand and predict the impact of climate variability/change on the severity and frequency of hydrological events (especially extremes). This thesis provides an important step towards this goal, by developing a rigorous general climate-informed spatio-temporal regional frequency analysis (RFA) framework for incorporating the effects of climate variability on hydrological events. This framework brings together several components (in particular spatio-temporal regression models, copula-based modeling of spatial dependence, Bayesian inference, model comparison tools) to derive a general and flexible modeling platform. In this framework, data are assumed to follow a distribution, whose parameters are linked to temporal or/and spatial covariates using regression models. Parameters are estimated with a Monte Carlo Markov Chain method under the Bayesian framework. Spatial dependency of data is considered with copulas. Model comparison tools are integrated. The development of this general modeling framework is complemented with various Monte-Carlo experiments aimed at assessing its reliability, along with real data case studies. Two case studies are performed to confirm the generality, flexibility and usefulness of the framework for understanding and predicting the impact of climate variability on hydrological events. These case studies are carried out at two distinct spatial scales: • Regional scale: Summer rainfall in Southeast Queensland (Australia): this case study analyzes the impact of ENSO on the summer rainfall totals and summer rainfall maxima. A regional model allows highlighting the asymmetric impact of ENSO: while La Niña episodes induce a significant increase in both the summer rainfall totals and maxima, the impact of El Niño episodes is found to be not significant. • Global scale: a new global dataset of extreme precipitation including 11588 rainfall stations worldwide is used to describe the impact of ENSO on extreme precipitations in the world. This is achieved by applying the regional modeling framework to 5x5 degrees cells covering all continental areas. This analysis allows describing the pattern of ENSO impact at the global scale and quantifying its impact on extreme quantiles estimates. Moreover, the asymmetry of ENSO impact and its seasonal pattern are also evaluated.
|
815 |
Recursos computacionais para auxiliar a análise da aptidão física relacionada à saúde de universitários /Sena, Rafael Veloso da. January 2013 (has links)
Orientador: Carlos Norberto Fisher / Banca: Marcelo Tavella Navega / Banca: Alexandre Janotta Drigo / Resumo: Estudar as características relacionadas à aptidão física é um importante passo tanto para avaliar indivíduos quanto à sua saúde, em relação aos considerados padrões de normalidade, como para procurar identificar possíveis associações entre essas características. Isso possibilita propor medidas que possam ser tomadas visando melhorar variáveis relacionadas à aptidão física para a manutenção ou melhora da saúde das pessoas. Esta dissertação visou identificar associações entre características de composição corporal e resultados referentes ao desempenho em testes neuromotores e cardiorrespiratório de universitários. Os dados coletados foram analisados usando recursos e técnicas computacionais, no caso, Banco de Dados e Mineração de Dados. Foram identificadas associações relevantes entre as próprias características de composição corporal e delas com os desempenhos nos testes realizados. Algumas associações mostraram valores altos para a Confiança, métrica usada nesta dissertação. Os resultados das análises permitiram descrever o perfil do grupo analisado. Considerando este perfil, foi construído um aplicativo computacional que compara dados de um indivíduo com valores de variáveis referentes ao perfil e mostra a situação do mesmo em relação a este perfil. O aplicativo também mostra como aquele indivíduo se encontra em relação a tabelas padrão conhecidas / Abstract: The study of the characteristics related to physical fitness is an important step to evaluate individuals according to their health, based on the considered standards of normality. It is also important to try to identify associations between these characteristics in order to try to propose ways to improve the variables related to physical fitness to maintain or improve people health. This work was aimed to identify associations between body composition and results from neuromotor and cardiopulmonary tests of university students. The results were analyzed using Database and Data Mining techniques. Important associations were identified amongst body composition characteristics and amongst them and the test results. Some associations showed high values to Confidence, the metric used in this study. Based on the analysis results, it was possible to describe the profile of the analyzed group and, based on this profile, develop a computational system that compares data of a person with values of a specific profile and shows his/her situation with regard to that profile. The application also show the comparison of a person data with known reference tables / Mestre
|
816 |
Utilização de métodos estatísticos para avaliação da qualidade do leite cru refrigeradoRoquette, Juliana Januzi 25 April 2014 (has links)
Product quality is critical to its acceptance in the market. Monitoring milk quality goes beyond meeting the consumer; products with poor quality affect industrial processes inducing damage. Paying a bonus for the product quality is a way to educate the producers about the importance of quality from raw material. Bonuses are also used to gratify products that have better results than the minimum required quality standards. However, deductions are used to warn farmers who do not understand the risk of producing a food product without the slightest control and monitoring. There are several statistical analyzes that can be developed to monitor the quality of the dairy product collected quickly, accurately and objectively. Among them, it can be mentioned the tools of statistical process control, such as the ability to index and control charts for mean and standard deviation. The objective was to review the means of monitoring the SCC and TBC as milk quality scores and statistics propose ways of monitoring. It was possible to judge accurately the properties, thus demonstrating the applicability and the need to use statistical methods that enable precision, detail and speed in the analyzes. / A qualidade do produto é fundamental para sua aceitação no mercado. O monitoramento da qualidade do leite vai além de atender o consumidor; um produto com qualidade inadequada prejudica processos industriais causando prejuízos. A bonificação do produto por sua qualidade é uma forma de conscientizar o produtor da importância da qualidade da matéria prima. As bonificações também são utilizadas para gratificar o produto que possui resultados muito melhores que os padrões mínimos de qualidade exigidos. Já as deduções são utilizadas para advertir os produtores que não entendem o risco de produzir um gênero alimentício sem o mínimo controle e monitoramento. Várias são as análises estatísticas possíveis de serem realizadas para acompanhar a qualidade do produto captado pelos laticínios de forma rápida, precisa e objetiva. Entre elas é possível citar as ferramentas de controle estatístico de processos, como os índices de capacidade e os gráficos de controle para média e desvio padrão. Objetivou-se revisar os meios de monitoramento da CCS e CBT como índices de qualidade do leite e propor formas estatísticas de monitoramento. Foi possível julgar de forma precisa as propriedades, demonstrando assim a aplicabilidade e a necessidade da utilização de métodos estatísticos que possibilitam precisão, detalhamento e rapidez nas análises. / Mestre em Ciências Veterinárias
|
817 |
Recursos computacionais para auxiliar a análise da aptidão física relacionada à saúde de universitáriosSena, Rafael Veloso da [UNESP] 25 February 2013 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:29:48Z (GMT). No. of bitstreams: 0
Previous issue date: 2013-02-25Bitstream added on 2014-06-13T20:39:35Z : No. of bitstreams: 1
sena_rv_me_rcla.pdf: 393538 bytes, checksum: e72ebab2329a84bc76986c21f6b4012c (MD5) / Estudar as características relacionadas à aptidão física é um importante passo tanto para avaliar indivíduos quanto à sua saúde, em relação aos considerados padrões de normalidade, como para procurar identificar possíveis associações entre essas características. Isso possibilita propor medidas que possam ser tomadas visando melhorar variáveis relacionadas à aptidão física para a manutenção ou melhora da saúde das pessoas. Esta dissertação visou identificar associações entre características de composição corporal e resultados referentes ao desempenho em testes neuromotores e cardiorrespiratório de universitários. Os dados coletados foram analisados usando recursos e técnicas computacionais, no caso, Banco de Dados e Mineração de Dados. Foram identificadas associações relevantes entre as próprias características de composição corporal e delas com os desempenhos nos testes realizados. Algumas associações mostraram valores altos para a Confiança, métrica usada nesta dissertação. Os resultados das análises permitiram descrever o perfil do grupo analisado. Considerando este perfil, foi construído um aplicativo computacional que compara dados de um indivíduo com valores de variáveis referentes ao perfil e mostra a situação do mesmo em relação a este perfil. O aplicativo também mostra como aquele indivíduo se encontra em relação a tabelas padrão conhecidas / The study of the characteristics related to physical fitness is an important step to evaluate individuals according to their health, based on the considered standards of normality. It is also important to try to identify associations between these characteristics in order to try to propose ways to improve the variables related to physical fitness to maintain or improve people health. This work was aimed to identify associations between body composition and results from neuromotor and cardiopulmonary tests of university students. The results were analyzed using Database and Data Mining techniques. Important associations were identified amongst body composition characteristics and amongst them and the test results. Some associations showed high values to Confidence, the metric used in this study. Based on the analysis results, it was possible to describe the profile of the analyzed group and, based on this profile, develop a computational system that compares data of a person with values of a specific profile and shows his/her situation with regard to that profile. The application also show the comparison of a person data with known reference tables
|
818 |
Determinação da biomassa de cana-de-açucar considerando a variação espacial de dados espcetrais do satelite Landsat 7 - ETM+ / Determination of sugarcane biomass considering ETM+/Landsat 7 spectral response variationMachado, Hermogenes Moura 28 February 2003 (has links)
Orientadores: Rubens Augusto Camargo Lamparelli, Jansle Vieira Rocha / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Agricola / Made available in DSpace on 2018-08-06T11:37:32Z (GMT). No. of bitstreams: 1
Machado_HermogenesMoura_M.pdf: 1839058 bytes, checksum: ffeba6a1df211c5b76b90a6548fd58fb (MD5)
Previous issue date: 2003 / Resumo: O monitoramento e estimativa da quantidade de biomassa da cana-de-açúcar são de fundamental importância no planejamento das atividades de condução, colheita, transporte, processamento e comercialização da produção. Com o avanço tecnológico, técnicas de sensoriamento remoto têm-se mostrado úteis no monitoramento de áreas agrícolas onde se tem observado o potencial em especial do sistema Landsat no fornecimento de informações sobre a cultura da cana-de-açúcar. Dentre estas técnicas pode-se citar a aplicação de índices de vegetação obtidos através de transformação linear entre o valor espectral das diferentes bandas do sensor. Com o objetivo de avaliar a aplicação de imagens multiespectrais do satélite Landsat 7 - ETM+ e dos diversos índices de vegetação, no mapeamento da variação espacial da biomassa, e o seu potencial para estimativa da produtividade utilizando-se de um modelo matemático. O estudo foi desenvolvido nos municípios de Araras e Leme - SP. Para os dois anos de estudo foram analisadas duas passagens por ano, sendo a primeira anterior à colheita (fevereiro a abril) e a segunda dias antes da colheita. As imagens em nível de cinza foram transformadas em valores de reflectância aparente e, através da correção atmosférica, para valores de reflectância real. Sob a área de estudo foram distribuídas amostras de acordo com a variação espacial da resposta espectraL As amostras foram localizadas, demarcadas e colhidas.Foram realizadas análises de correlação entre os valores de produtividade e sua resposta espectraL A correção atmosférica não influenciou na correlação entre os dados de produtividade e a resposta espectraL As melhores correlações foram encontradas com os valores da banda 4 (IVP) e do índice de vegetação GVI. No primeiro ano de estudo os valores da banda 4 e do GVI explicaram respectivamente 51,6% e 52,66% da variação da produtividade. Enquanto que no segundo ano ambos explicaram 83% da variação da produtividade. O modelo polinomial de 2° grau foi o que melhor se adaptou ao comportamento dos dados de produtividade da cultura. A produtividade estimada pelo modelo apresentou um erro médio de 4,O4t/ha e um desvio padrão de 2,47t1ha, este resultado comprovou eficiência da utilização da resposta espectral no processo de estimativa da produtividade da cana-de-açúcar / Abstract: The monitoring and biomass estimate in sugar cane are the fundamental importance in leading, harvest, transport, processing and production marketing. With the technologic advanced techniques in remote sensing has showed very high potential in monitoring of sugarcane are as in special the Landsat system. Among these techniques it can to mention the application of the vegetation indices. Therefore that work has the objective of to evaluate the application of ETM+/Landsat 7 spectral images and different vegetation index to mapping biomass spatial variation and its potential for yield estimate using statistical model. The study was developed in Araras and Leme / SP counties at São João Mill in two years. During that time were analyzed two images per year, being the first before the harvest (February to April) and the second just some days before the harvest. The images in raw level had been transformed into values of apparent reflectance and through the atmospheric correction to values of real reflectance. Through the study area were allocated samples according the spectral response variation. After than this were full filled correlation analysis between yield and spectral response. The atmospheric correction hasn't provoked influence in the correlations. The best correlations had been found with the values of the band 4 (IVP) and of the index of vegetation GVI. In the first year of study band 4 and GVI indices values had explained 51.6% and 52.66% of the variation of the yield, respectively. While that in the second year both had explained 83% of the yield variation. The polynomial model of2° degree had the best agree in relation of the yield data behavior. The estimative productivity calculated by the model presented an average error of 4.04t/ha and a standard deviation of 2.47t/ha, attesting the efficiency of the use of the spectral response in the process of the yield estimate in sugar cane culture / Mestrado / Planejamento e Desenvolvimento Rural Sustentável / Mestre em Engenharia Agrícola
|
819 |
Reconstrução de chuveiros atmosféricos extensos detectados pelo Observatório Pierre Auger utilizando métodos robustos / Reconstruction of extensive air showers seen by the Pierre Auger Observatory using robust methodsPeixoto, Carlos Jose Todero 28 August 2008 (has links)
Orientador: Carlos Ourivio Escobar / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin / Made available in DSpace on 2018-08-11T18:57:31Z (GMT). No. of bitstreams: 1
Peixoto_CarlosJoseTodero_D.pdf: 15351567 bytes, checksum: 33b4f282f53a5669d23f8170b3bbf392 (MD5)
Previous issue date: 2008 / Resumo: Desde os primeiros Raios Cósmicos de alta energia detectados por Pierre Auger na década de 30, toda a comunidade de Física de Radiação Cósmica procura técnicas matemáticas e métodos estatísticos mais adequados para analisar estes eventos. Estes processos de análise são imprescindíveis na estimativa da energia da partícula primária, bem como no cálculo do ângulo de chegada q . A estimativa desta energia e do ângulo q é o final de toda uma rede de trabalho e o começo de uma nova linha de pesquisa na busca pelas possíveis fontes que produziram tais eventos.
Ao longo deste trabalho refizemos o princípio de reconstrução dos "chuveiros de Auger", os chamados "Chuveiros Atmosféricos Extensos" ( C.A.E.), utilizando um conceito relativamente novo de estatística, hoje denominada Estatística Robusta.
O Método dos Mínimos Quadrados ou Least Square -LS, apresentado por Gauss e Legendre, possuía limitações que eles próprios já reconheciam e tentaram resolver, sem sucesso. Desde fins do século XVIII e come¸ co do século XIX, os métodos estritamente paramétricos, em especial o Método dos Mínimos Quadrados e a média aritmética, foram questionados quando utilizados para descrever distribuições pouco comportadas ou com grandes utuações. Algumas das principais questões estavam relacionadas a como tratar pontos muito distantes da distribuição principal (os chamados outliers) e como estes influenciavam a própria distribuição. A saída convencional mais utilizada foi a rejeição dos outliers e de pontos que apresentassem grandes desvios em relação a média. Porém, a perda de informações sobre a própria distribuição tornava-se inevitável. O modelo paramétrico mostrou-se apenas uma aproximação da realidade, uma vez que as flutuações, apesar de serem consideradas, não são "bem-vindas"; são vistas apenas como um erro inerente à observação. Então, no fim do século XIX apareceram as primeiras tentativas de extrair informação das flutuações, classificando-as e as considerando parte integral da descrição da distribuição.
Se um método estatístico for capaz de descrever os dados observados, incluindo e classificando as flutuações inerentes, este passa a ser conhecido como "Método Robusto" ou "Estatística Robusta", onde a nomenclatura "Robusta" está relacionada à capacidade do método ou modelo de "resistir" às flutuações fornecendo uma descrição da realidade com razoável independência destas mesmas flutuações.
Com base em dois métodos robustos, Mínima Mediana Quadrada (Least Median Square - LMS) e Mínimos Quadrados "Aparados" (Least Trimmed Square - LTS), aplicamos estes nos ajustes da Função Distribuição Lateral de Chuveiros (Lateral Distribution Function - LDF) extraindo o valor de S 1000, parâmetro necessário para estimar a energia da partícula primária. Os valores para S1000 calculados a partir de estatística convencional (Mínimos Quadrados) e estatística robusta (LMS e LTS) são comparados.
O valor de S1000, para chuveiros de mesma energia, depende do ângulo q dos primários, já que o CAE sofre atenuação na atmosfera, atenuação esta tanto maior quanto maior, for q . Para levar em conta a atenuação no cálculo do espectro de energia, em que todos os ângulos de chegada são considerados (até 60 graus), é introduzido o parâmetro S38, onde 38 graus é a mediana dos dados do Auger. A atenuação é calculada usando-se o método do Constant Intensity Cut (CIC) o qual depende da validade de várias hipóteses. As três hipóteses supostas pela Colaboração Auger são apresentadas neste trabalho.
Correlacionamos, assim, todos os novos valores de S38 com os valores da chamada "Energia Híbrida", obtida diretamente do programa de análise da Colaboração Auger. Esta correlação nos permite recorrigir a energia com base em detecção híbrida, que é a grande vantagem do Observatório Pierre Auger.
Esta correlação nos permite estabelecer a escala de energia ou calibração do detector de superfície com base na determinação calorimétrica da energia feita pelo detector de flurescência, que é o grande avançoo trazido para o campo pelo Obvservatório Pierre Auger.
Com os novos resultados de energia, refizemos os cálculos de minimização para a correlação de radiação cósmica com fontes extra-galácticas obtendo correlações que não estão em correspondência biunívoca com aquelas obtidas pelo método convencional de análise.
Por fim fazemos uma análise das próprias estações outliers tentando extrair alguma informação relacionada à performance do detector de superfície.
Os apêndices incluídos após as conclusões foram colocados neste trabalho apenas por motivos didáticos como consulta rápida para o leitor leigo em métodos de detecção de radiação cósmica / Abstract: Since the first ultra high-energy cosmic rays detected by Pierre Auger (the 30s) the entire community of Physics of Cosmic Rays search for mathematical techniques and more appropriate statistical methods to analyze these events. These analysis processes are essential for the estimate of the energy of the primary particle as well as in the calculation of the angle of arrival q . The estimate of the energy and the angle q is the end of a long chain of analysis and the beginning of a new line of research in the search for the possible sources that produced such events.
Throughout this work we re-analysed the reconstruction chain of the "Auger showers", the socalled "Extensive Air Showers - EAS", using a relatively new concept of statistics, known as Robust Statistics.
The Least Square Method - LS, presented by Gauss and Legendre had limitations already recognized by themselves who tried to overcome them without success. Since the end of the eighteenth century and beginning of the nineteenth century, strictly parametric methods, especially the Least Squares and the arithmetic average, were questioned when used to describe distributions with bad behavior or with large uctuations. Some of the main issues were related to how to deal with points far way from the main distribution (the so-called outliers) and how it in uenced the main distribution. The more conventional way out used was the rejection of the outliers and points that produced large deviations from average. But the loss of information about the distribution was inevitable. The parametric model proved to be only an approximation of reality, since uctuations, despite being considered, are not "welcome"; are seen only as an error inherent in observation. Then, at the end of the nineteenth century there appeared the first attempts to extract information from uctuations sorting them out and considering them as an integral part of the description of the distribution.
Whether a statistical method is able to describe observed data, including and sorting the uctuations inherent, then becomes known as "Robust Method" or "Robust Statistic", where the nomenclature "Robust" is related the ability of the method or model to "Resist" the uctuations by providing a description of reality with reasonable independence these same uctuations.
Based on two robust methods: Least Median Square - LMS and Least Trimmed Square - LTS; we apply these to adjust the Lateral Distribution Function - LDF extracting the value of S1000, parameter needed to estimate the energy of the primary particle. The values for S1000 calculated from conventional statistic (Least Square) and robust statistic (LMS and LTS) are compared.
The parameter S1000 is dependent on the angle of arrival of the shower, then we apply a correction factor called S38. This correlates S1000 and and, currently, there are several ways to calculate this factor. The three hypotheses most used by Auger Collaboration are presented in this work.
We then correlate all new values of S38 with the values of the so-called "Hybrid Energy", obtained directly from analysis software of the Auger Collaboration. This relationship allows us to correct the energy based on hybrid detection that is great advantage of the Pierre Auger Observatory.
This relationship allows us to establish the energy scale or calibration of the surface detector on the basis of the calorimetric determination of the energy done by the uorescence detector which is the great advancement brought to the field by the Pierre Auger Observatory.
With the new results for the energy we reanalysed the the correlation with extra-galactic sources of cosmic ray getting new correlations, which are absent in the conventional methods of analysis.
Finally we make an analysis of the surface stations outliers by themselves trying to extract some information relevant for their performance.
Appendices included after the conclusions were placed in this work only for a rapid consultation by lay readers in methods of detection of cosmic rays / Doutorado / Teorias Especificas e Modelos de Interação ; Sistematica de Particulas ; Raios Cosmicos / Doutor em Ciências
|
820 |
Otimização e análise do desempenho de sistemas frigoríficos utilizando o método de superfície de resposta, o planejamento de experimentos e ensaios de protótipos / Optimization and analysis of the performance of refrigeration systems using response surface methodology, experimental design and prototype experimentsSidnei José de Oliveira 20 June 2001 (has links)
Os métodos de superfície de resposta e planejamento de experimentos foram utilizados no processo de análise e otimização de sistemas frigoríficos. Foram determinadas as dimensões do tubo capilar juntamente com a carga de refrigerante que proporcionaram as melhores condições de funcionamento a um protótipo. O comportamento de oito variáveis resposta foram estudadas, que são: Capacidade Frigorífica, Coeficiente de Eficácia, Temperatura de Descarga, Super Aquecimento, Sub resfriamento, Vazão de Refrigerante, Temperatura de Evaporação e Temperatura de Condensação. Superfícies de Resposta e Curvas de nível foram levantadas em diversas situações de interesse, visando revelar o comportamento e a sensibilidade do sistema. Alguns fatores revelaram níveis que propiciaram uma reduzida variabilidade para certas variáveis resposta demonstrando o conceito de sistema robusto. O método mostrou-se bastante adequado, contribuindo com resultados de grande valia para a otimização e análise do comportamento de sistemas frigoríficos, além de poder ter sua aplicabilidade ampliada para sistemas térmicos em geral. / The Response Surface Methodology and the Design of Experiments were applied on the analysis and optimization process of refrigeration systems. The dimensions of a capillary tube and refrigerant charge that provided the best working conditions to a prototype were determined. The behavior of the Refrigeration Capacity, Coeficient of Performance, Discharge Temperature, Super Heating, Sub Cooling, Mass Flow Rate, Evaporation Temperature and Condensing Temperature were studied in detail. Surface Response and Contour plots were constructed on many situations in order to reveal the system behavior and sensitivity. Some factor levels provided a small variability to certain responses, demonstrating the concept of robust system. The methodology contribuited properly with valuable results to the optimization and analysis of refrigeration system behavior; besides, its applicability can be easily generalised to thermal systems.
|
Page generated in 0.1726 seconds