• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 107
  • 86
  • 32
  • 28
  • 12
  • 10
  • 9
  • 7
  • 7
  • 6
  • 4
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 332
  • 69
  • 57
  • 38
  • 34
  • 32
  • 31
  • 29
  • 28
  • 25
  • 24
  • 23
  • 23
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Statistický modul k pracovní databázi TIVOLE / Statistical modul for working database TIVOLE

Ježek, Jan January 2008 (has links)
Presented thesis thesis is written for IBM Company, specifically IBM IDC Czech Republic, s.r.o. branch. The thesis describes development of statistic module for work database TIVOLE utilizing Lotus Notes program. The basic aim of the thesis is to ease creation of reports and graphical data outputs in relation to speed of operator´s work while narrowing down the required time of human resources for solving problems.
42

Severe Weather during the North American Monsoon and Its Response to Rapid Urbanization and a Changing Global Climate within the Context of High Resolution Regional Atmospheric Modeling

Luong, Thang Manh January 2015 (has links)
The North American monsoon (NAM) is the principal driver of summer severe weather in the Southwest U.S. With sufficient atmospheric instability and moisture, monsoon convection initiates during daytime in the mountains and later may organize, principally into mesoscale convective systems (MCSs). Most monsoon-related severe weather occurs in association with organized convection, including microbursts, dust storms, flash flooding and lightning. The overarching theme of this dissertation research is to investigate simulation of monsoon severe weather due to organized convection within the use of regional atmospheric modeling. A commonly used cumulus parameterization scheme has been modified to better account for dynamic pressure effects, resulting in an improved representation of a simulated MCS during the North American monsoon experiment and the climatology of warm season precipitation in a long-term regional climate model simulation. The effect of urbanization on organized convection occurring in Phoenix is evaluated in model sensitivity experiments using an urban canopy model (UCM) and urban land cover compared to pre-settlement natural desert land cover. The presence of vegetation and irrigation makes Phoenix a "heat sink" in comparison to its surrounding desert, and as a result the modeled precipitation in response to urbanization decreases within the Phoenix urban area and increase on its periphery. Finally, analysis of how monsoon severe weather is changing in association with observed global climate change is considered within the context of a series of retrospectively simulated severe weather events during the period 1948-2010 in a numerical weather prediction paradigm. The individual severe weather events are identified by favorable thermodynamic conditions of instability and atmospheric moisture (precipitable water). Changes in precipitation extremes are evaluated with extreme value statistics. During the last several decades, there has been intensification of organized convective precipitation, but these events occur with less frequency. A more favorable thermodynamic environment for monsoon thunderstorms is the driver of these changes, which is consistent with the broader notion that anthropogenic climate change is presently intensifying weather extremes worldwide.
43

EMPIRICAL PROCESSES FOR ESTIMATED PROJECTIONS OF MULTIVARIATE NORMAL VECTORS WITH APPLICATIONS TO E.D.F. AND CORRELATION TYPE GOODNESS OF FIT TESTS

Saunders, Christopher Paul 01 January 2006 (has links)
Goodness-of-fit and correlation tests are considered for dependent univariate data that arises when multivariate data is projected to the real line with a data-suggested linear transformation. Specifically, tests for multivariate normality are investigated. Let { } i Y be a sequence of independent k-variate normal random vectors, and let 0 d be a fixed linear transform from Rk to R . For a sequence of linear transforms { ( )} 1 , , n d Y Y converging almost surely to 0 d , the weak convergence of the empirical process of the standardized projections from d to a tight Gaussian process is established. This tight Gaussian process is identical to that which arises in the univariate case where the mean and standard deviation are estimated by the sample mean and sample standard deviation (Wood, 1975). The tight Gaussian process determines the limiting null distribution of E.D.F. goodness-of-fit statistics applied to the process of the projections. A class of tests for multivariate normality, which are based on the Shapiro-Wilk statistic and the related correlation statistics applied to the dependent univariate data that arises with a data-suggested linear transformation, is also considered. The asymptotic properties for these statistics are established. In both cases, the statistics based on random linear transformations are shown to be asymptotically equivalent to the statistics using the fixed linear transformation. The statistics based on the fixed linear transformation have same critical points as the corresponding tests of univariate normality; this allows an easy implementation of these tests for multivariate normality. Of particular interest are two classes of transforms that have been previously considered for testing multivariate normality and are special cases of the projections considered here. The first transformation, originally considered by Wood (1981), is based on a symmetric decomposition of the inverse sample covariance matrix. The asymptotic properties of these transformed empirical processes were fully developed using classical results. The second class of transforms is the principal components that arise in principal component analysis. Peterson and Stromberg (1998) suggested using these transforms with the univariate Shapiro-Wilk statistic. Using these suggested projections, the limiting distribution of the E.D.F. goodness-of-fit and correlation statistics are developed.
44

Dammsäkerhetsutvärdering samt utformning av dammregister och felrapporteringssystem för svenska gruvdammar / Dam Safety Evaluation and Development of a Database for Swedish Tailings Dams

Isaksson, Nils, Lundström, Helena January 2005 (has links)
<p>A lot of mine waste rock and tailings arise from all mining processes and have to be stored in an appropriate way. Tailings are deposited in impoundments retained by tailings dams. The objective of tailings dams is to retain the slurry from the mining process and in that way prevent spill into the surroundings that might be harmful for the environment. Tailings dams are often constructed as staged embankments so that construction costs and demand of materials are spread more evenly over the period of deposition.</p><p>The objective of this thesis has been to compile information about and evaluate events at Swedish tailings dams and also to develop a collective database for all Swedish mining companies for all tailings dams and all events that occur at tailings dams.</p><p>Information about 60 events at Swedish tailings dams has been gathered and evaluated. The evaluation has been performed by comparison between and analysis of individual parameters and also by use of a multivariate statistical method called PLS. The statistical analysis shows a decrease in the numbers of events during the last five years, which indicates improved dam safety within the mining industry. The analysis also shows that severe events and the human factor might be related when it comes to the initiating cause of the event. Further relations between the parameters and the severity of the events can be seen from the PLS-analysis, for example that low and short tailings dams to a greater extent are subjected to severe events. To be able to draw more reliable conclusions further studies with a more complete basic data are needed.</p><p>This work has shown a need of a collective database within the Swedish mining industry for tailings dams and occurring events at tailings dams so that more complete basic data could be obtained for future studies. A structure for such a database has been developed in Microsoft Access 2000. The aim of the database is to facilitate feedback within the mining industry and to gather comprehensive data for future statistical evaluations.</p> / <p>Vid alla gruvprocesser skapas stora mängder restprodukter i form av gråberg och anrikningssand som måste tas om hand på lämpligt sätt. Anrikningssanden deponeras tillsammans med vatten från gruvprocessen i magasin omgärdade av dammvallar, s.k. gruvdammar. Gruvdammar har som syfte att hålla kvar anrikningssand och vatten och måste vara stabila så att de skyddar omgivningen från utsläpp av anrikningssand som skulle kunna vara skadligt för miljön. En gruvdamm byggs ofta upp i etapper eftersom byggkostnaderna och behovet av dammfyllnadsmaterial då sprids över tiden.</p><p>Syftet med arbetet har varit att sammanställa och utvärdera händelser vid svenska gruvdammar samt att utforma ett för gruvindustrin gemensamt dammregister och felrapporteringssystem.</p><p>60 händelser vid svenska gruvdammar har sammanställts och utvärderats. Utvärderingen har genomförts dels genom att enskilda parametrar jämförts och analyserats och dels med hjälp av den multivariata analysmetoden PLS. Den statistiska analysen visar på en minskning i antal händelser under de senaste fem åren, vilket tyder på ett förbättrat dammsäkerhetsarbete inom gruvindustrin. Analysen har kunnat uppvisa ett samband mellan allvarliga händelser och den mänskliga faktorn när det gäller vad det är som initierat händelserna. Genom PLS-analysen har ytterligare samband mellan de undersökta parametrarna och allvarlighetsgraden av händelserna kunnat utläsas, bl.a. visar analysen att låga och korta dammar i större utsträckning drabbas av allvarliga händelser jämfört med höga och långa dammar. För att säkra slutsatser ska kunna dras krävs dock vidare studier med ett mer komplett statistiskt underlag.</p><p>Examensarbetet har påvisat ett behov av ett branchgemensamt damm- och felrapporteringsregister för att ett mer komplett underlag ska kunna erhållas i framtiden. En färdig databasstruktur för ett sådant dammregister och felrapporteringsregister för svenska gruvdammar har utformats. Databasen är uppbyggd i Microsoft Access 2000 och är tänkt att underlätta erfarenhetsåterföring inom branschen samt att ge ett underlag för framtida statistiska undersökningar.</p>
45

Hodnocení a porovnání úspěšnosti herní činnosti libera se smečaři na ME ve volejbale mužů 2012 / Evaluation and comparison of the success in gaming activities of libero with hitters in the mens volleyball European Championship 2011

Buchalová, Karolína January 2012 (has links)
Title: Evaluation and comparison of the success in gaming activities of libero with hitters in the mens volleyball European Championship 2011 Objectives: The main objective of this thesis is comparison of libero and hitters. This thesis focus on pass after servis from oponent team and on field game. We will achieve results about succes rate of libero and compare those results with results of hitters, on European championship 2011 in men voleyball, in the same category. Methods: In this thesis is used the method of indirect observation, thanks to video. It was recorded success of passing and collection via line method. In order to classify the quality of passing and collection, it was used five-point scale. Results: The results confirmed all the research questions, so we are convinced of the quality and indispensability of libero in Volleyball. Keywords: Volleyball, individual game activity, libero, European championship, statistik
46

Conception, développement et analyse de systèmes de fonction booléennes décrivant les algorithmes de chiffrement et de déchiffrement de l'Advanced Encryption Standard / Design, development and analysis of Boolean function systems describing the encryption and decryption algorithms of the Advanced Encryption Standard

Dubois, Michel 24 July 2017 (has links)
La cryptologie est une des disciplines des mathématiques, elle est composée de deux sous-ensembles: la cryptographie et la cryptanalyse. Tandis que la cryptographie s'intéresse aux algorithmes permettant de modifier une information afin de la rendre inintelligible sans la connaissance d'un secret, la seconde s'intéresse aux méthodes mathématiques permettant de recouvrer l'information originale à partir de la seule connaissance de l'élément chiffré.La cryptographie se subdivise elle-même en deux sous-ensembles: la cryptographie symétrique et la cryptographie asymétrique. La première utilise une clef identique pour les opérations de chiffrement et de déchiffrement, tandis que la deuxième utilise une clef pour le chiffrement et une autre clef, différente de la précédente, pour le déchiffrement. Enfin, la cryptographie symétrique travaille soit sur des blocs d'information soit sur des flux continus d'information. Ce sont les algorithmes de chiffrement par blocs qui nous intéressent ici.L'objectif de la cryptanalyse est de retrouver l'information initiale sans connaissance de la clef de chiffrement et ceci dans un temps plus court que l'attaque par force brute. Il existe de nombreuses méthodes de cryptanalyse comme la cryptanalyse fréquentielle, la cryptanalyse différentielle, la cryptanalyse intégrale, la cryptanalyse linéaire...Beaucoup de ces méthodes sont maintenues en échec par les algorithmes de chiffrement modernes. En effet, dans un jeu de la lance et du bouclier, les cryptographes développent des algorithmes de chiffrement de plus en plus efficaces pour protéger l'information chiffrée d'une attaque par cryptanalyse. C'est le cas notamment de l'Advanced Encryption Standard (AES). Cet algorithme de chiffrement par blocs a été conçu par Joan Daemen et Vincent Rijmen et transformé en standard par le National Institute of Standards and Technology (NIST) en 2001. Afin de contrer les méthodes de cryptanalyse usuelles les concepteurs de l'AES lui ont donné une forte structure algébrique.Ce choix élimine brillamment toute possibilité d'attaque statistique, cependant, de récents travaux tendent à montrer, que ce qui est censé faire la robustesse de l'AES, pourrait se révéler être son point faible. En effet, selon ces études, cryptanalyser l'AES se ``résume'' à résoudre un système d'équations quadratiques symbolisant la structure du chiffrement de l'AES. Malheureusement, la taille du système d'équations obtenu et le manque d'algorithmes de résolution efficaces font qu'il est impossible, à l'heure actuelle, de résoudre de tels systèmes dans un temps raisonnable.L'enjeu de cette thèse est, à partir de la structure algébrique de l'AES, de décrire son algorithme de chiffrement et de déchiffrement sous la forme d'un nouveau système d'équations booléennes. Puis, en s'appuyant sur une représentation spécifique de ces équations, d'en réaliser une analyse combinatoire afin d'y détecter d'éventuels biais statistiques. / Cryptology is one of the mathematical fields, it is composed of two subsets: cryptography and cryptanalysis. While cryptography focuses on algorithms to modify an information by making it unintelligible without knowledge of a secret, the second focuses on mathematical methods to recover the original information from the only knowledge of the encrypted element.Cryptography itself is subdivided into two subsets: symmetric cryptography and asymmetric cryptography. The first uses the same key for encryption and decryption operations, while the second uses one key for encryption and another key, different from the previous one, for decryption. Finally, symmetric cryptography is working either on blocks of information either on continuous flow of information. These are algorithms block cipher that interests us here.The aim of cryptanalysis is to recover the original information without knowing the encryption key and this, into a shorter time than the brute-force attack. There are many methods of cryptanalysis as frequency cryptanalysis, differential cryptanalysis, integral cryptanalysis, linear cryptanalysis...Many of these methods are defeated by modern encryption algorithms. Indeed, in a game of spear and shield, cryptographers develop encryption algorithms more efficient to protect the encrypted information from an attack by cryptanalysis. This is the case of the Advanced Encryption Standard (AES). This block cipher algorithm was designed by Joan Daemen and Vincent Rijmen and transformed into standard by the National Institute of Standards and Technology (NIST) in 2001. To counter the usual methods of cryptanalysis of AES designers have given it a strong algebraic structure.This choice eliminates brilliantly any possibility of statistical attack, however, recent work suggests that what is supposed to be the strength of the AES, could prove to be his weak point. According to these studies, the AES cryptanalysis comes down to ``solve'' a quadratic equations symbolizing the structure of the AES encryption. Unfortunately, the size of the system of equations obtained and the lack of efficient resolution algorithms make it impossible, at this time, to solve such systems in a reasonable time.The challenge of this thesis is, from the algebraic structure of the AES, to describe its encryption and decryption processes in the form of a new Boolean equations system. Then, based on a specific representation of these equations, to achieve a combinatorial analysis to detect potential statistical biases.
47

Rodová diverzita rozsivek: vztah ke genetické variabilitě v rámci rodu Frustulia a význam geografie / The diversity of diatom genera: relationship to genetic variability within the genus Frustulia and the role of geography

Vrbová, Kateřina January 2018 (has links)
The occurrence of some diatoms depends on degree of pollution and water quality. Due to this attribute are diatoms used as indicators for the environmental bioassessment. But the maximum use of diatoms for this purpose is complicated by high number of species which are defined based on the ultrastructural morphological features which are indistinguishable without the electron microscope. The aims of this study were to find out the influence of environmental factors, types of habitat and geography on the structure of diatom community. And find out if richness of higher taxonomic levels is correlated with species richness, in this case if it responds with the genetic diversity within diatom species complex Frustulia crassinervia-saxonica. In this study, 49 permanent slides from natural samples were analyzed. Samples were taken from benthos of different types of freshwater habitat - lakes, dams, pools, peat bogs, stream, wet wall on diverse localities in Europe, Canada, Greenland, Chile and New Zealand. In all slides were counted 300 cells which were determined based on the morphological features on genera level. Altogether 43 benthic genera were identified. The results of this thesis showed that number of genera correlated with pH gradient but do not correlate with other environmental factors -...
48

O Biplot na análise fatorial multivariada / The Biplot in multivariate factor analysis

Klefens, Paula Cristina de Oliveira 11 January 2010 (has links)
AA análise multivariada e um conjunto de técnicas que são adequadas para situações onde varias variáveis correlacionadas estão envolvidas. Dentre essas técnicas temos as componentes principais e a analise fatorial. A técnica dos componentes principais reduz a dimensão de uma matriz de dados originais através de combinações lineares facilitando a interpretação desses dados e a analise fatorial que e o nome dado a uma classe de métodos estatísticos paramétricos (e não paramétricos) multivariados que correspondem a um grande numero de métodos e técnicas que utilizam simultaneamente todas as variáveis do conjunto na interpretação do inter-relacionamento das variáveis observadas (COSTA, 2006). O objetivo da analise fatorial e descrever as relações de covariância entre algumas variáveis em algum termo subjacente, mas não observável, de quantidades aleatórias chamadas fatores (JOHNSON e WICHERN, 1998). Biplot e um gráfico estático, desenvolvido por Gabriel (1971), que representa no mesmo gráfico as variáveis o as observações com o intuito de demonstrar graficamente as relações existentes entre variáveis, entre observações e entre variáveis e observações. O presente trabalho tem como objetivo inserir a metodologia de analise biplot tridimensional na técnica de analise fatorial multivariada. Foi usado o software SAS para a realização da analise fatorial e a construção do gráfico biplot e um conjunto de dados para a aplicação do mesmo. O estudo mostra que o gráfico Biplot e um método de analise multivariada de suma importância quando inserido na analise fatorial facilitando e complementando a interpretação dos resultados / Multivariate analysis is a set of techniques that are appropriate for situations where several correlated variables are involved. Among these techniques have the principal components and factor analysis. The technique of principal components reduces the size of an array of original data through linear combinations facilitating the interpretation of these data and factor analysis that is the name given to a class of parametric statistical methods (non parametric) multivariate corresponding to a large number of methods and techniques that use simultaneously all the variables set in the interpretation of the interrelationship of the observed variables (COSTA, 2006). The goal of factor analysis is to describe the covariance relationships between variables in any term underlying, but unobservable, random quantities called factors (Johnson and Wichern, 1998). Biplot is a statistical graph, developed by Gabriel (1971), which represents the in same graph the variables to the observations in order to demonstrate graphically the relationship between variables, between observations and between variables and observations. This paper aims to insert the biplot analysis methodology in three-dimensional technique of multivariate factor analysis. Was used SAS software to perform the factor analysis and construction of the biplot graph and a set of data for the application. The study shows that the graph Biplot is a method of analysis of the utmost importance when inserted in the factor analysis, facilitating and complementing the interpretation of results.
49

Construção de mapas genéticos em espécies de polinização aberta: uma abordagem Bayesiana com o uso de uma priori informativa. / Construction of genetics maps in outbreeding species: A Bayesian approach with the use of a prior informative.

Ragonha, Francine 03 March 2005 (has links)
A construção dos mapas Genéticos é importante para o melhoramento genético de plantas, pois são através desses mapas que pode se determinar em que pontos dos cromossomos as unidades hereditárias podem estar. Com o objetivo de verificar se o método Bayesiano incluindo a informação a priori pode ou não ser empregado nos estudos de construção de mapas Genéticos, estimativas Bayesianas e de máxima verossimilhança para a freqüência de recombinação foram obtidas, envolvendo espécies de polinização aberta. Para isso, foram considerados diferentes tipos de marcadores: marcadores completamente informativos e marcadores parcialmente informativos. Através de simulações de conjuntos de dados combinando dois marcadores de cada vez, as estimativas da freqüência de recombinação foram obtidas através de um algoritmo baseado na função de verossimilhança para os dois métodos de estimação usados. A caracterização das fases de ligação foi baseada na distribuição da probabilidade a posteriori dos arranjos de alelos alternativos em dados marcadores para dois cromossomos homólogos de cada genitor, condicional aos fenótipos observados dos marcadores. Os resultados obtidos permitem concluir que o método Bayesiano pode ser usado em estudos de ligação Genética com o uso da informação a priori. Quanto a estimação das fases de ligação, os dois métodos levam sempre à mesma conclusão. / The construction of the Genetic maps are essential for the genetic improvement of plants, because through this maps that it can be determined in which spots within the chromosomes the hereditary unities could be. With the aim of checking whether the Bayesian method including the prior information can or not to be used in the studies of Genetic maps construction, Bayesians estimates and of maximum likelihood for the recombination frequency were obtained, outbreeding species. For that, diferent types of markers were considered containing fully informative markers and partially informative markers. Through simulations of groups of data combining two markers one at a time, the estimates of the recombination frequency were obtained through a general maximum-likelihood based algorithm for the two used estimate methods. The characterization of linkage phases was based in the posterior probable distribution of the assignment of alternative alleles at given markers to two homologous chromosomes of each parent, conditional on the observed phenotypes of the markers.The results obtained allows to conclude that the Bayesian method can be used in studies of Genetic linkage with the use of the priori information. As the estimate of the linkage phases, the two methods always get to the same conclusion.
50

Um estudo sobre estimação e predição em modelos geoestatísticos bivariados / A study on estimation and prediction in bivariate geostatistical models

Fonseca, Bruno Henrique Fernandes 05 March 2009 (has links)
Os modelos geoestatísticos bivariados denem funções aleatórias para dois processos estocásticos com localizações espaciais conhecidas. Pode-se adotar a suposição da existência de um campo aleatório gaussiano latente para cada variável aleatória. A suposição de gaussianidade do processo latente é conveniente para inferências sobre parâmetros do modelo e para obtenção de predições espaciais, uma vez que a distribuição de probabilidade conjunta para um conjunto de pontos do processo latente é também gaussiana. A matriz de covariância dessa distribuição deve ser positiva denida e possuir a estrutura de variabilidade espacial entre e dentre os atributos. Gelfand et al. (2004) e Diggle e Ribeiro Jr. (2007) propuseram estratégias para estruturar essa matriz, porém não existem muitos relatos sobre o uso e avaliações comparativas entre essas abordagens. Neste trabalho foi conduzido um estudo de simulação de modelos geoestatísticos bivariados em conjunto com estimação por máxima verossimilhança e krigagem ordinária, sob diferentes congurações amostrais de localizações espaciais. Também foram utilizados dados provenientes da análise de solo de uma propriedade agrícola com 51,8ha de área, onde foram amostradas 67 localizações georeferenciadas. Foram utilizados os valores mensurados de pH e da saturação por bases do solo, que foram submetidas à análise descritiva espacial, modelagens geoestatísticas univariadas, bivariadas e predições espaciais. Para vericar vantagens quanto à adoção de modelos univariados ou bivariados, a amostra da saturação por bases, que possui coleta mais dispendiosa, foi dividida em uma subamostra de modelagem e uma subamostra de controle. A primeira foi utilizada para fazer a modelagem geoestatística e a segunda foi utilizada para comparar as precisões das predições espaciais nas localizações omitidas no processo de modelagem. / Bivariate geostatistical models dene random functions for two stochastic processes with known spatial locations. Existence of a Gaussian random elds can be assumed for each latent random variable. This Gaussianity assumption for the latent process is a convenient one for the inferences on the model parameters and for spatial predictions once the joint distribution for a set of points is multivariate normal. The covariance matrix of this distribution should be positivede nite and to have the spatial variability structure between and among the attributes. Gelfand et al. (2004) and Diggle e Ribeiro Jr. (2007) suggested strategies for structuring this matrix, however there are few reports on comparing approaches. This work reports on a simulation study of bivariate models together with maximum likelihood estimators and spatial prediction under dierent sets of sampling locations space. Soil sample data from a eld with 51.8 hectares is also analyzed with the two soil attributes observed at 67 spatial locations. Data on pH and base saturation were submitted to spatial descriptive analysis, univariate and bivariate modeling and spatial prediction. To check for advantages of the adoption of univariate or bivariate models, the sample of the more expensive variable was divided into a modeling and testing subsamples. The rst was used to t geostatistical models, and the second was used to compare the spatial prediction precisions in the locations not used in the modeling process.

Page generated in 0.0506 seconds