• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 518
  • 246
  • 208
  • 111
  • 56
  • 24
  • 20
  • 18
  • 16
  • 15
  • 14
  • 14
  • 12
  • 11
  • 10
  • Tagged with
  • 1460
  • 280
  • 201
  • 167
  • 140
  • 123
  • 122
  • 120
  • 118
  • 116
  • 116
  • 114
  • 110
  • 105
  • 96
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Limit of detection for second-order calibration methods

Rodríguez Cuesta, Mª José 02 June 2006 (has links)
Analytical chemistry can be split into two main types, qualitative and quantitative. Most modern analytical chemistry is quantitative. Popular sensitivity to health issues is aroused by the mountains of government regulations that use science to, for instance, provide public health information to prevent disease caused by harmful exposure to toxic substances. The concept of the minimum amount of an analyte or compound that can be detected or analysed appears in many of these regulations (for example, to discard the presence of traces of toxic substances in foodstuffs) generally as a part of method validation aimed at reliably evaluating the validity of the measurements.The lowest quantity of a substance that can be distinguished from the absence of that substance (a blank value) is called the detection limit or limit of detection (LOD). Traditionally, in the context of simple measurements where the instrumental signal only depends on the amount of analyte, a multiple of the blank value is taken to calculate the LOD (traditionally, the blank value plus three times the standard deviation of the measurement). However, the increasing complexity of the data that analytical instruments can provide for incoming samples leads to situations in which the LOD cannot be calculated as reliably as before.Measurements, instruments and mathematical models can be classified according to the type of data they use. Tensorial theory provides a unified language that is useful for describing the chemical measurements, analytical instruments and calibration methods. Instruments that generate two-dimensional arrays of data are second-order instruments. A typical example is a spectrofluorometer, which provides a set of emission spectra obtained at different excitation wavelengths.The calibration methods used with each type of data have different features and complexity. In this thesis, the most commonly used calibration methods are reviewed, from zero-order (or univariate) to second-order (or multi-linears) calibration models. Second-order calibration models are treated in details since they have been applied in the thesis.Concretely, the following methods are described:- PARAFAC (Parallel Factor Analysis)- ITTFA (Iterative Target Transformation Analysis)- MCR-ALS (Multivariate Curve Resolution-Alternating Least Squares)- N-PLS (Multi-linear Partial Least Squares)Analytical methods should be validated. The validation process typically starts by defining the scope of the analytical procedure, which includes the matrix, target analyte(s), analytical technique and intended purpose. The next step is to identify the performance characteristics that must be validated, which may depend on the purpose of the procedure, and the experiments for determining them. Finally, validation results should be documented, reviewed and maintained (if not, the procedure should be revalidated) as long as the procedure is applied in routine work.The figures of merit of a chemical analytical process are 'those quantifiable terms which may indicate the extent of quality of the process. They include those terms that are closely related to the method and to the analyte (sensitivity, selectivity, limit of detection, limit of quantification, ...) and those which are concerned with the final results (traceability, uncertainty and representativity) (Inczédy et al., 1998). The aim of this thesis is to develop theoretical and practical strategies for calculating the limit of detection for complex analytical situations. Specifically, I focus on second-order calibration methods, i.e. when a matrix of data is available for each sample.The methods most often used for making detection decisions are based on statistical hypothesis testing and involve a choice between two hypotheses about the sample. The first hypothesis is the "null hypothesis": the sample is analyte-free. The second hypothesis is the "alternative hypothesis": the sample is not analyte-free. In the hypothesis test there are two possible types of decision errors. An error of the first type occurs when the signal for an analyte-free sample exceeds the critical value, leading one to conclude incorrectly that the sample contains a positive amount of the analyte. This type of error is sometimes called a "false positive". An error of the second type occurs if one concludes that a sample does not contain the analyte when it actually does and it is known as a "false negative". In zero-order calibration, this hypothesis test is applied to the confidence intervals of the calibration model to estimate the LOD as proposed by Hubaux and Vos (A. Hubaux, G. Vos, Anal. Chem. 42: 849-855, 1970).One strategy for estimating multivariate limits of detection is to transform the multivariate model into a univariate one. This strategy has been applied in this thesis in three practical applications:1. LOD for PARAFAC (Parallel Factor Analysis).2. LOD for ITTFA (Iterative Target Transformation Factor Analysis).3. LOD for MCR-ALS (Multivariate Curve Resolution - Alternating Least Squares)In addition, the thesis includes a theoretical contribution with the proposal of a sample-dependent LOD in the context of multivariate (PLS) and multi-linear (N-PLS) Partial Least Squares. / La Química Analítica es pot dividir en dos tipus d'anàlisis, l'anàlisi quantitativa i l'anàlisi qualitativa. La gran part de la química analítica moderna és quantitativa i fins i tot els govern fan ús d'aquesta ciència per establir regulacions que controlen, per exemple, nivells d'exposició a substàncies tòxiques que poden afectar la salut pública. El concepte de mínima quantitat d'un analit o component que es pot detectar apareix en moltes d'aquestes regulacions, en general com una part de la validació dels mètodes per tal de garantir la qualitat i la validesa dels resultats.La mínima quantitat d'una substància que pot ser diferenciada de l'absència d'aquesta substància (el que es coneix com un blanc) s'anomena límit de detecció (limit of detection, LOD). En procediments on es treballa amb mesures analítiques que són degudes només a la quantitat d'analit present a la mostra (situació d'ordre zero) el LOD es pot calcular com un múltiple de la mesura del blanc (tradicionalment, 3 vegades la desviació d'aquesta mesura). Tanmateix, l'evolució dels instruments analítics i la complexitat creixent de les dades que generen, porta a situacions en les que el LOD no es pot calcular fiablement d'una forma tan senzilla. Les mesures, els instruments i els models de calibratge es poden classificar en funció del tipus de dades que utilitzen. La Teoria Tensorial s'ha utilitzat en aquesta tesi per fer aquesta classificació amb un llenguatge útil i unificat. Els instruments que generen dades en dues dimensions s'anomenen instruments de segon ordre i un exemple típic és l'espectrofluorímetre d'excitació-emissió, que proporciona un conjunt d'espectres d'emissió obtinguts a diferents longituds d'ona d'excitació.Els mètodes de calibratge emprats amb cada tipus de dades tenen diferents característiques i complexitat. En aquesta tesi, es fa una revisió dels models de calibratge més habituals d'ordre zero (univariants), de primer ordre (multivariants) i de segon ordre (multilinears). Els mètodes de segon ordre estan tractats amb més detall donat que són els que s'han emprat en les aplicacions pràctiques portades a terme. Concretament es descriuen:- PARAFAC (Parallel Factor Analysis)- ITTFA (Iterative Target Transformation Analysis)- MCR-ALS (Multivariate Curve Resolution-Alternating Least Squares)- N-PLS (Multi-linear Partial Least Squares)Com s'ha avançat al principi, els mètodes analítics s'han de validar. El procés de validació inclou la definició dels límits d'aplicació del procediment analític (des del tipus de mostres o matrius fins l'analit o components d'interès, la tècnica analítica i l'objectiu del procediment). La següent etapa consisteix en identificar i estimar els paràmetres de qualitat (figures of merit, FOM) que s'han de validar per, finalment, documentar els resultats de la validació i mantenir-los mentre sigui aplicable el procediment descrit.Algunes FOM dels processos químics de mesura són: sensibilitat, selectivitat, límit de detecció, exactitud, precisió, etc. L'objectiu principal d'aquesta tesi és desenvolupar estratègies teòriques i pràctiques per calcular el límit de detecció per problemes analítics complexos. Concretament, està centrat en els mètodes de calibratge que treballen amb dades de segon ordre.Els mètodes més emprats per definir criteris de detecció estan basats en proves d'hipòtesis i impliquen una elecció entre dues hipòtesis sobre la mostra. La primera hipòtesi és la hipòtesi nul·la: a la mostra no hi ha analit. La segona hipòtesis és la hipòtesis alternativa: a la mostra hi ha analit. En aquest context, hi ha dos tipus d'errors en la decisió. L'error de primer tipus té lloc quan es determina que la mostra conté analit quan no en té i la probabilitat de cometre l'error de primer tipus s'anomena fals positiu. L'error de segon tipus té lloc quan es determina que la mostra no conté analit quan en realitat si en conté i la probabilitat d'aquest error s'anomena fals negatiu. En calibratges d'ordre zero, aquesta prova d'hipòtesi s'aplica als intervals de confiança de la recta de calibratge per calcular el LOD mitjançant les fórmules d'Hubaux i Vos (A. Hubaux, G. Vos, Anal. Chem. 42: 849-855, 1970)Una estratègia per a calcular límits de detecció quan es treballa amb dades de segon ordre es transformar el model multivariant en un model univariant. Aquesta estratègia s'ha fet servir en la tesi en tres aplicacions diferents::1. LOD per PARAFAC (Parallel Factor Analysis).2. LOD per ITTFA (Iterative Target Transformation Factor Analysis).3. LOD per MCR-ALS (Multivariate Curve Resolution - Alternating Least Squares)A més, la tesi inclou una contribució teòrica amb la proposta d'un LOD que és específic per cada mostra, en el context del mètode multivariant PLS i del multilinear N-PLS.
212

El límite como concepto plástico en la obra de Eduardo Chillida

Lomelí Ponce, Javier 29 June 2011 (has links)
El límite como concepto plástico en la obra de Eduardo Chillida consiste en una investigación estética que desde un cariz fenomenológico, específicamente desde la óptica del filosofo alemán Martin Heidegger, pretende indagar en la obra del escultor vasco en su carácter de acontecimiento artístico-espacial. Concretamente, tomando como eje conceptual la noción de límite, entendida como categoría ontológica fundamental aplicada a la obra de arte. Esto siguiendo la línea propuesta por el filósofo barcelonés Eugenio Trías, que con su Filosofía del límite, sobre todo en relación al ámbito estético, nos proporciona un cauce adecuado y una sistematización para nuestra lectura sobre la obra del artista. De esta forma, se realiza una reflexión estética fundada en la categoría del límite, que en su desarrollo y radicalización artístico-conceptual, concluye con la relación que se establece entre el arte y la religión, partiendo de la obra de Eduardo Chillida. / The limit as an artistic concept in the work of Eduardo Chilida is an aesthetic investigation looking at the work of the Basque sculptor from a phenomenological viewpoint, applying the ideas of German Philosopher, Martin Heidegger and using the concept of the limit as a fundamental ontological category applied to his works of art. This notion of the limit follows the ideas of Barcelona philosopher, Eugenio Trías, who in his philosophical system, above all in relation to his aesthetic sphere, provides us with both the appropriate channels and structure for our approach to Chilida´s work. In this way, an aesthetic reflection based on the concept of the limit is made, in which its conceptual-artistic development concludes with the relationship established between art and religion.
213

[en] MARTINGALE CENTRAL LIMIT THEOREM / [pt] TEOREMA CENTRAL DO LIMITE PARA MARTINGAIS

RODRIGO BARRETO ALVES 13 December 2017 (has links)
[pt] Esta dissertação é dedicada ao estudo das taxas de convergência no Teorema Central do Limite para Martingais. Começamos a primeira parte da tese apresentando a Teoria de Martingais, introduzindo o conceito de esperança condicional e suas propriedades. Desta forma poderemos descrever o que é um Martingal, mostraremos alguns exemplos, e exporemos alguns dos seus principais teoremas. Na segunda parte da tese vamos analisar o Teorema Central do Limite para variáveis aleatórias, apresentando os conceitos de função característica e convergência em distribuição, que serão utilizados nas provas de diferentes versões do Teorema Central do Limite. Demonstraremos três formas do Teorema Central do Limite, para variáveis aleatórias independentes e identicamente distribuídas, a de Lindeberg-Feller e para uma Poisson. Após, apresentaremos o Teorema Central do Limite para Martingais, demonstrando uma forma mais geral e depois enunciaremos uma forma mais específica a qual focaremos o resto da tese. Por fim iremos discutir as taxas de convergência no Teorema Central do Limite, com foco nas taxas de convergência no Teorema Central do Limite para Martingais. Em particular, exporemos o resultado de [4], o qual determina, até uma constante multiplicativa, a dependência ótima da taxa de um certo parâmetro do martingal. / [en] This dissertation is devoted to the study of the rates of convergence in the Martingale Central Limit Theorem. We begin the first part presenting the Martingale Theory, introducing the concept of conditional expectation and its properties. In this way we can describe what a martingale is, present examples of martingales, and state some of the principal theorems and results about them. In the second part we will analyze the Central Limit Theorem for random variables, presenting the concepts of characteristic function and the convergence in distribution, which will be used in the proof of various versions of the Central Limit Theorem. We will demonstrate three different forms of the Central Limit Theorem, for independent and identically distributed random variables, Lindeberg-Feller and for a Poisson distribution. After that we can introduce the Martingale Central Limit Theorem, demonstrating a more general form and then stating a more specific form on which we shall focus. Lastly, we will discuss rates of convergence in the Central Limit Theorems, with a focus on the rates of convergence in the Martingale Central Limit Theorem. In particular, we state results of [4], which determine, up to a multiplicative constant, the optimal dependence of the rate on a certain parameter of the martingale.
214

Veřejné zdravotní pojištění se zaměřením na regulační poplatky / Public health insurance with a view to a regulation charge

PEKÁRKOVÁ, Veronika January 2009 (has links)
The aim of my thesis was to examine people{\crq}s and professionals{\crq} opinions on introduction of the regulatory health fees. Within the scope of the two sub-targets, I tried to find out the amounts of the regulatory health fees acceptable for the citizens and something about the possibility to arrange a private health insurance. I briefly described the system of the public health insurance and I mentioned the system of the regulatory health fees and the annual protection limit in more detail in the theoretical part of the diploma thesis. In order to compile the thesis I used the method of questioning which I carried out by collecting data using a questionnaire and the method of the secondary analysis of data. Two statistics groups {--} general public and professionals from the South Bohemian Region - were defined for the research part. Based on the available literature, I established four hypotheses of which the first three were confirmed and the fourth one was disconfirmed by the research. The research revealed differences in opinions and perception of the regulatory health fees between the individuals and general public and professionals. The citizens{\crq} awareness {--} which still has its deficiencies as the questionnaire survey revealed {--} has definitely influence on the opinion on the regulatory health fees. The thesis could be used to increase the awareness of the system of the regulatory health fees and the annual protection limit, for lectures, eventually the respondents{\crq} replies and the research results could be taken into consideration when amending the existing Act No. 48/1997 Coll., on Public Health Insurance.
215

Stroj na americký sen: Protisystémová fikce Coovera, Thompsona, Burroughse, a Acker / The American Dream Machine: Anti-Systemic Fictions of Coover, Thompson, Burroughs, and Acker

Novická, Tereza January 2017 (has links)
Thesis Abstract The thesis examines manifestations of transgression in Robert Coover's The Public Burning (1977), Hunter S. Thompson's Fear and Loathing in Las Vegas: A Savage Journey to the Heart of the American Dream (1971), William S. Burroughs' Naked Lunch (1959) and The Nova Trilogy (1961-1967), and Kathy Acker's Empire of the Senseless (1988) on a structural and thematic level. Georges Bataille's theory of escalated excess and Michel Foucault's theory of the transgression-limit power dynamics, outlined in Chapter One, provide the theoretical framework through which the texts are analyzed, as through concepts of the spectacle, the carnival, taboo, and the Situationist détournement practice. The nature of the American Dream Machine is explored in regards to its chief components of control; the American war on abstractions, American exceptionalism, and the American Dream, examined through their contradictory connotations and historical relevance. The thesis proposes that despite their anti- systemic drive, the selected texts are complicit with and dependent on the American Dream Machine in perpetuating their power play. In Chapter Two, the hyperbolization of American Cold War propaganda rhetoric is analyzed in Coover's The Public Burning. Chapter Three details Thompson's gonzo writing against the...
216

Lower Bound Limit Analysis Applications For Solving Planar Stability Problems In Geomechanics

Bhattacharya, Paramita 09 1900 (has links) (PDF)
Limit analysis based upon the theory of plasticity is one of the very useful numerical techniques to determine the failure loads of different civil and mechanical engineering structures for a material following an associated flow rule. The limiting values of the collapse loads, namely, lower and upper bounds, can be bracketed quite accurately with the application of the lower and upper bound theorems of the limit analysis. With the advancement of the finite elements and different robust optimization techniques, the numerical limit analysis approach in association with finite elements is becoming very popular to assess the stability of various complicated structures. Although two different optimization methods, namely, linear programming and nonlinear programming, have been both successfully implemented by various researchers for solving different stability problems in geomechanics, the linear programming method is employed in the present thesis due to its inherent advantage in implementation and ease in achieving the convergence. The objectives of the present thesis are (i) to improve upon the existing lower bound limit analysis method, in combination with finite elements and linear programming, with an intention of reducing the computational time and the associated memory requirement, and (ii) to apply the existing lower bound finite element limit analysis to various important planar stability problems in geotechnical engineering. With reference to the first objective of the thesis, two new methods have been introduced in this thesis to improve upon the existing computational procedure while solving the geomechanics stability problem with the usage of the limit analysis, finite elements and linear programming. In the first method, namely, the method-I, the order of the yield polygon within the chosen domain is varied, based on the proximity of the stress state to the yield, such that a higher order polygon needs not to be used everywhere in the problem domain. In the second method, the method-II, it has been intended to use only a few selected sides, but not all, of the higher order yield polygon which are being used to linearize the Mohr-Coulomb yield function. The proposed two methods have been applied to compute the ultimate bearing capacity of smooth as well as rough strip footings for various soil frictional angles. It has been noticed that both the proposed new methods reduce the CPU time and the total number of inequality constraints required as compared to the existing lower bound linear programming method used in literature. With reference to the second objective, a few important planar stability problems in geomechanics associated with interference of footings and vertical anchors have been solved in the present thesis. Footings are essentially used to transfer the compressive loads of the super structures to underlying soil media. On the other hand, vertical anchors are used for generating passive supports to retaining walls, sheet piles and bulkheads. A large number of research investigations have been reported in literature to compute the collapse load for a single isolated strip footing and a single vertical anchor. It is a common practice to estimate the bearing capacity of footings or pullout capacity of anchors without considering the effect of interference. There are, however, clear evidences from the available literature that (i) the ultimate bearing capacity of footings, and (ii) the ultimate pullout capacity of anchors, are significantly affected by their interference effect. Based on different available methods, the interference of footings, in a group of two footings as well as an infinite number of multiple footings, has been examined by different researchers in order to compute the ultimate bearing capacity considering the group effect. However, there is no research study to find the ultimate bearing capacity of interfering footings with the usage of the lower bound limit analysis. In the present thesis, the ultimate bearing capacity of two and an infinite number of multiple strip footings placed on sandy soil with horizontal ground surface, has been determined. The analysis has been performed for smooth as well as rough footings. The failure loads for interfering footings are found to be always greater than the single isolated footing. The effect of the footings' interference is expressed in terms of an efficiency factor ( ξγ); where, ξγ is defined as the ratio of the magnitude of failure load for a footing of width B in presence of the other footing to the magnitude of failure load of an isolated strip footing having the same width. The effect of the interference on the failure load (i) for rough footings becomes always greater than smooth footings, (ii) increases with an increase in soil frictional angle φ, and (iii) becomes almost negligible beyond the spacing, S > 3B. It is observed that the failure load for a footing in a group of an infinite number of multiple strip footings becomes always greater than that for two interfering footings. Attempts have been made in this thesis to investigate the group effect of two vertical anchors on their horizontal pullout resistance (PuT). The anchors are considered to be embedded at a certain clear spacing (S) along the same vertical plane. The group effect has been studied separately for anchors embedded in (i) sandy soil, and (ii) undrained clay, respectively. For anchors embedded in clays, an increase of soil cohesion with depth, in a linear fashion, has also been taken into consideration. The magnitude of PuT has been obtained in terms of a group efficiency factor, ηγ for sand and ηc for clay, with respect to the failure load for a single isolated vertical plate with the same H/B. The pullout capacity of a group of two anchors either in sand or in undrained clay becomes quite extensive as compared to a single isolated anchor. The magnitudes of ηγ and ηc become maximum corresponding to a certain critical value of S/B, which has been found to lie generally between 0.5 and 1. The value of ηγ for a given S/B has been found to become larger for greater values of H/B, φ, and δ. For greater values of H/B, the group effect becomes more significant in contributing the pullout resistance. The horizontal pullout capacity of a single isolated vertical anchor embedded in sand in the presence of pseudo static horizontal earthquake body forces has also been determined by using the lower bound finite element limit analysis. The variation of the pullout factor Fγ with changes in the embedment ratio of the smooth and rough anchor plates for different values of horizontal earthquake acceleration coefficient ( αh) has been investigated. The analysis clearly reveals that the pullout resistance decreases quite significantly with an increase in the magnitude of the earthquake acceleration coefficient. For the various problems selected in the present thesis, the failure patterns have also been exclusively drawn in order to understand the development of the plastic zones within the chosen domain for solving a given problem. The results obtained from the analysis, for the various problems taken up in this thesis, have been thoroughly compared with those reported in literature.
217

CASUALTY REINSURANCE EXPOSURE RATING / Casualty reinsurance exposure rating

Těšínská, Anna January 2014 (has links)
The main aim of this thesis is a development of ILF curves that can be used in the insurance industry when pricing general third party liability on the Czech market. Based on available data there are first estimated size of loss distribution functions used for following generating process. From generated data the increased limit factors are estimated and with a usage of Riebesell's parameterization ILF curves are derived. A substantial part of the thesis is a compilation of literature and the expansion of the statistical approach for estimating fair ILFs based on these data. Besides, the basis for the curves derivation are chapters describing basic theoretical knowledge in the field of reinsurance - in particular, the description of the basic types of reinsurance contracts, as well as the most common methods of a pricing. There is the whole mechanism of curves derivation described; their own use is then demonstrated with the example based on pseudoreal data.
218

Návrh předpjaté stropní konstrukce / Design of a prestressed slab

Juříček, Lukáš January 2018 (has links)
Diploma thesis is focused on the design of prestressed concrete membrane from the lightweight concrete. The concrete membrane is in the interaction with the bearing steel structure, which one is supported. I did only one variant of the solution, which was optimized. The goal of the design was determined the initial state of a membrane and ensure to in the end of design working life was the membrane in the compression state and the cracks couldn’t appear. The structure is modelled in the main (longitudinal) direction as a spatial frame in program ANSYS 17.2. This tool enables to perform analysis of the construction stages. The rheology is calculated manually and supplied to the model as relative strain. The whole structure is checked for ultimate limit state and serviceability limit state according to valid Eurocodes. Transverse direction is modelled by shell elements in program RFEM 5.12. Construction stages and global behaviour of the structure were analysed. The checks have been calculated manually and control in program IDEA StatiCa. The attachments of the thesis are drawings, visualisation and schemes of construction stages.
219

Nosná železobetonová konstrukce vícepodlažního objektu / Load-bearing RC structure of the multi-storey building

Petrovič, Jiří January 2019 (has links)
The Master’s thesis is focused on the analysis and design of selected members of load-bearing structure of an apartment house according to the ultimate limit states (ULS) and seviceability limit states (SLS). The calculation and the analysis was supported by design software SCIA ENGINEER 2017. Structural analysis deals with the design of the reinforced concrete floor slab of 1.PP, beams in 1.PP, selected walls in 1.NP and column in 1.PP and 1.NP. Thework beside this deals with the calculation and design of foundation of the object.
220

Železobetonová konstrukce polyfunkčního domu / Reinforced concrete structure of a multifunction building

Pohl, Martin January 2019 (has links)
The diploma thesis deals with structural solution of the reinforced structures of the polyfunctional building. The structure is designed and assessed for for ultimate limit state in accordance with ČSN EN 1992-1-1: Design of concrete structures – Part 1-1: General rules and rules for buildings.

Page generated in 0.0495 seconds