• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 7
  • 6
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 26
  • 19
  • 12
  • 8
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Aplicación de imágenes de satélite SAR,en los estudios de contaminación marina y de dinámica de las aguas en el mediterráneo noroccidental

Platonov, Alexei 19 July 2002 (has links)
This thesis presents diverse theoretical and practical applications of the ERS 1/2 and RADARSAT satellite images of the Synthetic Aperture Radar SAR sensor (near 330 images in the Northwest Mediterranean NWM and 980 in European coastal waters) Other types of satellite images were used to study the marine pollution and dynamics studies of the NWM and other European areas. Principal source of information constitutes a collection of the SAR images obtained on periodic form during the 1996-1998 years generally by the Clean Seas European project, also by the Oil Watch and the ERS-1/2 SAR Exploitation Study in Catalonia projects. The geographical area of interest includes the NWM maritime zone: Balearic Islands - Ebro Delta - Cap of Creus - Gulf of Leon - Marseilles - Balearic Islands. During the presents studies there we elaborated a thematic collection of the SAR (full and detailed fragments) images of the almost 300 oil spills and of the 42 coastal plumes detected in the NWM in the period 1996-1998 also thematic maps and the statistical analysis of their topologic characteristics and of their temporary/spatial occurrences were elaborated. The area of the all accidental spills and plumes that took place in the period 1996-1998 in the NWM was also estimated (the diameter is 146 km and it oil mass is 4.477 Tm), we made a comparative analysis of the results of the Clean Seas project from different zones of studies (NWM, North Sea, Baltic Sea) and the statistical analysis of the occurrence of disasters involving oil spills in European waters in base of the results of the present work and of the historical information of the past 34 years, relating them to Zipf's law. The general conclusion is that the small habitual oil spills play a significant role in the overall marine pollution due to its very frequent occurrence.In the area of study of the dynamics of the NWM we present the results of the topologic and spatial analysis of the vortices detected by the satellite sensors, the thematic maps, the comparison with the laboratory experiments, the quantitative analysis of the particularities of tide in different points of the NWM, the examples of the application of the multifractal analysis and also a practical method proposed in order to distinguish the sea surface structures of different origins.The obtained results have allowed to obtain a general and statistically justified vision of the level of the marine pollution of the NWM, as well as in other European maritime zones. It was also possible to obtain quantitative information on the complex superficial dynamics of the NWM, which can be useful to quantify the capacity of surface diffusion of the ocean. / PER NAVEGAR ENTRE ELS FITXERS D'AQUESTA TESI PARTIU DEL FITXER: inicio.pdfPARA NAVEGAR ENTRE LOS FICHEROS DE ÉSTA TESIS PARTIR DEL FICHERO: inicio.pdfTHE FIRST FILE IS inicio.pdf
22

Regression models with an interval-censored covariate

Langohr, Klaus 16 June 2004 (has links)
El análisis de supervivencia trata de la evaluación estadística de variables que miden el tiempo transcurrido hasta un evento de interés. Una particularidad que ha de considerar el análisis de supervivencia son datos censurados. Éstos aparecen cuando el tiempo de interés no puede ser observado exactamente y la información al respecto es parcial. Se distinguen diferentes tipos de censura: un tiempo censurado por la derecha está presente si el tiempo de supervivencia es sabido mayor a un tiempo observado; censura por izquierda está dada si la supervivencia es menor que un tiempo observado. En el caso de censura en un intervalo, el tiempo está en un intervalo de tiempo observado, y el caso de doble censura aparece cuando, también, el origen del tiempo de supervivencia está censurado.La primera parte del Capítulo 1 contiene un resumen de la metodología estadística para datos censurados en un intervalo, incluyendo tanto métodos paramétricos como no-paramétricos. En la Sección 1.2 abordamos el tema de censura noinformativa que se supone cumplida para todos los métodos presentados. Dada la importancia de métodos de optimización en los demás capítulos, la Sección 1.3 trata de la teoría de optimización. Esto incluye varios algoritmos de optimización y la presentación de herramientas de optimización. Se ha utilizado el lenguaje de programación matemática AMPL para resolver los problemas de maximización que han surgido. Una de las características más importantes de AMPL es la posibilidad de enviar problemas de optimización al servidor 'NEOS: Server for Optimization' en Internet para que sean solucionados por ese servidor.En el Capítulo 2, se presentan los conjuntos de datos que han sido analizados. El primer estudio es sobre la supervivencia de pacientes de tuberculosis co-infectados por el VIH en Barcelona, mientras el siguiente, también del área de VIH/SIDA, trata de usuarios de drogas intra-venosas de Badalona y alrededores que fueron admitidos a la unidad de desintoxicación del Hospital Trias i Pujol. Un área completamente diferente son los estudios sobre la vida útil de alimentos. Se presenta la aplicación de la metodología para datos censurados en un intervalo en esta área. El Capítulo 3 trata del marco teórico de un modelo de vida acelerada con una covariante censurada en un intervalo. Puntos importantes a tratar son el desarrollo de la función de verosimilitud y el procedimiento de estimación de parámetros con métodos del área de optimización. Su uso puede ser una herramienta importante en la estadística. Estos métodos se aplican también a otros modelos con una covariante censurada en un intervalo como se demuestra en el Capítulo 4.Otros métodos que se podrían aplicar son descritos en el Capítulo 5. Se trata sobre todo de métodos basados en técnicas de imputación para datos censurados en un intervalo. Consisten en dos pasos: primero, se imputa el valor desconocido de la covariante, después, se pueden estimar los parámetros con procedimientos estadísticos estándares disponibles en cualquier paquete de software estadístico.El método de maximización simultánea ha sido implementado por el autor con el código de AMPL y ha sido aplicado al conjunto de datos de Badalona. Presentamos los resultados de diferentes modelos y sus respectivas interpretaciones en el Capítulo 6. Se ha llevado a cabo un estudio de simulación cuyos resultados se dan en el Capítulo 7. Ha sido el objetivo comparar la maximización simultánea con dos procedimientos basados en la imputación para el modelo de vida acelerada. Finalmente, en el último capítulo se resumen los resultados y se abordan diferentes aspectos que aún permanecen sin ser resueltos o podrían ser aproximados de manera diferente. / Survival analysis deals with the evaluation of variables which measure the elapsed time until an event of interest. One particularity survival analysis has to account for are censored data, which arise whenever the time of interest cannot be measured exactly, but partial information is available. Four types of censoring are distinguished: right-censoring occurs when the unobserved survival time is bigger, left-censoring when it is less than an observed time, and in case of interval-censoring, the survival time is observed within a time interval. We speak of doubly-censored data if also the time origin is censored.In Chapter 1 of the thesis, we first give a survey on statistical methods for interval-censored data, including both parametric and nonparametric approaches. In the second part of Chapter 1, we address the important issue of noninformative censoring, which is assumed in all the methods presented. Given the importance of optimization procedures in the further chapters of the thesis, the final section of Chapter 1 is about optimization theory. This includes some optimization algorithms, as well as the presentation of optimization tools, which have played an important role in the elaboration of this work. We have used the mathematical programming language AMPL to solve the maximization problems arisen. One of its main features is that optimization problems written in the AMPL code can be sent to the internet facility 'NEOS: Server for Optimization' and be solved by its available solvers.In Chapter 2, we present the three data sets analyzed for the elaboration of this dissertation. Two correspond to studies on HIV/AIDS: one is on the survival of Tuberculosis patients co-infected with HIV in Barcelona, the other on injecting drug users from Badalona and surroundings, most of whom became infected with HIV as a result of their drug addiction. The complex censoring patterns in the variables of interest of the latter study have motivated the development of estimation procedures for regression models with interval-censored covariates. The third data set comes from a study on the shelf life of yogurt. We present a new approach to estimate the shelf lives of food products taking advantage of the existing methodology for interval-censored data.Chapter 3 deals with the theoretical background of an accelerated failure time model with an interval-censored covariate, putting emphasize on the development of the likelihood functions and the estimation procedure by means of optimization techniques and tools. Their use in statistics can be an attractive alternative to established methods such as the EM algorithm. In Chapter 4 we present further regression models such as linear and logistic regression with the same type of covariate, for the parameter estimation of which the same techniques are applied as in Chapter 3. Other possible estimation procedures are described in Chapter 5. These comprise mainly imputation methods, which consist of two steps: first, the observed intervals of the covariate are replaced by an imputed value, for example, the interval midpoint, then, standard procedures are applied to estimate the parameters.The application of the proposed estimation procedure for the accelerated failure time model with an interval-censored covariate to the data set on injecting drug users is addressed in Chapter 6. Different distributions and covariates are considered and the corresponding results are presented and discussed. To compare the estimation procedure with the imputation based methods of Chapter 5, a simulation study is carried out, whose design and results are the contents of Chapter 7. Finally, in the closing Chapter 8, the main results are summarized and several aspects which remain unsolved or might be approximated in another way are addressed.
23

Recherche sur les connaissances scientifiques dans la poésie persane classique: une étude des arbres et des fruits dans la Khamsa de Nezâmî Ganjavî

Van Ruymbeke, Christine January 1997 (has links)
Doctorat en philosophie et lettres / info:eu-repo/semantics/nonPublished
24

A markov chain monte carlo method for inverse stochastic modeling and uncertainty assessment

Fu, Jianlin 07 May 2008 (has links)
Unlike the traditional two-stage methods, a conditional and inverse-conditional simulation approach may directly generate independent, identically distributed realizations to honor both static data and state data in one step. The Markov chain Monte Carlo (McMC) method was proved a powerful tool to perform such type of stochastic simulation. One of the main advantages of the McMC over the traditional sensitivity-based optimization methods to inverse problems is its power, flexibility and well-posedness in incorporating observation data from different sources. In this work, an improved version of the McMC method is presented to perform the stochastic simulation of reservoirs and aquifers in the framework of multi-Gaussian geostatistics. First, a blocking scheme is proposed to overcome the limitations of the classic single-component Metropolis-Hastings-type McMC. One of the main characteristics of the blocking McMC (BMcMC) scheme is that, depending on the inconsistence between the prior model and the reality, it can preserve the prior spatial structure and statistics as users specified. At the same time, it improves the mixing of the Markov chain and hence enhances the computational efficiency of the McMC. Furthermore, the exploration ability and the mixing speed of McMC are efficiently improved by coupling the multiscale proposals, i.e., the coupled multiscale McMC method. In order to make the BMcMC method capable of dealing with the high-dimensional cases, a multi-scale scheme is introduced to accelerate the computation of the likelihood which greatly improves the computational efficiency of the McMC due to the fact that most of the computational efforts are spent on the forward simulations. To this end, a flexible-grid full-tensor finite-difference simulator, which is widely compatible with the outputs from various upscaling subroutines, is developed to solve the flow equations and a constant-displacement random-walk particle-tracking method, which enhances the com / Fu, J. (2008). A markov chain monte carlo method for inverse stochastic modeling and uncertainty assessment [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1969 / Palancia
25

Oceanic and atmospheric response to climate change over varying geologic timescales

Woodard, Stella C. 2011 May 1900 (has links)
Global climate is controlled by two factors, the amount of heat energy received from the sun (solar insolation) and the way that heat is distributed Earth's surface. Solar insolation varies on timescales of 10s to 100s of thousands of years due to changes in the path of Earth's orbit about the sun (Milankovitch cycles). Earth's internal boundary conditions, such as paleogeography, the presence/absence of polar icecaps, atmospheric/oceanic chemistry and sea level, provide distribution and feedback mechanisms for the incoming heat. Variations in these internal boundary conditions may happen abruptly or, as in the case of plate tectonics, take millions of years. We use geochemical and sedimentological techniques to investigate the response of ocean chemistry, regional aridity and atmospheric and oceanic circulation patterns to climate change during both greenhouse and icehouse climates. To explore the connection between orbitally-forced changes in solar insolation, continental aridity and wind, we generated a high-resolution dust record for ~58 Myr old deep-sea sediments from Shatsky Rise. Our data provide the first evidence of a correlation between dust flux to the deep sea and orbital cycles during the Early Paleogene, indicating dust supply (regional aridity) responded to orbital forcing during the last major interval of greenhouse climate. The change in dust flux was comparable to that during icehouse climates implying subtle variations in solar insolation have a similar impact on climate during intervals of over-all warmth as they do during glacial-interglacial states. The Carboniferous Period (359-299 Ma) marks a critical time in Earth's history when a series of tectonic and biological events caused a shift in the mean climate state from a global "greenhouse" to an "icehouse". Geochemical records extracted from sedimentary rocks deposited in shallow epicontinental seaways are increasingly being used to infer relationships between tectonism, carbon cycling and climate and therefore are assumed to reflect global ocean processes. We analyzed radiogenic isotopes in biogenic apatite along a North American transect to constrain the degree of geochemical coupling between the epicontinental seas and the open ocean. Our results argue strongly for decoupling of North American seaways from the open ocean by latest Mississippian time.
26

Errores en la búsqueda de condiciones robustas. Metodologías para evitarlos.

Pozueta Fernández, Maria Lourdes 10 December 2001 (has links)
El problema de encontrar condiciones robustas al efecto de factores no controlados es un tema que interesa enormemente a las empresas ya que es una característica que demanda el mercado. Existen básicamente dos métodos para estudiar el problema: El que se basa en el método propuesto por G. Taguchi a comienzos de los 80's con el que se aproxima la variabilidad a partir de matrices producto y se seleccionan las condiciones robustas minimizando la respuesta, o el que parte de una matriz más económica que permite estimar un modelo para la respuesta Y en función de los factores de control y ruido, y estudia las condiciones robustas a partir de las interacciones entre los factores ruido y los factores de control. Aunque en un principio cabrían esperar resultados muy similares analizando un mismo problema por las dos vías hemos encontrado ejemplos donde las conclusiones son muy dispares y por ello nos hemos planteado este trabajo de investigación para encontrar las causas de estas diferencias.El trabajo de investigación lo hemos iniciado estudiando la naturaleza de las superficies asociadas a la variabilidad provocada por factores ruido realizando el estudio de forma secuencial aumentando el número de factores ruido. Hemos demostrado que independientemente de que la métrica seleccionada sea s2(Y), s(Y) o lo(s(Y)) las superficies difícilmente podrán ser aproximadas por polinomios de primer orden en los factores de control llegando a la conclusión de que algunas de las estrategias habituales que los experimentadores utilizan en la práctica difícilmente llevan a un buen conocimiento de esta superficie. Por ejemplo no es adecuado colocar un diseño 2k-p de Resolución III en los factores de control en una matriz producto siendo recomendables diseños de Resolución IV con puntos centrales.A continuación se han supuesto dos fuentes de variación en la respuesta debidas a ruido, fuentes desconocidas para el experimentador, y se ha estudiado la sensibilidad de los dos métodos para recoger estas oportunidades de reducción de la variabilidad demostrándose que el modelo para métricas resumen está más preparado para recoger todas las fuentes de variación que el modelo a partir de métricas no-resumen, el cual es muy sensible a la estimación del modelo de Y.Por último se ha investigado sobre los errores más comunes a la hora de seleccionar las condiciones robustas a partir de gráficos.

Page generated in 0.029 seconds