Spelling suggestions: "subject:"aximum"" "subject:"amaximum""
301 |
Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl LiebenbergLiebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
|
302 |
Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl LiebenbergLiebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
|
303 |
Méthodologie de conversion des spécifications géométriques de tolérance en zones d'incertitudeYettou, Abdelhalim January 2011 (has links)
Cette étude a pour but la conversion des spécifications géométriques de tolérance en zones d'incertitudes. Pour cela, une représentation des zones de tolérance a été faite, en tenant compte de tous les différents types de tolérancements pouvant y être imposés. Après cela, ces zones de tolérances ont été exprimées sous forme de torseurs de petits déplacements. Par la suite, les éléments de ces torseurs ont été bornés afin de délimiter la zone de tolérance. À ce stade, les équations gouvernant la conversion des spécifications géométriques en zones d'incertitudes sont établies.Cette procédure a été faite pour chaque type de zone de tolérance à savoir, un disque, un trou, le décalage d'une ligne, le décalage d'une paroi plane, un cylindre, un anneau, un parallélépipède et une rainure. Ce développement a aussi été enlisé pour les deux mouvements de recherche, à savoir les déterministes et les statisticiens. Finalement le principe du maximum de matière a été introduit pour toutes les équations.
|
304 |
Metrics and Test Procedures for Data Quality Estimation in the Aeronautical Telemetry ChannelHill, Terry 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / There is great potential in using Best Source Selectors (BSS) to improve link availability in aeronautical telemetry applications. While the general notion that diverse data sources can be used to construct a consolidated stream of "better" data is well founded, there is no standardized means of determining the quality of the data streams being merged together. Absent this uniform quality data, the BSS has no analytically sound way of knowing which streams are better, or best. This problem is further exacerbated when one imagines that multiple vendors are developing data quality estimation schemes, with no standard definition of how to measure data quality. In this paper, we present measured performance for a specific Data Quality Metric (DQM) implementation, demonstrating that the signals present in the demodulator can be used to quickly and accurately measure the data quality, and we propose test methods for calibrating DQM over a wide variety of channel impairments. We also propose an efficient means of encapsulating this DQM information with the data, to simplify processing by the BSS. This work leads toward a potential standardization that would allow data quality estimators and best source selectors from multiple vendors to interoperate.
|
305 |
NON-COHERENTLY DETECTED FQPSK: RAPID SYNCHRONIZATION AND COMPATIBILITY WITH PCM/FM RECEIVERSPark, Hyung Chul, Lee, Kwyro, Feher, Kamilo 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada / A new class of non-coherent detection techniques for recently standardized Feher patented
quadrature phase-shift keying (FQPSK) systems is proposed and studied by computer aided
design/simulations and also verified by experimental hardware measurements.
The theoretical concepts of the described non-coherent techniques are based on an
interpretation of the instantaneous frequency deviation or phase transition characteristics of
FQPSK-B modulated signal at the front end of the receiver. These are accomplished either
by Limiter-Discriminator (LD) or by Limiter-Discriminator followed by Integrate-and-Dump (LD I&D) methods. It is shown that significant BER performance improvements can
be obtained by increasing the received signal’s observation time over multiple symbols as
well as by adopting trellis-demodulation. For example, our simulation results show that a
BER=10^-4 can be obtained for an E(b)/N(0)=12.7 dB.
|
306 |
Asian monsoon over mainland Southeast Asia in the past 25 000 yearsChabangborn, Akkaneewut January 2014 (has links)
The objective of this research is to interpret high-resolution palaeo-proxy data sets to understand the Asian summer monsoon variability in the past. This was done by synthesizing published palaeo-records from the Asian monsoon region, model simulation comparisons, and analysing new lake sedimentary records from northeast Thailand. Palaeo-records and climate modeling indicate a strengthened summer monsoon over Mainland Southeast Asia during the Last Glacial Maximum (LGM), compared to dry conditions in other parts of the Asian monsoon region. This can be explained by the LGM sea level low stand, which exposed Sundaland and created a large land-sea thermal contrast. Sea level rise ~19 600 years before present (BP), reorganized the atmospheric circulation in the Pacific Ocean and weakened the summer monsoon between 20 000 and 19 000 years BP. Both the Mainland Southeast Asia and the East Asian monsoon hydroclimatic records point to an earlier Holocene onset of strengthened summer monsoon, compared to the Indian Ocean monsoon. The asynchronous evolution of the summer monsoon and a time lag of 1500 years between the East Asian and the Indian Ocean monsoon can be explained by the palaeogeography of Mainland Southeast Asia, which acted as a land bridge for the movement of the Intertropical Convergence Zone. The palaeo-proxy records from Lake Kumphawapi compare well to the other data sets and suggest a strengthened summer monsoon between 10 000 and 7000 years BP and a weakening of the summer monsoon thereafter. The data from Lake Pa Kho provides a picture of summer monsoon variability over 2000 years. A strengthened summer monsoon prevailed between BC 170-AD 370, AD 800-960 and since AD 1450, and was weaker about AD 370-800 and AD 1300-1450. The movement of the mean position of the Intertropical Convergence Zone explains shifts in summer monsoon intensity, but weakening of the summer monsoon between 960 and 1450 AD could be affected by changes in the Walker circulation. / <p>At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 4: Manuscript..</p>
|
307 |
Optimisation de trajectoires spatiales. Vol d'un dernier étage de lanceur - Nettoyage des débris spatiaux.Cerf, Max 28 September 2012 (has links) (PDF)
Ce travail porte sur deux problèmes d'optimisation de trajectoires spatiales: le vol d'un dernier étage de lanceur, et le nettoyage des débris spatiaux. L'objectif est de développer pour ces deux problèmes des méthodes de résolution et des logiciels utilisables dans un contexte industriel. Les travaux comportent une partie théorique de formulation et une partie appliquée de résolution numérique. Les domaines abordés sont la mécanique spatiale, l'optimisation discrète, l'optimisation continue en dimension finie et le contrôle optimal.
|
308 |
Résolution numérique de problèmes en contrôle optimal a évolution linéaire et critère quadratiqueDurand, André 25 January 1968 (has links) (PDF)
.
|
309 |
Alien plants and their invasion of the forested landscape of the southeastern United StatesLemke, Dawn January 2012 (has links)
In this thesis, I have assessed and modelled invasion of alien plant species in the forest of the southeastern United States. There are over 380 recognized invasive plants in southeastern forests and grasslands with 53 ranked as high-to-medium risk to natural communities. I have focused on ten of these: Chinese lespedeza, tall fescue, Japanese honeysuckle, Chinese privet, autumn olive, princesstree, silktree, chinaberry, tree of heaven, tallowtree. Assessing them at differing scales, locally (Chapter 2 and 3), eco-regionally (Chapter 4 and 5) and regionally (Chapters 6 and 7), using field based measurements integrated with remotely sensed and digital datasets, and applying both parametric and non-parametric modelling approaches. Data from field based measurements as well as digitally available sources was evaluated, bringing together freely available data with time consuming, intensively collected data. Once models were developed application to assessing long term impacts was done by integrating potential climate change scenarios.
At the local level Chinese lespedeza and Japanese Honeysuckle were the most prevalent, with models at the local level dominated by remotely sensed variables. At an eco-regional level Japanese honeysuckle was the most prevalent with models primarily dominated by environmental variables. At a regional level, where only trees were assessed, potential distributions of the invasive species ranged from 12 to 33 percent of the southeastern forests under current conditions with this dramatically increasing for chinaberry and tallowtree under most climate change scenarios, up as high as 66 percent of southeastern forest sites.
In this thesis information on anthropogenic factors added some value to the models, however it was rarely dominant. Roads and land use (proportion of forest or distance to forest) were the most useful anthropogenic variables. In all models evaluated, only six times did any one anthropogenic variable represent more than 25 percent of the models, four of these were at the local scale. At the regional and eco-regional level, roads had a greater than 25 percent contribution to the silktree models, at a local level, distance to forest and distance roads contributed more than 25 percent to three of the species evaluated, sawtooth oak, Japanese honeysuckle and privet. Human activities have the most influence on invasion progression through dispersal (movement and introduction rate) and disturbance of the landscape (increased resource availability). Anthropogenic variables such as roads are likely to be a mechanism of spread, thus the more a model is driven by anthropogenic variables, the more likely the invasive plant is to be in the early stages of invasion process. Thus our results suggest that many of these species have moved through the first stages of invasion.
Environmental characteristics play an important role in determining a site’s vulnerability to invasion. At an eco-region and regional scale, environmental characteristics dominated (>50%) all but one model (silktree at the regional scale). At the eco-region level elevation was the dominant variable, and at a regional level minimum temperature was the dominant variable. These have some correlation, with higher elevation often relating to lower temperatures, particularly at a smaller scale. This confirms the validity of matching the climate ranges of native species with the range of potential invasion, and the approach of integrating elevation, latitude and longitude to estimate potential distribution. It also suggests that climate change will influence the distribution and that variation in climate should be integrated into models.
Two different modelling approaches, logistic regression and maximum entropy, were used throughout my thesis, and applied to the same data. Agreement between different modelling types adds strength to conclusions, while disagreement can assist in asking further questions. The inclusion in the models of similar variables with the same direction of relationships gives confidence to any inference about the importance of these variables. The geographical agreement between models adds confidence to the probability of occurrence in the area. Alternatively using the same model but different datasets can give you similar information. Overall for all models created by both logistic regression and MaxEnt, the logistic regression had slightly better omission rates and the MaxEnt model had better AUC’s. Logistic regression models also often predicted larger geographical areas of occurrences when the threshold of maximum sensitivity plus specificity was used, thus the lower omission rates is related to the less stringent model that predicts a larger area. The selection of appropriate data to answer the question was shown to be fundamental in Chapter 7. When data were used outside of the area of interest it generalized the models and increased the potential for invasion significantly. There was more value in the intensive surveyed data but this was less dramatic than in using the defined areas of interest to select the data for models.
|
310 |
ModPET: Novel Applications of Scintillation Cameras to Preclinical PETMoore, Stephen K. January 2011 (has links)
We have designed, developed, and assessed a novel preclinical positron emission tomography (PET) imaging system named ModPET. The system was developed using modular gamma cameras, originally developed for SPECT applications at the Center for Gamma Ray Imaging (CGRI), but configured for PET imaging by enabling coincidence timing. A pair of cameras are mounted on a exible system gantry that also allows for acquisition of optical images such that PET images can be registered to an anatomical reference. Data is acquired in a super list-mode form where raw PMT signals and event times are accumulated in events lists for each camera. Event parameter estimation of position and energy is carried out with maximum likelihood methods using careful camera calibrations accomplished with collimated beams of 511-keV photons and a new iterative mean-detector-response-function processing routine. Intrinsic lateral spatial resolution for 511-keV photons was found to be approximately 1.6 mm in each direction. Lists of coincidence pairs are found by comparing event times in the two independent camera lists. A timing window of 30 nanoseconds is used. By bringing the 4.5 inch square cameras in close proximity, with a 32-mm separation for mouse imaging, a solid angle coverage of ∼75% partially compensates for the relatively low stopping power in the 5-mm-thick NaI crystals to give a mea- sured sensitivity of up to 0.7%. An NECR analysis yields 11,000 pairs per second with 84 μCi of activity. A list-mode MLEM reconstruction algorithm was developed to reconstruct objects in a 88 x 88 x 30 mm field of view. Tomographic resolution tests with a phantom suggest a lateral resolution of 1.5 mm and a slightly degraded resolution of 2.5 mm in the direction normal to the camera faces. The system can also be configured to provide (99m)Tc planar scintigraphy images. Selected biological studies of inammation, apoptosis, tumor metabolism, and bone osteogenic activity are presented.
|
Page generated in 0.039 seconds