• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 410
  • 58
  • 47
  • 19
  • 13
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 7
  • 7
  • 4
  • 3
  • Tagged with
  • 690
  • 132
  • 95
  • 94
  • 76
  • 70
  • 62
  • 59
  • 56
  • 54
  • 46
  • 42
  • 38
  • 37
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Choice set formulation for discrete choice models

Pitschke, Steven B January 1980 (has links)
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Civil Engineering, 1980. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Bibliography: leaves 100-102. / by Steven B. Pitschke. / M.S.
502

Distributions of some random volumes and their connection to multivariate analysis

Jairu, Desiderio N. January 1987 (has links)
No description available.
503

Decision Theory Classification Of High-dimensional Vectors Based On Small Samples

Bradshaw, David 01 January 2005 (has links)
In this paper, we review existing classification techniques and suggest an entirely new procedure for the classification of high-dimensional vectors on the basis of a few training samples. The proposed method is based on the Bayesian paradigm and provides posterior probabilities that a new vector belongs to each of the classes, therefore it adapts naturally to any number of classes. Our classification technique is based on a small vector which is related to the projection of the observation onto the space spanned by the training samples. This is achieved by employing matrix-variate distributions in classification, which is an entirely new idea. In addition, our method mimics time-tested classification techniques based on the assumption of normally distributed samples. By assuming that the samples have a matrix-variate normal distribution, we are able to replace classification on the basis of a large covariance matrix with classification on the basis of a smaller matrix that describes the relationship of sample vectors to each other.
504

Möbius operators and non-additive quantum probabilities in the Birkhoff-von Neumann lattice.

Vourdas, Apostolos 08 December 2015 (has links)
yes / The properties of quantum probabilities are linked to the geometry of quantum mechanics, described by the Birkhoff-von Neumann lattice. Quantum probabilities violate the additivity property of Kolmogorov probabilities, and they are interpreted as Dempster-Shafer probabilities. Deviations from the additivity property are quantified with the Möbius (or non-additivity) operators which are defined through Möbius transforms, and which are shown to be intimately related to commutators. The lack of distributivity in the Birkhoff-von Neumann lattice Λd, causes deviations from the law of the total probability (which is central in Kolmogorov’s probability theory). Projectors which quantify the lack of distributivity in Λd, and also deviations from the law of the total probability, are introduced. All these operators, are observables and they can be measured experimentally. Constraints for the Möbius operators, which are based on the properties of the Birkhoff-von Neumann lattice (which in the case of finite quantum systems is a modular lattice), are derived. Application of this formalism in the context of coherent states, generalizes coherence to multi-dimensional structures.
505

An Analysis of Equally Weighted and Inverse Probability Weighted Observations in the Expanded Program on Immunization (EPI) Sampling Method

Reyes, Maria 11 1900 (has links)
Performing health surveys in developing countries and humanitarian emergencies can be challenging work because the resources in these settings are often quite limited and information needs to be gathered quickly. The Expanded Program on Immunization (EPI) sampling method provides one way of selecting subjects for a survey. It involves having field workers proceed on a random walk guided by a path of nearest household neighbours until they have met their quota for interviews. Due to its simplicity, the EPI sampling method has been utilized by many surveys. However, some concerns have been raised over the quality of estimates resulting from such samples because of possible selection bias inherent to the sampling procedure. We present an algorithm for obtaining the probability of selecting a household from a cluster under several variations of the EPI sampling plan. These probabilities are used to assess the sampling plans and compute estimator properties. In addition to the typical estimator for a proportion, we also investigate the Horvitz-Thompson (HT) estimator, an estimator that assigns weights to individual responses. We conduct our study on computer-generated populations having different settlement types, different prevalence rates for the characteristic of interest and different spatial distributions of the characteristic of interest. Our results indicate that within a cluster, selection probabilities can vary largely from household to household. The largest probability was over 10 times greater than the smallest probability in 78% of the scenarios that were tested. Despite this, the properties of the estimator with equally weighted observations (EQW) were similar to what would be expected from simple random sampling (SRS) given that cases of the characteristic of interest were evenly distributed throughout the cluster area. When this was not true, we found absolute biases as large as 0.20. While the HT estimator was always unbiased, the trade off was a substantial increase in the variability of the estimator where the design effect relative to SRS reached a high of 92. Overall, the HT estimator did not perform better than the EQW estimator under EPI sampling, and it involves calculations that may be difficult to do for actual surveys. Although we recommend continuing to use the EQW estimator, caution should be taken when cases of the characteristic of interest are potentially concentrated in certain regions of the cluster. In these situations, alternative sampling methods should be sought. / Thesis / Master of Science (MSc)
506

Characterization of isomeric states in neutron-rich nuclei approaching N = 28

Ogunbeku, Timilehin Hezekiah 08 December 2023 (has links) (PDF)
The investigation of isomeric states in neutron-rich nuclei provides useful insights into the underlying nuclear configurations, and understanding their occurrence along an isotopic chain can inform about shell evolution. Recent studies on neutron-rich Si isotopes near the magic number N = 20 and approaching N = 28 have revealed the presence of low-lying states with intruder configurations, resulting from multiple-particle, multiple-hole excitations across closed shell gaps. The characterization of these states involves measuring their half-lives and transition probabilities. In this study, a new low-energy (7/2−1) isomer at 68 keV in 37Si was accessed via beta decay and characterized. To achieve this, radioactive 37Al and 38Al ions were produced through the projectile fragmentation reaction of a 48Ca beam and implanted into a CeBr3 detector, leading to the population of states in 37Si. The 68-keV isomer was directly populated in the beta-delayed one neutron emission decay of implanted 38Al ions. Ancillary detector arrays comprising HPGe and LaBr3(Ce) detectors were employed for the detection of beta-delayed gamma rays. The choice of detectors was driven by their excellent energy and timing resolutions, respectively. The beta-gamma timing method was utilized to measure the half-life of the new isomeric state in 37Si. This dissertation also discusses other timing techniques employed to search for and characterize isomeric states following beta decay of implanted ions. Notably, the half-life of the newly observed (7/2−1) isomeric state in 37Si was measured to be 9.1(7) ns. The half-life of the previously observed closely-lying (3/2−1) state at 156 keV was determined to be 3.20(4) ns, consistent with previously reported values. Reduced ground-state transition probabilities associated with the gamma-ray decay from these excited states were in agreement with results obtained from shell model calculations. In addition to the investigation of isomeric states in 37Si, isomeric 0+ states in 34Si and 32Mg nuclei belonging to the N = 20 “island of inversion” were characterized and searched for, respectively. The isomeric 0+ state in 34Si was populated following the beta decay of implanted 34Mg ions and its 34Al daughter nucleus. Similarly, the 0+ state in 32Mg was searched for via the beta-delayed one neutron emission decay of implanted 33Na ions.
507

"It's Not Probabilities, It's Possibilities": Lay Views of Disclosure Regarding Emerging Health Issues

Moreau, Geneviève 08 1900 (has links)
Products and technologies provide us with significant lifestyle benefits but they can also evolve into hazards and bring about concern for human health. A history of poor regulatory performances has resulted in a public displeased with and skeptical of the actors responsible for protecting the public against the unintended effects of progress. It is within this historical and social context that the study explores the following objectives: to understand people's responses to emerging health issues, of which there is considerable knowledge uncertainty and little public awareness; to identify the information needs regarding these issues, and to explore the role of government disclosure for personal decision-making around these issues. Seven focus groups were conducted in Hamilton, Ontario with community members from a range of backgrounds: youth, faith, allophone immigrants, environmental, health, recreational, and mixed. Two scenarios about potential hazards, i.e. a persistent pollutant and extreme heatwaves from climate change, were used to generate discussion about people's experiences with risk and knowledge. Results indicate that emerging health issues are framed by lay individuals as a chronic societal phenomenon. Their concerns about health and well-being, resiliency, and issue comprehension point to an overarching preoccupation about social vulnerability, irrespective of the presence of confirmed hazards. The analysis further revealed several roles for disclosure which would allow for more capacity in personal decision-making; more transparent and accountable regulatory processes, and which could lead to more trustworthy relations between citizens and government. / Thesis / Master of Arts (MA)
508

Experimental Knowledge in Cognitive Neuroscience: Evidence, Errors, and Inference

Aktunc, Mahir Emrah 06 September 2011 (has links)
This is a work in the epistemology of functional neuroimaging (fNI) and it applies the error-statistical (ES) philosophy to inferential problems in fNI to formulate and address these problems. This gives us a clear, accurate, and more complete understanding of what we can learn from fNI and how we can learn it. I review the works in the epistemology of fNI which I group into two categories; the first category consists of discussions of the theoretical significance of fNI findings and the second category discusses methodological difficulties of fNI. Both types of works have shortcomings; the first category has been too theory-centered in its approach and the second category has implicitly or explicitly adopted the assumption that methodological difficulties of fNI cannot be satisfactorily addressed. In this dissertation, I address these shortcomings and show how and what kind of experimental knowledge fNI can reliably produce which would be theoretically significant. I take fMRI as a representative fNI procedure and discuss the history of its development. Two independent trajectories of research in physics and physiology eventually converge to give rise to fMRI. Thus, fMRI findings are laden in the theories of physics and physiology and I propose how this creates a kind of useful theory-ladenness which allows for the representation of and intervention in the constructs of cognitive neuroscience. Duhemian challenges and problems of underdetermination are often raised to argue that fNI is of little, if any, epistemic value for psychology. I show how the ES notions of severe tests and error probabilities can be applied in epistemological analyses of fMRI. The result is that hemodynamic hypotheses can be severely tested in fMRI experiments and I demonstrate how these hypotheses are theoretically significant and fuel the growth of experimental knowledge in cognitive neuroscience. Throughout this dissertation, I put the emphasis on the experimental knowledge we obtain from fNI and argue that this is the fruitful approach that enables us to see how fNI can contribute to psychology. In doing so, I offer an error-statistical epistemology of fNI, which hopefully will be a significant contribution to the philosophy of psychology. / Ph. D.
509

Joint probability distribution of rainfall intensity and duration

Patron, Glenda G. 23 June 2009 (has links)
Intensity-duration-frequency (IDF) curves are widely used for peak discharge estimation in designing hydraulic structures. The traditional Gumbel probability method entails selecting annual maximum rainfall depths (intensities) conditioned on a fixed time window width (which in general will not coincide with the rainfall event duration) from a continuous record to perform a frequency analysis in terms of the marginal distribution. The digitized database contains annual maximum intensities for selected discrete durations. This method presents problems when intensities are required for arbitrary durations which are not part of the selected durations. Accurate interpolated and especially extrapolated intensity values are hard to obtain. The present study offers two methods both involving a joint probability approach to overcome the deficiencies inherent in the traditional method of IDF analysis. The first joint probability approach employs Box-Cox and modulus transformations to transform original data to near bivariate normality. The second method does not require such a transformation. Instead, it uses the closed-form bivariate Burr III cumulative distribution to fit the data. Another advantage of the joint probability approach is that it allows one to gauge the rarity of certain extreme events, such as probable maximum precipitation, in terms of the joint occurrence of its extremely high intensity and a sufficiently long duration (e.g. 24 hours). The joint probability approach is applied to three data sets. The resulting conditional probability intensity estimates are quite close to those obtained by traditional Gumbel IDF analysis. In addition, reliable interpolated and extrapolated intensities are available because the approach essentially fits a flexible surface to the discrete data with the capability of providing a complete probabilistic structure. / Master of Science
510

Predicting climate change impacts on precipitation for western North America

McKechnie, Nicole R., University of Lethbridge. Faculty of Arts and Science January 2005 (has links)
Global Circulation Models (GCMs) are used to create projections of possible future climate characteristics under global climate change scenarios. Future local and regional precipitation scenarios can be developed by downscaling synoptic CGM data. Daily 500-mb geopotential heights from the Canadian Centre for Climate Modeling and Analysis's CGCM2 are used to represent future (2020-2050) synoptics and are compared to daily historical (1960-1990) 500-mb geopotential height reanalysis data. The comparisons are made based on manually classified synoptic patterns identified by Changnon et al. (1993.Mon. Weather Rev. 121:633-647). Multiple linear regression models are used to link the historical synoptic pattern frequencies and precipitation amounts for 372 weather stations across western North America,. The station-specific models are then used to forecast future precipitation amounts per weather station based on synoptic pattern frequencies forecast by the CGCM2 climate change forcing scenario. Spatial and temporal variations in precipitation are explored to determine monthly, seasonal and annual trends in climate change impacts on precipitation in western North America. The resulting precipitation scenarios demonstrate a decrease in precipitation from 10 to 30% on an annual basis for much of the south and western regions of the study area. Seasonal forecasts show variations of the same regions with decreases in precipitation and select regions with increases in future precipitation. A major advancement of this analysis was the application of synoptic pattern downscaling to summer precipitation scenarios for western North America. / ix, 209 leaves : col. maps ; 29 cm.

Page generated in 0.2113 seconds