• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • 2
  • 1
  • Tagged with
  • 16
  • 16
  • 8
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A simulation study on quality assessment of the Normalized Site Attenuation (NSA) measurements for Open-Area Test Site using statistical models

Liang, Kai-Jie 15 July 2005 (has links)
Open site measurement on the electromagnetic interference is the most direct and universally accepted standard approach for measuring radiated emissions from an equipment or the radiation susceptibility of a component or equipment. In general, if the NSA measurements we recorded at different frequencies do not exceed the ideal value +-4dB, we would regard this site as a normalized site, otherwise it is not a normalized site as long as there is one measurement exceeds the range. A one change point model had been used to fit observed measurements. For each set of observations as well as the corresponding ideal values, we have the estimated regression parameter for a one change point model. Our ideal is using the difference of regression parameters between ideal values and observations to assess whether a site is qualified for measuring EMI or not. The assessment tool for whether the testing site is normalized or not is referred to the confidence region for the regression model parameters. Finally, according to the data collected in this experiment, the estimated parameters obtained from the observations will be used to do further statistical analyses and comparing the qualities of the four different testing sites.
2

Quantitative NDA Measurements of Advanced Reprocessing Product Materials Containing U, NP, PU, and AM

Goddard, Braden 03 October 2013 (has links)
The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (α,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.
3

Multiple Change-Point Detection: A Selective Overview

Niu, Yue S., Hao, Ning, Zhang, Heping 11 1900 (has links)
Very long and noisy sequence data arise from biological sciences to social science including high throughput data in genomics and stock prices in econometrics. Often such data are collected in order to identify and understand shifts in trends, for example, from a bull market to a bear market in finance or from a normal number of chromosome copies to an excessive number of chromosome copies in genetics. Thus, identifying multiple change points in a long, possibly very long, sequence is an important problem. In this article, we review both classical and new multiple change-point detection strategies. Considering the long history and the extensive literature on the change-point detection, we provide an in-depth discussion on a normal mean change-point model from aspects of regression analysis, hypothesis testing, consistency and inference. In particular, we present a strategy to gather and aggregate local information for change-point detection that has become the cornerstone of several emerging methods because of its attractiveness in both computational and theoretical properties.
4

Technological aspects of corrosion control of metals / Enjeux technologique de la protection contre la corrosion

Taylor, Matthew 06 November 2012 (has links)
La prévention contre la corrosion est un facteur déterminant pour la durabilité des matériaux. Historiquement, le développement des applications des matériaux avancés n'est pas envisageable sans une approche scientifique poussée des mécanismes fondamentaux qui conduisent à la dégradation en service. L'histoire humaine a été ponctuée par les progrès technologiques, qui ont tous été permis par les progrès de la science des matériaux, de l'âge du fer à l'âge de silicium. Par exemple, c'est la fusion du minerai qui a fait basculer l'humanité de l'âge de pierre aux premiers alliages (bronze) et la fondation ultérieure d'une société basée sur les métaux. Ces métaux retournent à l'état naturel en suivant des lois thermodynamiques et cinétiques. l'objet de la thèse vise à comprendre le comportement de certains matériaux dits passivables pour tenter de proposer des lois de comportement à partir du modèle du défaut ponctuel. Cette approche s'appuie sur des caractérisations électrochimique et physico-chimique des matériaux métalliques considérés. / Corrosion control is an important facet of durable and responsible engineering. Historically, the development of advanced materials applications stymied without sufficient scientific understanding of the fundamental mechanisms that dominate degradation in the system of application. Human history has been punctuated by advances in technology, all of which were enabled by advances in materials science, from the iron age to the silicon age. For instance, it was the invention of smelting ores that brought humanity out of the stone age, leading to the first alloys (bronze) and the subsequent foundation of a metals based society. During the infancy of the planet earth, around four billion years ago, the first photosynthesizers began converting carbon dioxide into oxygen. However, oxygen gas was not released into the atmosphere in great quantities because it was immediately bound up with dissolved metals in the ocean; mostly iron, forming a large fraction of the iron ores we rely upon. Producing such metals from oxides formed during the previous four billion years involves flying in the face of the thermodynamic desire to return to the oxide state.
5

Transactional pointcuts for aspect-oriented programming

Sadat Kooch Mohtasham, Seyed Hossein 06 1900 (has links)
In dynamic pointcut-advice join point models of Aspect-Oriented Programming (AOP), join points are typically selected and advised independently of each other. That is, the relationships between join points are not considered in join point selection and advice. But these inter-relationships are key to the designation and advice of arbitrary pieces of code when modularizing concerns such as exception handling and synchronization. Without a mechanism for associating join points, one must instead refactor (if possible) into one method the two or more related join points that are to be advised together. In practice, join points are often not independent. Instead, they form part of a higher-level operation that implements the intent of the developer (e.g. managing a resource). This relationship should be made more explicit. We extend the dynamic pointcut-advice join point model to make possible the designation, reication, and advice of interrelated join points. The Transactional Pointcut (transcut), which is a realization of this extended model, is a special join point designator that selects sets of interrelated join points. Each match of a transcut is a set of join points that are related through control ow, dataow, or both. This allows transcuts to dene new types of join points (pieces of computation) by capturing the key points of a computation and to provide effective access for their manipulation (i.e. advice). Essentially, transcuts almost eliminate the need for refactoring to expose join points, which is shown by others to have a signicant negative effect on software quality. The transcut construct was implemented as an extension to the AspectJ language and integrated into the AspectBench compiler. We used transcuts to modularize the concern of exception handling in two real-world software systems. The results show that transcuts are effective in designating target join points without unnecessary refactorings, even when the target code is written obliviously to the potential aspectization.
6

Opening the Black Box of Agency Behavior: Dimensionality and Stability of FCC Commissioner Voting

Hurst, Eric Demian 19 November 2008 (has links)
Traditional analyses of agency output are typically performed at the institutional level, characterizing the agency in question as a unitary actor with a singular preference. I test these assumptions using a variety of statistical methods, including a dynamic linear model that estimates ideal points of FCC commissioners for every year, 1975-2000. Voting within the FCC is essentially unidimensional and commissioner preferences are stable over time. Aggregate analyses of the ideal points of individual commissioners suggest that FCC commissioner voting has become profoundly ideological only recently. Future agency research must carefully consider the time period of analysis and previous findings should be reexamined.
7

Statistical model building and inference about the normalized site attenuation (NSA) measurements for electromagnetic interference (EMI)

Chiu, Shih-ting 09 August 2004 (has links)
Open site measurement on the electromagnetic interference is the most direct and universally accepted standard approach for measuring radiated emissions from an equipment or the radiation susceptibility of a component or equipment. A site is qualified for testing EMI or not is decided by the antenna measurements. In this work, we use data from setups with di erent factors to find relations of measurement and the situation of antenna. A one change point model has been used to fit observed measurements and compare the di erences with two kinds of antenna (broadband antenna and dipole antenna). However, with only one change point model it may not give a suitable fit for all data sets in this work. Therefore, we have tried other models and applied them to the data. Furthermore, we try to set up another standard more strict than ¡Ó4dB based on statistical inference results in deciding whether a site is a better one with more precision in measuring EMI values. Finally, a program by Matlab with a complete analysis based on the procedure performed here is provided, so that it may be used as a standard tool for evaluating whether a site is with good measurement quality in practice.
8

Transactional pointcuts for aspect-oriented programming

Sadat Kooch Mohtasham, Seyed Hossein Unknown Date
No description available.
9

Should I Stay or Should I Go? Bayesian Inference in the Threshold Time Varying Parameter (TTVP) Model

Huber, Florian, Kastner, Gregor, Feldkircher, Martin 09 1900 (has links) (PDF)
We provide a flexible means of estimating time-varying parameter models in a Bayesian framework. By specifying the state innovations to be characterized trough a threshold process that is driven by the absolute size of parameter changes, our model detects at each point in time whether a given regression coefficient is con stant or time-varying. Moreover, our framework accounts for model uncertainty in a data-based fashion through Bayesian shrinkage priors on the initial values of the states. In a simulation, we show that our model reliably identifies regime shifts in cases where the data generating processes display high, moderate, and low numbers of movements in the regression parameters. Finally, we illustrate the merits of our approach by means of two applications. In the first application we forecast the US equity premium and in the second application we investigate the macroeconomic effects of a US monetary policy shock. / Series: Research Report Series / Department of Statistics and Mathematics
10

Should I stay or should I go? Bayesian inference in the threshold time varying parameter (TTVP) model

Huber, Florian, Kastner, Gregor, Feldkircher, Martin 09 1900 (has links) (PDF)
We provide a flexible means of estimating time-varying parameter models in a Bayesian framework. By specifying the state innovations to be characterized trough a threshold process that is driven by the absolute size of parameter changes, our model detects at each point in time whether a given regression coefficient is constant or time-varying. Moreover, our framework accounts for model uncertainty in a data-based fashion through Bayesian shrinkage priors on the initial values of the states. In a simulation, we show that our model reliably identifies regime shifts in cases where the data generating processes display high, moderate, and low numbers of movements in the regression parameters. Finally, we illustrate the merits of our approach by means of two applications. In the first application we forecast the US equity premium and in the second application we investigate the macroeconomic effects of a US monetary policy shock. / Series: Department of Economics Working Paper Series

Page generated in 0.0696 seconds