• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 236
  • 83
  • 70
  • 26
  • 22
  • 17
  • 10
  • 10
  • 7
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 566
  • 128
  • 112
  • 89
  • 83
  • 74
  • 62
  • 46
  • 38
  • 38
  • 35
  • 31
  • 30
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Elastic Wave Propagation in Corrugated Wave Guides

Banerjee, Sourav January 2005 (has links)
Elastic Wave propagation in structures with irregular boundaries is studied by transforming the plates with irregular surfaces to sinusoidal wave-guides. Guided elastic wave in a two-dimensional periodically corrugated plate is studied analytically. The plate material is considered as homogeneous, isotropic and linearly elastic. In a periodically corrugated wave-guide, all possible spectral orders of wave numbers are considered. The dispersion equation is obtained by applying the traction free boundary conditions at the two surfaces. The analysis is carried out in the wave-number domain for both symmetric and anti-symmetric modes. Non-propagating 'stop bands' and propagating 'pass bands' are investigated. Experimental analyses with two different pairs of transducers are also performed and compared with the results from the mathematical analysis. Newly developed semi-analytical DPSM technique has been also adopted in this dissertation to model the ultrasonic field in sinusoidally corrugated plate. Distributed Point Source Method (DPSM) is gradually gaining popularity in the field of Non-Destructive Evaluation (NDE). DPSM can be used to calculate the ultrasonic field (pressure, velocity and displacement in a fluid or stress and displacement in a solid) generated by ultrasonic transducers. So far the technique has been used to model ultrasonic field in homogeneous or multilayered fluid structures. In this dissertation the method is extended to model the ultrasonic field generated in both fluid and solid media. The Prime objective of using DPSM technique in this dissertation is to model the ultrasonic field generated in the corrugated wave guide. This method has never been used to model ultrasonic field in solids. Development of stress and displacement Green's functions in solids are presented. In addition to the wave propagation problem in the sinusoidal wave guide, a few unsolved problems such as ultrasonic field generated by bounded acoustic beams in multilayered fluid structures, near a fluid-solid interface and in flat solid isotropic plates are also presented in this dissertation.
182

Inflation targeting in emerging countries: the exchange rate issues

Reyes Altamirano, Javier Arturo 30 September 2004 (has links)
The current discussion of Inflation Targeting (IT) in emerging economies deals with the effects that nominal exchange rate movements have on the overall inflation rate. The literature has focused in the analysis of the advantages and disadvantages that IT has with respect to other monetary policy regimes and the relevancy of the nominal exchange rate pass-through effect into inflation. So far none of them have dealt with the differences arising from the policy instruments used to fight off inflationary pressure under an IT regime. The literature on IT for emerging economies can be separated in two categories. In the first category the monetary authority uses interest rate policy as the instrument variable to implement and control the inflation target. The second category illustrates when the monetary authorities use international reserves as the instrument to influence the nominal exchange rate in such a way that the depreciation rate is consistent with the overall inflation target. This dissertation presents a model in which both policy instruments are available to the monetary authority. This model is used to address two questions: i) Is IT better than a monetary rule regime? and ii) Is it better to intervene directly in the foreign exchange market rather than use interest rate policy to control exchange rate pressure on inflation, or are they equivalent? The results show that there are important differences between these choices and the answers to these questions are shock dependent. These differences arise because the intervention needed under IT is accompanied by important output costs or benefits depending on the direction of the shock being analyzed. Regarding the pass-through effect, some studies have shown that the pass-through effect from currency depreciation into inflation has been decreasing and therefore is becoming less of an issue for these countries. The literature has offered different explanations for these declines but so far they have not been directly linked to the adoption of IT. This dissertation shows that lower pass-through levels can be a natural result of fear of floating observed in emerging countries that adopted IT and therefore exchange rate effects on inflation are still relevant.
183

Aging in an Era of Change: Contextualizing the Upcoming Demographic Shift in Marich Pass, North Western Kenya

Van De Keere, Laurel 23 November 2010 (has links)
Research on aging in low income countries tends to overlook socio-cultural dimensions, including the significance of change. Because countries like Kenya are expected to witness a demographic shift in coming years, and because different forms of change will place new pressures on existing resources during the same timeframe, there is a need to bridge this gap. This study therefore grapples with the following question: How is change shaping the characteristics and needs of Kenya‘s aging population? This thesis adopts a mixed methods approach informed by critical gerontology and life course perspectives to examine the aging experiences of Pokot elders living in Marich Pass, north western Kenya, in face of multi-scalar changes. Results highlight the challenges and opportunities created by change, and illuminate a need to develop resources to support informal caregivers, to buttress existing formal supports, and to develop additional formal supports to address the unique needs of Pokot elders.
184

Snowmelt energy balance in a burned forest stand, Crowsnest Pass, Alberta

Burles, Katie, University of Lethbridge. Faculty of Arts and Science January 2010 (has links)
Forested watersheds in western North America are subject to significant change from natural and anthropogenic disturbance, including wildfire. Forest canopy changes have subsequent impacts on sub-canopy snow processes. A simple, process-based point energy balance model was developed to quantify differences in energy balance characteristics between a burned and a healthy forest stand. Potential model uncertainties were identified using sensitivity analyses. Simulated snowmelt accurately recreated measured snowmelt, providing confidence in the model’s ability to simulate energy balance processes in subcanopy environments where wind redistribution and sublimation are not major drivers of the local snowmelt energy balance. In the burned stand, sub-canopy snow accumulation was greater but melted more rapidly than in the healthy stand. The removal of forest canopy resulted in more energy available for snowmelt, including higher short-wave and lower long-wave radiation, and increased turbulent fluxes. Burned stands should be considered a separate land cover type in larger scale watershed models. / xii, 129 leaves : ill,, map ; 29 cm
185

Information Centric Data Collection and Dissemination Fabric for Smart Infrastructures

Nigam, Aakash 09 December 2013 (has links)
Evolving smart infrastructures requires both content distribution as well as event notification and processing support. Content Centric Networking (CCN), built around named data, is a clean slate network architecture for supporting future applications. Due to its focus on content distribution, CCN does not inherently support Publish-Subscribe event notification, a fundamental building block in computer mediated systems and a critical requirement for smart infrastructure applications. While semantics of content distribution and event notification require different support systems from the underlying network infrastructure, content distribution and event notification can still be united by leveraging similarities in the routing infrastructure. Our Extended-CCN architecture(X-CCN) realizes this to provide lightweight content based pub-sub service at the network layer, which is used to provide advanced publish/subscribe services at higher layers. Light weight content based pub-sub and CCN communication at network layer along with advanced publish/subscribe together are presented as data fabric for the smart infrastructures applications.
186

Information Centric Data Collection and Dissemination Fabric for Smart Infrastructures

Nigam, Aakash 09 December 2013 (has links)
Evolving smart infrastructures requires both content distribution as well as event notification and processing support. Content Centric Networking (CCN), built around named data, is a clean slate network architecture for supporting future applications. Due to its focus on content distribution, CCN does not inherently support Publish-Subscribe event notification, a fundamental building block in computer mediated systems and a critical requirement for smart infrastructure applications. While semantics of content distribution and event notification require different support systems from the underlying network infrastructure, content distribution and event notification can still be united by leveraging similarities in the routing infrastructure. Our Extended-CCN architecture(X-CCN) realizes this to provide lightweight content based pub-sub service at the network layer, which is used to provide advanced publish/subscribe services at higher layers. Light weight content based pub-sub and CCN communication at network layer along with advanced publish/subscribe together are presented as data fabric for the smart infrastructures applications.
187

Development of the New Zealand Stimuli for the University of Canterbury Adaptive Speech Test-Filtered Words (UCAST-FW)

Murray, Sarah Louise January 2012 (has links)
Auditory processing disorder (APD) is a label that describes a variable set of symptoms that share a common feature of difficulty listening to sounds in the absence of an actual audiological deficit (Moore, 2006). Clinical assessment of APD typically involves a test battery consisting of tests designed to examine the integrity of various auditory processes of the central auditory nervous system. Individuals with APD have difficulty recognising speech when parts of the signal are missing. One category of tests used to assess the extent to which this deficit is associated with reduced performance on the task is the low-pass filtered speech test. The University of Canterbury Adaptive Speech Test-Filtered Words (UCAST-FW) is a computer-based adaptive low-pass filtered speech test developed for the assessment of auditory processing skills in adults and children. Earlier studies with the UCAST-FW (McGaffin, 2007; Sincock, 2008; Heidtke, 2010; Abu-Hijleh, 2011) have produced encouraging results. However, there appear to be a number of confounding factors. The UCAST-FW is testing New Zealand listeners using an Australian recording of American test material. The purpose of the current study was to develop a new four-alternative forced choice test to replace the Northwestern University Children’s Perception of Speech (NU-CHIPS) stimuli the UCAST-FW currently utilises. The new test consists of 98 sets of four test items, (one target item and three foil alternatives) designed to be used in a four-alternative forced choice picture-pointing procedure. Phonemic analysis of the new word list and the NU-CHIPS word lists revealed a similar distribution of phonemes for target words of both word lists. The development of the new word list is described and the clinical applicability is explored.
188

THREE ESSAYS ON EXCHANGE RATE AND MONETARY POLICY

An, Lian 01 January 2006 (has links)
There are four chapters in my dissertation. Chapter one gives a brief introduction of the three essays. Chapter two empirically analyzes the interaction among conventional monetary policy, foreign exchange intervention and the exchange rate in a unifying model for Japan. I have several findings. First, the results lend support to the leaning-against-the-wind hypothesis. Second, conventional monetary policy has as great influence on the exchange rate as foreign exchange intervention in Japan. Third, intervention in Japan is ineffective or may be counter-effective, so escaping liquidity trap by intervention alone may not be a feasible way. Chapter three empirically identifies the sources of exchange rate movements of Japan vis--vis the US, and investigates the role of the exchange rate in the macro economy adjustment. It finds that real shocks dominate nominal shocks in explaining the exchange rate movements, with relative real demand shocks as the major contributor. And the exchange rate market does not create many shocks. The overall result supports that the bilateral exchange rate in Japan is a shock-absorber rather than a source of shock. Chapter four provides cross-country and time-series evidence on the extent of exchange rate pass-through at different stages of distribution - import prices, producer prices and consumer prices - for eight major industrial countries: United States, Japan, Canada, Italy, UK, Finland, Sweden and Spain. I find exchange rate pass-through incomplete in many horizons, though complete pass-through is observed occasionally. The degree of pass-through declines and time needed for complete pass-through lengthens along the distribution chain. Furthermore, I find that a greater pass-through coefficient is associated with an economy that is smaller in size with higher import shares, more persistent and less volatile exchange rate shocks, more volatile monetary shocks, higher inflation rate, and less volatile GDP.
189

STATISTICAL MODELS FOR CONSTANT FALSE-ALARM RATE THRESHOLD ESTIMATION IN SOUND SOURCE DETECTION SYSTEMS

Saghaian Nejad Esfahani, Sayed Mahdi 01 January 2010 (has links)
Constant False Alarm Rate (CFAR) Processors are important for applications where thousands of detection tests are made per second, such as in radar. This thesis introduces a new method for CFAR threshold estimation that is particularly applicable to sound source detection with distributed microphone systems. The novel CFAR Processor exploits the near symmetry about 0 for the acoustic pixel values created by steered-response coherent power in conjunction with a partial whitening preprocessor to estimate thresholds for positive values, which represent potential targets. To remove the low frequency components responsible for degrading CFAR performance, fixed and adaptive high-pass filters are applied. A relation is proposed and it tested the minimum high-pass cut-off frequency and the microphone geometry. Experimental results for linear, perimeter and planar arrays illustrate that for desired false alarm (FA) probabilities ranging from 10-1 and 10-6, a good CFAR performance can be achieved by modeling the coherent power with Chi-square and Weibull distributions and the ratio of desired over experimental FA probabilities can be limited within an order of magnitude.
190

Beers and Bonds : Essays in Structural Empirical Economics

Romahn, André January 2012 (has links)
This dissertation consists of four papers in structural empirics that can be broadly categorized into two areas. The first three papers revolve around the structural estimation of demand for differentiated products and several applications thereof (Berry (1994), Berry, Levinsohn and Pakes (1995), Nevo (2000)), while the fourth paper examines the U.S. Treasury yield curve by estimating yields as linear functions of observable state variables (Ang and Piazzesi (2003), Ang et al. (2006)). The central focus of each paper are the underlying economics. Nevertheless, all papers share a common empirical approach. Be it prices of beers in Sweden or yields of U.S. Treasury bonds, it is assumed throughout that the economic variables of interest can be modeled by imposing specific parametric functional forms. The underlying structural parameters are then consistently estimated based on the variation in available data. Consistent estimation naturally hinges on the assumption that the assumed functional forms are correct. Another way of viewing this is that the imposed functions are flexible enough not to impose restrictive patterns on the data that ultimately lead to biased estimates of the structural parameters and thereby produce misleading conclusions regarding the underlying economics. In principle, the danger of misspecification could therefore be avoided by adopting sufficiently flexible functional forms. This, however, typically requires the estimation of a growing number of structural parameters that determine the underlying economic relationships. As an example, we can think of the estimation of differentiated product demand. The key object of interest here is the substitution patterns between the products. That is, we are interested in what happens to the demand of good X and all its rival products, as the price of good X increases. With N products in total, we could collect the product-specific changes in demand in a vector with N entries. It is also possible, however, that the price of any other good Y changes and thereby alters the demands for the remaining varieties. Thus, in total, we are interested in N2 price effects on product-specific demand. With few products, these effects could be estimated directly and the risk of functional misspecification could be excluded (Goolsbee and Petrin (2004)). With 100 products, however, we are required to estimate 10,000 parameters, which rarely, if ever, is feasible. This is the curse of dimensionality. Each estimation method employed in the four papers breaks this curse by imposing functions that depend on relatively few parameters and thereby tries to strike a balance between the necessity to rely on parsimonious structural frameworks and the risk of misspecification. This is a fundamental feature of empirical research in economics that makes it both interesting and challenging. / <p>Diss. Stockholm :  Stockholm School of Economics, 2012. Introduction together with 4 papers</p>

Page generated in 0.0719 seconds