• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 119
  • 27
  • 19
  • 13
  • 10
  • 9
  • 7
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 232
  • 232
  • 151
  • 61
  • 58
  • 41
  • 36
  • 32
  • 29
  • 27
  • 26
  • 24
  • 23
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

An Asymptotic Approach to Progressive Censoring

Hofmann, Glenn, Cramer, Erhard, Balakrishnan, N., Kunert, Gerd 10 December 2002 (has links) (PDF)
Progressive Type-II censoring was introduced by Cohen (1963) and has since been the topic of much research. The question stands whether it is sensible to use this sampling plan by design, instead of regular Type-II right censoring. We introduce an asymptotic progressive censoring model, and find optimal censoring schemes for location-scale families. Our optimality criterion is the determinant of the 2x2 covariance matrix of the asymptotic best linear unbiased estimators. We present an explicit expression for this criterion, and conditions for its boundedness. By means of numerical optimization, we determine optimal censoring schemes for the extreme value, the Weibull and the normal distributions. In many situations, it is shown that these progressive schemes significantly improve upon regular Type-II right censoring.
142

Peak Sidelobe Level Distribution Computation for Ad Hoc Arrays using Extreme Value Theory

Krishnamurthy, Siddhartha 25 February 2014 (has links)
Extreme Value Theory (EVT) is used to analyze the peak sidelobe level distribution for array element positions with arbitrary probability distributions. Computations are discussed in the context of linear antenna arrays using electromagnetic energy. The results also apply to planar arrays of random elements that can be transformed into linear arrays. / Engineering and Applied Sciences
143

Extreme-Value Analysis of Self-Normalized Increments / Extremwerteigenschaften der normierten Inkremente

Kabluchko, Zakhar 23 April 2007 (has links)
No description available.
144

Statistische Multiresolutions-Schätzer in linearen inversen Problemen - Grundlagen und algorithmische Aspekte / Statistical Multiresolution Estimatiors in Linear Inverse Problems - Foundations and Algorithmic Aspects

Marnitz, Philipp 27 October 2010 (has links)
No description available.
145

台灣銀行業系統重要性之衡量 / Measuring Systemic Importance of Taiwan’s Banking System

林育慈, Lin, Yu Tzu Unknown Date (has links)
本文利用Gravelle and Li (2013)提出之系統重要性指標來衡量國內九家上市金控銀行對於系統風險之貢獻程度。此種衡量方法係將特定銀行之系統重要性定義為該銀行發生危機造成系統風險增加的幅度,並以多變量極值理論進行機率的估算。實證結果顯示:一、系統重要性最高者為第一銀行;最低者為中國信託銀行。其中除中國信託銀行之重要性顯著低於其他銀行外,其餘銀行之系統重要性均無顯著差異。二、經營期間較長之銀行其系統重要性較高;具公股色彩之銀行對於系統風險之貢獻程度平均而言高於民營銀行。三、銀行規模與其對系統風險之貢獻大致呈現正向關係,即規模越大之銀行其重要性越高。在此情況下可能會有銀行大到不能倒的問題發生。四、存放比較低之銀行系統重要性亦較低,而資本適足率與系統重要性間並無明顯關係。 / In this thesis, we apply the measure proposed by Gravelle and Li (2013) to examine the systemic importance of certain Taiwanese banks. The systemic importance is defined as the increase in the systemic risk conditioned on the crash of a particular bank, and is estimated by the multivariate extreme value theory. Our empirical evidence shows that the most systemically important bank is First Commercial Bank, and the CTBC Bank is significantly less important than other banks, while the differences among the remaining banks are not significant. Second, banks established earlier have higher systemic importance; and the contribution to systemic risk of public banks, on average, is higher than the contribution of private banks. Third, we also find out that the size of a bank and its risk contribution have positive relationship. That is, the bigger a bank is, the more important it is. Under this circumstances, the too big to fail problem may occur. Last, the bank which has lower loan-to-deposit ratio will be less systemically important than those with higher ones, while the relation between capital adequacy ratio and systemic importance is unclear.
146

Statistical Post-Processing Methods And Their Implementation On The Ensemble Prediction Systems For Forecasting Temperature In The Use Of The French Electric Consumption

Gogonel, Adriana Geanina 27 November 2012 (has links) (PDF)
The thesis has for objective to study new statistical methods to correct temperature predictionsthat may be implemented on the ensemble prediction system (EPS) of Meteo France so toimprove its use for the electric system management, at EDF France. The EPS of Meteo Francewe are working on contains 51 members (forecasts by time-step) and gives the temperaturepredictions for 14 days. The thesis contains three parts: in the first one we present the EPSand we implement two statistical methods improving the accuracy or the spread of the EPS andwe introduce criteria for comparing results. In the second part we introduce the extreme valuetheory and the mixture models we use to combine the model we build in the first part withmodels for fitting the distributions tails. In the third part we introduce the quantile regressionas another way of studying the tails of the distribution.
147

Market Timing Ability of Bond-Equity Yield Ratio : A study of trading strategies in Japan, Malaysia and Singapore

Chit, Ngwe Lin Myat, Wang, Feiran January 2014 (has links)
Market Timing Strategy is an active investment strategy, which is based on the signals of indicators, for the investors to make their investment decisions. However, there has always been the question on which variable is a good indicator, that would provide superior returns for the investment. Bond to Equity Yield Ratio (BEYR) is a new indicator widely researched by many academics in the field of finance and extensively applied by practitioners of the financial markets during the last two decades. Efficient Market Hypothesis (EMH) is a theory in finance which states that stock prices are always reflected with the relevant information and beating the market from predicting the trend of future stock prices is not possible. Therefore, if the market is in accordance with EMH, market timing strategy is not useful and passive investment strategy is better than active investment strategy. Although extant literatures have proved BEYR as a good indicator to be used in market timing strategy, the focus of the existing research is on the financial markets in the United States, the United Kingdom, and the Europe; the study on Asian financial markets is very limited. The main objective of the research is mainly motivated by this knowledge gap. This study will use extreme value strategy as an active trading strategy to conduct research on the market timing ability of BEYR in three Asian financial markets: Japan, Malaysia and Singapore. In addition, passive trading strategy will be used to compare with active trading strategy in each country to identify whether the markets comply with weak form of EMH. Deductive approach of quantitative research is conducted and three main hypotheses are developed to achieve the research objective. The empirical findings from our research and the responses to the main hypotheses can be summarized as active trading strategy does perform better than passive trading strategy for all countries and the market timing ability of BEYR is not as good as the traditional indicators: dividend yields and earning yields for all countries. Therefore, the financial markets of all counties under scrutiny do not comply with weak form of EMH. However, it is worthy to take note that the sample period chosen for this research includes the period when the Global Financial Crisis occurred in 2008. Therefore, it is assumed that the impact of the financial crisis is the main reason contributing the difference between the findings from our research and the existing literatures. Moreover, the difference in the nature of financial market can be considered as another underlying factor for the new perspective on BEYR resulting from our empirical results.
148

Modeling, analysis, and optimization for wireless networks in the presence of heavy tails

Wang, Pu 13 January 2014 (has links)
The heavy-tailed traffic from wireless users, caused by the emerging Internet and multimedia applications, induces extremely dynamic and variable network environment, which can fundamentally change the way in which wireless networks are conceived, designed, and operated. This thesis is concerned with modeling, analysis, and optimization of wireless networks in the presence of heavy tails. First, a novel traffic model is proposed, which captures the inherent relationship between the traffic dynamics and the joint effects of the mobility variability of network users and the spatial correlation in their observed physical phenomenon. Next, the asymptotic delay distribution of wireless users is analyzed under different traffic patterns and spectrum conditions, which reveals the critical conditions under which wireless users can experience heavy-tailed delay with significantly degraded QoS performance. Based on the delay analysis, the fundamental impact of heavy-tailed environment on network stability is studied. Specifically, a new network stability criterion, namely moment stability, is introduced to better characterize the QoS performance in the heavy-tailed environment. Accordingly, a throughput-optimal scheduling algorithm is proposed to maximize network throughput while guaranteeing moment stability. Furthermore, the impact of heavy-tailed spectrum on network connectivity is investigated. Towards this, the necessary conditions on the existence of delay-bounded connectivity are derived. To enhance network connectivity, the mobility-assisted data forwarding scheme is exploited, whose important design parameters, such as critical mobility radius, are derived. Moreover, the latency in wireless mobile networks is analyzed, which exhibits asymptotic linearity in the initial distance between mobile users.
149

Climate variability and change impacts on coastal environmental variables in British Columbia, Canada

Abeysirigunawardena, Dilumie Saumedaka 29 April 2010 (has links)
The research presented in this dissertation attempted to determine whether climate variability is critical to sea level changes in coastal BC. To that end, a number of statistical models were proposed to clarify the relationships between five climate variability indices representing large-scale atmospheric circulation regimes and sea levels, storm surges, extreme winds and storm track variability in coastal BC. The research findings demonstrate that decadal to inter decadal climatic variability is fundamental to explaining the changing frequency and intensity of extreme atmospheric and oceanic environmental variables in coastal BC. The trends revealed by these analyses suggest that coastal flooding risks are certain to increase in this region during the next few decades, especially if the global sea-levels continue to rise as predicted. The out come of this study emphasis the need to look beyond climatic means when completing climate impact assessments, by clearly showing that climate extremes are currently causing the majority of weather-related damage along coastal BC. The findings highlight the need to derive knowledge on climate variability and change effects relevant at regional to local scales to enable useful adaptation strategies. The major findings of this research resulted in five independent manuscripts: (i) Sea level responses to climatic variability and change in Northern BC. The Manuscript (MC) is published in the Journal of atmospheric and oceans (AO 46 (3), 277-296); (ii) Extreme sea-level recurrences in the south coast of BC with climate considerations. This MC is in review with the Asia Pacific Journal of Climate Change (APJCC); (iii) Extreme sea-surge responses to climate variability in coastal BC. This MC is currently in review in the Annals of the AAG (AN-2009-0098); (iv) Extreme wind regime responses to climate variability and change in the inner-south-coast of BC. This MC is published in the Journal of Atmosphere and Oceans (AO 47 (1), 41-62); (v) Sensitivity of winter storm track characteristics in North-eastern Pacific to climate variability. This manuscript is in review with the Journal of Atmosphere and Oceans (AO (1113)). The findings of this research program made key contributions to the following regional sea level rise impact assessment studies in BC: (i) An examination of the Factors Affecting Relative and Absolute Sea level in coastal BC (Thomson et al., 2008). (ii) Coastal vulnerability to climate change and sea level rise, Northeast Graham Island, Haida Gwaii (formally known as the Queen Charlotte Islands), BC (Walker et al., 2007). (iii) Storm Surge: Atmospheric Hazards, Canadian Atmospheric Hazards Network - Pacific and Yukon Region, C/O Bill Taylor.
150

Construction of the Intensity-Duration-Frequency (IDF) Curves under Climate Change

2014 December 1900 (has links)
Intensity-Duration-Frequency (IDF) curves are among the standard design tools for various engineering applications, such as storm water management systems. The current practice is to use IDF curves based on historical extreme precipitation quantiles. A warming climate, however, might change the extreme precipitation quantiles represented by the IDF curves, emphasizing the need for updating the IDF curves used for the design of urban storm water management systems in different parts of the world, including Canada. This study attempts to construct the future IDF curves for Saskatoon, Canada, under possible climate change scenarios. For this purpose, LARS-WG, a stochastic weather generator, is used to spatially downscale the daily precipitation projected by Global Climate Models (GCMs) from coarse grid resolution to the local point scale. The stochastically downscaled daily precipitation realizations were further disaggregated into ensemble hourly and sub-hourly (as fine as 5-minute) precipitation series, using a disaggregation scheme developed using the K-nearest neighbor (K-NN) technique. This two-stage modeling framework (downscaling to daily, then disaggregating to finer resolutions) is applied to construct the future IDF curves in the city of Saskatoon. The sensitivity of the K-NN disaggregation model to the number of nearest neighbors (i.e. window size) is evaluated during the baseline period (1961-1990). The optimal window size is assigned based on the performance in reproducing the historical IDF curves by the K-NN disaggregation models. Two optimal window sizes are selected for the K-NN hourly and sub-hourly disaggregation models that would be appropriate for the hydrological system of Saskatoon. By using the simulated hourly and sub-hourly precipitation series and the Generalized Extreme Value (GEV) distribution, future changes in the IDF curves and associated uncertainties are quantified using a large ensemble of projections obtained for the Canadian and British GCMs (CanESM2 and HadGEM2-ES) based on three Representative Concentration Pathways; RCP2.6, RCP4.5, and RCP8.5 available from CMIP5 – the most recent product of the Intergovernmental Panel on Climate Change (IPCC). The constructed IDF curves are then compared with the ones constructed using another method based on a genetic programming technique. The results show that the sign and the magnitude of future variations in extreme precipitation quantiles are sensitive to the selection of GCMs and/or RCPs, and the variations seem to become intensified towards the end of the 21st century. Generally, the relative change in precipitation intensities with respect to the historical intensities for CMIP5 climate models (e.g., CanESM2: RCP4.5) is less than those for CMIP3 climate models (e.g., CGCM3.1: B1), which may be due to the inclusion of climate policies (i.e., adaptation and mitigation) in CMIP5 climate models. The two-stage downscaling-disaggregation method enables quantification of uncertainty due to natural internal variability of precipitation, various GCMs and RCPs, and downscaling methods. In general, uncertainty in the projections of future extreme precipitation quantiles increases for short durations and for long return periods. The two-stage method adopted in this study and the GP method reconstruct the historical IDF curves quite successfully during the baseline period (1961-1990); this suggests that these methods can be applied to efficiently construct IDF curves at the local scale under future climate scenarios. The most notable precipitation intensification in Saskatoon is projected to occur with shorter storm duration, up to one hour, and longer return periods of more than 25 years.

Page generated in 0.0344 seconds