• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 553
  • 162
  • 106
  • 67
  • 62
  • 51
  • 12
  • 12
  • 12
  • 10
  • 10
  • 9
  • 6
  • 5
  • 5
  • Tagged with
  • 1249
  • 264
  • 214
  • 131
  • 126
  • 117
  • 116
  • 114
  • 114
  • 104
  • 93
  • 82
  • 79
  • 74
  • 70
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Order-statistics-based inferences for censored lifetime data and financial risk analysis

Sheng, Zhuo January 2013 (has links)
This thesis focuses on applying order-statistics-based inferences on lifetime analysis and financial risk measurement. The first problem is raised from fitting the Weibull distribution to progressively censored and accelerated life-test data. A new orderstatistics- based inference is proposed for both parameter and con dence interval estimation. The second problem can be summarised as adopting the inference used in the first problem for fitting the generalised Pareto distribution, especially when sample size is small. With some modifications, the proposed inference is compared with classical methods and several relatively new methods emerged from recent literature. The third problem studies a distribution free approach for forecasting financial volatility, which is essentially the standard deviation of financial returns. Classical models of this approach use the interval between two symmetric extreme quantiles of the return distribution as a proxy of volatility. Two new models are proposed, which use intervals of expected shortfalls and expectiles, instead of interval of quantiles. Different models are compared with empirical stock indices data. Finally, attentions are drawn towards the heteroskedasticity quantile regression. The proposed joint modelling approach, which makes use of the parametric link between the quantile regression and the asymmetric Laplace distribution, can provide estimations of the regression quantile and of the log linear heteroskedastic scale simultaneously. Furthermore, the use of the expectation of the check function as a measure of quantile deviation is discussed.
272

Sustainable Software Development: Evolving Extreme Programming

Sedano, Todd 01 April 2017 (has links)
Context: Software development is a complex socio-technical endeavor that involves coordinating different disciplines and skill sets. Practitioners experiment with and adopt processes and practices with a goal of making their work more effective. Objective: To observe, describe, and analyze software development processes and practices in an industrial setting. Our goal is to generate a descriptive theory of software engineering development, which is rooted in empirical data. Method: Following Constructivist Grounded Theory, we conducted a 2.5 year participant-observation of eight software projects at Pivotal, a software development company. We interviewed 33 software engineers, interaction designers, and product managers, and analyzed one year of retrospection topics. We iterated between data collection, data analysis and theoretical sampling until achieving theoretical saturation and generating a descriptive theory. Results: 1) This research introduces a descriptive theory of Sustainable Software Development. The theory encompasses principles, policies, and practices aiming at removing knowledge silos and improving code quality, hence leading to development sustainability. 2) At the heart of Sustainable Software Development is team code ownership. This research widens our understanding of team code ownership. Developers achieve higher team code ownership when they understand the system context, have contributed to the code in question, perceive code quality as high, believe the product will satisfy the user needs, and perceive high team cohesion. 3) This research introduces the first evidence-based waste taxonomy, identifying eight wastes along with causes and tensions, and compares it with Lean Software Development’s waste taxonomy. Conclusion: The Sustainable Software Development theory refines and extends our understanding of Extreme Programming by adding principles, policies, and practices (including Overlapping Pair Rotation) and aligning them with the business goal of sustainability. One key aspect of the theory is team code ownership, which is rooted in numerous cognitive, emotional, contextual and technical factors and cannot be achieved simply by policy. Another key dimension is waste identification and elimination, which has led to a new taxonomy of waste. Overall, this research contributes to the field of software engineering by providing new insights, rooted in empirical data, into how a software organization leverages and extends Extreme Programming to achieve software sustainability.
273

Using Empirical Mode Decomposition to Study Periodicity and Trends in Extreme Precipitation

Pfister, Noah 01 January 2015 (has links)
Classically, we look at annual maximum precipitation series from the perspective of extreme value statistics, which provides a useful statistical distribution, but does not allow much flexibility in the context of climate change. Such distributions are usually assumed to be static, or else require some assumed information about possible trends within the data. For this study, we treat the maximum rainfall series as sums of underlying signals, upon which we perform a decomposition technique, Empirical Mode Decomposition. This not only allows the study of non-linear trends in the data, but could give us some idea of the periodic forces that have an effect on our series. To this end, data was taken from stations in the New England area, from different climatological regions, with the hopes of seeing temporal and spacial effects of climate change. Although results vary among the chosen stations the results show some weak signals and in many cases a trend-like residual function is determined.
274

Analysis and processing of mechanically stimulated electrical signals for the identification of deformation in brittle materials

Kyriazis, Panagiotis A. January 2010 (has links)
The fracture of brittle materials is of utmost importance for civil engineering and seismology applications. A different approach towards the aim of early identification of fracture and the prediction of failure before it occurs is attempted in this work. Laboratory experiments were conducted in a variety of rock and cement based material specimens of various shapes and sizes. The applied loading schemes were cyclic or increasing and the specimens were tested to compression and bending type loading of various levels. The techniques of Pressure Stimulated Current and Bending Stimulated Current were used for the detection of electric signal emissions during the various deformation stages of the specimens. The detected signals were analysed macroscopically and microscopically so as to find suitable criteria for fracture prediction and correlation between the electrical and mechanical parameters. The macroscopic proportionality of the mechanically stimulated electric signal and the strain was experimentally verified, the macroscopic trends of the PSC and BSC electric signals were modelled and the effects of material memory to the electric signals were examined. The current of a time-varying RLC electric circuit was tested against experimental data with satisfactory results and it was proposed as an electrical equivalent model. Wavelet based analysis of the signal revealed the correlation between the frequency components of the electric signal and the deformation stages of the material samples. Especially the increase of the high frequency component of the electric signal seems to be a good precursor of macrocracking initiation point. The additional electric stimulus of a dc voltage application seems to boost the frequency content of the signal and reveals better the stages of cracking process. The microscopic analysis method is scale-free and thus it can confront with the problems of size effects and material properties effects. The AC conductivity time series of fractured and pristine specimens were also analysed by means of wavelet transform and the spectral analysis was used to differentiate between the specimens. A non-destructive technique may be based on these results. Analysis has shown that the electric signal perturbation is an indicator of the forthcoming fracture, as well as of the fracture that has already occurred in specimens.
275

La mise en discours et la mise en marché de la chirurgie esthétique

Mercier, Élisabeth January 2007 (has links)
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
276

FITTING A DISTRIBUTION TO CATASTROPHIC EVENT

Osei, Ebenezer 15 December 2010 (has links)
Statistics is a branch of mathematics which is heavily employed in the area of Actuarial Mathematics. This thesis first reviews the importance of statistical distributions in the analysis of insurance problems and the applications of Statistics in the area of risk and insurance. The Normal, Log-normal, Pareto, Gamma, standard Beta, Frechet, Gumbel, Weibull, Poisson, binomial, and negative binomial distributions are looked at and the importance of these distributions in general insurance is also emphasized. A careful review of literature is to provide practitioners in the general insurance industry with statistical tools which are of immediate application in the industry. These tools include estimation methods and fit statistics popular in the insurance industry. Finally this thesis carries out the task of fitting statistical distributions to the flood loss data in the 50 States of the United States.
277

Právo na život a trest smrti / The right to life and capital punishment

Caletka, Lumír January 2011 (has links)
Resumé The right to life and the capital punishment Death penalty or capital punishment, or extreme penalty, is a punishment that assumes killing (or execution) of condemned person for a criminal offence provided it is possible to impose this sentence in accordance with the criminal law in force. At former times these crimes were called capital or hanging crimes. With regard to its definitiveness (impossibility of any redress after its execution) it is very controversial punishment. Its supporters argue that it is the sole just sentence for homicide and other felonies, that it excludes recurrence and it has significant deterrent (preventive) effects. On the other hand, the opponents allege not only moral arguments concerning the right of each person to life but they also mention the danger of judicial error which is incorrigible in this case. Furthermore, the objectors point out that the death sentence is not so intimidating. In addition, the costs of the execution use to be more expensive than life imprisonment. Since 1970s the capital punishment has been gradually cancelled in the majority of European and South American countries. Some states of the USA and the People's Republic of China are among countries that still make practise of the death penalty. I deal with the capital sentence in the first part...
278

Measuring Extremes: Empirical Application on European Markets

Öztürk, Durmuş January 2015 (has links)
This study employs Extreme Value Theory and several univariate methods to compare their Value-at-Risk and Expected Shortfall predictive performance. We conduct several out-of-sample backtesting procedures, such as uncondi- tional coverage, independence and conditional coverage tests. The dataset in- cludes five different stock markets, PX50 (Prague, Czech Republic), BIST100 (Istanbul, Turkey), ATHEX (Athens, Greece), PSI20 (Lisbon, Portugal) and IBEX35 (Madrid, Spain). These markets have different financial histories and data span over twenty years. We analyze the global financial crisis period sep- arately to inspect the performance of these methods during the high volatility period. Our results support the most common findings that Extreme Value Theory is one of the most appropriate risk measurement tools. In addition, we find that GARCH family of methods, after accounting for asymmetry and fat tail phenomena, can be equally useful and sometimes even better than Extreme Value Theory based method in terms of risk estimation. Keywords Extreme Value Theory, Value-at-Risk, Expected Shortfall, Out-of-Sample Backtesting Author's e-mail ozturkdurmus@windowslive.com Supervisor's e-mail ies.avdulaj@gmail.com
279

Krajní formulace ve výzkumném interview / Extreme Case Formulations in the Research Interview

Zaepernicková, Eliška January 2016 (has links)
The following thesis analyses the form and importance of extreme case formulations based on more than 10 research interviews. Extreme case formulations, i.e. formulations comprising expressions such as "nothing", "everything" or "noone", which present a phenomenon or a circumstance according to its minimal or maximal properties, can constitute a complication in the analysis of research interviews. This thesis expand the existing hypothesis of extreme case formulations by adding new findings, for example the bipolarity of extreme case formulations. In the analysed data, these do often occur together with different clasifiers, which either weaken (softeners) or strengthen (intensifiers) their extreme force. At the same time the following thesis broadens the research of extreme case formulations of interdisciplinary approach, which uses the findings from psychology and rhetoric. The conducted research is furthermore completed with the acoustic analysis of semantically extreme formulations, whose results show that extreme case formulations often tend to be acoustically emphasized. Keywords: extreme case formulation, conversation analysis, discourse analysis, interview, Czech
280

Voliči české krajní pravice po r. 1990 z geografické perspektivy / Electorate of the Czech Extreme Right since 1990 in the Geographic Perspective

Fráně, Luděk January 2013 (has links)
This diploma thesis deals with the political-geographical analysis of the Czech extreme rights, its voters and electoral support in the period after 1990. The paper aims to evaluate the development of the Czech extreme rights in the period 1990-2010 in terms of its importance in the Czech political system, and especially from the perspective of regional differentiation of its electoral support, and to identify possible influencing factors that conduce the voters to support this political stream. The objective is to answer the question whether there are significant regional differences in the support of right-wing extremist parties and, if so, whether these differences are stable over time. Just monitoring of a longer period (1990-2010) will specify whether the regions with above-average support of far-right parties are stable, or whether the extraordinary support of these parties during the reported period varies regionally and some shifts occur for example, due to the social-economic changes in the transformation period. The thesis also examines the factors that could explain the regional differences in electoral support of the far rights. The aim here is to use the social- demographic and economic characteristics expressing the structure of the population as well as the influence of the local...

Page generated in 0.0453 seconds