• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 619
  • 158
  • 86
  • 74
  • 55
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1433
  • 210
  • 190
  • 190
  • 183
  • 180
  • 124
  • 118
  • 104
  • 103
  • 99
  • 85
  • 81
  • 80
  • 79
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

CARBON PRICING IN KOREA: EMPIRICAL STUDIES ON THE BUSINESS PERSPECTIVES / 韓国における炭素価格付け政策:産業観点についての実証研究

Suk, Sunhee 23 January 2018 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(経済学) / 甲第20785号 / 経博第557号 / 新制||経||282(附属図書館) / 京都大学大学院経済学研究科経済学専攻 / (主査)教授 諸富 徹, 教授 劉 徳強, 教授 依田 高典 / 学位規則第4条第1項該当 / Doctor of Economics / Kyoto University / DFAM
512

Empirical Analysis of Joint Quantile and Expected Shortfall Regression Backtests

Ågren, Viktor January 2023 (has links)
In this work, we look into the practical applicability of three joint quantile and expected shortfall regression backtests. The strict, auxiliary, and intercept ESR backtests are applied to the historical log returns of the OMX Stockholm 30 market-weight price index. We estimate the conditional variance using GARCH models for various rolling window lengths and refitting frequencies. We are particularly interested in the rejection rates of the one-sided intercept ESR backtest as it is comparable to the current standard of backtests. The one-sided test is found to perform well when the conditional variance is estimated by either the GARCH(1,1), GJR-GARCH(1,1), or EGARCH(1,1) coupled with student’s t-innovation residuals and a rolling window size of 1000 days.
513

Assessment of Remotely Sensed Image Processing Techniques for Unmanned Aerial System (Uas) Applications

Zarzar, Christopher Michael 11 August 2017 (has links)
Unmanned Aerial Systems (UASs) offer a new era of local-scale environmental monitoring where access to invaluable aerial data no longer comes at a substantial cost. This provides the opportunity to vastly expand the ability to detect natural hazards impacts, observe environmental conditions, quantify restoration efforts, track species propagation, monitor land surface changes, cross-validate existing platforms, and identify hazardous situations. While UASs have the potential to accelerate understanding of natural processes, much of the research using UASs has applied current remote sensing image processing techniques without questioning the validity of these in UAS applications. With new scientific tools comes a need to affirm that previous techniques are still valid for the new systems. To this end, the objective of the current study is to provide an assessment regarding the use of current remote sensing image processing techniques in UAS applications. The research reported herein finds that atmospheric effects have a statistically significant impact on low altitude UAS imagery. Correcting for these external factors affecting the imagery was successful using an empirical line calibration (ELC) image correction technique and required little modification for use in a complex UAS application. Finally, it was found that classification performance of UAS imagery was reliant on training sample size more than classification technique, and that training sample size requirements are larger than previous remote sensing studies suggest.
514

Testing for the Existence of Distribution Effects in the Aggregate Consumption Function

Tahir, Sayyid 01 1900 (has links)
This thesis addresses a long-standing puzzle in empirical econometrics: Does the size distribution of income matter in the aggregate consumption function? Current opinion on whether distribution matters is divided. There is also a lack of consensus (among those who believe distribution effects exist) on the nature of such effects; that is, whether a decrease or an increase in income inequality is needed to stimulate aggregate demand. In this thesis, the previous or existing tests are challenged on the grounds that they do not properly take into account the causal link between the variability of the marginal, not the average, propensity to consume (with respect to the income level) and the existence of distribution effects. This particular link is taken care of, however, if one tests for the linearity (in income) of the micro relation underlying one's aggregate consumption function. The rejection of the linearity hypothesis will establish the existence of distribution effects. Ex post, if the nonlinear relation is such that the marginal propensity to consume declines with income, it also follows that an equalization in the income distribution produces greater aggregate consumption. The theoretical contribution of this thesis lies in the clarification of these issues. On the empirical side, this thesis cautions against the casual use of the term "distribution effects". In the current income-current expenditure framework of the Keynesians, it refers to "the effect of a redistribution of real disposable income" on aggregate real consumers' expenditure. In the Permanent Income Hypothesis framework, however, it could mean either "the effect of a redistribution of real disposable income" or "the effect of a redistribution of real permanent income" on aggregate real consumption. In this thesis, the distributions of real disposable income and real permanent income are alternatively assumed to follow the lognormal density, and two conclusions are empirically determined: I. The distribution of real disposable income matters in the current income-current expenditure framework---this result is statistically significant at a 10% level after the correction for serial correlation and simultaneity bias. In particular, the estimates indicate that the marginal propensity to consume declines with the level of real disposable income and, hence, a decrease in inequality would stimulate aggregate demand. II. The elasticity of consumption out of real permanent income is unity; therefore, the distribution of real permanent income does not matter in the Permanent Income Hypothesis framework---this result is statistically. significant at all conventional levels of significance both before and after the correction for serial correlation. Both findings are based on aggregative time-series data for Canada. The consumer unit in this thesis is an individual income-recipient, and the data period is 1947-1976. Maximum-likelihood procedures have been used in the estimation, with proper allowance for across-parameter constraints. In the event of correction for serial correlation, the autocorrelation coefficient is constrained to the open-interval (+1,-1). The results are also double-checked by examining many avenues that might affect the nature of the outcomes. Another contribution of this study is the compilation of data on the distribution of pre-tax personal income (in current dollars) in Canada under the lognormality hypothesis. The parameters of this distribution are determined using the minimum chi-square method. Estimates of the variance (of logarithms of income) parameter show a slight increase in income inequality over the period 1946 to 1976. The data on this parameter are used to approximate the variance of logarithms for the distribution of real disposable income (while establishing result I) and also the same for the distribution of real permanent income (while establishing the result II). / Thesis / Doctor of Philosophy (PhD)
515

Surviving the Surge: Real-time Analytics in the Emergency Department

Rea, David J. 05 October 2021 (has links)
No description available.
516

Semi-Empirical Lifetimes for High-Energy Rydberg States of ¹³³Cs Neutral Cesium in a Blackbody Radiation Field

Truxon, James M. 01 August 2013 (has links)
No description available.
517

The Outcomes of Just War: An Empirical Study of the Outcomes Associated with Adherence to Just War Theory, 1960-2000

Kauffman, Rudi D. January 2012 (has links)
No description available.
518

Laboratory Resilient Modulus Measurements of Aggregate Base Materials in Utah

Jackson, Kirk David 01 December 2015 (has links) (PDF)
The Utah Department of Transportation (UDOT) has fully implemented the Mechanistic-Empirical Pavement Design Guide for pavement design but has been using primarily level-three design inputs obtained from correlations to aggregate base materials developed at the national level. UDOT was interested in investigating correlations between laboratory measurements of resilient modulus, California bearing ratio (CBR), and other material properties specific to base materials commonly used in Utah; therefore, a statewide testing program was needed. The objectives of this research were to 1) determine the resilient modulus of several representative aggregate base materials in Utah and 2) investigate correlations between laboratory measurements of resilient modulus, CBR, and other properties of the tested materials. Two aggregate base materials were obtained from each of the four UDOT regions. Important material properties, including particle-size distribution, soil classification, and the moisture-density relationship, were investigated for each of the sampled aggregate base materials. The CBR and resilient modulus of each aggregate base material were determined in general accordance with American Society for Testing and Materials D1883 and American Association of State Highway and Transportation Officials T 307, respectively. After all of the data were collected, several existing models were evaluated to determine if one or more of them could be used to predict the resilient modulus values measured in this research. Statistical analyses were also performed to investigate correlations between measurements of resilient modulus, CBR, and other properties of the tested aggregate base materials, mainly including aspects of the particle-size distributions and moisture-density relationships. A set of independent predictor variables was analyzed using both stepwise regression and best subset analysis to develop a model for predicting resilient modulus. After a suitable model was developed, it was analyzed to determine the sensitivity of the model coefficients to the individual data points. For the aggregate base materials tested in this research, the average resilient modulus varied from 16.0 to 25.6 ksi. Regarding the correlation between resilient modulus and CBR, the test results show that resilient modulus and CBR are not correlated for the materials tested in this research. Therefore, a new model was developed to predict the resilient modulus based on the percent passing the No. 200 sieve, particle diameter corresponding to 30 percent finer, optimum moisture content, maximum dry density (MDD), and ratio of dry density to MDD. Although the equation may not be applicable for values outside the ranges of the predictor variables used to develop it, it is expected to provide UDOT with reasonable estimates of resilient modulus values for aggregate base materials similar to those tested in this research.
519

Structuring Emperical Methods for Reuse and Efficiency in Product Development Processes

Bare, Marshall Edwin 21 December 2006 (has links) (PDF)
Product development requires that engineers have the ability to predict product performance. When product performance involves complex physics and natural phenomena, mathematical models are often insufficient to provide accurate predictions. Engineering companies compensate for this deficiency by testing prototypes to obtain empirical data that can be used in place of predictive models. The purpose of this work is to provide techniques and methods for efficient use of empirical methods in product development processes. Empirical methods involve the design and creation of prototype hardware and the testing of that hardware in controlled environments. Empirical methods represent a complete product development sub-cycle within the overall product development process. Empirical product development cycles can be expensive in both time and resources. Global economic pressures have caused companies to focus on improving the productivity of their product development cycles. A variety of techniques for improving the productivity of product development processes have been developed. These methods focus on structuring process steps and product artifacts for reuse and efficiency. However these methods have, to this point, largely ignored the product development sub-cycle of empirical design. The same techniques used on the overall product development processes can and should be applied to the empirical product development sub-cycle. This thesis focuses on applying methods of efficient and reusable product development processes on the empirical development sub-cycle. It also identifies how to efficiently link the empirical product development sub-cycle into the overall product development process. Specifically, empirical product development sub-cycles can be characterized by their purposes into three specific types: first, obtaining data for predictive model coefficients, boundary conditions and driving functions; second, validating an existing predictive model; and third, to provide the basis for predictions using interpolation and extrapolation of the empirical data when a predictive model does not exist. These three types of sub-cycles are structured as reusable processes in a standard form that can be used generally in product development. The roles of these three types of sub-cycles in the overall product development process are also established and the linkages defined. Finally, the techniques and methods provided for improving the efficiency of empirical methods in product development processes are demonstrated in a form that shows their benefits.
520

Observational Studies of Software Engineering Using Data from Software Repositories

Delorey, Daniel Pierce 06 March 2007 (has links) (PDF)
Data for empirical studies of software engineering can be difficult to obtain. Extrapolations from small controlled experiments to large development environments are tenuous and observation tends to change the behavior of the subjects. In this thesis we propose the use of data gathered from software repositories in observational studies of software engineering. We present tools we have developed to extract data from CVS repositories and the SourceForge Research Archive. We use these tools to gather data from 9,999 Open Source projects. By analyzing these data we are able to provide insights into the structure of Open Source projects. For example, we find that the vast majority of the projects studied have never had more than three contributors and that the vast majority of authors studied have never contributed to more than one project. However, there are projects that have had up to 120 contributors in a single year and authors who have contributed to more than 20 projects which raises interesting questions about team dynamics in the Open Source community. We also use these data to empirically test the belief that productivity is constant in terms of lines of code per programmer per year regardless of the programming language used. We find that yearly programmer productivity is not constant across programming languages, but rather that developers using higher level languages tend to write fewer lines of code per year than those using lower level languages.

Page generated in 0.2049 seconds