• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 616
  • 157
  • 86
  • 74
  • 54
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1428
  • 210
  • 189
  • 189
  • 181
  • 179
  • 124
  • 117
  • 104
  • 102
  • 98
  • 85
  • 80
  • 79
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

Decisions of capital structure in the presence of agency and collusive monopsony

Wallace, Gerald Leon January 2012 (has links)
The United States acute care hospital (ACH) market provides a unique environment in which to examine questions about market structure and performance. The ACHs operate in a mature market of health services that is highly regulated and has one dominant primary consumer of services. The uncharacteristic industry structure offers the opportunity to analyze pervasive agency relationships and capital structure issues in a new setting. In addition, the policies of the U.S. Government have created an environment in which tacit collusion is likely to flourish, which leads to market buyer power (monopsony, or buyers acting as one monopoly buyer). A key question is the extent to which monopsony and agency affect capital structure decisions. Agency is defined by Ross (1973, p.134) as a relationship formed between a principle and their agents, “when one, designated as the agent, acts for, on behalf of, or as representative for the other, designated the principal, in a particular domain of decision problems.” This thesis extends the agency framework provided by Jensen and Meckling (1976), along with the econometric understanding of monopsony in healthcare via tacit collusion, as suggested by Pauly (1998) and Sevilla (2005), and the research constraints of monopsony under an all-or-nothing contract, as outlined by Taylor (2003). Using data on ACHs from the period of 1995 to 2007 for approximately 5,000 ACHs, which was derived from the Medicare Cost Report and medical payments for a sub-population of 1,500, this research examines the determinants of capital structure in a distorted market. Building upon this initial analysis, the research seeks to examine the effects of market distortions upon free cash flow, and ultimately, capital structure. Two theories of distortion are presented that would affect free cash flow: The first is that of the agency cost of free cash flow and signaling, and the second is a theory of monopsony via tacit collusion between buyers. A model of the agency relationship between ACHs and the U.S. Government is proposed, promoting agency cost (signaling and the agency cost of free cash flows) as a causal relation with free cash flows and capital structure (Jensen & Meckling 1976; Jensen 1986). Empirical models of agency are constructed, examining the dependence on government business and the relation to the leverage (signaling) and free cash flows (agency cost of free cash flows) for ACHs. In addition, a complementary theory of capital structure determinant via market power (monopsony) is formulated, suggesting that monopsony conditions within the ACH market affect free cash flows and capital structure. The analysis provides a framework for understanding the environments in which ACHs operate and the strength of bargaining within the market. The research concludes with a review of the determinants of capital structure in light of the inefficiencies and distortions of the industry and the relationships observed.
262

It's Not Black and White: An Empirical Study of the 2015-2016 U.S. College Protests

Kelleher, Kaitlyn Anne 01 January 2017 (has links)
Beginning in October 2015, student protests erupted at many U.S. colleges and universities. This wave of demonstrations prompted an ongoing national debate over the following question: what caused this activism? Leveraging existing theoretical explanations, this paper attempts to answer this question through an empirical study of the 73 most prominent college protests from October 2015 to April 2016. I use an original data set with information collected from U.S. News and World Report to determine what factors at these 73 schools were most predictive of the protests. My findings strongly suggest that the probability of a protest increases at larger, more selective institutions. I also find evidence against the dominant argument that the marginalization of minority students exclusively caused this activism. Using my empirical results, this paper presents a new theoretical explanation for the 2015-2016 protests. I argue that racial tensions sparked the first demonstration. However, as the protests spread to other campuses, they were driven less by racial grievances and more by a pervasive culture of political correctness. This paper concludes by applying this new theoretical framework to the budding wave of 2017 protests.
263

Effective ERP adoption processes: the role of project activators and resource investments

Bernroider, Edward January 2013 (has links) (PDF)
The aim of this paper is to demonstrate whether stakeholders activating a project shape team building, the structure and magnitude of resource investment levels, and to what extent these levels impact ERP project effectiveness. The process view of an ERP project includes project initiation, system justification and funding, implementation, and early system use. Results from a nationwide empirical survey conducted in Austria (N = 88) show that activating actors influence team formation and resource investments, which impact project effectiveness levels. Resource-intensive justification and funding phases tend to precede resource-intensive implementations in heavy-weight projects, which seem to be less effective than light-weight projects. Resource and change conflicts are associated with lower project performance and are more common in resource-intensive ERP projects, where early system use appears to be relatively less stable. (author's abstract)
264

Objective Climatological Analysis of Extreme Weather Events in Arizona during the North American Monsoon

Mazon, Jeremy J., Castro, Christopher L., Adams, David K., Chang, Hsin-I, Carrillo, Carlos M., Brost, John J. 11 1900 (has links)
Almost one-half of the annual precipitation in the southwestern United States occurs during the North American monsoon (NAM). Given favorable synoptic-scale conditions, organized monsoon thunderstorms may affect relatively large geographic areas. Through an objective analysis of atmospheric reanalysis and observational data, the dominant synoptic patterns associated with NAM extreme events are determined for the period from 1993 to 2010. Thermodynamically favorable extreme-weather-event days are selected on the basis of atmospheric instability and precipitable water vapor from Tucson, Arizona, rawinsonde data. The atmospheric circulation patterns at 500 hPa associated with the extreme events are objectively characterized using principal component analysis. The first two dominant modes of 500-hPa geopotential-height anomalies of the severe-weather-event days correspond to type-I and type-II severe-weather-event patterns previously subjectively identified by Maddox et al. These patterns reflect a positioning of the monsoon ridge to the north and east or north and west, respectively, from its position in the "Four Corners" region during the period of the climatological maximum of monsoon precipitation from mid-July to mid-August. An hourly radar gauge precipitation product shows evidence of organized, westward-propagating convection in Arizona during the type-I and type-II severe weather events. This new methodological approach for objectively identifying severe weather events may be easily adapted to inform operational forecasting or analysis of gridded climate data.
265

Determination and applications of rock quality designation (RQD)

Zhang, Lianyang 06 1900 (has links)
Characterization of rock masses and evaluation of their mechanical properties are important and challenging tasks in rock mechanics and rock engineering. Since in many cases rock quality designation (RQD) is the only rock mass classification index available, this paper outlines the key aspects on determination of RQD and evaluates the empirical methods based on RQD for determining the deformation modulus and unconfined compressive strength of rock masses. First, various methods for determining RQD are presented and the effects of different factors on determination of RQD are highlighted. Then, the empirical methods based on RQD for determining the deformation modulus and unconfined compressive strength of rock masses are briefly reviewed. Finally, the empirical methods based on RQD are used to determine the deformation modulus and unconfined compressive strength of rock masses at five different sites including 13 cases, and the results are compared with those obtained by other empirical methods based on rock mass classification indices such as rock mass rating (RMR), Q-system (Q) and geological strength index (GSI). It is shown that the empirical methods based on RQD tend to give deformation modulus values close to the lower bound (conservative) and unconfined compressive strength values in the middle of the corresponding values from different empirical methods based on RMR, Q and GSI. The empirical methods based on RQD provide a convenient way for estimating the mechanical properties of rock masses but, whenever possible, they should be used together with other empirical methods based on RMR, Q and GSI. (C) 2016 Institute of Rock and Soil Mechanics, Chinese Academy of Sciences. Production and hosting by Elsevier B.V.
266

Particle systems and SPDEs with application to credit modelling

Jin, Lei January 2010 (has links)
No description available.
267

Towards the development of 'priest researchers' in the Church of England

Barley, Lynda January 2014 (has links)
The Church of England is living through a time of significant change in attitudes towards local church ministry, congregational participation and pastoral practices. As it seeks to respond with integrity to changes in contemporary society the Church’s dialogue with empirical social research is beginning to develop more fully. This thesis focuses on a pioneer national project to explore the effectiveness of pastoral ministry in contemporary church weddings. The social science research methods used in this project revealed insights into the ministry of contemporary church weddings with the intention of shaping responsive parochial wedding policies. This thesis considers the potential for further local enquiry by individual marrying clergy to understand the ordinary theology (proposed by Astley) of their communities using methods of ordinary research alongside a shared reflective practice. It highlights the socio-theological interface within reflective empirical theology by pastoral practitioners in the Church. A model of participatory action research incorporating online clergy forums and change agent groups is explored to stimulate parochial and institutional change among clergy in partnership with each other. The role of priest researchers is proposed and identified in other pastoral contexts to examine factors that motivate clergy to participate in the development of pastorally responsive national policies. A methodology of personal diaries, focus groups and one to one interviews is used to explore the responses of clergy to participating in reflective praxis. The findings point to key factors in developing pastoral practice and policies involving the place of ministerial development and attitudes towards collaborative working. A typology of pastoral ministry is developed towards identifying priest researchers in the Church. The research affirms the contribution of pastoral practitioners towards the development of pastorally responsive national policies but the nature of parochial deployment and clergy relationships with each other and the Church institutions frequently preclude much of this contribution.
268

Constructing Empirical Likelihood Confidence Intervals for Medical Cost Data with Censored Observations

Jeyarajah, Jenny Vennukkah 15 December 2016 (has links)
Medical cost analysis is an important part of treatment evaluation. Since resources are limited in society, it is important new treatments are developed with proper costconsiderations. The mean has been mostly accepted as a measure of the medical cost analysis. However, it is well known that cost data is highly skewed and the mean could be highly influenced by outliers. Therefore, in many situations the mean cost alone cannot offer complete information about medical costs. The quantiles (e.g., the first quartile, median and third quartile) of medical costs could better represent the typical costs paid by a group of individuals, and could provide additional information beyond mean cost. For a specified patient population, cost estimates are generally determined from the beginning of treatments until death or end of the study period. A number of statistical methods have been proposed to estimate medical cost. Since medical cost data are skewed to the right, normal approximation based confidence intervals can have much lower coverage probability than the desired nominal level when the cost data are moderately or severely skewed. Additionally, we note that the variance estimators of the cost estimates are analytically complicated. In order to address some of the above issues, in the first part of the dissertation we propose two empirical likelihood-based confidence intervals for the mean medical costs: One is an empirical likelihood interval (ELI) based on influence function, the other is a jackknife empirical likelihood (JEL) based interval. We prove that under very general conditions, −2log (empirical likelihood ratio) has an asymptotic standard chi squared distribution with one degree of freedom for mean medical cost. Also we show that the log-jackknife empirical likelihood ratio statistics follow standard χ2 distribution with one degree of freedom for mean medical cost. In the second part of the dissertation, we propose an influence function-based empirical likelihood method to construct a confidence region for the vector of regression parameters in mean cost regression models with censored data. The proposed confidence region can be used to obtain a confidence interval for the expected total cost of a patient with given covariates. The new method has sound asymptotic property (Wilks Theorem). In the third part of the dissertation we propose empirical likelihood method based on influence function to construct confidence intervals for quantile medical costs with censored data. We prove that under very general conditions, −2log (empirical likelihood ratio) has an asymptotic standard chi squared distribution with one degree of freedom for quantile medical cost. Simulation studies are conducted to compare coverage probabilities and interval lengths of the proposed confidence intervals with the existing confidence intervals. The proposed methods are observed to have better finite sample performances than existing methods. The new methods are also illustrated through a real example.
269

Using Empirical Mode Decomposition to Study Periodicity and Trends in Extreme Precipitation

Pfister, Noah 01 January 2015 (has links)
Classically, we look at annual maximum precipitation series from the perspective of extreme value statistics, which provides a useful statistical distribution, but does not allow much flexibility in the context of climate change. Such distributions are usually assumed to be static, or else require some assumed information about possible trends within the data. For this study, we treat the maximum rainfall series as sums of underlying signals, upon which we perform a decomposition technique, Empirical Mode Decomposition. This not only allows the study of non-linear trends in the data, but could give us some idea of the periodic forces that have an effect on our series. To this end, data was taken from stations in the New England area, from different climatological regions, with the hopes of seeing temporal and spacial effects of climate change. Although results vary among the chosen stations the results show some weak signals and in many cases a trend-like residual function is determined.
270

Molekulové simulace nukleace ledu / Molekulové simulace nukleace ledu

Pluhařová, Eva January 2010 (has links)
Title: Molecular simulations of ice nucleation Author: Eva Pluhařová Department: Department of Physical and Macromolecular Chemistry Faculty of Science UK Advisor: doc. Mgr. Pavel Jungwirth, DSc., IOCB AS CR, v.v.i. Advisor's e-mail address: pavel.jungwirth@uochb.cas.cz Abstract: By means of molecular dynamics simulations we have systematically investigated homogeneous ice nucleation in neat and surface contaminated water. As models of the adsorbates we have assumed pentanol and pentanoic acid. In neat water nucleation preferentially starts in the subsurface region, which accommodates better than the bulk the volume increase associated with freezing. Homogeneous ice nucleation is affected more by alcohol than by acid. Water slabs covered by a disordered layer of pentanol exhibit negligible preference for subsurface nucleation and longer nucleation times in comparison with neat water, while nucleation times are almost unaffected by the presence of pentanoic acid and the subsurface preference is only slightly decreased. We tried to rationalize the differences between the effects of different compounds by their ability to orient water molecules and to change their mobility. The fact that adsorbates differ in the influence on homogeneous ice nucleation has important implications for the microphysics of...

Page generated in 0.0329 seconds