• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 615
  • 157
  • 86
  • 74
  • 54
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1425
  • 210
  • 188
  • 188
  • 181
  • 178
  • 123
  • 116
  • 102
  • 102
  • 98
  • 85
  • 80
  • 78
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Open stope hangingwall design based on general and detailed data collection in unfavourable hangingwall conditions

Capes, Geoffrey William 16 April 2009 (has links)
This thesis presents new methods to improve open stope hangingwall (HW) design based on knowledge gained from site visits, observations, and data collection at underground mines in Canada, Australia, and Kazakhstan. The data for analysis was collected during 2 months of research at the Hudson Bay Mining and Smelting Ltd. Callinan Mine in Flin Flon, Manitoba, a few trips to the Cameco Rabbit Lake mine in northern Saskatchewan, and 3 years of research and employment at the Xstrata Zinc George Fisher mine near Mount Isa, Queensland, Australia. Other sites visited, where substantial stope stability knowledge was accessed include the Inco Thompson mines in northern Manitoba; BHP Cannington mine, Xstrata Zinc Lead Mine, and Xstrata Copper Enterprise Mine, in Queensland, Australia; and the Kazzinc Maleevskiy Mine in north-eastern Kazakhstan. An improved understanding of stability and design of open stope HWs was developed based on: 1) Three years of data collection from various rock masses and mining geometries to develop new sets of design lines for an existing HW stability assessment method; 2) The consideration of various scales of domains to examine HW rock mass behaviour and development of a new HW stability assessment method; 3) The investigation of the HW failure mechanism using analytical and numerical methods; 4) An examination of the effects of stress, undercutting, faulting, and time on stope HW stability through the presentation of observations and case histories; and 5) Innovative stope design techniques to manage predicted stope HW instability. An observational approach was used for the formulation of the new stope design methodology. To improve mine performance by reducing and/or controlling the HW rock from diluting the ore with non-economic material, the individual stope design methodology included creating vertical HWs, leaving ore skins or chocks where appropriate, and rock mass management. The work contributed to a reduction in annual dilution from 14.4% (2003) to 6.3% (2005), an increase in zinc grade from 7.4% to 8.7%, and increasing production tonnes from 2.1 to 2.6 Mt (Capes et al., 2006).
122

Empirical Essays on the Efficiency of Heterogeneous Good Auction

Martin, Thomas A., IV 14 January 2010 (has links)
A recent pursuit of the auction design literature has been the development of an auction mechanism which performs well in a multi-good setting, when the goods are not substitutes. This work began in earnest with the Federal Communications Commission spectrum license auctions in the early nineties and continues to this day. In a setting in which goods are not substitutes, the value of one good depends nonnegatively on the quantities of other goods that are won. This type of interdependent value structure has proven difficult to account for in auction design. However, the need for mechanisms that account for such a value structure hinges on the magnitude of the interdependence, whose computation is an empirical exercise. I identify a setting in which to perform this computation. I develop an empirical methodology that allows me to recover bidders' value functions in a multi-good auction setting. This methodology allows me to assess the magnitude of any interdependence in the goods? value structure. Since the auction setting that I analyze is a variation of the standard uniform price auction, which has been adapted for a multi-good setting, I am able to measure the benefit of having a direct revelation mechanism. This counterfactual study is performed by maximizing the value of the auction using the recovered bidder value functions. I find evidence that there is an interdependent value structure in the setting. The counterfactual auction finds that the standard uniform price auction, adapted to a multi-good setting, performs poorly in the presence of such a value structure. The setting for this analysis is an auction for financial transmission rights held in Texas in 2002. The auction involved twenty two firms and collected almost $70 million in revenue. This research is the first to empirically assess efficiency in this type of auction setting.
123

The determinants of participation in voluntary association: An empirical analysis of social service organization.

Chiu, Po-ching 20 June 2000 (has links)
The voluntary association is the mediate between society and individuals; it serves as a prime engine for social changes. Therefore, it is very important to find out who the voluntary association participants are. Previous research has focused primarily on demographic attributes of voluntary association participants. In addition to the demographic factors, this study investigates the effects of job characteristics, work-related attitudes, perceptions of social inequality, and resource constrains on the probability of participation. Based on 1996 ¡§ Taiwan Social Change Survey,¡¨ this study uses logistic regression to model the determinants of participation in voluntary associations of more than 2000 respondents. The result shows that job satisfaction is negatively associated with participation rate and job stability has a positive effect on participation. With respect to the perceived causes of social inequality, situational attribution is positively correlated with probability of participation. Finally, respondents with lower family responsibilities and higher income are more likely to participate in voluntary association.
124

Bayesian Unit Root Test ¡V Application for Exchange Rate Market

Liao, Siang-kai 24 June 2008 (has links)
There should be more interpretations which are derived from data, presented by those professional analysts. The empirical rules and knowledge do help as making statistical inference in Econometrics. The approaches from classical statistical analysis make judges simply resulting from historical data. To be frank, the advantage of this analysis is the objectivity, but there is a fatal drawback. That is, it does not pay attention to some logically extra information. This paper is born for the applications of Bayesian, which has the essential characteristic of accepting subjective outlook, applying empirical rules to study unit root test on exchange rate market. Furthermore, the various distributions of data may have direct effect on the classical statistical inference we use, such as Dickey-Fuller and Phillips-Perron test. To take those defects into consideration, this paper tends not to take the assumption of disturbances in normal distribution as granted. For instance, it is quite common for us to confront the heavy-tailed distribution when studying some data of time series related to stocks and targets of investment. Hence, we will apply more generalized model to do research on Bayesian unit root test. Use the model of Schotman and Van Dijk (1991) and assuming disturbance shaped as independent student-t distribution to revise the unit root test, next, applying to exchange rate market. This is the motif of this paper.
125

Assessing GCM performance for use in greenhouse gas forced climate change predictions using multivariate empirical orthogonal functions

Picton, Jeffrey 26 November 2012 (has links)
Due to factors such as spatial discretization and the parameterization of certain processes, the presence of bias in models of the Earth's atmosphere is unavoidable. Whether we are selecting a model to explain past phenomenon, forecast weather patterns, or make inferences about the future, the target of any selection process is to minimize the discrepancies between model output and observations. Some discrepancies have a greater effect on the scatter of model predictions though. We exemplify this in the case of CO2 forced warming using multivariate empirical orthogonal functions (EOF), created using an ensemble of plausible parameter configurations of CAM3.1. When subjecting this ensemble to a doubling of atmospheric CO2, some EOFs exhibit significantly higher correlation than others with the resulting increase in mean global surface temperature. Therefore, there are discernible bias patterns that effect its predictive scatter. By targeting these patterns in the model evaluation process, it is plausible to use this information to constrain the resulting range of predictions. We take a first step towards showing this by creating a metric to evaluate model skill based on these EOFs and their correlation to a model's sensitivity to CO2 forcing. Using model output, for which we know the resulting temperature increase, as a surrogate for observations in this metric, the resulting distribution of skill scores indeed agreement in sensitivity to CO2 forcing. / text
126

Fatigue damage prediction in deepwater marine risers due to vortex-induced vibration

Shi, Chen 10 January 2013 (has links)
Slender marine risers used in deepwater applications often experience vortex-induced vibration (VIV). Fatigue damage associated with VIV is of great concern to offshore engineers; however, it has proven difficult to predict this fatigue damage using existing semi-empirical tools. Similarly, approaches based on theoretical and computational fluid dynamics (CFD) generally rely on simplified assumptions on the fluid flow fields and response characteristics. To gain an understanding of VIV and associated fatigue damage, full-scale field monitoring campaigns as well as reduced-scale laboratory experiments are often carried out, wherein the riser response in the form of strains and/or accelerations is recorded using an array of a limited number of sensors distributed over the length of the riser. Simultaneously, current velocities at a proximate location are also recorded. Such measurements generally reveal complex characteristics of the dynamic response of a riser undergoing VIV, including the presence of multiple vibration harmonics, non-stationary behavior, and the existence of sustained or intermittent traveling wave patterns. Such complex features, often not accounted for in some semi-empirical and theoretical approaches, are critical to take into consideration for accurate fatigue damage estimation. In this study, several empirical methods are employed to first reconstruct the response of an instrumented riser and, then, estimate fatigue damage rates over the entire span of the riser based on a limited number of discrete measurements. The methods presented employ the measured data in different ways. One method, referred to as ``weighted waveform analysis'' relies on expressing the riser response as a summation of several weighted waveforms or riser modes; the mode shapes are ``assumed'' and time-varying weights for each mode are estimated directly from the measurements. The riser response over the entire span is reconstructed based on these assumed mode shapes and estimated modal weights. Other methods presented extract discrete mode shapes from the data directly. With the help of interpolation techniques, continuous mode shapes are formed, and the riser response is again reconstructed. Fatigue damage rates estimated based on the reconstructed strains obtained using the various empirical methods are cross-validated by comparing predictions against direct measurements available at the same locations (but not used in the analyses). Results show that the empirical methods developed here may be employed to accurately estimate fatigue damage rates associated with individual recorded segments of measurements. Finally, a procedure for prediction of long-term fatigue damage rates of an instrumented marine riser is presented that relies on combining (multiplying) the fatigue damage rates associated with short recorded segments for specific current profile types, with the relative likelihood of different incident current profiles, and integration over all current profiles. It should be noted that the empirical approaches to fatigue damage estimation presented in this study are based only on measured data; also, they explicitly account for different riser response characteristics and for site-specific current profiles developed from metocean studies. Importantly, too, such estimation procedures can easily accommodate additional data that become available in any ongoing field monitoring campaign to improve and update long-term fatigue damage prediction. / text
127

A generalized flow rate model for primary production and an analysis of gravity drainage through numerical simulation

Vitter, Cameron Artigues 07 April 2015 (has links)
The age of “easy” oil has steadily declined through the years as many conventional land-based fields have been depleted to residual levels. Novel technologies, however, have reawakened old fields, allowing incremental oil to be added to their recoverable oil in place (ROIP). Underground Gravity Drainage (UGD), an example of one of these technologies, combines improved horizontal and deviated drilling technologies with the longstanding concept of gravity drainage. In this work, a better understanding of gravity drainage has been gained through (1) development of a numerical, three-dimensional, three-phase reservoir simulator (UT-EMPRES), (2) development of a universal, semi-empirical model of production rates through primary depletion, and (3) analysis of the important aspects of gravity drainage through simulation. UT-EMPRES is a new three-phase, finite-difference reservoir simulator, which utilizes a simple, easy-to-use Microsoft Excel interface to access MATLAB-programmed simulation code. This simulator produces nearly identical results to other well-established simulators, including UTCHEM and CMG. UT-EMPRES has some unique features, allows for easy post-processing in MATLAB, and has been utilized extensively in the other two areas of this thesis. The generalized flow rate model (GFRM) is a semi-empirical equation that is used to forecast the dynamic primary production rate of a reservoir with an arbitrary number of wells all operating at the same constant pressure condition. The model is an extension of the classic tank model, which is inherently a single flowing phase development. With the ability to make a priori predictions of production figures, users can screen various prospect assets on the basis of economic potential through optimization routines on the GFRM. Gravity drainage and its approximation through numerical simulation are analyzed. A sensitivity study was conducted on three-phase gravity drainage, leading to the conclusion that small changes in vertical permeability and portions of the relative permeability-saturation relationships can greatly affect production rates. Finally, two-phase (oil and air) and regions of three-phase (water, oil, air) flow simulations were found to exhibit exponential decline in phase production rates, which may enable the GFRM to be applicable to UGD-type processes. / text
128

Achieving excellence in services : an empirical study in the UAE banking sector

Al-Marri, Khalid Sager January 2005 (has links)
Banking services are perhaps the largest industry that caters to -the needs of various segments of the population reflecting the diverse Diasporas of the society. Moreover, perceived service quality tends to play a significant role in high involvement (high interaction between customers and service providers) industries like banks. Also, banks often have long-term business relationships with customers. In addition, the banking sector is large enough to capture and represent almost all the critical features of the customer-perceived service quality and the critical dimensions of excellence that the management may have to encounter, in order to effectively manage a service organisation. . However, there is considerable lack of literature with respect to service industry management, especially in the banking industry of developing economies. Therefore an analysis of banks in the UAE from a 'service-quality perspective' may sound interesting at this juncture. Such an investigation is vital for the bankers in order to enhance their business performance. The main objective of this research is to develop a theoretical framework to understand and explore CSFs for banks that succeed in the field of TQM and to identify marketoriented activities that are affected by the use of this approach. This research adapts an interdisciplinary approach that makes use of TQM, service quality, IT, and information systems literature. It takes a holistic view of TQM in the banking sector and considers the different stages of implementation and implications of the TQM. The research design involves a combination of quantitative and qualitative methodologies to incorporate: (1) TQM development; (2) the identification of key TQMsuccess factors commonly cited in the literature, and endorsed by practitioners and experts as important to effective TQM implementation; (3) an in-depth case studies. approach to understand how TQM processes, and critical success factors identified are addressed and implemented; and (4) the possible impact of TQM practices on efficiency. Furthermore, the research framework, which emerged from the literature search, is tested and validated by rigorous quantitative analysis using SPSS. The statistical analysis using Factor Analysis, Regression Analysis, One-Sample Test and Ranking Analysis to test a series of relationships and research constructs to provide solid support for the resulting relationships. i The study has identified twelve CSFs for the successful implementation of TQM: (1) Top Management Support, (2) Strategy, (3) Continuous Improvement, (4) Benchmarking, (5) Customer Focus, (6) Quality Department, (7) Human Resource Management, (8) Quality Technology,. (9) Service Design, (10) Employees, (11) Servicescapes, (12) Quality Systems. Furthermore, it *has been found that the organisational experiences of TQM implementation in-the service- sector in UAE are far from being mature. There is a lot of evidence with reference to the survey results and case studies presented in this study that TQM is still a new management concept, and is widely unknown. In many cases, there is some reluctance to introduce it. . The study has pointed out the CSFs for successful TQM implementation because it is vital for organisations to capture the minds of everybody, starting at the top and permeating throughout the whole organisation and beyond. The philosophy maintains that an organisation's primary objective is to enhance its ability to meet customer requirements by improving the quality of its services. People are the most important management resource and ultimate goal of business. TQM generally means a quest for excellence, creating the right attitudes and controls to make prevention of any possible errors, and optimise customer satisfaction by increased efficiency and effectiveness. Further, this study points out TQM as being an organisation-wide activity which has to reach every employee. Therefore, TQM has been an. approach for continuously improving the quality of services delivered through the participation at all levels and functions of the organisation. From this study, it is evident that the effectivetransformation to TQM has been linked to the extent to which firms implement certain CSFs. This study contributes to the emerging literature on TQM in banking sector in a number of specific ways: (1) It provides new theoretical grounds for studying TQM in banking sector in the context of CSFs that affect competition in the dynamic marketplace; (2) It computes and analyses the total quality management indices with respect to the 16 factors which have been developed from the literature for the banking industry as a whole; (3) It ascertains the level of TQM implementation in the UAE banking scene; (4) It Offers key insights on the criticality of the different TQM dimensions with respect to the banking sector in UAE and (5) It provides a foundation and proposals for future research and investigation.
129

Empirical Likelihood Confidence Intervals for the Ratio and Difference of Two Hazard Functions

Zhao, Meng 21 July 2008 (has links)
In biomedical research and lifetime data analysis, the comparison of two hazard functions usually plays an important role in practice. In this thesis, we consider the standard independent two-sample framework under right censoring. We construct efficient and useful confidence intervals for the ratio and difference of two hazard functions using smoothed empirical likelihood methods. The empirical log-likelihood ratio is derived and its asymptotic distribution is a chi-squared distribution. Furthermore, the proposed method can be applied to medical diagnosis research. Simulation studies show that the proposed EL confidence intervals have better performance in terms of coverage accuracy and average length than the traditional normal approximation method. Finally, our methods are illustrated with real clinical trial data. It is concluded that the empirical likelihood methods provide better inferential outcomes.
130

Essays on empirical asset pricing

Wei, Chishen 24 October 2011 (has links)
This dissertation contains two essays that use empirical techniques to shed light on open questions in the asset pricing literature. In the first essay, I investigate whether foreign institutional investors affect stock liquidity in domestic equity markets. The evidence indicates that stocks with higher foreign institutional ownership subsequently experience higher liquidity. However, it is difficult to interpret the causal relation of this finding because institutional investors self-select into more liquid stocks. To solve this problem, I exploit a provision in the 2003 US dividend tax cut which extends tax-relief to dividends from US tax-treaty countries but not to dividends from non-treaty countries. This natural experiment suggests a causal link between foreign institutional investors and liquidity. Consistent with the predictions of theoretical models, I find that liquidity improves due to foreign institutional investors increasing information competition. In the second essay, I introduce a new measure of difference of opinion using mutual fund portfolio weights to test prominent competing theories of the effect of heterogeneous beliefs on asset prices. The over-valuation theory (Miller (1977)) proposes that in the presence of short-sale constraints stock prices reflects only the view of optimistic investors which implies lower subsequent returns. Alternatively, neo-classical asset pricing models (Williams (1977), Merton (1987)) suggest that differences of opinions indicate high levels of information uncertainty or risk which implies higher expected returns. My initial result finds no support for the over-valuation theory. Instead, the measure used in this study finds that high differences of opinion stocks weakly outperform low differences of opinion stocks by 2.42% annually which is more consistent with the information uncertainty explanation. / text

Page generated in 0.0388 seconds