• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 615
  • 157
  • 86
  • 74
  • 54
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1425
  • 210
  • 188
  • 188
  • 181
  • 178
  • 123
  • 116
  • 103
  • 102
  • 98
  • 85
  • 80
  • 78
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Empirical Evaluation of a Universal Requirements Engineering Process Maturity Model

Nguyen, Mai January 2010 (has links)
Context. Software products are usually developed for either a specific customer (bespoke) or a broader market (market-driven). Due to their characteristics, bespoke and market-driven development face different problems, especially in the requirements engineering areas. Moreover, many of these problems are caused by an inadequate requirements engineering process. Hence, in order to improve the process and subsequently the software quality, the maturity of the RE process must be evaluated. Although there are many process assessment initiatives done in bespoke development, there is a need for models covering both approaches. Uni-REPM, which can assess the RE process maturity in all environments, is such a model. Objectives. This study presents an academic and industrial evaluation of the Uni-REPM model before transferring it to industry. Methods. The first validation was conducted in the form of interviews with seven academic experts in which the model was scrutinized for its correctness and completeness. Subsequently, the model and the assessment method were applied and validated in 4 industrial organizations locating in Denmark, Spain and Singapore. Results. Based on the feedback obtained in the validation, refinements were made to the model to improve its quality. In addition, the evaluation result analysis of each industrial project is useful in indentifying weak areas in the process and suggesting possible improvement practices. Conclusions. The study shows that Uni-REPM is a quick, simple and cost-effective solution to assess the maturity level of the Requirements Engineering process of projects. Moreover, the assessment method using checklist is highly usable and applicable in various international development environments.
162

The European Employment Price Index: Implementation and Feasibility in Austria

Zagler, Martin, Mühlberger, Ulrike January 1998 (has links) (PDF)
The study, on which this paper is based upon, has analyzed the implementation and feasibility of the European Employment Price Index (EEPI) in Austria. The European Employment Price Index is a Laspeyres measure of the change in the demand-transaction price of the standardized unit of labor. We find that it is feasible to construct the index with the available company data with an approximate lag length of five month. Most data were easily accessible within firms, with the exception of severance payments, company pensions, and hypothetical costs. Only 228 observations are required to obtain an aggregate EEPI for Austria within +/- one percentage point at the 95 % significance level, whilst some 4800 observations are necessary for disaggregate series, enormously increasing costs of provision.
163

Vyhodnocování užitečnosti ladících nástrojů / Evaluation of Usefulness of Debugging Tools

Martinec, Tomáš January 2015 (has links)
Debugging is a very time-consuming activity for programmers. Although the number of proposed debugging tools is large, the number of tools that are actually adopted by practitioners and used during development of software is less than one may expect. Many believe that one reason for the situation is that it is hard to estimate whether the implementation efforts of proposed debugging tools or approaches are worth the gain. The first goal of this thesis is to propose a methodology for the evaluation of usefulness of debugging tools. To provide an exemplary usage of the methodology, a study of usefulness of typical debugging tools for development of operating systems is conducted. Secondly, the thesis also explores and documents further aspects of how programmers debug software. Powered by TCPDF (www.tcpdf.org)
164

A comparison of hypothesis testing procedures for two population proportions

Hort, Molly January 1900 (has links)
Master of Science / Department of Statistics / John E. Boyer Jr / It has been shown that the most straightforward approach to testing for the difference of two independent population proportions, called the Wald procedure, tends to declare differences too often. Because of this poor performance, various researchers have proposed simple adjustments to the Wald approach that tend to provide significance levels closer to the nominal. Additionally, several tests that take advantage of different methodologies have been proposed. This paper extends the work of Tebbs and Roths (2008), who wrote an R program to compare confidence interval coverage for a variety of these procedures when used to estimate a contrast in two or more binomial parameters. Their program has been adapted to generate exact significance levels and power for the two parameter hypothesis testing situation. Several combinations of binomial parameters and sample sizes are considered. Recommendations for a choice of procedure are made for practical situations.
165

The Impact of Red Light Cameras on Injury Crashes within Miami-Dade County, Florida

Llau, Anthoni 27 April 2015 (has links)
Previous red light camera (RLC) studies have shown reductions in violations and overall and right angle collisions, however, they may also result in increases in rear-end crashes (Retting & Kyrychenko, 2002; Retting & Ferguson, 2003). Despite their apparent effectiveness, many RLC studies have produced imprecise findings due to inappropriate study designs and/or statistical techniques to control for biases (Retting & Kyrychenko, 2002), therefore, a more comprehensive approach is needed to accurately assess whether they reduce motor vehicle injury collisions. The objective of this proposal is to assess whether RLC’s improve safety at signalized intersections within Miami-Dade County, Florida. Twenty signalized intersections with RLC’s initiating enforcement on January 1st, 2011 were matched to two comparison sites located at least two miles from camera sites to minimize spillover effect. An Empirical Bayes analysis was used to account for regression to the mean. Incidences of all injury, red light running related injury, right-angle/turning, and rear-end collisions were examined. An index of effectiveness along with 95% CI’s were calculated. During the first year of camera enforcement, RLC sites experienced a marginal decrease in right-angle/turn collisions, a significant increase in rear-end collisions, and significant decreases in all-injury and red light running-related injury collisions. An increase in right-angle/turning and rear-end collisions at the RLC sites was observed after two years despite camera enforcement. A significant reduction in red light running-related injury crashes, however, was still observed after two years. A non-significant decline in all injury collisions was also noted. Findings of this research indicate RLC’s reduced red light running-related injury collisions at camera sites, yet its tradeoff was a large increase in rear-end collisions. Further, there was inconclusive evidence whether RLC’s affected right-angle/turning and all injury collisions. Statutory changes in crash reporting during the second year of camera enforcement affected the incidence of right-angle and rear-end collisions, nevertheless, a novelty effect could not be ruled out. A limitation of this study was the small number of injury crashes at each site. In conclusion, future research should consider events such as low frequencies of severe injury/fatal collisions and changes in crash reporting requirements when conducting RLC analyses.
166

Pretreatment and hydrolysis of recovered fibre for ethanol production

Ruffell, John 11 1900 (has links)
Energy utilization is a determining factor for the standards of living around the world, and the current primary source of energy is fossil fuels. A potential source of liquid fuels that could ease the strain caused by diminishing petroleum resources is bioethanol. Effective exploitation of biomass materials requires a pretreatment to disrupt the lignin and cellulose matrix. The pretreatment utilized for this research was oxygen delignification, which is a standard process stage in the production of bleached chemical pulp. The model substrate utilized as a feedstock for bioethanol was recovered fibre. An analysis of the substrates digestibility resulted in a hexose yield of approximately 23%, which justified the need for an effective pretreatment. An experimental design was performed to optimize the delignification conditions by performing experiments over a range of temperature, caustic loadings, and reaction times. Equations were developed that outline the dependence of various response parameters on the experimental variables. An empirical model that can predict sugar concentrations from enzymatic hydrolysis based on the Kappa number, enzyme loading, and initial fibre concentration was also developed. A study of hydrolysis feeding regimes for untreated recovered fibre (87 Kappa), pretreated recovered fibre (17 Kappa), and bleached pulp (6 Kappa) showed that the batch feeding regime offers reduced complexity and high sugar yields for lower Kappa substrates. In order to evaluate the possibility of lignin recovery, the pH of delignification liquor was reduced by the addition of CO₂ and H₂SO₄, resulting in up to 25% lignin yield. An experiment that looked at effect of post-delignification fibre washing on downstream hydrolysis found that a washing efficiency of approximately 90% is required in order to achieve a hexose sugar yield of 85%. / Applied Science, Faculty of / Chemical and Biological Engineering, Department of / Graduate
167

Analytical and empirical models of online auctions

Ødegaard, Fredrik 11 1900 (has links)
This thesis provides a discussion on some analytical and empirical models of online auctions. The objective is to provide an alternative framework for analyzing online auctions, and to characterize the distribution of intermediate prices. Chapter 1 provides a mathematical formulation of the eBay auction format and background to the data used in the empirical analysis. Chapter 2 analyzes policies for optimally disposing inventory using online auctions. It is assumed a seller has a fixed number of items to sell using a sequence of, possibly overlapping, single-item auctions. The decision the seller must make is when to start each auction. The decision involves a trade-off between a holding cost for each period an item remains unsold, and a cannibalization effect among competing auctions. Consequently the seller must trade-off the expected marginal gain for the ongoing auctions with the expected marginal cost of the unreleased items by further deferring their release. The problem is formulated as a discrete time Markov Decision Problem. Conditions are derived to ensure that the optimal release policy is a control limit policy in the current price of the ongoing auctions. Chapter 2 focuses on the two item case which has sufficient complexity to raise challenging questions. An underlying assumption in Chapter 2 is that the auction dynamics can be captured by a set of transition probabilities. Chapter 3 shows with two fixed bidding strategies how the transition probabilities can be derived for a given auction format and bidder arrival process. The two specific bidding strategies analyzed are when bidders bid: 1) a minimal increment, and 2) their true valuation. Chapters 4 and 5 provides empirical analyzes of 4,000 eBay auctions conducted by Dell. Chapter 4 provides a statistical model where over discrete time periods, prices of online auctions follow a zero-inflated gamma distribution. Chapter 5 provides an analysis of the 44,000 bids placed in the auctions, based on bids following a gamma distribution. Both models presented in Chapters 4 and 5 are based on conditional probabilities given the price and elapsed time of an auction, and certain parameters of the competing auctions. Chapter 6 concludes the thesis with a discussion of the main results and possible extensions. / Business, Sauder School of / Graduate
168

Hedge fund activism, corporate governance and corporate law : an empirical analysis across twenty-five countries

Katelouzou, Dionysia January 2013 (has links)
This study investigates the brand of shareholder activism hedge funds deploy by reference to a unique hand-collected dataset of 11 years’ activist hedge funds’ campaigns across 25 countries. The analysis has two core elements, one of which is to chart the emergence of hedge fund activism outside the United States and the other being to account why hedge fund activism has developed differently across the sample countries. Both issues have been to date only tangentially explored. This study is the first one to seek to determine the extent to which corporate law is a determinant of the hedge fund activism phenomenon using a fresh approach which combines theoretical and comparative legal analysis with empirical methods. While a single variable is unlikely to account for the emergence of hedge fund activism, the study describes hedge fund activism as a game of three sequential stages as a heuristic device and identifies market and legal parameters for each stage. To test the hypotheses advanced for the emergence of hedge fund the study draws upon the law and finance literature. For instance, to account to what extent the rights bestowed on shareholders by corporate law influence hedge fund activism the study uses the CBR shareholder rights index. The results indicate that the extent to which law matters depends on the stage which activism has reached. The study also puts hedge fund activism in its corporate governance context. Activist hedge funds’ interventions have been envisioned as a mechanism for ensuring effective control of managerial discretion. Opponents of hedge fund activism contend, however, that this new breed of activists has a dark side that raises various concerns. Activist hedge funds have been considered: as exacerbating short-termism; as being mainly aggressive to the incumbents; as bearing similarities to the 1980s-raiders; and as engaging in distorting equity decoupling techniques. The study presents new empirical data that shows that the perceived negative side-effects of hedge funds activism are greatly exaggerated: they are myths. Cumulatively, these findings question whether hedge fund activism warrants any type of legislative response so far as the goal of shareholder value maximization is succeeded.
169

The Use of Intellectual Property Laws and Social Norms by Independent Fashion Designers in Montreal and Toronto: An Empirical Study

Doagoo, B. Courtney January 2017 (has links)
Intellectual property law theory is premised on a utilitarian justification granting limited time monopolies for encouraging creation, innovation and its dissemination to society. However, in the last several decades, scholars have been mounting empirical evidence to show that in some industries, creativity and innovation exist outside the contours of intellectual property law and thrive despite their lack of reliance on the laws. Instead, what they uncovered is that creators in these industries follow norms that mitigate issues surrounding some kinds of copying. Intellectual property protection for fashion design in Canada is fragmented across a complex legal landscape that entails several different laws, unique in scope, eligibility requirements and rights. This complex framework is not unique to the fashion design industry but is similar for design industries generally. Navigating through these laws can be daunting and thus inaccessible for the some segments of the design industry that are small to medium sized enterprises (SMEs) that have limited resources to expend on legal advice and registration. Using grounded theory methodology and qualitative and quantitative methods, this research explored the use of intellectual property law and social norms by the independent fashion design segment in Montreal and in Toronto and the contours of copying and the public domain. What the empirical research reveals is that independent fashion designers do not use the law to protect their designs and instead, use mechanisms that centre on the negative copying norm. Negative copying is copying that is negatively perceived. It is not necessarily legally infringing or economically harmful, although it can be both. Further, it can apply to subject matter that is not the subject matter of intellectual property law. This norm against negative copying is supported by extra-legal prevention and enforcement mechanisms that have been developed by individuals within the segment in order to mitigate the issue of copying. The empirical research also reveals that in addition to the economic incentives to create, there are also a number of non-economic incentives such as identity and reputational interests that drive creativity and help reinforce the norm against negative copying. Using grounded theory enabled me to draw on literature from a number of disciplines in order to help contextualize these findings and approach the analysis from the perspective of intellectual property theory, policy and law, social norms (sociology and psychology) as well as economic geography, and design.
170

Analýza efektivity tréninkového programu v bance HSBC / Efficiency analysis of training program in HSBC bank

Šperlich, Marek January 2008 (has links)
The thesis theme is research and analysis of training program in HSBC Bank which took place during September 2007. The thesis is divided in to five chapters. The first chapter is devoted to theory of andragogics, corporate training and efficiency analysis. The second chapter is devoted to empirical research also problem and its solution is introduced there. The third chapter introduces HSBC Bank and HSBC Bank training program. This training program was created for newly hired employees. The fourth chapter consist of three surveys of participators of this training. The first -questionary- is focused on trainees. The second survey is evaluated by trainers of the company, this survey was executed via internet. The last survey was realized through series of interview with trainees. Based on the results of the investigation the fourth chapter is concluded with SWOT analysis of training, evaluation of efficiency of Kirkpatrick`s schneme. The last chapter contains proposals to streamline the training process. This proposals will be submitted to the management of the company for possible use in further training.

Page generated in 0.0487 seconds