• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 305
  • 191
  • 169
  • 69
  • 51
  • 44
  • 23
  • 17
  • 9
  • 9
  • 7
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 998
  • 212
  • 165
  • 151
  • 105
  • 83
  • 81
  • 80
  • 68
  • 68
  • 62
  • 60
  • 57
  • 54
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Using Benchmarking Methodology to Evaluate the Effectiveness of In-Home Parent-Child Interaction Therapy (PCIT)

Valente, Jessica R 06 August 2010 (has links)
Benchmarking offers community practitioners more systematic judgments about research effectiveness when control groups are not feasible, while also providing a standard for program transportability from clinical to community settings. The purpose of the current study was to outline the necessary decisions, calculations, and strengths and limitations of applying benchmarking methodologies to a behavioral parent training (BPT) program, a field in which benchmarking remains relatively underutilized. The implementation of in-home Parent-Child Interaction Therapy (PCIT), an evidence-based practice shown to be successful in reducing child maltreatment and neglect, was evaluated as a case study of the application of benchmarking. Of those parents that completed in-home PCIT, a significant reduction was seen for pre-post ECBI scores. Six randomized controlled trials (RCTs) were established as benchmarks based on similarity in parent and child demographics as well as use of the ECBI as a primary measure. Effect sizes of each benchmark study were aggregated to create a single benchmark effect size for treatment and control groups, respectively. The effect size of the current study was found to be significantly superior to the control benchmark effect size but not significantly equivalent to the treatment benchmark effect size. Although the current study demonstrates the use of benchmarking in community research, the need for further guidelines is critical for researchers.
292

Benchmarking growth performance and feed efficiency of commercial rainbow trout farms in Ontario, Canada

Skipper-Horton, James Owen 16 May 2013 (has links)
Ontario cage culture operations produce the majority of farmed rainbow trout in Canada, using a diverse range of management practices that are expected to result in substantial variation in trout performance across the industry. A preliminary survey of performance data was undertaken, resulting in data from 5 commercial sites between 2008 and 2012. Commercial performance was somewhat poorer than expected, particularly for mortality rates, thermal-unit growth coefficients, and economic feed conversion ratios (average weighted values of 12%, 0.165, and 1.36, respectively). Substantial variability in all performance parameters within and across operations suggests that continued production monitoring and benchmarking could be highly valuable for improving the economic sustainability of the sector. For future benchmarking efforts to be effective, improvement and standardization of data collection methods is needed. As such, a number of recommendations are provided to the industry for the refinement and standardization of performance recording protocols used by Ontario producers.
293

Multiobjective Optimization Algorithm Benchmarking and Design Under Parameter Uncertainty

LALONDE, NICOLAS 13 August 2009 (has links)
This research aims to improve our understanding of multiobjective optimization, by comparing the performance of five multiobjective optimization algorithms, and by proposing a new formulation to consider input uncertainty in multiobjective optimization problems. Four deterministic multiobjective optimization algorithms and one probabilistic algorithm were compared: the Weighted Sum, the Adaptive Weighted Sum, the Normal Constraint, the Normal Boundary Intersection methods, and the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The algorithms were compared using six test problems, which included a wide range of optimization problem types (bounded vs. unbounded, constrained vs. unconstrained). Performance metrics used for quantitative comparison were the total run (CPU) time, number of function evaluations, variance in solution distribution, and numbers of dominated and non-optimal solutions. Graphical representations of the resulting Pareto fronts were also presented. No single method outperformed the others for all performance metrics, and the two different classes of algorithms were effective for different types of problems. NSGA-II did not effectively solve problems involving unbounded design variables or equality constraints. On the other hand, the deterministic algorithms could not solve a problem with a non-continuous objective function. In the second phase of this research, design under uncertainty was considered in multiobjective optimization. The effects of input uncertainty on a Pareto front were quantitatively investigated by developing a multiobjective robust optimization framework. Two possible effects on a Pareto front were identified: a shift away from the Utopia point, and a shrinking of the Pareto curve. A set of Pareto fronts were obtained in which the optimum solutions have different levels of insensitivity or robustness. Four test problems were used to examine the Pareto front change. Increasing the insensitivity requirement of the objective function with regard to input variations moved the Pareto front away from the Utopia point or reduced the length of the Pareto front. These changes were quantified, and the effects of changing robustness requirements were discussed. The approach would provide designers with not only the choice of optimal solutions on a Pareto front in traditional multiobjective optimization, but also an additional choice of a suitable Pareto front according to the acceptable level of performance variation. / Thesis (Master, Mechanical and Materials Engineering) -- Queen's University, 2009-08-10 21:59:13.795
294

A benchmarking of the South African Liquid Fuels industry with that of the United States of America.

Ballim, Kamil. January 2006 (has links)
The South African Liquid Fuels Industry is currently in a state of flux. A new fuel pricing mechanism has been rolled out and new legislation has been enacted within the past few years that have significantly impacted on the business environment. The Main Supply Agreement which governed the marketing and distribution of Sasol's liquid fuel products also came to an end on the 1 st of January 2004. All these drivers have had a major impact on the business environment that the Sasol Liquid Fuels Business operated in. The government has also stated its policy to further deregulate the industry including the pricing structure of liquid fuel. The United States Liquid Fuels Industry is the largest in the world and is based on an unregulated fuel price thus leading to price competition among competing retailers. It therefore serves as a good basis with which to compare and evaluate the South African industry. A comprehensive industry analysis is performed in order to explore, understand and describe the nature of the liquid fuels business environments in the USA and South Africa. The Delphi technique was used to gather primary data on the state of the South African Liquid Fuels industry. Thereafter a benchmarking of the South African Liquid Fuels Industry is carried out using primary data from the Delphi study and secondary data from the literature review. The techniques used include PEST analysis and Porter's Five Forces. Aspects of the United States industry that are similar to the South African industry are identified. A scenario for a future deregulated South African Liquid Fuels Industry is described and a marketing and distribution strategy for Sasol is proposed. / Thesis (MBA)-University of KwaZulu-Natal, Westville, 2006.
295

Scalable and robust compute capacity multiplexing in virtualized datacenters

Kesavan, Mukil 27 August 2014 (has links)
Multi-tenant cloud computing datacenters run diverse workloads, inside virtual machines (VMs), with time varying resource demands. Compute capacity multiplexing systems dynamically manage the placement of VMs on physical machines to ensure that their resource demands are always met while simultaneously optimizing on the total datacenter compute capacity being used. In essence, they give the cloud its fundamental property of being able to dynamically expand and contract resources required on-demand. At large scale datacenters though there are two practical realities that designers of compute capacity multiplexing systems need to deal with: (a) maintaining low operational overhead given variable cost of performing management operations necessary to allocate and multiplex resources, and (b) the prevalence of a large number and wide variety of faults in hardware, software and due to human error, that impair multiplexing efficiency. In this thesis we propound the notion that explicitly designing the methods and abstractions used in capacity multiplexing systems for this reality is critical to better achieve administrator and customer goals at large scales. To this end the thesis makes the following contributions: (i) CCM - a hierarchically organized compute capacity multiplexer that demonstrates that simple designs can be highly effective at multiplexing capacity with low overheads at large scales compared to complex alternatives, (ii) Xerxes - a distributed load generation framework for flexibly and reliably benchmarking compute capacity allocation and multiplexing systems, (iii) A speculative virtualized infrastructure management stack that dynamically replicates management operations on virtualized entities, and a compute capacity multiplexer for this environment, that together provide fault-scalable management performance for a broad class of commonly occurring faults in large scale datacenters. Our systems have been implemented in an industry-strength cloud infrastructure built on top of the VMware vSphere virtualization platform and the popular open source OpenStack cloud computing platform running ESXi and Xen hypervisors, respectively. Our experiments have been conducted in a 700 server datacenter using the Xerxes benchmark replaying trace data from production clusters, simulating parameterized scenarios like flash crowds, and also using a suite of representative cloud applications. Results from these scenarios demonstrate the effectiveness of our design techniques in real-life large scale environments.
296

Wearable Heart Rate Measuring Unit

Patancheru, Govardhan Reddy January 2014 (has links)
Despite having the numerous evolved heart rate measuring devices and progress in their development over the years, there always remain the challenges of modern signal processing implementation by a comparatively small size wearable device. This thesis paper presents a wearable reflectance photoplethysmography (PPG) sensor system for measuring the heart rate of a user both in steady and moving states. The size and, power consumption of the device are considered while developing, to ensure an easy deployment of the unit at the measuring site and the ability to power the entire unit with a battery .The selection of both the electronic circuits and signal processing techniques is based on their sensitivity to PPG signals, robustness against noise inducing artifacts and miniaturization of the entire measuring unit. The entire signal chain operates in the discrete-time, which allows the entire signal processing to be implemented in firmware on an embedded microprocessor. The PPG sensor system is implemented on a single PCB that consumes around 7.5mW of power. Benchmarking tests with standard heart rate measuring devices reveal that the developed measurement unit (combination of the PPG sensor system, and inertial measurement unit (IMU) developed in-house at Acreo Swedish ICT, and a battery) is comparable to the devices in detecting heart rate even in motion artifacts environment. This thesis work is carried out in Acreo Swedish ICT, Gothenburg, Sweden in collaboration with MidSweden University, Sundsvall, Department of Electronics Design. This report can be used as ground work for future development of wearable heart rate measuring units at Acreo Swedish ICT.
297

A benchmarking model for harmonic distortion in a power system / Johnny Rudolph

Rudolph, Johnny January 2011 (has links)
The present power system is loaded with sophisticated energy conversion technologies like solid state converters. With the rapid advance in semiconductor technology, power electronics have provided new devices that are highly efficient and reliable. These devices are inherently non-linear, which causes the current to deviate from sinusoidal conditions. This phenomenon is known as harmonic current distortion. Multiple consumers are connected to the utility at the point of common coupling. Harmonic currents are then transmitted into the distribution system by various solid state users and this could lead to voltage distortion. Harmonic distortion is just one of the power quality fields and is not desirable in a power system. Distortion levels could cause multiple problems in the form of additional heating, increased power losses and even failing of sensitive equipment. Utility companies like Eskom have power quality monitors on various points in their distribution system. Data measurements are taken at a single point of delivery during certain time intervals and stored on a database. Multiple harmonic measurements will not be able to describe distortion patterns of the whole distribution system. Analysis must be done on this information to translate it to useful managerial information. The aim of this project is to develop a benchmarking methodology that could aid the supply industry with useful information to effectively manage harmonic distortion in a distribution system. The methodology will implement distortion indexes set forth by the Electrical Power Research Institute [3], which will describe distortion levels in a qualitative and quantitative way. Harmonic measurements of the past two years will be used to test the methodology. The information is obtained from Eskom’s database and will benchmark the North-West Province distribution network [40]. This proposed methodology will aim to aid institutions like NERSA to establish a reliable power quality management system. / Thesis (M.Ing. (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2012
298

The measurement of university performance using concepts derived from data envelopment analysis (DEA)

Wilkinson, Robert H. January 1991 (has links)
Performance measurement in higher education is examined during this study, in particular university performance indicators are reviewed and discussed. The conclusion is made that appropriate input and output indicators require some form of combination in order to allow practical consideration to be made. The technique of Data Envelopment Analysis (DEA) is reviewed and found to have a number of conceptual drawbacks. The model is considerably developed within the thesis, primarily by the introduction of weight restrictions on the variables. Taken as a whole the developments, coined the DEAPMAS process, create a technique which can be used to assess cost effectiveness rather than simply efficiency. Data for two examples of subject areas, defined by recognised accounting units, are applied to the program as inter-university comparison was felt to be impractical at institutional level; due to differing subject mixes. A considerable computer implementation of the developed theory was written and utilised to provide results over a number of data runs for the examples. It was concluded that the results obtained represented a considerable improvement over separate consideration of numerous performance indicators.
299

Airport car parking strategy : lessons from the non-airport sector

Straker, Ian January 2006 (has links)
Despite September 11th 2001, many international airports are operating close to capacity, a problem that is likely to become more acute given the projected long-term growth in air traffic. This growth is likely to have major implications on runway, terminal and surface access capacity, infrastructure which is already experiencing constraints. The area of surface access is as much, if not more an issue in terms of employees accessing the airport as it is for passengers. Typically, one third of access traffic can be attributed to employees. Employees represent a particular problem for airports in terms of surface access due to the frequent, peak hour nature of trips made and their higher usage rates of the car compared to passengers. A range of initiatives exist to encourage the use of modes other than the private car by employees but overall these measures tend to be ‘soft' in nature and one generally has to look to the non-airport sector to see examples of more innovative ‘harder' initiatives such as financial incentive and disincentive car parking measures direct to employees. This thesis utilises a series of carefully selected interviews and focus groups at Heathrow Airport and three best practice non-airport organisations, underpinned by a process grounded in the concept and methodology of best practice benchmarking, to suggest areas where BAA, and potentially airport authorities around the world, could learn from other organisations in the area of employee surface access and specifically car parking . It is concluded that there are four key areas airports should focus on to explore the issues surrounding the implementation of a car parking charge or car parking cash out direct to employees, namely: the use of a package approach; a requirement for top management support; gaining acceptance from employees, and; the issues surrounding the process of implementation. The thesis contributes to knowledge in a number of areas, predominantly the use of benchmarking in the area of car parking and the airport sector, an industry which has thus far not adopted the technique to any great extent.
300

Assessment of TQM implementation /

Woon, Kin Chung. Unknown Date (has links)
Thesis (DBA(DoctorateofBusinessAdministration))--University of South Australia, 2001.

Page generated in 0.0571 seconds