• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 17
  • 6
  • 4
  • 1
  • 1
  • 1
  • Tagged with
  • 97
  • 97
  • 20
  • 19
  • 17
  • 14
  • 11
  • 10
  • 10
  • 10
  • 9
  • 9
  • 9
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Achievement in education : improving measurement and testing models and predictors

McIlroy, David January 2000 (has links)
No description available.
2

An investigation of the role of simulation in the performance prediction of data parallel Fortran (HPF) programs

Vassiliou, Vassilios January 1999 (has links)
No description available.
3

Performance Prediction of Quantization Based Automatic Target Recognition Algorithms

Horvath, Matthew Steven January 2015 (has links)
No description available.
4

Iterative compilation and performance prediction for numerical applications

Fursin, Grigori G. January 2004 (has links)
As the current rate of improvement in processor performance far exceeds the rate of memory performance, memory latency is the dominant overhead in many performance critical applications. In many cases, automatic compiler-based approaches to improving memory performance are limited and programmers frequently resort to manual optimisation techniques. However, this process is tedious and time-consuming. Furthermore, a diverse range of a rapidly evolving hardware makes the optimisation process even more complex. It is often hard to predict the potential benefits from different optimisations and there are no simple criteria to stop optimisations i.e. when optimal memory performance has been achieved or sufficiently approached. This thesis presents a platform independent optimisation approach for numerical applications based on iterative feedback-directed program restructuring using a new reasonably fast and accurate performance prediction technique for guiding optimisations. New strategies for searching the optimisation space, by means of profiling to find the best possible program variant, have been developed. These strategies have been evaluated using a range of kernels and programs on different platforms and operating systems. A significant performance improvement has been achieved using new approaches when compared to the state-of-the-art native static and platform-specific feedback directed compilers.
5

Evaluation Of the NRC 1996 winter feed requirements for beef cows In western Canada

Bourne, Jodi Lynn 28 February 2007
A trial was conducted to evaluate the accuracy of the 1996 NCR beef model to predict DMI and ADG of pregnant cows under western Canadian conditions. Over two consecutive years, 90 Angus (587±147 kg) cows assigned to 15 pens (N=6) were fed typical diets ad libitum, formulated to stage of pregnancy. Data collection included pen DMI and ADG (corrected for pregnancy), calving date, calf weight, body condition scores and ultrasound fat measurements, weekly feed samples and daily ambient temperature. DMI and ADG for each pen of cows in each trimester was predicted using the computer program Cowbytes based on the 1996 NRC beef model. The results indicate that in the 2nd and 3rd trimester of both years the model under predicted (P≤0.05) ADG based on observed DMI. Ad libitum intake was over predicted (P≤0.05) during the 2nd trimester, and under predicted (P≤0.05) during the 3rd trimester of pregnancy. A second evaluation was carried out assuming thermal neutral (TN) conditions. In this case, it was found that during the 2nd and 3rd trimesters there was an over prediction (P≤0.05) of ADG relative to observed. Under these same TN conditions, the ad libitum intake of these cows was under predicted (P≤0.05) for both the 2nd and 3rd trimesters. These results suggest current energy equations for modelling environmental stress, over predict maintenance requirements for wintering beef cows in western Canada. The results also suggest that the cows experienced some degree of cold stress, but not as severe as modelled by the NRC (1996) equations. Further research is required to more accurately model cold stress felt by mature cattle, and their ability to acclimatise to western Canadian winter conditions.
6

Metareasoning about propagators for constraint satisfaction

Thompson, Craig Daniel Stewart 11 July 2011
Given the breadth of constraint satisfaction problems (CSPs) and the wide variety of CSP solvers, it is often very difficult to determine a priori which solving method is best suited to a problem. This work explores the use of machine learning to predict which solving method will be most effective for a given problem. We use four different problem sets to determine the CSP attributes that can be used to determine which solving method should be applied. After choosing an appropriate set of attributes, we determine how well j48 decision trees can predict which solving method to apply. Furthermore, we take a cost sensitive approach such that problem instances where there is a great difference in runtime between algorithms are emphasized. We also attempt to use information gained on one class of problems to inform decisions about a second class of problems. Finally, we show that the additional costs of deciding which method to apply are outweighed by the time savings compared to applying the same solving method to all problem instances.
7

Evaluation Of the NRC 1996 winter feed requirements for beef cows In western Canada

Bourne, Jodi Lynn 28 February 2007 (has links)
A trial was conducted to evaluate the accuracy of the 1996 NCR beef model to predict DMI and ADG of pregnant cows under western Canadian conditions. Over two consecutive years, 90 Angus (587±147 kg) cows assigned to 15 pens (N=6) were fed typical diets ad libitum, formulated to stage of pregnancy. Data collection included pen DMI and ADG (corrected for pregnancy), calving date, calf weight, body condition scores and ultrasound fat measurements, weekly feed samples and daily ambient temperature. DMI and ADG for each pen of cows in each trimester was predicted using the computer program Cowbytes based on the 1996 NRC beef model. The results indicate that in the 2nd and 3rd trimester of both years the model under predicted (P≤0.05) ADG based on observed DMI. Ad libitum intake was over predicted (P≤0.05) during the 2nd trimester, and under predicted (P≤0.05) during the 3rd trimester of pregnancy. A second evaluation was carried out assuming thermal neutral (TN) conditions. In this case, it was found that during the 2nd and 3rd trimesters there was an over prediction (P≤0.05) of ADG relative to observed. Under these same TN conditions, the ad libitum intake of these cows was under predicted (P≤0.05) for both the 2nd and 3rd trimesters. These results suggest current energy equations for modelling environmental stress, over predict maintenance requirements for wintering beef cows in western Canada. The results also suggest that the cows experienced some degree of cold stress, but not as severe as modelled by the NRC (1996) equations. Further research is required to more accurately model cold stress felt by mature cattle, and their ability to acclimatise to western Canadian winter conditions.
8

Metareasoning about propagators for constraint satisfaction

Thompson, Craig Daniel Stewart 11 July 2011 (has links)
Given the breadth of constraint satisfaction problems (CSPs) and the wide variety of CSP solvers, it is often very difficult to determine a priori which solving method is best suited to a problem. This work explores the use of machine learning to predict which solving method will be most effective for a given problem. We use four different problem sets to determine the CSP attributes that can be used to determine which solving method should be applied. After choosing an appropriate set of attributes, we determine how well j48 decision trees can predict which solving method to apply. Furthermore, we take a cost sensitive approach such that problem instances where there is a great difference in runtime between algorithms are emphasized. We also attempt to use information gained on one class of problems to inform decisions about a second class of problems. Finally, we show that the additional costs of deciding which method to apply are outweighed by the time savings compared to applying the same solving method to all problem instances.
9

Evaluating MapReduce System Performance: A Simulation Approach

Wang, Guanying 13 September 2012 (has links)
Scale of data generated and processed is exploding in the Big Data era. The MapReduce system popularized by open-source Hadoop is a powerful tool for the exploding data problem, and is widely employed in many areas involving large scale of data. In many circumstances, hypothetical MapReduce systems must be evaluated, e.g. to provision a new MapReduce system to provide certain performance goal, to upgrade a currently running system to meet increasing business demands, to evaluate novel network topology, new scheduling algorithms, or resource arrangement schemes. The traditional trial-and-error solution involves the time-consuming and costly process in which a real cluster is first built and then benchmarked. In this dissertation, we propose to simulate MapReduce systems and evaluate hypothetical MapReduce systems using simulation. This simulation approach offers significantly lower turn-around time and lower cost than experiments. Simulation cannot entirely replace experiments, but can be used as a preliminary step to reveal potential flaws and gain critical insights. We studied MapReduce systems in detail and developed a comprehensive performance model for MapReduce, including sub-task phase level performance models for both map and reduce tasks and a model for resource contention between multiple processes running in concurrent. Based on the performance model, we developed a comprehensive simulator for MapReduce, MRPerf. MRPerf is the first full-featured MapReduce simulator. It supports both workload simulation and resource contention, and it still offers the most complete features among all MapReduce simulators to date. Using MRPerf, we conducted two case studies to evaluate scheduling algorithms in MapReduce and shared storage in MapReduce, without building real clusters. Furthermore, in order to further integrate simulation and performance prediction into MapReduce systems and leverage predictions to improve system performance, we developed online prediction framework for MapReduce, which periodically runs simulations within a live Hadoop MapReduce system. The framework can predict task execution within a window in near future. These predictions can be used by other components in MapReduce systems in order to improve performance. Our results show that the framework can achieve high prediction accuracy and incurs negligible overhead. We present two potential use cases, prefetching and dynamic adapting scheduler. / Ph. D.
10

An Enhanced MapReduce Workload Allocation Tool for Spot Market Resources

Hudzina, John Stephen 29 March 2015 (has links)
When a cloud user allocates a cluster to execute a map-reduce workload, the user must determine the number and type of virtual machine instances to minimize the workload's financial cost. The cloud user may rent on-demand instances at a fixed price or spot instances at a variable price to execute the workload. Although the cloud user may bid on spot virtual machine instances at a reduced rate, the spot market auction may delay the workload's start or terminate the spot instances before the workload completes. The cloud user requires a forecast for the workload's financial cost and completion time to analyze the trade-offs between on-demand and spot instances. While existing estimation tools predict map-reduce workloads' completion times and costs, these tools do not provide spot instance estimates because a spot market auction determines the instance's start time and duration. The ephemeral spot instances impact execution time estimates because the spot market auction forces the map-reduce workloads to use different storage strategies to persist data after the spot instances terminate. The spot market also reduces the existing tools' completion time and cost estimate accuracy because the tool must factor in spot instance wait times and early terminations. This dissertation updated an existing tool to forecast map-reduce workload's monetary cost and completion time based on spot market historical traces. The enhanced estimation tool includes three new enhancements over existing tools. First, the estimation tool models the impact to the execution from new storage strategies. Second, the enhanced tool calculates additional execution time from early spot instance termination. Finally, the enhance tool predicts the workloads wait time and early termination probabilities from historic traces. Based on two historical Amazon EC2 spot market traces, the enhancements reduce the average completion time prediction error by 96% and the average monetary cost prediction error by 99% over existing tools.

Page generated in 0.1199 seconds