551 |
Multidimensional approaches to performance evaluation of competing forecasting modelsXu, Bing January 2009 (has links)
The purpose of my research is to contribute to the field of forecasting from a methodological perspective as well as to the field of crude oil as an application area to test the performance of my methodological contributions and assess their merits. In sum, two main methodological contributions are presented. The first contribution consists of proposing a mathematical programming based approach, commonly referred to as Data Envelopment Analysis (DEA), as a multidimensional framework for relative performance evaluation of competing forecasting models or methods. As opposed to other performance measurement and evaluation frameworks, DEA allows one to identify the weaknesses of each model, as compared to the best one(s), and suggests ways to improve their overall performance. DEA is a generic framework and as such its implementation for a specific relative performance evaluation exercise requires a number of decisions to be made such as the choice of the units to be assessed, the choice of the relevant inputs and outputs to be used, and the choice of the appropriate models. In order to present and discuss how one might adapt this framework to measure and evaluate the relative performance of competing forecasting models, we first survey and classify the literature on performance criteria and their measures – including statistical tests – commonly used in evaluating and selecting forecasting models or methods. In sum, our classification will serve as a basis for the operationalisation of DEA. Finally, we test DEA performance in evaluating and selecting models to forecast crude oil prices. The second contribution consists of proposing a Multi-Criteria Decision Analysis (MCDA) based approach as a multidimensional framework for relative performance evaluation of the competing forecasting models or methods. In order to present and discuss how one might adapt such framework, we first revisit MCDA methodology, propose a revised methodological framework that consists of a sequential decision making process with feedback adjustment mechanisms, and provide guidelines as to how to operationalise it. Finally, we adapt such a methodological framework to address the problem of performance evaluation of competing forecasting models. For illustration purposes, we have chosen the forecasting of crude oil prices as an application area.
|
552 |
The quality of environmental management frameworks in South Africa / Marius MaraisMarais, Marius January 2010 (has links)
Environmental assessments and authorisations surrounding project level developments are often made in
isolation, without consideration of the regional or strategic context within which individual developments are
done. This research investigates the quality of Environmental Management Frameworks (EMF) as strategic
environmental instrument. EMF is a unique South African instrument that was first conceptualised in 1989,
enacted in 2006 and updated in 2010. EMFs were developed to map environmental sensitivity to aid the
screening out of undesired developments in sensitive environments and to minimise unnecessary project
level assessments in preferred development areas. EMFs form an important link between environmental
assessment (EA) processes and planning strategies such as Spatial Development Frameworks (SDFs) and
Integrated Development Plans (IDPs), due to their spatial output of environmental sensitivity maps and
their ability to feed strategic assessment processes required by SDFs. They have a legal mandate which
ensures their assimilation and use.
This research uses a multiple case study approach to review seven EMF documents for their quality. The
quality aspects identified are the process, methodology and documentation components, using the printed
EMF documentation as primary information source. Quality review criteria were subsequently developed to
investigate these inputs, using the legal mandate of EMF as basis. Each case was rated for compliance with
the quality criteria using a six–level rating schedule. Further analyses were made by comparing the
performance of cases against one another.
Public participation emerged as the weakest component of EMF practice, while aspects of sensitivity analysis
also performed weaker than other aspects. More focus is required on aligning scales and resolutions of map
inputs, mapping methods and general integration of spatial data, especially those of adjoining districts. The
need to substantiate a rationale for buffer determination also requires further refinement. The practice of
conducting EMF is well established and it can be valuable in sustainable development planning and decisionmaking.
Recommendations to enhance the sustainability outcomes and hence effectiveness of this
instrument are made, as well as future research objectives for increasing its utility. / Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2011.
|
553 |
The quality of environmental management frameworks in South Africa / Marius MaraisMarais, Marius January 2010 (has links)
Environmental assessments and authorisations surrounding project level developments are often made in
isolation, without consideration of the regional or strategic context within which individual developments are
done. This research investigates the quality of Environmental Management Frameworks (EMF) as strategic
environmental instrument. EMF is a unique South African instrument that was first conceptualised in 1989,
enacted in 2006 and updated in 2010. EMFs were developed to map environmental sensitivity to aid the
screening out of undesired developments in sensitive environments and to minimise unnecessary project
level assessments in preferred development areas. EMFs form an important link between environmental
assessment (EA) processes and planning strategies such as Spatial Development Frameworks (SDFs) and
Integrated Development Plans (IDPs), due to their spatial output of environmental sensitivity maps and
their ability to feed strategic assessment processes required by SDFs. They have a legal mandate which
ensures their assimilation and use.
This research uses a multiple case study approach to review seven EMF documents for their quality. The
quality aspects identified are the process, methodology and documentation components, using the printed
EMF documentation as primary information source. Quality review criteria were subsequently developed to
investigate these inputs, using the legal mandate of EMF as basis. Each case was rated for compliance with
the quality criteria using a six–level rating schedule. Further analyses were made by comparing the
performance of cases against one another.
Public participation emerged as the weakest component of EMF practice, while aspects of sensitivity analysis
also performed weaker than other aspects. More focus is required on aligning scales and resolutions of map
inputs, mapping methods and general integration of spatial data, especially those of adjoining districts. The
need to substantiate a rationale for buffer determination also requires further refinement. The practice of
conducting EMF is well established and it can be valuable in sustainable development planning and decisionmaking.
Recommendations to enhance the sustainability outcomes and hence effectiveness of this
instrument are made, as well as future research objectives for increasing its utility. / Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2011.
|
554 |
Performance evaluation of video streaming over multi-hop wireless local area networksLi, Deer 10 July 2008 (has links)
Internet Protocol Television (IPTV) has become the application that drives the
Internet to a new height. However, challenges still remain in IPTV in-home distribution. The high-quality video streaming in IPTV services demands home networks to deliver video streaming packets with stringent Quality-of-Service (QoS) requirements. Currently, most service providers recommend Ethernet-based broadband home networks for IPTV. However, many existing houses are not wired with Ethernet cables and the rewiring cost is prohibitively expensive. Therefore, wireless solutions are preferred if their performance can meet the requirements. IEEE 802.11 wireless local area networks (WLANs) are pervasively adopted in home networks for their flexibility and affordability. However, through our experiments in the real environment, we found that the conventional single-hop infrastructure mode WLANs have very limited capacity and coverage in a typical in-door environment due to high
attenuation and interference. The single-hop wireless networks cannot provide support
for high-quality video streaming to the entire house. Multi-hop wireless networks
are therefore used to extend the coverage. Contrary to the common believes that adding relay routers in the same wireless channel should reduce the throughput, our experiment, analysis and simulation results show that the multi-hop IEEE 802.11 WLANs can improve both the capacity and coverage in certain scenarios, and sufficiently support high-quality video streaming in a typical house. In this research, we analyzed and evaluated the performance of H.264-based video streaming over multi-hop wireless networks. Our analysis and simulation results reveal a wide spectrum of
coverage-capacity tradeoff of multi-hop wireless networks in generic scenarios. More-
over, we discuss the methods of how to further improve video streaming performance.
This research provides the guidance on how to achieve the optimal balance for a given
scenario, which is of great importance when deploying end-to-end IPTV services with
QoS guarantee.
|
555 |
Performance Evaluation and Prediction of Parallel ApplicationsMarkomanolis, Georgios 20 January 2014 (has links) (PDF)
Analyzing and understanding the performance behavior of parallel applicationson various compute infrastructures is a long-standing concern in the HighPerformance Computing community. When the targeted execution environments arenot available, simulation is a reasonable approach to obtain objectiveperformance indicators and explore various ''what-if?'' scenarios. In thiswork we present a framework for the off-line simulation of MPIapplications. The main originality of our work with regard to the literature is to rely on\tit execution traces. This allows for an extreme scalability as heterogeneousand distributed resources can be used to acquire a trace. We propose a formatwhere for each event that occurs during the execution of an application we logthe volume of instructions for a computation phase or the bytes and the type ofa communication. To acquire time-independent traces of the execution of MPI applications, wehave to instrument them to log the required data. There exist many profilingtools which can instrument an application. We propose a scoring system thatcorresponds to our framework specific requirements and evaluate the mostwell-known and open source profiling tools according to it. Furthermore weintroduce an original tool called Minimal Instrumentation that was designed tofulfill the requirements of our framework. We study different instrumentationmethods and we also investigate several acquisition strategies. We detail thetools that extract the \tit traces from the instrumentation traces of somewell-known profiling tools. Finally we evaluate the whole acquisition procedureand we present the acquisition of large scale instances. We describe in detail the procedure to provide a realistic simulated platformfile to our trace replay tool taking under consideration the topology of thereal platform and the calibration procedure with regard to the application thatis going to be simulated. Moreover we present the implemented trace replaytools that we used during this work. We show that our simulator can predictthe performance of some MPI benchmarks with less than 11\% relativeerror between the real execution and simulation for the cases that there is noperformance issue. Finally, we identify the reasons of the performance issuesand we propose solutions.
|
556 |
An Approximation Method For Performance Measurement In Base-stock Controlled Assembly SystemsRodoplu, Umut 01 January 2004 (has links) (PDF)
The aim of this thesis is to develop a tractable method for approximating the steady-state behavior of continuous-review base-stock controlled assembly systems with Poisson demand arrivals and manufacturing and assembly facilities modeled as Jackson networks. One class of systems studied is to produce a single type of finished product assembling a number of components and another class is to produce two types of finished products allowing component commonality. The performance measures evaluated are the expected backorders, fill rate and the stockout probability for finished product(s). A partially aggregated but exact model is approximated assuming that the state-dependent transition rates arising as a result of the partial aggregation are constant. This approximation leads to the derivation of a closed-form steady-state probability distribution, which is of product-form. Adequacy of the proposed model in approximating the steady-state performance measures is tested against simulation experiments over a large range of parameters and the approximation turns out to be quite accurate with absolute errors of 10% at most for fill rate and stockout probability, and of less than 1.37 (& / #8776 / 2) requests for
expected backorders. A greedy heuristic which is proposed to be employed using approximate steady-state probabilities is devised to optimize base-stock levels while aiming at an overall service level for finished product(s).
|
557 |
On Qos Multicast Routing Routing ProtocolsBereketli, Alper 01 September 2005 (has links) (PDF)
Multicasting is a technique used for distributing data packets from one or more sources to a set of receivers on interconnected networks. Currently developing network applications bring specific quality of service (QoS) requirements like bounded delay, minimum bandwidth, and maximum data loss rate. Providing the required quality of service addresses routing and resource reservation concepts. In this study, a literature survey is carried out on traditional and QoS multicast routing protocols, and the need for QoS routing protocols is investigated. QoS multicast routing protocols are classified and compared according to their multicast tree construction and resource reservation approaches. Two QoS protocols, QROUTE and QMBF, are selected, and their performances are experimentally compared using the network simulation tool Network Simulator-2 (ns-2). The objective of the simulations is to compare the QoS routing algorithms and their tree construction efficiencies. The first contribution of the thesis is the survey and classification of traditional and QoS multicast routing protocols. Another contribution is the ns-2 implementation of two QoS multicast routing protocols. The final contribution of the thesis is the performance evaluation of the recent protocols from a different perspective.
|
558 |
An Approximate Model For Kanban Controlled Assembly SystemsTopan, Engin 01 September 2005 (has links) (PDF)
In this thesis, an approximation is proposed to evaluate the steady-state performance of kanban controlled assembly systems. The approximation is developed for the systems with two components making up an assembly. Then, it is extended to systems with more than two components. A continuous-time Markov model is aggregated keeping the model exact, and this aggregate model is approximated replacing some state-dependent transition rates with constant rates. Decomposition of the approximate aggregate model into submodels guarantees product-form steady-state distribution for each subsystem. Finally, submodels are combined in such a way that the size of the problem becomes independent of the number of kanbans. This brings about the computational advantage in solving the combined model using numerical matrix-geometric solution algorithms. Based on the numerical comparisons with simulation, the exact model, an approximate aggregate model and another approximation in a previous study in the literature, the approximation is observed to be good in terms of accuracy with respect to computational burden and has the potential to be a building block for the analysis of systems that are more complex but closer to real-life applications.
|
559 |
Perceived performance risk and its influence on Abandoned Cart Syndrome (ACS) - An exploratory studyMoore, Simon Scott January 2004 (has links)
Despite predictions of Internet shopping reaching 6.9 trillion dollars by the end of 2004, research is now suggesting many online consumers are still very reluctant to complete the online shopping process. A number of authors have attributed consumers' reluctance to purchase online to apparent barriers, however, such barriers have not been fully examined within a theoretical context. While most studies of consumers' decision to shop on the Internet have focussed on key shopping determinants, this thesis builds a conceptual model grounded in consumer behaviour theory. In particular, this thesis explores the application of the perceived risk theoretical framework, specifically looking at one dimension of perceived risk theory - performance risk and the influence it has on the phenomenon of Internet Abandoned Cart Syndrome (ACS). To explore this phenomenon, a number of extrinsic cues are identified as playing a major role in the performance evaluation process of online purchases. The combination of these elements enabled the researcher to develop a conceptual model from which a series of propositions were drawn. To acquire pertinent data and investigate each proposition, this study used a combination of indirect and direct techniques, namely projective techniques in the form of a third-person vignette, a structured tick-box questionnaire and finally semi-structured interviews. The results suggest that collectively the extrinsic cues of brand, reputation, design and price have an overall impact on the performance evaluation process just prior to an online purchase. Varying these cues either positively or negatively had a strong impact on performance evaluation. The conclusion of this study suggests consumers are often unable to measure the full extent of risk-taking directly. In the majority of cases, consumers are guided by numerous factors, some intrinsic, others extrinsic. E-tailers with an established reputation, a well designed web site with known brands and a balanced pricing strategy reduce the perceived performance risks associated with purchasing online, thus reducing the occurrence of ACS.
|
560 |
Compliance and impact of corporate governance best practice code on the financial performance of New Zealand listed companies : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Business and Admnistration at Massey University, Auckland campus, New ZealandTeh, Chor Tik January 2009 (has links)
The corporate governance best practice code (Code) of the New Zealand Exchange (NZX) came into effect on October 29, 2003. However, so far there is no systematic study of compliance with and impact of NZX Code on the performance of NZX companies. This study attempts to provide some answers to the perceived knowledge gap. The NZX Code recommends certain governance mechanisms to enhance corporate performance. The mechanisms analysed in this study are the percentage of independent directors, duality, presence of board subcommittees (audit, remuneration, and nomination), and the performance evaluation of board and individual directors. This thesis examines the possible relationship between recommended governance structures and the performance of NZX companies for the years 2003 (pre-Code) and 2007 (post Code), using data from the same 89 companies for each year. Although the number of companies adopting the NZX structures has increased, the rate of full compliance of the Code remains disappointingly low, rising from 5.6% in 2003 to just 22.5% in 2007. Probably due to the small sample size relative to the number of independent variables, and the problem of co-linearity, the multiple linear regression results do not seem to be conclusive and may be unreliable as the basis to form any formal statistical inference. However, treating the 89 companies as the whole population (89 out of 90), and using a simpler and more descriptive statistical tool to analyse the impact of individual independent variables on firm performance, the 2007 results show a consistent pattern of a positive relationship between Code compliance and firm performance, assuming all other factors being constant. This positive relationship is further reinforced by dividing the population into the various industry groupings as classified by the NZX, which also results in a consistent pattern of companies which comply fully with the Code structures financially outperforming companies that only partially comply with the Code during 2007. Surprisingly, listed companies adhering to the Chairman/CEO dual role do not seem to have impacted negatively on firm performance, contrary to agency theory expectation.
|
Page generated in 0.1088 seconds