• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 356
  • 348
  • 40
  • 34
  • 33
  • 30
  • 26
  • 23
  • 8
  • 6
  • 6
  • 4
  • 3
  • 3
  • 2
  • Tagged with
  • 1026
  • 1026
  • 331
  • 274
  • 189
  • 129
  • 112
  • 90
  • 89
  • 87
  • 77
  • 73
  • 71
  • 70
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

The quality of environmental management frameworks in South Africa / Marius Marais

Marais, Marius January 2010 (has links)
Environmental assessments and authorisations surrounding project level developments are often made in isolation, without consideration of the regional or strategic context within which individual developments are done. This research investigates the quality of Environmental Management Frameworks (EMF) as strategic environmental instrument. EMF is a unique South African instrument that was first conceptualised in 1989, enacted in 2006 and updated in 2010. EMFs were developed to map environmental sensitivity to aid the screening out of undesired developments in sensitive environments and to minimise unnecessary project level assessments in preferred development areas. EMFs form an important link between environmental assessment (EA) processes and planning strategies such as Spatial Development Frameworks (SDFs) and Integrated Development Plans (IDPs), due to their spatial output of environmental sensitivity maps and their ability to feed strategic assessment processes required by SDFs. They have a legal mandate which ensures their assimilation and use. This research uses a multiple case study approach to review seven EMF documents for their quality. The quality aspects identified are the process, methodology and documentation components, using the printed EMF documentation as primary information source. Quality review criteria were subsequently developed to investigate these inputs, using the legal mandate of EMF as basis. Each case was rated for compliance with the quality criteria using a six–level rating schedule. Further analyses were made by comparing the performance of cases against one another. Public participation emerged as the weakest component of EMF practice, while aspects of sensitivity analysis also performed weaker than other aspects. More focus is required on aligning scales and resolutions of map inputs, mapping methods and general integration of spatial data, especially those of adjoining districts. The need to substantiate a rationale for buffer determination also requires further refinement. The practice of conducting EMF is well established and it can be valuable in sustainable development planning and decisionmaking. Recommendations to enhance the sustainability outcomes and hence effectiveness of this instrument are made, as well as future research objectives for increasing its utility. / Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2011.
512

The quality of environmental management frameworks in South Africa / Marius Marais

Marais, Marius January 2010 (has links)
Environmental assessments and authorisations surrounding project level developments are often made in isolation, without consideration of the regional or strategic context within which individual developments are done. This research investigates the quality of Environmental Management Frameworks (EMF) as strategic environmental instrument. EMF is a unique South African instrument that was first conceptualised in 1989, enacted in 2006 and updated in 2010. EMFs were developed to map environmental sensitivity to aid the screening out of undesired developments in sensitive environments and to minimise unnecessary project level assessments in preferred development areas. EMFs form an important link between environmental assessment (EA) processes and planning strategies such as Spatial Development Frameworks (SDFs) and Integrated Development Plans (IDPs), due to their spatial output of environmental sensitivity maps and their ability to feed strategic assessment processes required by SDFs. They have a legal mandate which ensures their assimilation and use. This research uses a multiple case study approach to review seven EMF documents for their quality. The quality aspects identified are the process, methodology and documentation components, using the printed EMF documentation as primary information source. Quality review criteria were subsequently developed to investigate these inputs, using the legal mandate of EMF as basis. Each case was rated for compliance with the quality criteria using a six–level rating schedule. Further analyses were made by comparing the performance of cases against one another. Public participation emerged as the weakest component of EMF practice, while aspects of sensitivity analysis also performed weaker than other aspects. More focus is required on aligning scales and resolutions of map inputs, mapping methods and general integration of spatial data, especially those of adjoining districts. The need to substantiate a rationale for buffer determination also requires further refinement. The practice of conducting EMF is well established and it can be valuable in sustainable development planning and decisionmaking. Recommendations to enhance the sustainability outcomes and hence effectiveness of this instrument are made, as well as future research objectives for increasing its utility. / Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2011.
513

Performance evaluation of video streaming over multi-hop wireless local area networks

Li, Deer 10 July 2008 (has links)
Internet Protocol Television (IPTV) has become the application that drives the Internet to a new height. However, challenges still remain in IPTV in-home distribution. The high-quality video streaming in IPTV services demands home networks to deliver video streaming packets with stringent Quality-of-Service (QoS) requirements. Currently, most service providers recommend Ethernet-based broadband home networks for IPTV. However, many existing houses are not wired with Ethernet cables and the rewiring cost is prohibitively expensive. Therefore, wireless solutions are preferred if their performance can meet the requirements. IEEE 802.11 wireless local area networks (WLANs) are pervasively adopted in home networks for their flexibility and affordability. However, through our experiments in the real environment, we found that the conventional single-hop infrastructure mode WLANs have very limited capacity and coverage in a typical in-door environment due to high attenuation and interference. The single-hop wireless networks cannot provide support for high-quality video streaming to the entire house. Multi-hop wireless networks are therefore used to extend the coverage. Contrary to the common believes that adding relay routers in the same wireless channel should reduce the throughput, our experiment, analysis and simulation results show that the multi-hop IEEE 802.11 WLANs can improve both the capacity and coverage in certain scenarios, and sufficiently support high-quality video streaming in a typical house. In this research, we analyzed and evaluated the performance of H.264-based video streaming over multi-hop wireless networks. Our analysis and simulation results reveal a wide spectrum of coverage-capacity tradeoff of multi-hop wireless networks in generic scenarios. More- over, we discuss the methods of how to further improve video streaming performance. This research provides the guidance on how to achieve the optimal balance for a given scenario, which is of great importance when deploying end-to-end IPTV services with QoS guarantee.
514

Performance Evaluation and Prediction of Parallel Applications

Markomanolis, Georgios 20 January 2014 (has links) (PDF)
Analyzing and understanding the performance behavior of parallel applicationson various compute infrastructures is a long-standing concern in the HighPerformance Computing community. When the targeted execution environments arenot available, simulation is a reasonable approach to obtain objectiveperformance indicators and explore various ''what-if?'' scenarios. In thiswork we present a framework for the off-line simulation of MPIapplications. The main originality of our work with regard to the literature is to rely on\tit execution traces. This allows for an extreme scalability as heterogeneousand distributed resources can be used to acquire a trace. We propose a formatwhere for each event that occurs during the execution of an application we logthe volume of instructions for a computation phase or the bytes and the type ofa communication. To acquire time-independent traces of the execution of MPI applications, wehave to instrument them to log the required data. There exist many profilingtools which can instrument an application. We propose a scoring system thatcorresponds to our framework specific requirements and evaluate the mostwell-known and open source profiling tools according to it. Furthermore weintroduce an original tool called Minimal Instrumentation that was designed tofulfill the requirements of our framework. We study different instrumentationmethods and we also investigate several acquisition strategies. We detail thetools that extract the \tit traces from the instrumentation traces of somewell-known profiling tools. Finally we evaluate the whole acquisition procedureand we present the acquisition of large scale instances. We describe in detail the procedure to provide a realistic simulated platformfile to our trace replay tool taking under consideration the topology of thereal platform and the calibration procedure with regard to the application thatis going to be simulated. Moreover we present the implemented trace replaytools that we used during this work. We show that our simulator can predictthe performance of some MPI benchmarks with less than 11\% relativeerror between the real execution and simulation for the cases that there is noperformance issue. Finally, we identify the reasons of the performance issuesand we propose solutions.
515

An Approximation Method For Performance Measurement In Base-stock Controlled Assembly Systems

Rodoplu, Umut 01 January 2004 (has links) (PDF)
The aim of this thesis is to develop a tractable method for approximating the steady-state behavior of continuous-review base-stock controlled assembly systems with Poisson demand arrivals and manufacturing and assembly facilities modeled as Jackson networks. One class of systems studied is to produce a single type of finished product assembling a number of components and another class is to produce two types of finished products allowing component commonality. The performance measures evaluated are the expected backorders, fill rate and the stockout probability for finished product(s). A partially aggregated but exact model is approximated assuming that the state-dependent transition rates arising as a result of the partial aggregation are constant. This approximation leads to the derivation of a closed-form steady-state probability distribution, which is of product-form. Adequacy of the proposed model in approximating the steady-state performance measures is tested against simulation experiments over a large range of parameters and the approximation turns out to be quite accurate with absolute errors of 10% at most for fill rate and stockout probability, and of less than 1.37 (&amp / #8776 / 2) requests for expected backorders. A greedy heuristic which is proposed to be employed using approximate steady-state probabilities is devised to optimize base-stock levels while aiming at an overall service level for finished product(s).
516

On Qos Multicast Routing Routing Protocols

Bereketli, Alper 01 September 2005 (has links) (PDF)
Multicasting is a technique used for distributing data packets from one or more sources to a set of receivers on interconnected networks. Currently developing network applications bring specific quality of service (QoS) requirements like bounded delay, minimum bandwidth, and maximum data loss rate. Providing the required quality of service addresses routing and resource reservation concepts. In this study, a literature survey is carried out on traditional and QoS multicast routing protocols, and the need for QoS routing protocols is investigated. QoS multicast routing protocols are classified and compared according to their multicast tree construction and resource reservation approaches. Two QoS protocols, QROUTE and QMBF, are selected, and their performances are experimentally compared using the network simulation tool Network Simulator-2 (ns-2). The objective of the simulations is to compare the QoS routing algorithms and their tree construction efficiencies. The first contribution of the thesis is the survey and classification of traditional and QoS multicast routing protocols. Another contribution is the ns-2 implementation of two QoS multicast routing protocols. The final contribution of the thesis is the performance evaluation of the recent protocols from a different perspective.
517

An Approximate Model For Kanban Controlled Assembly Systems

Topan, Engin 01 September 2005 (has links) (PDF)
In this thesis, an approximation is proposed to evaluate the steady-state performance of kanban controlled assembly systems. The approximation is developed for the systems with two components making up an assembly. Then, it is extended to systems with more than two components. A continuous-time Markov model is aggregated keeping the model exact, and this aggregate model is approximated replacing some state-dependent transition rates with constant rates. Decomposition of the approximate aggregate model into submodels guarantees product-form steady-state distribution for each subsystem. Finally, submodels are combined in such a way that the size of the problem becomes independent of the number of kanbans. This brings about the computational advantage in solving the combined model using numerical matrix-geometric solution algorithms. Based on the numerical comparisons with simulation, the exact model, an approximate aggregate model and another approximation in a previous study in the literature, the approximation is observed to be good in terms of accuracy with respect to computational burden and has the potential to be a building block for the analysis of systems that are more complex but closer to real-life applications.
518

Perceived performance risk and its influence on Abandoned Cart Syndrome (ACS) - An exploratory study

Moore, Simon Scott January 2004 (has links)
Despite predictions of Internet shopping reaching 6.9 trillion dollars by the end of 2004, research is now suggesting many online consumers are still very reluctant to complete the online shopping process. A number of authors have attributed consumers' reluctance to purchase online to apparent barriers, however, such barriers have not been fully examined within a theoretical context. While most studies of consumers' decision to shop on the Internet have focussed on key shopping determinants, this thesis builds a conceptual model grounded in consumer behaviour theory. In particular, this thesis explores the application of the perceived risk theoretical framework, specifically looking at one dimension of perceived risk theory - performance risk and the influence it has on the phenomenon of Internet Abandoned Cart Syndrome (ACS). To explore this phenomenon, a number of extrinsic cues are identified as playing a major role in the performance evaluation process of online purchases. The combination of these elements enabled the researcher to develop a conceptual model from which a series of propositions were drawn. To acquire pertinent data and investigate each proposition, this study used a combination of indirect and direct techniques, namely projective techniques in the form of a third-person vignette, a structured tick-box questionnaire and finally semi-structured interviews. The results suggest that collectively the extrinsic cues of brand, reputation, design and price have an overall impact on the performance evaluation process just prior to an online purchase. Varying these cues either positively or negatively had a strong impact on performance evaluation. The conclusion of this study suggests consumers are often unable to measure the full extent of risk-taking directly. In the majority of cases, consumers are guided by numerous factors, some intrinsic, others extrinsic. E-tailers with an established reputation, a well designed web site with known brands and a balanced pricing strategy reduce the perceived performance risks associated with purchasing online, thus reducing the occurrence of ACS.
519

Compliance and impact of corporate governance best practice code on the financial performance of New Zealand listed companies : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Business and Admnistration at Massey University, Auckland campus, New Zealand

Teh, Chor Tik January 2009 (has links)
The corporate governance best practice code (Code) of the New Zealand Exchange (NZX) came into effect on October 29, 2003. However, so far there is no systematic study of compliance with and impact of NZX Code on the performance of NZX companies. This study attempts to provide some answers to the perceived knowledge gap. The NZX Code recommends certain governance mechanisms to enhance corporate performance. The mechanisms analysed in this study are the percentage of independent directors, duality, presence of board subcommittees (audit, remuneration, and nomination), and the performance evaluation of board and individual directors. This thesis examines the possible relationship between recommended governance structures and the performance of NZX companies for the years 2003 (pre-Code) and 2007 (post Code), using data from the same 89 companies for each year. Although the number of companies adopting the NZX structures has increased, the rate of full compliance of the Code remains disappointingly low, rising from 5.6% in 2003 to just 22.5% in 2007. Probably due to the small sample size relative to the number of independent variables, and the problem of co-linearity, the multiple linear regression results do not seem to be conclusive and may be unreliable as the basis to form any formal statistical inference. However, treating the 89 companies as the whole population (89 out of 90), and using a simpler and more descriptive statistical tool to analyse the impact of individual independent variables on firm performance, the 2007 results show a consistent pattern of a positive relationship between Code compliance and firm performance, assuming all other factors being constant. This positive relationship is further reinforced by dividing the population into the various industry groupings as classified by the NZX, which also results in a consistent pattern of companies which comply fully with the Code structures financially outperforming companies that only partially comply with the Code during 2007. Surprisingly, listed companies adhering to the Chairman/CEO dual role do not seem to have impacted negatively on firm performance, contrary to agency theory expectation.
520

Framework for identifying systemic environmental factors causing underperformance in business processes

Swanepoel, Leon D. 12 1900 (has links)
Thesis (MEng)-- Stellenbosch University, 2013. / ENGLISH ABSTRACT: Performance management systems are integral to many organisations. On all levels of management such performance measurements are used to drive a desired behaviour and business units, departments, as well as individuals are rewarded for meeting or exceeding set targets. In large silo-structured organisations, divisions are particularly focused on their own targets and responsibilities. This may result in a diminished view of the effect their strategies and processes may have on overall stakeholder value. These divisions execute strategies to enhance the achievement of their own goal. The execution of these strategies sometimes hampers other divisions in meeting their goals. The net effect of this hampering may result in reduced stakeholder value. A mechanism is needed through which organisational divisions can evaluate the systemic environment, in order to identify hampering processes. The case may be that their processes are hampering other divisions, or that their processes as such are being hampered. The main objective of this research study was to develop such a mechanism. This mechanism emerged through a framework which can be used during investigations of hampering processes. Such investigation is conducted by following six predefined steps to guide the investigator in identifying the hampering factors. This framework was developed by combining primarily three disciplines: Systems thinking, Performance evaluation and Supplier perceived value. The evaluation framework was validated through three case studies. In all of the cases the framework delivered the expected result. It is thus concluded that organisations can apply the framework to help identify systemic environmental factors that may hamper business processes. / AFRIKAANSE OPSOMMING: Prestasiebestuurstelsels maak ʼn integrale deel uit van die meeste organisasies. Prestasiebeoordeling word op alle vlakke van bestuur ingespan om die verlangde gedrag aan te moedig. Sake-eenhede, departemente en individue word vergoed indien hulle die gestelde doelwitte haal of oorskry. In groot silo-gedrewe organisasies is afdelings grootliks gefokus op hulle eie verantwoordelikhede en om hulle eie doelwitte te bereik. Gevolglik verminder dit soms die uitwerking wat die uitkomste van hulle strategieë en prosesse het op die belanghebbendes van die organisasie. Hierdie afdelings voer dus strategieë uit om hulle eie doelwitte te behaal. Soms verhinder hierdie strategieë ander afdelings om hulle doelwitte te bereik. Die basiese effek hiervan kan wees dat minder waarde aan die belanghebbendes deurgegee word. ‘n Organisasie het dus ʼn meganisme nodig om die sistemiese omgewing mee te evalueer en sodoende prosesse te identifiseer wat belemmer is of wat belemmering kan veroorsaak. Die hoofdoel van hierdie navorsingstudie was om so ʼn meganisme te ontwikkel. Hierdie meganisme het na vore gekom in ʼn raamwerk wat tydens ondersoeke gebruik kan word om belemmering te identifiseer. Die raamwerk is ontwikkel deur hoofsaaklik drie dissiplines in gedagte te hou: Sistemiese Benadering, Prestasie-beoordeling en die Begrip van verskafferwaarde. Die raamwerk is aan die hand van drie gevallestudies getoets en in al drie gevalle het die raamwerk die verwagte resultate opgelewer. Die gevolgtrekking is dus gemaak dat organisasies wel die raamwerk kan toegepas kan om die sistemiese omgewing te evalueer en sodoende die belemmering van prosesse op mikrovlak uit te wys.

Page generated in 0.0459 seconds