• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 601
  • 452
  • 344
  • 109
  • 72
  • 52
  • 25
  • 23
  • 22
  • 18
  • 18
  • 13
  • 11
  • 9
  • 7
  • Tagged with
  • 2057
  • 324
  • 280
  • 273
  • 255
  • 230
  • 213
  • 184
  • 174
  • 167
  • 160
  • 159
  • 126
  • 120
  • 109
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Inventory-Location Problems for Spare Parts with Time-Based Service Constraints

Wheatley, David Michael January 2014 (has links)
This thesis studies an inventory-location problem faced by a large manufacturer and supplier of small to medium sized aircraft and their spare parts. The sale of after market spare parts is a major source of revenue for the company, but it is a complex industry with many unique challenges. The original problem is a multi-echelon network design problem, which is decomposed into a facility location problem with consolidated shipping challenges, and a spare parts inventory problem. The facility location problem is solved a number of times under different scenarios to give the company's leadership team access to a wide range of feasible solutions. The model itself is an important contribution to industry, allowing the company to solve a spare parts network problem that will guide strategic decision-making for years. The chapter serves as case-study on how to accurately model a large and complicated service parts supply chain through the use of mathematical programming, part aggregation and scenarios. The company used the scenario results to redesign its spare parts distribution network, opening new hubs and consolidating existing service centres. The costs savings associated with this project are estimated to be $4.4 Million USD annually. The proposed solution does increase the burden of customer freight charges on the company's customers compared to the current network, but the operational savings are expected to more than outweigh the increase in customer shipments costs. The project team thus recommended that the company consider subsidizing customer freight costs to offset the expected cost increase the customers face, resulting in lower costs for both the company and their customers. This solution could set a new standard for aircraft spare parts suppliers to follow. Considered next is an integrated inventory-location problem with service requirements based on the first problem. Customer demand is Poisson distributed and the service levels are time-based, leading to highly non-linear, stochastic service constraints and a nonlinear, mixed-integer optimization problem. Unlike previous works in the literature that propose approximations for the nonlinear constraints, this thesis presents an exact solution methodology using logic-based Benders decomposition. The problem is decomposed to separate the location decisions in the master problem from the inventory decisions in the subproblem. A new family of valid cuts is proposed and the algorithm is shown to converge to optimality. This is the first attempt to solve this type of problem exactly. Then, this thesis presents a new restrict-and-decompose scheme to further decompose the Benders master problem by part. The approach is tested on industry instances as well as random instances. The second algorithm is able to solve industry instances with up to 60 parts within two hours of computation time, while the maximum number of parts attempted in the literature is currently five. Finally, this thesis studies a second integrated inventory-location problem under different assumptions. While the previous model uses the backorder assumption for unfilled demand and a strict time window, the third model uses the lost-sales assumption and a soft time window for satisfying time sensitive customer demand. The restrict-and-decompose scheme is applied with little modification, the main difference being the calculation of the Benders cut coefficients. The algorithm is again guaranteed to converge to optimality. The results are compared against previous work under the same assumptions. The results deliver better solutions and certificates of optimality to a large set of test problems.
222

The management of operational value at risk in banks / Ja'nel Tobias Esterhuysen

Esterhuysen, Ja'nel Tobias January 2006 (has links)
The measurement of operational risk has surely been one of the biggest challenges for banks worldwide. Most banks worldwide have opted for a value-at-risk (VaR) approach, based on the success achieved with market risk, to measure and quantify operational risk. The problem banks have is that they do not always find it difficult to calculate this VaR figure, as there are numerous mathematical and statistical methods and models that can calculate VaR, but they struggle to understand and interpret the values that are produced by VaR models and methods. Senior management and normal staff do not always understand how these VaR values will impact their decision-making and they do not always know how to incorporate these values in their day-to-day management of the bank. This study therefore aims to explain and discuss the calculation of VaR for operational risk as well as the factors that influence this figure, and then also to discuss how this figure is managed and the impact that it has on the management of a bank. The main goal of this study is then to explain the management of VaR for operational risk in order to understand how it can be incorporated in the overall management of a bank. The methodology used includes a literature review, in-depth interviews and a case study on a South African Retail Bank to determine and evaluate some of the most renowned methods for calculating VaR for operational risk. The first objective of this study is to define operational risk and all its elements in order to distinguish it from all the other risks the banking industry faces and to better understand the management thereof. It is the view of this study that it will be impossible to manage and measure operational risk if it is not clearly defined, and it is therefore important to have a clear and understandable definition of operational risk. The second objective is to establish an operational risk management process that will ensure a structured approach to the management of operational risk, by focusing on the different phases of operational risk. The process discussed by this study is a combination of some of the most frequent used processes by international banks, and is intended to guide the reader in terms of the steps required for managing operational risk. The third objective of this study is to discuss and explain the qualitative factors that play a role in the management of operational risk, and to determine where these factors fit into the operational risk process and the role they play in calculating the VaR for operational risk. These qualitative factors include, amongst others, key risk indicators (KRIs), risk and control self-assessments and the tracking of operational losses. The fourth objective is to identify and evaluate the quantitative factors that play a role in the management of operational risk, to distinguish these factors from the qualitative factors, and also to determine where these factors fit into the operational risk management process and the role they play in calculating VaR for operational risk. Most of these quantitative factors are prescribed by the Base1 Committee by means of its New Capital Accord, whereby this new framework aims to measure operational risk in order to determine the amount of capital needed to safeguard a bank against operational risk. The fifth objective is to discuss and explain the calculation of VaR for operational risk by means of discussing all the elements of this calculation. This study mainly bases its discussion on the loss distribution approach (LDA), where the frequency and severity of operational loss events are convoluted by means of Monte Carlo simulations. This study uses real data obtained from a South African Retail Bank to illustrate this calculation on a practical level. The sixth and final objective of this study is to explain how VaR for operational risk is interpreted in order for management to deal with it and make proper management decisions based on it. The above-mentioned discussion is predominantly based on the two types of capital that are influenced by VaR for operational risk. / Thesis (Ph.D. (Risk Management))--North-West University, Potchefstroom Campus, 2007.
223

Microscopic evaluation of activated sludge from eleven wastewater treatment plants in Cape Town, South Africa / Pamela Welz

Welz, Pamela Jean January 2008 (has links)
From June to November 2007, a microscopic analysis was conducted on the activated sludge from eleven selected wastewater treatment plants (WWTP's) belonging to the City of Cape Town. The primary objective was the identification of the dominant and secondary filamentous organisms. Other important criteria included were the floe character, diversity, filament index (Fl) and identification of the protozoan and metazoan communities. The operational data determined from routine analyses of the sludge, influent and effluent were used to assess the relationship of the filamentous population to wastewater characteristics and to compare this with previous findings. Fl values of >3 and dissolved sludge volume indices (DSVI's) of >150 were chosen as representing the possibility of bulking conditions being present. The five most prevalent dominant filaments were Type 0092, Type 1851, actinomycetes, Microthrix parvicella and Type 021N, being present in 74%, 31%, 22%, 17% and 14% of samples respectively. Type 0092 did not appear to be associated with bulking in any of the WWTP's, although it was often incidentally present as a co-dominant species when bulking conditions existed. All three WWTP's with the Modified Ludzack-Ettinger configuration harboured Type 1851 as the major dominant species, irrespective of whether the plants treated domestic or industrial effluent. Conditions suggestive of bulking were present in two of these WWTP's. Contrary to expectations, Type 1851 was often found as a dominant species where domestic waste was the primary influent. Type 021N and actinomycetes were strongly implicated when bulking occurred. The overgrowth of these filaments appeared to be related to factors such as nutrient deficiency (Type 021N) or the presence of large amounts of low molecular weight substances in the influent. Microthrix parvicella did not cause major bulking problems. There was a strong association between low levels of nitrates/nitrites in the clarifier supernatant and good phosphorous removal, irrespective of the configuration of the WWTP. The converse was also true. / Thesis ((M. Environmental Science))--North-West University, Potchefstroom Campus, 2009.
224

Achieving operational efficiency within the local sphere of government / Manale Daniel Tsoai

Tsoai, Manale Daniel January 2008 (has links)
The study was done within the local government environment. It was conducted with the aim of assisting municipalities to achieve optimum levels of operational efficiency, following the realisation that all over the world there is an increasing need for organisations, including government organisations, to become efficient. One of the major challenges facing governments worldwide is the need for them to supply basic services to their populations, who are increasing at a disproportionate rate to the resources available to sustain and improve the quality of life of these people. Thus for government to overcome these challenges, it needs to utilise its available resources cautiously to be able to meet these escalating challenges effectively. Therefore, ten practices were presented in this study as key instruments capable of bringing about efficiency to the manner in which local government operates or delivers services. It was found during the literature review that when these practices are deployed, they will ensure the achievement of operational efficiency within the local sphere of government. The empirical study was conducted in Matjhabeng Local Municipality (MLM), located in the Northern region of the Free State Province. This local municipality came into existence on the 5th of December 2000 after the amalgamation of the former six transitional local councils into one financially viable and economically sustainable municipality. It incorporates the city of Welkom and the towns of Virginia, Odendaalsrus, Hennenman, Allanridge and Ventersburg, with an estimated population of more than 500 000 people. During the background review of the municipality, several challenges were encountered which included parts of the population without access to proper sanitation and electricity. However, in all the municipal challenges presented, it was argued that the solution can be found in the effective management of municipal input with relation to its output, which means that the municipality has the huge task of managing its scarce resources in an efficient manner to be able to deliver on its mandate and to meet the expectations of its residents. Furthermore, the study was conducted from a sample consisting of the four senior managers and twenty line managers from the four departments within the municipality. A representative sampling method was employed to ensure that all relevant aspects (such as race, gender and department), considered important for selecting a sample, are included and to also obtain global responses from respondents selected to participate in the study. Furthermore this sample was drawn from a population of nine senior managers and 35 line managers. The research was conducted with the expressed permission from the office of the accounting officer. Different impressions of the responses made by the participants from this municipality were obtained and then analysed. Overall, there was a negative response from the majority of the participants regarding most of the questions on the ten practices identified. Moreover, in terms of the devised model for measuring efficiency of the municipality based on three levels, the most desirable being Level 3, it was found that the municipality could be classified as a Level 1 organisation, which means that the municipality is at an elementary phase as far as achieving efficiency in its operations is concerned. Lastly, recommendations were made based on the findings of the empirical research conducted. / Thesis (M.B.A.)--North-West University, Vaal Triangle Campus, 2009.
225

Achieving operational efficiency within the local sphere of government / Manale Daniel Tsoai

Tsoai, Manale Daniel January 2008 (has links)
The study was done within the local government environment. It was conducted with the aim of assisting municipalities to achieve optimum levels of operational efficiency, following the realisation that all over the world there is an increasing need for organisations, including government organisations, to become efficient. One of the major challenges facing governments worldwide is the need for them to supply basic services to their populations, who are increasing at a disproportionate rate to the resources available to sustain and improve the quality of life of these people. Thus for government to overcome these challenges, it needs to utilise its available resources cautiously to be able to meet these escalating challenges effectively. Therefore, ten practices were presented in this study as key instruments capable of bringing about efficiency to the manner in which local government operates or delivers services. It was found during the literature review that when these practices are deployed, they will ensure the achievement of operational efficiency within the local sphere of government. The empirical study was conducted in Matjhabeng Local Municipality (MLM), located in the Northern region of the Free State Province. This local municipality came into existence on the 5th of December 2000 after the amalgamation of the former six transitional local councils into one financially viable and economically sustainable municipality. It incorporates the city of Welkom and the towns of Virginia, Odendaalsrus, Hennenman, Allanridge and Ventersburg, with an estimated population of more than 500 000 people. During the background review of the municipality, several challenges were encountered which included parts of the population without access to proper sanitation and electricity. However, in all the municipal challenges presented, it was argued that the solution can be found in the effective management of municipal input with relation to its output, which means that the municipality has the huge task of managing its scarce resources in an efficient manner to be able to deliver on its mandate and to meet the expectations of its residents. Furthermore, the study was conducted from a sample consisting of the four senior managers and twenty line managers from the four departments within the municipality. A representative sampling method was employed to ensure that all relevant aspects (such as race, gender and department), considered important for selecting a sample, are included and to also obtain global responses from respondents selected to participate in the study. Furthermore this sample was drawn from a population of nine senior managers and 35 line managers. The research was conducted with the expressed permission from the office of the accounting officer. Different impressions of the responses made by the participants from this municipality were obtained and then analysed. Overall, there was a negative response from the majority of the participants regarding most of the questions on the ten practices identified. Moreover, in terms of the devised model for measuring efficiency of the municipality based on three levels, the most desirable being Level 3, it was found that the municipality could be classified as a Level 1 organisation, which means that the municipality is at an elementary phase as far as achieving efficiency in its operations is concerned. Lastly, recommendations were made based on the findings of the empirical research conducted. / Thesis (M.B.A.)--North-West University, Vaal Triangle Campus, 2009.
226

The management of operational value at risk in banks / Ja'nel Tobias Esterhuysen

Esterhuysen, Ja'nel Tobias January 2006 (has links)
The measurement of operational risk has surely been one of the biggest challenges for banks worldwide. Most banks worldwide have opted for a value-at-risk (VaR) approach, based on the success achieved with market risk, to measure and quantify operational risk. The problem banks have is that they do not always find it difficult to calculate this VaR figure, as there are numerous mathematical and statistical methods and models that can calculate VaR, but they struggle to understand and interpret the values that are produced by VaR models and methods. Senior management and normal staff do not always understand how these VaR values will impact their decision-making and they do not always know how to incorporate these values in their day-to-day management of the bank. This study therefore aims to explain and discuss the calculation of VaR for operational risk as well as the factors that influence this figure, and then also to discuss how this figure is managed and the impact that it has on the management of a bank. The main goal of this study is then to explain the management of VaR for operational risk in order to understand how it can be incorporated in the overall management of a bank. The methodology used includes a literature review, in-depth interviews and a case study on a South African Retail Bank to determine and evaluate some of the most renowned methods for calculating VaR for operational risk. The first objective of this study is to define operational risk and all its elements in order to distinguish it from all the other risks the banking industry faces and to better understand the management thereof. It is the view of this study that it will be impossible to manage and measure operational risk if it is not clearly defined, and it is therefore important to have a clear and understandable definition of operational risk. The second objective is to establish an operational risk management process that will ensure a structured approach to the management of operational risk, by focusing on the different phases of operational risk. The process discussed by this study is a combination of some of the most frequent used processes by international banks, and is intended to guide the reader in terms of the steps required for managing operational risk. The third objective of this study is to discuss and explain the qualitative factors that play a role in the management of operational risk, and to determine where these factors fit into the operational risk process and the role they play in calculating the VaR for operational risk. These qualitative factors include, amongst others, key risk indicators (KRIs), risk and control self-assessments and the tracking of operational losses. The fourth objective is to identify and evaluate the quantitative factors that play a role in the management of operational risk, to distinguish these factors from the qualitative factors, and also to determine where these factors fit into the operational risk management process and the role they play in calculating VaR for operational risk. Most of these quantitative factors are prescribed by the Base1 Committee by means of its New Capital Accord, whereby this new framework aims to measure operational risk in order to determine the amount of capital needed to safeguard a bank against operational risk. The fifth objective is to discuss and explain the calculation of VaR for operational risk by means of discussing all the elements of this calculation. This study mainly bases its discussion on the loss distribution approach (LDA), where the frequency and severity of operational loss events are convoluted by means of Monte Carlo simulations. This study uses real data obtained from a South African Retail Bank to illustrate this calculation on a practical level. The sixth and final objective of this study is to explain how VaR for operational risk is interpreted in order for management to deal with it and make proper management decisions based on it. The above-mentioned discussion is predominantly based on the two types of capital that are influenced by VaR for operational risk. / Thesis (Ph.D. (Risk Management))--North-West University, Potchefstroom Campus, 2007.
227

Microscopic evaluation of activated sludge from eleven wastewater treatment plants in Cape Town, South Africa / Pamela Welz

Welz, Pamela Jean January 2008 (has links)
From June to November 2007, a microscopic analysis was conducted on the activated sludge from eleven selected wastewater treatment plants (WWTP's) belonging to the City of Cape Town. The primary objective was the identification of the dominant and secondary filamentous organisms. Other important criteria included were the floe character, diversity, filament index (Fl) and identification of the protozoan and metazoan communities. The operational data determined from routine analyses of the sludge, influent and effluent were used to assess the relationship of the filamentous population to wastewater characteristics and to compare this with previous findings. Fl values of >3 and dissolved sludge volume indices (DSVI's) of >150 were chosen as representing the possibility of bulking conditions being present. The five most prevalent dominant filaments were Type 0092, Type 1851, actinomycetes, Microthrix parvicella and Type 021N, being present in 74%, 31%, 22%, 17% and 14% of samples respectively. Type 0092 did not appear to be associated with bulking in any of the WWTP's, although it was often incidentally present as a co-dominant species when bulking conditions existed. All three WWTP's with the Modified Ludzack-Ettinger configuration harboured Type 1851 as the major dominant species, irrespective of whether the plants treated domestic or industrial effluent. Conditions suggestive of bulking were present in two of these WWTP's. Contrary to expectations, Type 1851 was often found as a dominant species where domestic waste was the primary influent. Type 021N and actinomycetes were strongly implicated when bulking occurred. The overgrowth of these filaments appeared to be related to factors such as nutrient deficiency (Type 021N) or the presence of large amounts of low molecular weight substances in the influent. Microthrix parvicella did not cause major bulking problems. There was a strong association between low levels of nitrates/nitrites in the clarifier supernatant and good phosphorous removal, irrespective of the configuration of the WWTP. The converse was also true. / Thesis ((M. Environmental Science))--North-West University, Potchefstroom Campus, 2009.
228

A Linear Programming Framework for Models of Forest Management Strategy

Martin, Andrew B. 23 September 2013 (has links)
Results found in this thesis draw attention to limitations in the conventional approach to modelling forest management strategy, where models have insufficient spatial resolution and ignore industry. Addressing these limitations, a Model One linear programming framework was developed in which models built can model strategically relevant spatial resolution, and include industry representation. In a case-study on Nova Scotia's Crown Central Forest, models from this framework were compared with Woodstock\texttrademark, a commercial modelling framework. When strategically relevant spatial resolution was modelled, these models found solutions in substantially less time than Woodstock. Of further interest, the framework's industry representation allows novel analysis to be performed. A comparison between a model that includes industry and a conventional model demonstrates that the conventional model schedules unprofitable stands for harvest. Then, models with industry representation are used to demonstrate industry based analysis, such as assessing the cost of a clearcut restriction policy and investigating the benefit of industrial expansion. Taken together, the results herein contained make an argument for modelling forest management strategy at strategically relevant spatial resolution, and including industry representation in modelling.
229

Dynamic Operational Risk Assessment with Bayesian Network

Barua, Shubharthi 2012 August 1900 (has links)
Oil/gas and petrochemical plants are complicated and dynamic in nature. Dynamic characteristics include ageing of equipment/components, season changes, stochastic processes, operator response times, inspection and testing time intervals, sequential dependencies of equipment/components and timing of safety system operations, all of which are time dependent criteria that can influence dynamic processes. The conventional risk assessment methodologies can quantify dynamic changes in processes with limited capacity. Therefore, it is important to develop method that can address time-dependent effects. The primary objective of this study is to propose a risk assessment methodology for dynamic systems. In this study, a new technique for dynamic operational risk assessment is developed based on the Bayesian networks, a structure optimal suitable to organize cause-effect relations. The Bayesian network graphically describes the dependencies of variables and the dynamic Bayesian network capture change of variables over time. This study proposes to develop dynamic fault tree for a chemical process system/sub-system and then to map it in Bayesian network so that the developed method can capture dynamic operational changes in process due to sequential dependency of one equipment/component on others. The developed Bayesian network is then extended to the dynamic Bayesian network to demonstrate dynamic operational risk assessment. A case study on a holdup tank problem is provided to illustrate the application of the method. A dryout scenario in the tank is quantified. It has been observed that the developed method is able to provide updated probability different equipment/component failure with time incorporating the sequential dependencies of event occurrence. Another objective of this study is to show parallelism of Bayesian network with other available risk assessment methods such as event tree, HAZOP, FMEA. In this research, an event tree mapping procedure in Bayesian network is described. A case study on a chemical reactor system is provided to illustrate the mapping procedure and to identify factors that have significant influence on an event occurrence. Therefore, this study provides a method for dynamic operational risk assessment capable of providing updated probability of event occurrences considering sequential dependencies with time and a model for mapping event tree in Bayesian network.
230

Three new perspectives for testing stock market efficiency

Chandrashekar, Satyajit, January 1900 (has links) (PDF)
Thesis (Ph. D.)--University of Texas at Austin, 2006. / Vita. Includes bibliographical references.

Page generated in 0.0881 seconds