• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 42
  • 6
  • 4
  • 4
  • 3
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 132
  • 132
  • 30
  • 28
  • 17
  • 16
  • 15
  • 15
  • 14
  • 13
  • 13
  • 13
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A methodology for modelling Leontief input-output economic systems using IDEF0 and simulation

El Abani, Ali January 2000 (has links)
No description available.
2

On the design of hybrid simulation models, focussing on the agent-based system dynamics combination

Swinerd, C. January 2014 (has links)
There is a growing body of literature reporting the application of hybrid simulations to inform decision making. However, guidance for the design of such models, where the output depends upon more than one modelling paradigm, is limited. The benefits of realising this guidance include facilitating efficiencies in the general modelling process and reduction in project risk (both across measures of time, cost and quality). Focussing on the least well researched modelling combination of agent-based simulation with system dynamics, a combination potentially suited to modelling complex adaptive systems, the research contribution presented here looks to address this shortfall. Within a modelling process, conceptual modelling is linked to model specification via the design transition. Using standards for systems engineering to formally define this transition, a critical review of the published literature reveals that it is frequently documented. However, coverage is inconsistent and consequently it is difficult to draw general conclusions and establish best practice. Therefore, methods for extracting this information, whilst covering a diverse range of application domains, are investigated. A general framework is proposed to consistently represent the content of conceptual models; characterising the key elements of the content and interfaces between them. Integrating this content in an architectural design, design classes are then defined. Building on this analysis, a decision process is introduced that can be used to determine the utility of these design classes. This research is benchmarked against reported design studies considering system dynamics and discrete-event simulation and demonstrated in a case study where each design archetype is implemented. Finally, the potential for future research to extend this guidance to other modelling combinations is discussed.
3

Emergent tendencies in multi-agent-based simulations using constraint-based methods to effect practical proofs over finite subsets of simulation outcomes

Teran Villegas, Oswaldo Ramon January 2001 (has links)
No description available.
4

Qualitative and semi-quantitative modelling and simulation of the software engineering processes

Zhang, He, Computer Science & Engineering, Faculty of Engineering, UNSW January 2008 (has links)
Software process modelling has been identified as being a vehicle for understanding development processes, controlling development costs, duration, and achieving product quality. In recent years, software process simulation has been becoming one of the essential techniques for effectively investigating and managing software development processes. Till now, most researches focus on the quantitative aspects of process simulation and modelling. Nevertheless, purely quantitative process modelling requires a very detailed understanding and accurate measurement of the software process, which relies on reliable and precise historical data. When such data are lacking or the quality is dubious, quantitative models have to impose severe constraints that restrict the model's value. Unfortunately, these data are not readily available in most cases, especially in the organisations at low process maturity levels. In addition, software development is a highly complex, human-centred endeavour, which involves many uncertain factors in the course of development process. Facing the inherent uncertainty and contingency, though quantitative modelling employs statistic techniques, its conditional capability and underlying assumptions limit its performance on large scale problems. As the alternatives of quantitative approaches, qualitative modelling can cope with a lack of complete knowledge, and predicts qualitative process behaviours. Furthermore, semi-quantitative modelling offers the capability of handling process uncertainty with limited knowledge, and achieves tradeoff between quantitative and qualitative approaches. However, most previous researches omitted these approaches, and the associated methods and applications are far from developed. The main contribution of this research lies in the pioneering work on the models, methods, and applications of qualitative and semi-quantitative software process modelling and simulation, and their relations with the conventional, quantitative modelling approaches. This dissertation produces its novelty from twofold research. Firstly, it explores methods and techniques to qualitatively and semi-quantitatively model and simulate software processes at different levels, i.e. project, portion of development process, and product evolution. Secondly, Some exclusive applications of these modelling approaches are also developed for aspects of software engineering practice. Moreover, a proposed framework integrates these approaches with typical quantitative paradigms to guide the adoption of process simulation modelling in software organisations. As a comprehensive reflection of state-of-the-art of software process simulation modelling, a systematic review is reported in this dissertation as well.
5

Guelph Flood Forecasting Model (GFLOOD): An Innovative Modelling Approach for Estimating Flows and Water Levels in Streams

Perdikaris, John 09 May 2013 (has links)
The analysis of large-scale watershed processes and development of an efficient and integrated modelling platform is the focus of this research. The research focused on developing a series of modelling tools that can be used in the simulation of the overall response of a watershed based on a localized or distributed hydrologic event. This is achieved through the introduction of a hybrid modelling concept using a combination of empirically based lumped hydrologic processes and a physics-based distributed model representation. The watershed simulation model (GFLOOD) was developed to account for the complexity of the watershed including the variations in climate, soils, topography, and landuse conditions across the watershed. GFLOOD stands for Guelph Flood Forecasting Model, a river basin or watershed scale flow prediction model. Two major modelling components of the GFLOOD model are the time parameters (time of concentration (Tc) and recession constant (K)) and the channel routing component. Each of these modelling components is evaluated separately. The equations developed in this study for estimating the time parameters can be used as an initial estimate for Tc and K for ungauged basins, and through calibration and/or sensitivity analysis the values of Tc and K can be finalized. The Saint Venant equations for flood routing are solved by transforming the momentum equation into a partial differential equation which has six parameters related to cross-sectional area and discharge of the channel, left floodplain and right floodplain. The simplified dynamic model was further modified to account for transmission losses, evaporation losses and bank storage within the channel. The model was compared with the solutions of the general dynamic wave model, diffusion wave model and the more complex dynamic wave model. The comparison shows that there is good agreement between the results of the simplified dynamic model and the other models however, the simplified dynamic model is easier to formulate and compute than the other models. The complete GFLOOD model was applied to the Welland River Watershed within Southern Ontario. The model was evaluated for its ability to predict streamflow and water levels along the main branch of the Welland River.
6

Qualitative and semi-quantitative modelling and simulation of the software engineering processes

Zhang, He, Computer Science & Engineering, Faculty of Engineering, UNSW January 2008 (has links)
Software process modelling has been identified as being a vehicle for understanding development processes, controlling development costs, duration, and achieving product quality. In recent years, software process simulation has been becoming one of the essential techniques for effectively investigating and managing software development processes. Till now, most researches focus on the quantitative aspects of process simulation and modelling. Nevertheless, purely quantitative process modelling requires a very detailed understanding and accurate measurement of the software process, which relies on reliable and precise historical data. When such data are lacking or the quality is dubious, quantitative models have to impose severe constraints that restrict the model's value. Unfortunately, these data are not readily available in most cases, especially in the organisations at low process maturity levels. In addition, software development is a highly complex, human-centred endeavour, which involves many uncertain factors in the course of development process. Facing the inherent uncertainty and contingency, though quantitative modelling employs statistic techniques, its conditional capability and underlying assumptions limit its performance on large scale problems. As the alternatives of quantitative approaches, qualitative modelling can cope with a lack of complete knowledge, and predicts qualitative process behaviours. Furthermore, semi-quantitative modelling offers the capability of handling process uncertainty with limited knowledge, and achieves tradeoff between quantitative and qualitative approaches. However, most previous researches omitted these approaches, and the associated methods and applications are far from developed. The main contribution of this research lies in the pioneering work on the models, methods, and applications of qualitative and semi-quantitative software process modelling and simulation, and their relations with the conventional, quantitative modelling approaches. This dissertation produces its novelty from twofold research. Firstly, it explores methods and techniques to qualitatively and semi-quantitatively model and simulate software processes at different levels, i.e. project, portion of development process, and product evolution. Secondly, Some exclusive applications of these modelling approaches are also developed for aspects of software engineering practice. Moreover, a proposed framework integrates these approaches with typical quantitative paradigms to guide the adoption of process simulation modelling in software organisations. As a comprehensive reflection of state-of-the-art of software process simulation modelling, a systematic review is reported in this dissertation as well.
7

A semi-empirical approach to modelling well deliverability in gas condensate reservoirs

Ugwu, Johnson Obunwa January 2011 (has links)
A critical issue in the development of gas condensate reservoirs is accurate prediction of well deliverability. In this investigation a procedure has been developed for accurate prediction of well production rates using semi-empirical approach. The use of state of the art fine grid numerical simulation is time consuming and computationally demanding, therefore not suitable for real time rapid production management decisions required on site. Development of accurate fit-for-purpose correlations for fluid property prediction below the saturation pressure was a major consideration to properly allow for retrograde condensation, complications of multiphase flow and mobility issues. Previous works are limited to use of experimentally measured pressure, volume, temperature (PVT) property data, together with static relative permeability correlations for simulation of well deliverability. To overcome the above limitations appropriate fluid property correlations required for prediction of well deliverability and dynamic three phase relative permeability correlation have been developed to enable forecasting of these properties at all the desired reservoir conditions The developed correlations include; condensate hybrid compressibility factor, viscosity, density, compositional pseudo-pressure, and dynamic three phase relative permeability. The study made use of published data bases of experimentally measured gas condensate PVT properties and three phase relative permeability data. The developed correlations have been implemented in both vertical and horizontal well models and parametric studies have been performed to determine the critical parameters that control productivity in gas condensate reservoirs, using specific case studies. The improved correlations showed superior performance over existing correlations on validation. The investigation has built on relevant literature to present an approach that modifies the black oil model for accurate well deliverability prediction for condensate reservoirs at conditions normally ignored by the conventional approach. The original contribution to knowledge and practice includes (i) the improved property correlations equations, (4.44, 4.47, 4.66, 4.69, 4.75, 5.21) and (ii) extension of gas rate equations, for condensate rate prediction in both vertical and horizontal wells. Standard industry software, the Eclipse compositional model, E-300 has been used to validate the procedure. The results show higher well performance compared with the industry standard. The new procedure is able to model well deliverability with limited PVT and rock property data which is not possible with most available methods. It also makes possible evaluation of various enhanced hydrocarbon recovery techniques and optimisation of gas condensate recovery.
8

Novel modelling and simulation approaches to support electronics manufacturing in the UK

Huertas-Quintero, Lina A. M. January 2010 (has links)
High value added products is the only segment of the electronics sector in which the UK is likely to remain competitive and where manufacturing will be retained within this country. Even though UK companies have a competitive advantage in this market, they also face a range of new challenges including demanding customer requirements, constantly changes conditions and highly complex products and technologies. Consequently, effective product and process (re-) design that encourages continuous improvement and innovation to satisfy highly demanding customers has become vital. Additionally, support to undertake design in an agile manner while managing complexity at the same time is required. The research described in this thesis addresses this problem by developing a software tool (i.e. INMOST - INtegrated MOdelling and Simulation Tool) that support agile design. This support is provided through modelling, simulation and root cause analysis (i.e. the functional modules within the tool). The functionality of the software is enabled through two novel concepts proposed. The first one is an integrated modelling framework that combines different modelling techniques in a single structure to enable more complete and realistic models. The second is a Hierarchical Object Oriented Simulation Structure (HOOSS) that unifies generalisation and customisation ideas to facilitate the utilisation of INMOST in an industrial context. The functionality of INMOST was tested wit three case studies. The case studies proves the capability of the software to be easily adopted in an industrial context, to provide predictive feedback to identify potential problems, and to complete the design cycle by providing decision support to solve identified problems. In this way, the compliance of the software with the domain requirements and needs is demonstrated. The research is completed by providing recommendations for the adoption of INMOST in industry and clear establishing clear directions for future work.
9

Modelling of Ore Floatability in Industrial Flotation Circuits

Kym Runge Unknown Date (has links)
Ore floatability is defined as the propensity of particles to float within a flotation environment and any effective mathematical model of the flotation process must incorporate its effect. The objective of this thesis was to review the ore floatability models in the literature and for those considered appropriate: • investigate their basic assumptions, • evaluate the type of experimental data required to derive model parameters, • and test their predictive capability. A review of the literature identified three different methods of representing ore floatability in flotation circuit models. Two approaches were studied within this thesis, namely the empirically derived floatability component model and the property based floatability component model. The third shaped distribution modelling approach was considered too inflexible a methodology to represent all types of ore floatability distributions. Ore floatability cannot be directly measured but must be inferred from a flotation response. In this thesis, it was investigated using batch laboratory flotation tests which, for a particular system, were all performed using the same set of operating conditions. Any difference in response between tests performed in this way was attributed to a change in ore floatability. Within this thesis, hundreds of batch laboratory flotation tests were performed using samples collected from the streams of seven different industrial flotation circuits. These tests, in combination with circuit survey data, were used to investigate various assumptions underpinning the ore floatability models. The tests also provide the experimental information required to derive the flotation properties of the two chosen ore floatability models. Both the two models investigated in this thesis assume the floatability of a particle in a flotation circuit to remain the same before and after processes in the circuit. A nodal analysis technique was developed by the author to compare the floatability in the feed and product of a flotation circuit process using batch laboratory flotation test information. This technique was used to show that ore floatability is a conserved property across most flotation, mixing and cycloning processes. In the cases where floatability was not conserved, it is suspected that the surfaces of the particles have changed due to oxidation, adsorption of hydrophilic species or decomposition of reagent surface species. Ore floatability was usually not conserved across processes which are designed to change particle properties (e.g. reagent addition and regrinding). An additional requirement of the ore floatability models is that all particles assigned to a particular component must float with a similar flotation rate. Sizing and liberation analysis of batch laboratory flotation test data showed that property based floatability component models based on size and liberation alone do not satisfy this criteria. It was concluded that a property based model would need to include information about the chemical state of the particle surfaces. As this type of measurement was considered beyond the scope of this thesis, no further analysis of this type of model was possible. Techniques for determining the empirically derived floatability component model parameters were studied using statistical techniques. This analysis showed that unique, stable parameters could be obtained by fitting the models to match multiple batch flotation test data collected at the same time as a circuit survey. It was found that a minimum of four batch laboratory flotation tests were required to derive statistically stable parameters. The use of one laboratory batch flotation test (the traditional method of parameter derivation) results in parameters which are highly sensitive to the error in the batch laboratory flotation test data. A methodology to simulate and predict ore grade and recovery in a flotation circuit based on different ore floatability particle groupings was developed by the author. A theoretical analysis was performed using this algorithm which showed that a two floating component and one non-floating component model produced similar predictions to a multi-component model developed using size and liberation information in a circuit subject to changes in cell operation, residence time and circuit configuration. It was therefore concluded that a discrete floatability component model has the ability to represent, what is in reality, a more complex particle floatability distribution. The analysis performed in this thesis shows that the empirically derived ore floatability component model is a valid method of representing ore floatability within a flotation circuit model which does not contain grinding or reagent addition processes. Parameters of the model can be derived with statistical confidence using multiple batch flotation test data. To effectively model ore floatability in circuits containing regrinding or staged reagent addition, ore floatability models need to be developed which incorporate parameters related to the physical properties of the ore. It is therefore recommended that research be performed to determine the effect of size, liberation and chemical conditioning on the ore floatability of a particle and how these effects are best incorporated into an ore floatability model.
10

Adapting Sensing and Transmission Times to Improve Secondary User Throughput in Cognitive Radio Ad Hoc Networks

January 2012 (has links)
abstract: Cognitive Radios (CR) are designed to dynamically reconfigure their transmission and/or reception parameters to utilize the bandwidth efficiently. With a rapidly fluctuating radio environment, spectrum management becomes crucial for cognitive radios. In a Cognitive Radio Ad Hoc Network (CRAHN) setting, the sensing and transmission times of the cognitive radio play a more important role because of the decentralized nature of the network. They have a direct impact on the throughput. Due to the tradeoff between throughput and the sensing time, finding optimal values for sensing time and transmission time is difficult. In this thesis, a method is proposed to improve the throughput of a CRAHN by dynamically changing the sensing and transmission times. To simulate the CRAHN setting, ns-2, the network simulator with an extension for CRAHN is used. The CRAHN extension module implements the required Primary User (PU) and Secondary User (SU) and other CR functionalities to simulate a realistic CRAHN scenario. First, this work presents a detailed analysis of various CR parameters, their interactions, their individual contributions to the throughput to understand how they affect the transmissions in the network. Based on the results of this analysis, changes to the system model in the CRAHN extension are proposed. Instantaneous throughput of the network is introduced in the new model, which helps to determine how the parameters should adapt based on the current throughput. Along with instantaneous throughput, checks are done for interference with the PUs and their transmission power, before modifying these CR parameters. Simulation results demonstrate that the throughput of the CRAHN with the adaptive sensing and transmission times is significantly higher as compared to that of non-adaptive parameters. / Dissertation/Thesis / M.S. Computer Science 2012

Page generated in 0.179 seconds