• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4537
  • 1155
  • 688
  • 230
  • 165
  • 147
  • 141
  • 132
  • 88
  • 58
  • 48
  • 36
  • 24
  • 18
  • 14
  • Tagged with
  • 9826
  • 1356
  • 1222
  • 929
  • 898
  • 803
  • 703
  • 650
  • 611
  • 577
  • 568
  • 509
  • 498
  • 461
  • 456
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Agent-based approaches to pedestrian modelling

Ronald, Nicole Amy Unknown Date (has links) (PDF)
This thesis investigates the early stages of the software development process for agent-based models of pedestrian behaviour. Planning for pedestrians is becoming more important as planners and engineers become more aware of the sustainability and environmental aspects of transport and infrastructure. It is also necessary for the planning and management of pedestrian areas and events. Pedestrian behaviour is more difficult to model than other transport modes as it is not as constrained and operates at a finer scale. Many approaches have been developed for modelling pedestrian behaviour. The simplest involve a single mathematical equation taking into account area and attractiveness of an area to calculate the maximum capacity. More complicated mathematical models involving differential equations have also been used. Agent-based modelling is a recent development in modelling and simulation. These simulations contain agents who interact with each other and the environment in which they are situated. Their similarity to human societies has led to their use for many social applications. Many modellers are unsure of what agents are and how to develop models using them. In some cases, agents may be useful. In other cases, the model outputs and realism may not offset the learning curve, development time, and increased complexity of an agent-based model. (For complete abstract open document)
22

Issues of scale in individual-based models : applications in fungal and plant community dynamics

Bown, James Louis January 2000 (has links)
The central question addressed in this thesis is whether descriptions of the dynamics of ecological systems at one scale may be effectively used as descriptions of the dynamics of ecological systems at larger scales. This question is addressed in the context of the dynamics of fungal communities. A simple experimental system and complementary theoretical approach, in the form of an individual-based (cellular automaton) model, is presented. Experimental results derived from small-scale systems are used to quantify parameters of the model; results from large-scale experimental systems serve to test the model. The theoretical analyses clearly demonstrate that the dynamics observed are a result of both local and non-local features of the experimental system. In cases such as this the immediate extrapolation of results derived from xperiments conducted out of the context of the community to represent system scale behaviour is not possible. In response to this observation a generic framework is developed to allow the consideration of effects at a range of scales through contextual parameterisation of localised dynamics. The framework is directed toward plant systems where a large body of experimental data exists, and may be parameterised by that experimental data. It represents the essential features of individual interactions in terms of competition for space and resource, and the behaviour of a given plant is described in terms of functional traits. Model runs demonstrate complex community patterns suggestive of a known biological phenomena, succession, that arises as a consequence of the coupling between the community and environment. This coupling may allow the long-term coexistence of species through some particular balance in individual function (traits) across the community. A search mechanism is determined to allow combinations of trait values at the scale of the individual to be assessed for a particular community-scale phenomenon. Initial results demonstrate that this mechanism may identify and converge on combinations of trait values that give rise to, in this case, a simple measure of diversity. The manner in which the generic framework developed may be applied to further the investigation into fungal community dynamics is addressed.
23

Proper orthogonal decomposition with interpolation-based real-time modelling of the heart

Rama, Ritesh Rao January 2017 (has links)
Several studies have been carried out recently with the aim of achieving cardiac modelling of the whole heart for a full heartbeat. However, within the context of the Galerkin method, those simulations require high computational demand, ranging from 16 - 200 CPUs, and long calculation time, lasting from 1 h - 50 h. To solve this problem, this research proposes to make use of a Reduced Order Method (ROM) called the Proper Orthogonal Decomposition with Interpolation method (PODI) to achieve real-time modelling with an adequate level of solution accuracy. The idea behind this method is to first construct a database of pre-computed full-scale solutions using the Element-free Galerkin method (EFG) and then project a selected subset of these solutions to a low dimensional space. Using the Moving Least Square method (MLS), an interpolation is carried out for the problem-at-hand, before the resulting coefficients are projected back to the original high dimensional solution space. The aim of this project is to tackle real-time modelling of a patient-specific heart for a full heartbeat in different stages, namely: modelling (i) the diastolic filling with variations of material properties, (ii) the isovolumetric contraction (IVC), ejection and isovolumetric relation (IVR) with arbitrary time evolutions, and (iii) variations in heart anatomy. For the diastolic filling, computations are carried out on a bi-ventricle model (BV) to investigate the performance and accuracy for varying the material parameters. The PODI calculations of the LV are completed within 14 s on a normal desktop machine with a relative L₂-error norm of 6x10⁻³. These calculations are about 2050 times faster than EFG, with each displacement step generated at a calculation frequency of 1074 Hz. An error sensitivity analysis is consequently carried out to find the most sensitive parameter and optimum dataset to be selected for the PODI calculation. In the second phase of the research, a so-called "time standardisation scheme" is adopted to model a full heartbeat cycle. This is due to the simulation of the IVC, ejection, and IVR phases being carried out using a displacement-driven calculation method which does not use uniform simulation steps across datasets. Generated results are accurate, with the PODI calculations being 2200 faster than EFG. The PODI method is, in the third phase of this work, extended to deal with arbitrary heart meshes by developing a method called "Degrees of freedom standardisation" (DOFS). DOFS consists of using a template mesh over which all dataset result fields are projected. Once the result fields are standardised, they are consequently used for the PODI calculation, before the PODI solution is projected back to the mesh of the problem-at-hand. The first template mesh to be considered is a cube mesh. However, it is found to produce results with high errors and non-physical behaviour. The second template mesh used is a heart template. In this case, a preprocessing step is required where a non-rigid transformation based on the coherent point drift method is used to transform all dataset hearts onto the heart template. The heart template approach generated a PODI solution of higher accuracy at a relatively low computational time. Following these encouraging results, a final investigation is carried out where the PODI method is coupled with a computationally expensive gradient-based optimisation method called the Levenberg- Marquardt (PODI-LVM) method. It is then compared against the full-scale simulation one where the EFG is used with the Levenberg-Marquardt method (EFG-LVM). In this case, the PODI-LVM simulations are 1025 times faster than the EFG-LVM, while its error is less than 1%. It is also observed that since the PODI database is built using EFG simulations, the PODI-LVM behaves similarly to the EFG-LVM one.
24

Comparative metabolic modeling and analysis of human pathogens

Abdel-Haleem, Alyaa M. 08 1900 (has links)
Infectious diseases continue to be major health concerns worldwide. Although major advances have led to accumulation of genomic data about human pathogens, there clearly exists a gap between genome information and studies aiming at identifying potential drug targets. Here, constraint-based modeling (CBM) was deployed to integrate disparate data types with genome-scale metabolic models (GEMs) to advance our understanding of the pathogenesis of infectious agents with respect to identifying and prioritizing drug targets. Specifically, genome-scale metabolic modeling of multiple stages and species of Plasmodium, the causative agent of malaria, was used to prioritize potential drug targets that could be used to simultaneously treat (anti-malarials) and block transmission of the parasite. In addition, species-specific metabolic models were used to guide translation of findings from non-human experimental disease models to human-infecting species. Further, comparative analysis of the essentiality of metabolic genes for V. cholerae, the causative agent of cholera, growth and survival in single and co-infections with other enteric pathogens led to prioritizing conditionally independent essential genes that would be potential drug targets in both single and co-infection scenarios. Taken together, our findings highlight the utility of using genome-scale metabolic models to prioritize druggable targets that would be of broader spectrum against human pathogens.
25

Extended modelling methodology to facilitate integrated decision making in manufacturing enterprises

Vacharaphol, Oratai January 2013 (has links)
This research has highlighted the importance of the multiple roles in design and change organizations and the benefits to have quantitative tools and qualitative tools to support decision making. Hence the aims and objectives of this research are a model driven approach to support integrated decision making in MEs. The author has identified a gap in the lack of a systematic way to model MEs to facilitate integrated decision making. Initial Modelling Methodology (IMM) has been established based on Manufacturing System Integration (MSI) group at Loughborough University so that improvement of this methodology can be investigated to facilitate integrated decision making. Artwork is an industrial furniture manufacturer based in Loughborough and is used as the company case study of this research. IMM has been tested in exploratory research case 1 at a low level of production at Artwork. The experimentations of case 1 have been carried out to study the impact of product volume and variety on specific process section at low level. The result has shown that IMM lacks the ability to facilitate integrated decision making aspect and it can be developed to achieve aims and objectives. This leads to additional concepts of (1) modelling at different level of abstraction to realise the benefit of multiple levels of modelling, (2) deploying the improved views of W, P and R sub-systems to assist in exercising simulation modelling and (3) identifying possible users in planning with scope and focus of decision making. These three concepts were added into IMM and known as Extended Modelling Methodology (EMM). The EMM has been tested in exploratory research case 2 at a mid-level of production system at Artfrom with four types of experimentations: 1) balancing resources of production system, 2) demand change, 3) rework impact and 4) delay impact. The evaluation of results has shown a systematic way of the EMM to facilitate decision making individually and collectively. Therefore the overall research contributions are a new model driven approach to support conceptual design and change of manufacturing systems in aspect of integrated decision making. However, limitations of this research can be addressed as limited availability of data, range and detail of case studies and limited range of modelling techniques explored here. It follows that scopes of future works are utilization of EMM in other domains, consideration of other reference models, investigation of EMM in other company case studies and establishment of a comprehensive database applied in EMM and development of coherent simulation models. In addition, this thesis has also presented ongoing research on developing and testing EMM in another company case study in aircraft engine manufacturer.
26

Multilevel regression modelling of melanoma incidence

Brown, Antony Clark January 2007 (has links)
This thesis is concerned with developing and implementing a method for modelling and projecting cancer incidence data. The burden of cancer is an increasing problem for society and therefore, the ability to analyse and predict trends in large scale populations is vital. These predictions based on incidence and mortality data collected by cancer registries, can be used for estimation of current and future rates, which is helpful for public health planning. A large body of work already exists on the use of various modelling strategies, methods and fitting techniques. A multilevel method of preparing the data is proposed, fitted to historical data using regression modelling, to predict future rates of incidence for a given population. The proposed model starts with a model for the total incidence of the population, with each successive level stratifying the data into progressively more specific groupings, based on age. Each grouping is partitioned into subgroups, and each subgroup is expressed as a proportion of the parent group. Models are fitted to each of the proportional age-groups, and a combination of these models produces a model that predicts incidence for a specific age. A simple, efficient implementation of the modelling procedure is described, including key algorithms and measures of performance. The method is applied to data from populations that have very different melanoma incidence (the USA and Australia). The proportional structure reveals that the proportional age trends present in both populations are remarkably similar, indicating that there are links between causative factors in both populations. The method is applied fully to data from a variety of populations, and compared with results from existing models. The method is shown to be able to produce results that are reliable and stable, and are generally significantly more accurate than those of other models.
27

Impact of Transcatheter Aortic Valve Replacement on Coronary Hemodynamics using Clinical Measurements and an Image-Based Patient-Specific Lumped Parameter Model

Garber, Louis January 2023 (has links)
Cardiovascular disease, including coronary artery disease and aortic valve stenosis, impacts tens of millions of people annually and carries a massive global economic burden. Advances in medical imaging, hardware and software are leading to an increased interest in the field of cardiovascular computational modelling to help combat the devastating impact of cardiovascular disease. Lumped parameter modelling (a branch of computational modelling) holds the potential of aiding in the early diagnosis of these diseases, assisting clinicians in determining personalized and optimal treatments and offering a unique in-silico setting to study cardiac and circulatory diseases due to its rapid computation time, ease of automation and relative simplicity. In this thesis, cardiovascular lumped parameter modelling is presented in detail and a patient-specific framework capable of simulating blood flow waveforms and hemodynamic data in the heart and coronary arteries was developed. The framework used only non-invasive clinical data and images (Computed Tomography images, echocardiography data and cuff blood pressure) as inputs. The novel model was then applied to 19 patients with aortic stenosis who underwent transcatheter aortic valve replacement. The diastolic coronary flow waveforms in the left anterior descending artery, left circumflex artery and right coronary artery were validated against a previously developed patient-specific 3D fluid-structure interaction model for all 19 subjects (pre and post intervention). There were strong qualitative and quantitative agreements between the two models. After the procedure, aortic valve area and net pressure gradient across the aortic valve improved for almost all the subjects. As for the hemodynamic data, according to the model, there was substantial variability in terms of the increase or decrease post intervention. On average, left ventricle workload and maximum left ventricle pressure decreased by 4.5% and 13.0% while cardiac output, mean arterial pressure and resting heart rate increased by 9.9%, 6.9% and 1.9% respectively. There were also subject specific changes in coronary blood flow (37% had increased flow in all three coronary arteries, 32% had decreased flow in all coronary arteries, and 31% had both increased and decreased flow in different coronary arteries). All in all, a proof-of-concept cardiac and coronary lumped parameter framework was developed, validated, and applied in this thesis. / Thesis / Master of Applied Science (MASc) / The heart is a vital part of the cardiovascular system, which helps deliver and regulate blood flow through the entire human body. The coronary arteries are a crucial part of this system since they deliver blood directly to heart muscles. For numerous reasons, the cardiovascular system can become diseased over time and require clinical treatment. Coronary artery disease and aortic valve stenosis are among the most prevalent cardiovascular diseases globally. While medical imaging on its own is a crucial part of the disease management and treatment process, advanced computational models can further enhance the process and provide clinics with data and predictions they might otherwise miss. In this thesis, a patient specific computational framework capable of simulating blood flow waveforms and cardiovascular data in the heart and coronary arteries using only non-invasive clinical data and images was developed and validated. The novel model was applied to a series of patients with aortic stenosis who underwent heart valve replacement with the aim of studying the impact on coronary blood flow and global cardiovascular metrics.
28

Agricultural economics within the NERC/ESRC Land Use Programme

Moxey, Andrew Paul January 1999 (has links)
No description available.
29

A multilayered approach to two-dimensional urban flood modelling

Evans, Barry January 2010 (has links)
With urbanisation continuing to encroach upon flood plains, the constant replacement of permeable land with impermeable surfaces and with the changes in global climate, the need for improved flood modelling is ever more apparent. A wide range of methods exist that simulate surface flow; most commonly in one-dimensional (1D) or twodimensional (2D), and more recently on smaller scales in three-dimensional (3D) models. In urban flood modelling, 2D models are often the preferred choice as they can simulate surface flow more accurately than their 1D model counterparts; they are, however, more computationally demanding and thereby usually require greater simulation time. With the vast amount of information used in flood modelling, generalisation techniques are often employed to reduce the computational load within a simulation. The objective of this thesis is to improve 2D flood modelling in urban environments by introducing a new and novel approach of representing fine scale building features within coarse grids. This is achieved by creating an automated approach that data-mines key features such as buildings and represents their effects numerically within a multiple layer grid format. This new approach is tested in comparison to two other, already established generalising techniques which are single layer based. The effectiveness of each model is assessed by its ability to accurately represent surface flow at different grid resolutions and how each copes with varying building orientations and distributions within the test datasets. The performance of each generalising approach is determined therefore by its accuracy in relation to the fine scale model and the difference in the computational time required complete the simulation. Finally the multilayered methodology is applied to a real case scenario to test its applicability further. Overall it revealed, as predicted, that the multilayered approach enables far greater accuracies at routing surface flow within coarse grids whilst still greatly reducing computational time. As a further benefit in urban flood modelling, this thesis shows that using a multilayered data format it is possible to simulate the influence of features that have a grid resolution finer than the initial terrain topology data, thus enabling, for example, the routing of surface water through alleyways between buildings that have a width less than one meter.
30

Compartmental flow modelling of acute care hospital bed occupancy for strategic decision-making

Mackay, Mark January 2007 (has links)
The research presented in this thesis focuses on the investigation of whether the compartmental flow models of bed occupancy originally described by Harrison and Millard (1991) for decision-making around geriatric service care in the English National Health Service can be used to describe data from acute care hospitals in Australia and New Zealand. Australia’s total health expenditure for 2004-05 was $87.3 billion. The use of health care services and expenditure pattern is well established and Australia follows the pattern found in most developed countries, with the greatest expenditure occurring on services for the elderly. Australia is experiencing a shift in population structure, with the proportion of older people forecast to increase. It is expected there will be a need for a greater level of expenditure on health care as the number of elderly people increase. There is an emerging gap between the ability to supply health services and the demand for them. Furthermore, acute care hospital treatment is generally considered expensive and governments have been keen to control this expenditure. It is imperative that governments are able to make decisions based upon robust policy advice. There are serious consequences in both economic resource allocation and patient (and population) health outcomes if decisions about future health service structures are incorrect. In particular, there is a need for better decision-making around bed management at the strategic level. Strategic decision-making relates to decisions that will occur in a longer time frame. Decision-making can benefit from the use of modelling. Models represent a simplified version of reality that preserve the essential features of the situation being examined and can be used as a tool to investigate decision-making options, particularly in complex environments such as the health sector. Historically decision-making relating to hospital beds has used either simple “back of the envelope” calculations or adherence to “rule of thumb” approaches. Most of the approaches have relied upon using the average length of stay metric. While the modelling of hospital bed numbers is not new, much of this work has relied upon the average length of stay, which is known to be a poor measure. Harrison and Millard (1991) introduced the application of the compartmental flow model for modelling hospital bed occupancy and noted its potential to be used to influence policy decision-making. The flow model results are plausible and easily interpreted. However, relatively little work has focussed on the ability of these models to be generalized and be used for predictive purposes. The research undertaken for this thesis consisted of a series of modelling experiments that can be grouped into two key stages: whether the models could be successfully applied to the acute care data; and whether the models could be used for novel purposes, such as forecasting, evaluation of service change, and benchmarking. This entailed the further development of the model, and a consideration of basic modelling issues such as the balance between data-fit and model complexity, in order to capture better variation in the data and also to facilitate linkage to changes in population and seasonality. / http://library.adelaide.edu.au/cgi-bin/Pwebrecon.cgi?BBID=1301698 / Thesis(Ph.D.) -- School of Psychology, 2007

Page generated in 0.0368 seconds