• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 96
  • 20
  • 12
  • 8
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 192
  • 192
  • 192
  • 65
  • 44
  • 32
  • 31
  • 28
  • 27
  • 26
  • 25
  • 24
  • 23
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Corridors and Elk Migration: A Comparative Analysis of Landscape Connectivity Models and GPS Data in the Greater Yellowstone Ecosystem

Chambers, Samuel Norton January 2015 (has links)
Landscape connectivity models aim to map the links or corridors that wildlife would or do use between patches of habitat. Migratory species such as elk traverse between such patches which serve as seasonal ranges. The goal of this study was to compare and contrast the suitability of several landscape connectivity models for describing and predicting migration in a long-distance migrant. We measured the suitability of connectivity models for covering and predicting the migratory movements of elk in the Greater Yellowstone Ecosystem. GPS point data was converted to sequential networks for multiple populations of elk. GPS data was also used to delineate the summer and winter ranges of each population. The kernel density of routes in the networks was measured for comparison to connectivity models. The ranges served as the patches to be connected by such models. A resistance surface was produced using reclassified landcover data for mapping habitat suitability and linear road data for human presence or obstruction to movement. Landscape connectivity was measured for eleven migratory elk populations using three distinct models. The first measured connectivity using circuit theory; the second, agent based modeling; the third, least cost corridors. The model results were compared to the migratory network density by measuring correlation. This was followed by a new method of measuring the influence of autocorrelation between the models and networks. Some of the models were then altered to test for suspected influences. This study shows that least cost corridors and circuit theory can are limited in their ability to predict the migratory movements between summer and winter ranges but only so much. They lack the ability to predict exploratory movements that do not link conspicuous ranges to each other. They also lack the ability to account for all avoidance behaviors in the landscape. Our results suggest that connectivity models need improvement by accounting for exploration outside of prime habitat. It also suggests connectivity models are not adequate predictors of migratory movements and not suited to conservation planning of migratory networks. This supports Sawyer's (et al. 2009) ungulate conservation planning of considering connectivity but basing priority on migratory landscape usage. It is assumed that fragmentation or loss in connectivity impedes seasonal migration, cutting off wildlife from resources (Rudnick et al. 2012). This study shows that migratory elk are actually using less than prime and supposedly fragmented habitat in migration and that there is more than connectivity at play.
42

On the Design and Numerical Analysis of Tradable Mobility Credit Strategies

Tian, Ye January 2015 (has links)
Traffic congestion has been placing an extremely high burden on the development of modern cities. Congestion can be alleviated by either increasing road capacity, or by reducing traffic demand. For decades, increasing capacity by building more roads and lanes has been the major solution applied to accommodate the ever-growing traffic demand. However, it turns out to be of limited effect due to some well-known phenomenon such as latent demand. Controlling and managing traffic demand has in turn been viewed as a cost-effective alternative to increasing road capacity, as has been demonstrated many successful applications all around the world. Within the concept framework of Traffic Demand Management (TDM), Active Transportation and Demand Management (ATDM) is the dynamic management, control, and influence of traffic demand and traffic flow of transportation facilities. ATDM strategies attempt to influence traveler behavior and further manage traffic flow in a time-dependent manner within the existing infrastructure Successful ATDM applications include congestion pricing, adaptive ramp metering, dynamic speed limits, dynamic lane use control, etc. Singapore stands out to be an excellent success story of ATDM, as the implementations of "Cap and Trade" license plates and electronic road pricing make motoring a high cost privilege for citizens of Singapore, making the public relies on transit. Monetary leverage is an effective instrument to facilitate ATDM. Examples of ATDM applications adopting monetary instrument includes dynamic congestion pricing, "Cap and Trade" of car licenses, etc. Taking congestion pricing as an example, policy makers are inducing travelers' behavior and alternating their preferences towards different behavior decisions by levying price tags to different choices. As an important underpinning of rationing choice theory, an individual assigns an ordinal number over the available actions and this ordinal number is calculated by their utility function or payoff function. The individual's preference is expressed as the relationship between those ordinal assignments. In the implementation of congestion pricing, policy makers are imposing an additional high disutility to congested roads and therefore pushing some of the travelers to take alternative routes or shift to alternative departure times or even cancel the trips. However, congestion pricing suffers from public aversion as it creates burden on the motoring of low-income people and therefore doesn't help to alleviate social inequality. The concept of Tradable Mobility Credit (TMC) has been proposed by a group of researchers as another innovative application to facilitate dynamic traffic demand management and solve social inequality issues using pricing instruments. The concept of TMC is borrowed from carbon trading in environmental control. A limited quota of personal auto usage is issued to eligible travelers and credits can be traded in a free market fashion. This guarantees that the roadway usage does not exceed capacity while avoiding the negative effects of shortages normally associated with quotation systems. TMC is literally not a market-ready policy as the integration of the supporting infrastructures, including the trading market, the credit assignment component, and the credit charging component, has not been fully explored yet. Existing TMC research focuses on explaining and exploring the equilibrium condition through analytical methods such as mathematical modeling. Analytical models produce perfect convergence curves and deterministic equilibrium traffic flow patterns. Analytical models provide influential guidance for further works but the solution procedure may encounter problems when dealing with larger real world networks and scenarios. Meantime, current analytical models don't consider the microstructure of the credit trading market sufficiently while it's actually the most unique component of TMC system. Motivated by those concerns, an integrated TMC evaluation platform consisting of a policy making module and traveler behavior modules are proposed in this research. The concept of Agent-Based Modeling and Simulation (ABMS) is extensively adopted in this integrated platform as each individual traveler carries his/her personal memory across iterations. The goal of establishing this framework is to better predict a traveler's route choice and trading behavior if TMC is imposed and further provide intelligence to potential policy makers' decision making process. The proposed integrated platform is able to generate results at different aggregation levels, including both individual level microscopic behavior data as well as aggregated traffic flow and market performance data. In order to calibrate the proposed integrated platform, an online interactive experiment is designed based on an experimental economic package and a human research element with 22 participants has been conducted on this experiment platform to gather field data regarding a real person's route choice behavior and credit trading behavior in an artificial TMC system. Participants are recruited from forum, listserve, social media, etc. The calibrated platform is proved to have the ability to predict travelers' behavior accurately. A prototype market microstructure is proposed in this research as well and it is proved to be a cost-effective setting and resulted to a vast amount of economic saving given the fact that travelers would behave similar to the prediction generated by traveler behavior module. It's also demonstrated that the principle of Pareto-improving is not achieved in the proposed ABMS models.
43

Sustainability of multimodal intercity transportation using a hybrid system dynamics and agent-based modeling approach

Hivin, Ludovic F. 12 January 2015 (has links)
Demand for intercity transportation has increased significantly in the past decades and is expected to continue to follow this trend in the future. In the meantime, concern about the environmental impact and potential climate change associated with this demand has grown, resulting in an increasing importance of climate impact considerations in the overarching issue of sustainability. This results in discussions on new regulations, policies and technologies to reduce transportation's climate impact. Policies may affect the demand for the different transportation modes through increased travel costs, increased market share of more fuel efficient vehicles, or even the introduction of new modes of transportation. However, the effect of policies and technologies on mobility, demand, fleet composition and the resulting climate impact remains highly uncertain due to the many interdependencies. This motivates the creation of a parametric modeling and simulation environment to explore a wide variety of policy and technology scenarios and assess the sustainability of transportation. In order to capture total transportation demand and the potential mode shifts, a multimodal approach is necessary. The complexity of the intercity transportation System-of-Systems calls for a hybrid Agent-Based Modeling and System Dynamics paradigm to better represent both micro-level and macro-level behaviors. Various techniques for combining these paradigms are explored and classified to serve as a hybrid modeling guide. A System Dynamics approach is developed, that integrates socio-economic factors, mode performance, aggregated demand and climate impact. It is used to explore different policy and technology scenarios, and better understand the dynamic behavior of the intercity transportation System-of-Systems. In order to generate the necessary data to create and validate the System Dynamics model, an Agent-Based model is used due to its capability to better capture the behavior of a collection of sentient entities. Equivalency of both models is ensured through a rigorous cross-calibration process. Through the use of fleet models, the fuel burn and life cycle emissions from different modes of transportation are quantified. The radiative forcing from the main gaseous and aerosol species is then obtained through radiative transfer calculations and regional variations are discussed. This new simulation environment called the environmental Ground and Air Mode Explorer (eGAME) is then used to explore different policy and technology scenarios and assess their effect on transportation demand, fleet efficiencies and the resulting climate impact. The results obtained with this integrated assessment tool aim to support a scenario-based decision making approach and provide insight into the future of the U.S. transportation system in a climate constrained environment.
44

Exurban land cover and land market evolution: Analysis, review and computational experimentation of spatial and agent heterogeneity from the bottom up

Huang, Qingxu 22 January 2013 (has links)
This dissertation investigates selected empirical and theoretical aspects of land-use and land-cover change (LUCC) in exurban areas. Two challenges – observation and monitoring of LUCC, and spatially explicit modeling, are addressed using three main approaches – measuring, reviewing and agent-based modeling (ABM). All of these approaches focus on LUCC at the individual household level, investigating how micro-scale elements interact to influence macro-scale functional patterns—bottom-up analysis. First, the temporal change of the quantity and pattern of land-cover types within exurban residential parcels in three townships in the southeastern Michigan is examined using landscape metrics and local indicators of spatial association at the parcel and parcel-neighborhood level respectively. The results demonstrate that the number and area of exurban residential parcels increased steadily from 1960 to 2000, and different land-cover types have distinctive temporal changes over time. The results also indicate that there is a convergence process at the neighborhood level through which the quantity and pattern of land cover in parcels conform with the neighborhood appearance. Second, 51 urban residential choice models based on ABM are reviewed. The results divide these models into three categories (i.e. models based on classical theories, models focusing on different stages of urbanization process; and integrated ABM and microsimulation models). This review also compares the differences among these models in their representations of three essential features brought by the technique of ABM: agent heterogeneity, the land market and output measurement. Challenges in incorporating these features, such as the trade-off between the simplicity and abstraction of model and the complexity of urban residential system, interactions of multiple features and demands for data at individual level, are also discussed. Third, the effects of agent heterogeneity on spatial and socioeconomic outcomes under different levels of land-market representations are explored through three experiments using a stylized agent-based land-market model. The results reveal that budget heterogeneity has prominent effects on socioeconomic outcomes, while preference heterogeneity is highly pertinent to spatial outcomes. The relationship between agent heterogeneity and macro-measures becomes more complex as more land-market mechanisms are represented. The results also imply that land-market representation (e.g., competitive bidding) is indispensable to reproduce the results of classical urban land market models (e.g., monocentric city model) in a spatial ABM when agents are heterogeneous.
45

Adaptive Interactive Expectations: Dynamically Modelling Profit Expectations

William Paul Bell Unknown Date (has links)
This thesis aims to develop an alternative expectations model to the Rational Expectations Hypothesis (REH) and adaptive-expectations models, which provides more accurate temporal predictive performance and more closely reflects recent advances in behavioural economics, the ‘science of complexity’ and network dynamics. The model the thesis develops is called Adaptive Interactive Expectations (AIE), a subjective dynamic model of the process of expectations formation. To REH, the AIE model provides both an alternative and a complement. AIE and REH complement one another in that they are diametrically opposite in the following five dimensions, agent intelligence, agent interaction, agent homogeneity, equilibrium assumptions and the rationalisation process. REH and AIE stress the importance of hyper-intelligent agents interacting only via a price signal and near zero-intelligent agents interacting via a network structure, respectively. The complementary nature of AIE and REH provide dual perspectives that enhance analysis. The Dun & Bradstreet (D&B 2008) profit expectations survey is used in the thesis to calibrate AIE and make predictions. The predictive power of the AIE and REH models is compared. The thesis introduces the ‘pressure to change profit expectations index’, px. This index provides the ability to model unknowns within an adaptive dynamic process and combine the beliefs from interactive-expectations, adaptive-expectations and biases that include pessimism, optimism and ambivalence. AIE uses networks to model the flow of interactive-expectations between firms. To overcome the uncertainty over the structure of the interactive network, the thesis uses model-averaging over 121 network topologies. These networks are defined by three variables regardless of their complexity. Unfortunately, the Bayesian technique’s use of the number of variables as a measure of complexity makes it unsuitable for model-averaging over the network topologies. To overcome this limitation in the Bayesian technique, the thesis introduces two model-averaging techniques, ‘runtime-weighted’ and ‘optimal-calibration’. These model-averaging techniques are benchmarked against ‘Bayes-factor model-averaging’ and ‘equal-weighted model-averaging’. In addition to the aggregate called all–firms, the D&B (2008) survey has four divisions, manufacturing durables, manufacturing non–durables, wholesale and retail. To make use of the four divisions, the thesis introduces a ‘link-intensity matrix’ based upon an ‘input-output table’ to improve the calibration of the networks. The transpose of the table is also used in the thesis. The two ‘link-intensity matrices’ are benchmarked against the default, a ‘matrix of ones’. The aggregated and disaggregated versions of AIE are benchmarked against adaptive-expectations to establish whether the interactive-expectations component of AIE add value to the model. The thesis finds that AIE has more predictive power than REH. ‘Optimal-calibration model-averaging’ improves the predictive performance of the better-fitting versions of AIE, which are those versions that use the ‘input-output table’ and ‘matrix of ones’ link-intensity matrices. The ‘runtime-weighted model-averaging’ improves the predictive performance of only the ‘input-output table’ version of AIE. The interactive component of the AIE model improves the predictive performance of all versions of the AIE over adaptive-expectations. There is an ambiguous effect on prediction performance from introducing the ‘input-output table’. However, there is a clear reduction in the predictive performance from introducing its transpose. AIE can inform the debate on government intervention by providing an Agent-Based Model (ABM) perspective on the conflicting mathematical and narrative views proposed by the Greenwald–Stiglitz Theorem and Austrian school, respectively. Additionally, AIE can provide a complementary role to REH, which is descriptive/predictive and normative, respectively. The AIE network calibration uses an ‘input-output table’ to determine the link-intensity; this method could provide Computable General Equilibrium (CGE) and Dynamic Stochastic General Equilibrium (DSGE) with a way to improve their transmission mechanism. Furthermore, the AIE network calibration and prediction methodology may help overcome the validation concerns of practitioners when they implement ABM.
46

Adaptive Interactive Expectations: Dynamically Modelling Profit Expectations

William Paul Bell Unknown Date (has links)
This thesis aims to develop an alternative expectations model to the Rational Expectations Hypothesis (REH) and adaptive-expectations models, which provides more accurate temporal predictive performance and more closely reflects recent advances in behavioural economics, the ‘science of complexity’ and network dynamics. The model the thesis develops is called Adaptive Interactive Expectations (AIE), a subjective dynamic model of the process of expectations formation. To REH, the AIE model provides both an alternative and a complement. AIE and REH complement one another in that they are diametrically opposite in the following five dimensions, agent intelligence, agent interaction, agent homogeneity, equilibrium assumptions and the rationalisation process. REH and AIE stress the importance of hyper-intelligent agents interacting only via a price signal and near zero-intelligent agents interacting via a network structure, respectively. The complementary nature of AIE and REH provide dual perspectives that enhance analysis. The Dun & Bradstreet (D&B 2008) profit expectations survey is used in the thesis to calibrate AIE and make predictions. The predictive power of the AIE and REH models is compared. The thesis introduces the ‘pressure to change profit expectations index’, px. This index provides the ability to model unknowns within an adaptive dynamic process and combine the beliefs from interactive-expectations, adaptive-expectations and biases that include pessimism, optimism and ambivalence. AIE uses networks to model the flow of interactive-expectations between firms. To overcome the uncertainty over the structure of the interactive network, the thesis uses model-averaging over 121 network topologies. These networks are defined by three variables regardless of their complexity. Unfortunately, the Bayesian technique’s use of the number of variables as a measure of complexity makes it unsuitable for model-averaging over the network topologies. To overcome this limitation in the Bayesian technique, the thesis introduces two model-averaging techniques, ‘runtime-weighted’ and ‘optimal-calibration’. These model-averaging techniques are benchmarked against ‘Bayes-factor model-averaging’ and ‘equal-weighted model-averaging’. In addition to the aggregate called all–firms, the D&B (2008) survey has four divisions, manufacturing durables, manufacturing non–durables, wholesale and retail. To make use of the four divisions, the thesis introduces a ‘link-intensity matrix’ based upon an ‘input-output table’ to improve the calibration of the networks. The transpose of the table is also used in the thesis. The two ‘link-intensity matrices’ are benchmarked against the default, a ‘matrix of ones’. The aggregated and disaggregated versions of AIE are benchmarked against adaptive-expectations to establish whether the interactive-expectations component of AIE add value to the model. The thesis finds that AIE has more predictive power than REH. ‘Optimal-calibration model-averaging’ improves the predictive performance of the better-fitting versions of AIE, which are those versions that use the ‘input-output table’ and ‘matrix of ones’ link-intensity matrices. The ‘runtime-weighted model-averaging’ improves the predictive performance of only the ‘input-output table’ version of AIE. The interactive component of the AIE model improves the predictive performance of all versions of the AIE over adaptive-expectations. There is an ambiguous effect on prediction performance from introducing the ‘input-output table’. However, there is a clear reduction in the predictive performance from introducing its transpose. AIE can inform the debate on government intervention by providing an Agent-Based Model (ABM) perspective on the conflicting mathematical and narrative views proposed by the Greenwald–Stiglitz Theorem and Austrian school, respectively. Additionally, AIE can provide a complementary role to REH, which is descriptive/predictive and normative, respectively. The AIE network calibration uses an ‘input-output table’ to determine the link-intensity; this method could provide Computable General Equilibrium (CGE) and Dynamic Stochastic General Equilibrium (DSGE) with a way to improve their transmission mechanism. Furthermore, the AIE network calibration and prediction methodology may help overcome the validation concerns of practitioners when they implement ABM.
47

Modeling Frameworks for Supply Chain Analytics

January 2012 (has links)
abstract: Supply chains are increasingly complex as companies branch out into newer products and markets. In many cases, multiple products with moderate differences in performance and price compete for the same unit of demand. Simultaneous occurrences of multiple scenarios (competitive, disruptive, regulatory, economic, etc.), coupled with business decisions (pricing, product introduction, etc.) can drastically change demand structures within a short period of time. Furthermore, product obsolescence and cannibalization are real concerns due to short product life cycles. Analytical tools that can handle this complexity are important to quantify the impact of business scenarios/decisions on supply chain performance. Traditional analysis methods struggle in this environment of large, complex datasets with hundreds of features becoming the norm in supply chains. We present an empirical analysis framework termed Scenario Trees that provides a novel representation for impulse and delayed scenario events and a direction for modeling multivariate constrained responses. Amongst potential learners, supervised learners and feature extraction strategies based on tree-based ensembles are employed to extract the most impactful scenarios and predict their outcome on metrics at different product hierarchies. These models are able to provide accurate predictions in modeling environments characterized by incomplete datasets due to product substitution, missing values, outliers, redundant features, mixed variables and nonlinear interaction effects. Graphical model summaries are generated to aid model understanding. Models in complex environments benefit from feature selection methods that extract non-redundant feature subsets from the data. Additional model simplification can be achieved by extracting specific levels/values that contribute to variable importance. We propose and evaluate new analytical methods to address this problem of feature value selection and study their comparative performance using simulated datasets. We show that supply chain surveillance can be structured as a feature value selection problem. For situations such as new product introduction, a bottom-up approach to scenario analysis is designed using an agent-based simulation and data mining framework. This simulation engine envelopes utility theory, discrete choice models and diffusion theory and acts as a test bed for enacting different business scenarios. We demonstrate the use of machine learning algorithms to analyze scenarios and generate graphical summaries to aid decision making. / Dissertation/Thesis / Ph.D. Industrial Engineering 2012
48

An Agent-Based Model of Wildlife Migratory Patterns in Human-Disturbed Landscapes

Tierney, Lauren 18 August 2015 (has links)
In recent years, human decision-making has led to significant landscape impacts in the western United States. Specifically, migratory wildlife populations have increasingly been impacted by rural urban development and energy resource development. This research presents the application of agent-based modeling to explore how such impacts influence the characteristics of migratory animal movement, focusing on mule deer (Odocoileus hemionus) in Western Wyoming. This study utilizes complex adaptive systems and agent-based modeling frameworks to increase understanding of migratory patterns in a changing landscape and explores thresholds of interference to migration patterns due to increased habitat degradation and fragmentation. The agent-based model utilizes GPS-collar data to examine how individual processes lead to population-level patterns of movement and adaptation. The assessment incorporates elements from both human and natural systems to explore potential future scenarios for human development in the natural landscape and incorporates adaptive behaviors, as well as animal-movement ecology, in changing landscapes.
49

Assessing the Effects of Institutional and Spatial Arrangements in Analytical and Computational Models of Conservation

January 2013 (has links)
abstract: This work is an assemblage of three applied projects that address the institutional and spatial constraints to managing threatened and endangered (T & E) terrestrial species. The first project looks at the role of the Endangered Species Act (ESA) in protecting wildlife and whether banning non–conservation activities on multi-use federal lands is socially optimal. A bioeconomic model is used to identify scenarios where ESA–imposed regulations emerge as optimal strategies and to facilitate discussion on feasible long–term strategies in light of the ongoing public land–use debate. Results suggest that banning harmful activities is a preferred strategy when valued species are in decline or exposed to poor habitat quality. However such a strategy cannot be sustained in perpetuity, a switch to land–use practices characteristic of habitat conservation plans is recommended. The spatial portion of this study is motivated by the need for a more systematic quantification and assessment of landscape structure ahead of species reintroduction; this portion is further broken up into two parts. The first explores how connectivity between habitat patches promotes coexistence among multiple interacting species. An agent–based model of a two–patch metapopulation is developed with local predator–prey dynamics and density–dependent dispersal. The simulation experiment suggests that connectivity levels at both extremes, representing very little risk and high risk of species mortality, do not augment the likelihood of coexistence while intermediate levels do. Furthermore, the probability of coexistence increases and spans a wide range of connectivity levels when individual dispersal is less probabilistic and more dependent on population feedback. Second, a novel approach to quantifying network structure is developed using the statistical method of moments. This measurement framework is then used to index habitat networks and assess their capacity to drive three main ecological processes: dispersal, survival, and coexistence. Results indicate that the moments approach outperforms single summary metrics and accounts for a majority of the variation in process outcomes. The hierarchical measurement scheme is helpful for indicating when additional structural information is needed to determine ecological function. However, the qualitative trend between network indicator and function is, at times, unintuitive and unstable in certain areas of the metric space. / Dissertation/Thesis / Ph.D. Applied Mathematics for the Life and Social Sciences 2013
50

A Multi-agent Based Model for Inter Terminal Transportation

Iqbal, Mubashir January 2015 (has links)
Context: Due to an increase in inter-terminal transportation (ITT) volume within a container port; complexity of transportation processes between the terminals has also increased. Problems with the current way of handling ITT resources are expected to rise in the near future. Different types of vehicles are already in place for transporting containers between different terminals in a container port. However, there need to be an efficient and effective use of these vehicle types in order to get maximum benefits out of these resources. Objectives: In this thesis, we investigate and propose a solution model for ITT considering the combination of both manned (MTS, Trucks) and unmanned (AGV) vehicles. An agent based model is proposed for ITT focusing on three ITT vehicle types. Objective of proposed model is to investigate the capabilities and combination of different vehicles for transporting containers between different container terminals in a port.  Methods: A systematic literature review is conducted to identify the problems and methods and approaches for solving those problems in the domain of container transportation. As a case, an agent-based model is proposed for the Maasvlakte area of the Rotterdam port. Simulations are performed on different scenarios to compare three different road vehicle types, i.e., AGV, MTS, and truck, in a network comprising of ten terminals. Results: Literature review results indicate that heuristics is the most commonly used method to solve different problems of container transportation in the recent past. The review also depicts that limited research has been published focusing on ITT when compared to intra-terminal transportation. Simulation results of our proposed model indicate that AGVs outperforms trucks in terms of loading/unloading time and number of vehicles required to handle the given volume of all scenarios. In most of the cases, it is observed that the number of trucks required are twice as much as compared to AGVs in order to transport containers between different terminals. Results also show that lower number MTS vehicles (as compared to AGVs) are required for handling containers in certain scenarios; however, the loading/unloading time for MTS is much higher than that of AGVs. Conclusions: Using agent-based simulation experiments, we propose a model that can help in estimating the required resources (vehicles) to handle the ITT containers volume and improve the utilization of different resources in a network of terminals. From comparison of three road vehicle types, it was concluded that trucks are incapable to handle higher container volume in an ITT. It was also concluded that AGVs can be an appropriate choice if automated operations are supported in the terminals, otherwise MTS is the best choice concerning the number of vehicles required to handle containers. Our simulation results may help the ITT planners in better estimations and planning of ITT to meet current and future challenges of transporting high containers volume.

Page generated in 0.0828 seconds