• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 173
  • 28
  • 22
  • 19
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 335
  • 335
  • 197
  • 79
  • 58
  • 56
  • 45
  • 40
  • 38
  • 35
  • 33
  • 32
  • 32
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Exurban land cover and land market evolution: Analysis, review and computational experimentation of spatial and agent heterogeneity from the bottom up

Huang, Qingxu 22 January 2013 (has links)
This dissertation investigates selected empirical and theoretical aspects of land-use and land-cover change (LUCC) in exurban areas. Two challenges – observation and monitoring of LUCC, and spatially explicit modeling, are addressed using three main approaches – measuring, reviewing and agent-based modeling (ABM). All of these approaches focus on LUCC at the individual household level, investigating how micro-scale elements interact to influence macro-scale functional patterns—bottom-up analysis. First, the temporal change of the quantity and pattern of land-cover types within exurban residential parcels in three townships in the southeastern Michigan is examined using landscape metrics and local indicators of spatial association at the parcel and parcel-neighborhood level respectively. The results demonstrate that the number and area of exurban residential parcels increased steadily from 1960 to 2000, and different land-cover types have distinctive temporal changes over time. The results also indicate that there is a convergence process at the neighborhood level through which the quantity and pattern of land cover in parcels conform with the neighborhood appearance. Second, 51 urban residential choice models based on ABM are reviewed. The results divide these models into three categories (i.e. models based on classical theories, models focusing on different stages of urbanization process; and integrated ABM and microsimulation models). This review also compares the differences among these models in their representations of three essential features brought by the technique of ABM: agent heterogeneity, the land market and output measurement. Challenges in incorporating these features, such as the trade-off between the simplicity and abstraction of model and the complexity of urban residential system, interactions of multiple features and demands for data at individual level, are also discussed. Third, the effects of agent heterogeneity on spatial and socioeconomic outcomes under different levels of land-market representations are explored through three experiments using a stylized agent-based land-market model. The results reveal that budget heterogeneity has prominent effects on socioeconomic outcomes, while preference heterogeneity is highly pertinent to spatial outcomes. The relationship between agent heterogeneity and macro-measures becomes more complex as more land-market mechanisms are represented. The results also imply that land-market representation (e.g., competitive bidding) is indispensable to reproduce the results of classical urban land market models (e.g., monocentric city model) in a spatial ABM when agents are heterogeneous.
92

Feature Based Design Of Rotational Parts Based On Step

Fidan, Tahir 01 December 2004 (has links) (PDF)
The implicit and low-level part definition data, provided by geometric modeling cannot be used by downstream applications. Therefore, feature based modeling concept has been introduced to integrate CAD and downstream applications. However, due to the lack of implicit and explicit standard representations for features and unmanageable number of possible predefined features without standardization, feature based modeling approach has proved to be inadequate. STEP AP224 provides a standard for both implicit and explicit representations for manufacturing features. This thesis presents STEP AP224 features based modeling for rotational parts. The thesis covers features extracted from STEP AP224 for rotational parts and their definitions, classifications, attributes, generation techniques, attachment methods and geometrical constraints. In this thesis a feature modeler for rotational parts has been developed. STEP AP224 features generated are used as the basic entities for part design. The architecture of the proposed system consists of two three phases: (1) feature library, (2) feature modeler and (3) preprocessor. Preprocessor responsible from STEP-XML data file creation. The data file created can be used in the integration CAPP/CAM systems without using a complex feature recognition process. An object-oriented design approach is used in developing the feature modeler to provide incremental system development and reusability.
93

Adaptive Interactive Expectations: Dynamically Modelling Profit Expectations

William Paul Bell Unknown Date (has links)
This thesis aims to develop an alternative expectations model to the Rational Expectations Hypothesis (REH) and adaptive-expectations models, which provides more accurate temporal predictive performance and more closely reflects recent advances in behavioural economics, the ‘science of complexity’ and network dynamics. The model the thesis develops is called Adaptive Interactive Expectations (AIE), a subjective dynamic model of the process of expectations formation. To REH, the AIE model provides both an alternative and a complement. AIE and REH complement one another in that they are diametrically opposite in the following five dimensions, agent intelligence, agent interaction, agent homogeneity, equilibrium assumptions and the rationalisation process. REH and AIE stress the importance of hyper-intelligent agents interacting only via a price signal and near zero-intelligent agents interacting via a network structure, respectively. The complementary nature of AIE and REH provide dual perspectives that enhance analysis. The Dun & Bradstreet (D&B 2008) profit expectations survey is used in the thesis to calibrate AIE and make predictions. The predictive power of the AIE and REH models is compared. The thesis introduces the ‘pressure to change profit expectations index’, px. This index provides the ability to model unknowns within an adaptive dynamic process and combine the beliefs from interactive-expectations, adaptive-expectations and biases that include pessimism, optimism and ambivalence. AIE uses networks to model the flow of interactive-expectations between firms. To overcome the uncertainty over the structure of the interactive network, the thesis uses model-averaging over 121 network topologies. These networks are defined by three variables regardless of their complexity. Unfortunately, the Bayesian technique’s use of the number of variables as a measure of complexity makes it unsuitable for model-averaging over the network topologies. To overcome this limitation in the Bayesian technique, the thesis introduces two model-averaging techniques, ‘runtime-weighted’ and ‘optimal-calibration’. These model-averaging techniques are benchmarked against ‘Bayes-factor model-averaging’ and ‘equal-weighted model-averaging’. In addition to the aggregate called all–firms, the D&B (2008) survey has four divisions, manufacturing durables, manufacturing non–durables, wholesale and retail. To make use of the four divisions, the thesis introduces a ‘link-intensity matrix’ based upon an ‘input-output table’ to improve the calibration of the networks. The transpose of the table is also used in the thesis. The two ‘link-intensity matrices’ are benchmarked against the default, a ‘matrix of ones’. The aggregated and disaggregated versions of AIE are benchmarked against adaptive-expectations to establish whether the interactive-expectations component of AIE add value to the model. The thesis finds that AIE has more predictive power than REH. ‘Optimal-calibration model-averaging’ improves the predictive performance of the better-fitting versions of AIE, which are those versions that use the ‘input-output table’ and ‘matrix of ones’ link-intensity matrices. The ‘runtime-weighted model-averaging’ improves the predictive performance of only the ‘input-output table’ version of AIE. The interactive component of the AIE model improves the predictive performance of all versions of the AIE over adaptive-expectations. There is an ambiguous effect on prediction performance from introducing the ‘input-output table’. However, there is a clear reduction in the predictive performance from introducing its transpose. AIE can inform the debate on government intervention by providing an Agent-Based Model (ABM) perspective on the conflicting mathematical and narrative views proposed by the Greenwald–Stiglitz Theorem and Austrian school, respectively. Additionally, AIE can provide a complementary role to REH, which is descriptive/predictive and normative, respectively. The AIE network calibration uses an ‘input-output table’ to determine the link-intensity; this method could provide Computable General Equilibrium (CGE) and Dynamic Stochastic General Equilibrium (DSGE) with a way to improve their transmission mechanism. Furthermore, the AIE network calibration and prediction methodology may help overcome the validation concerns of practitioners when they implement ABM.
94

Adaptive Interactive Expectations: Dynamically Modelling Profit Expectations

William Paul Bell Unknown Date (has links)
This thesis aims to develop an alternative expectations model to the Rational Expectations Hypothesis (REH) and adaptive-expectations models, which provides more accurate temporal predictive performance and more closely reflects recent advances in behavioural economics, the ‘science of complexity’ and network dynamics. The model the thesis develops is called Adaptive Interactive Expectations (AIE), a subjective dynamic model of the process of expectations formation. To REH, the AIE model provides both an alternative and a complement. AIE and REH complement one another in that they are diametrically opposite in the following five dimensions, agent intelligence, agent interaction, agent homogeneity, equilibrium assumptions and the rationalisation process. REH and AIE stress the importance of hyper-intelligent agents interacting only via a price signal and near zero-intelligent agents interacting via a network structure, respectively. The complementary nature of AIE and REH provide dual perspectives that enhance analysis. The Dun & Bradstreet (D&B 2008) profit expectations survey is used in the thesis to calibrate AIE and make predictions. The predictive power of the AIE and REH models is compared. The thesis introduces the ‘pressure to change profit expectations index’, px. This index provides the ability to model unknowns within an adaptive dynamic process and combine the beliefs from interactive-expectations, adaptive-expectations and biases that include pessimism, optimism and ambivalence. AIE uses networks to model the flow of interactive-expectations between firms. To overcome the uncertainty over the structure of the interactive network, the thesis uses model-averaging over 121 network topologies. These networks are defined by three variables regardless of their complexity. Unfortunately, the Bayesian technique’s use of the number of variables as a measure of complexity makes it unsuitable for model-averaging over the network topologies. To overcome this limitation in the Bayesian technique, the thesis introduces two model-averaging techniques, ‘runtime-weighted’ and ‘optimal-calibration’. These model-averaging techniques are benchmarked against ‘Bayes-factor model-averaging’ and ‘equal-weighted model-averaging’. In addition to the aggregate called all–firms, the D&B (2008) survey has four divisions, manufacturing durables, manufacturing non–durables, wholesale and retail. To make use of the four divisions, the thesis introduces a ‘link-intensity matrix’ based upon an ‘input-output table’ to improve the calibration of the networks. The transpose of the table is also used in the thesis. The two ‘link-intensity matrices’ are benchmarked against the default, a ‘matrix of ones’. The aggregated and disaggregated versions of AIE are benchmarked against adaptive-expectations to establish whether the interactive-expectations component of AIE add value to the model. The thesis finds that AIE has more predictive power than REH. ‘Optimal-calibration model-averaging’ improves the predictive performance of the better-fitting versions of AIE, which are those versions that use the ‘input-output table’ and ‘matrix of ones’ link-intensity matrices. The ‘runtime-weighted model-averaging’ improves the predictive performance of only the ‘input-output table’ version of AIE. The interactive component of the AIE model improves the predictive performance of all versions of the AIE over adaptive-expectations. There is an ambiguous effect on prediction performance from introducing the ‘input-output table’. However, there is a clear reduction in the predictive performance from introducing its transpose. AIE can inform the debate on government intervention by providing an Agent-Based Model (ABM) perspective on the conflicting mathematical and narrative views proposed by the Greenwald–Stiglitz Theorem and Austrian school, respectively. Additionally, AIE can provide a complementary role to REH, which is descriptive/predictive and normative, respectively. The AIE network calibration uses an ‘input-output table’ to determine the link-intensity; this method could provide Computable General Equilibrium (CGE) and Dynamic Stochastic General Equilibrium (DSGE) with a way to improve their transmission mechanism. Furthermore, the AIE network calibration and prediction methodology may help overcome the validation concerns of practitioners when they implement ABM.
95

Modeling the Role and Influence of Children in Household Activity-Based Travel Model Systems

January 2010 (has links)
abstract: Rapid developments are occurring in the arena of activity-based microsimulation models. Advances in computational power, econometric methodologies and data collection have all contributed to the development of microsimulation tools for planning applications. There has also been interest in modeling child daily activity-travel patterns and their influence on those of adults in the household using activity-based microsimulation tools. It is conceivable that most of the children are largely dependent on adults for their activity engagement and travel needs and hence would have considerable influence on the activity-travel schedules of adult members in the household. In this context, a detailed comparison of various activity-travel characteristics of adults in households with and without children is made using the National Household Travel Survey (NHTS) data. The analysis is used to quantify and decipher the nature of the impact of activities of children on the daily activity-travel patterns of adults. It is found that adults in households with children make a significantly higher proportion of high occupancy vehicle (HOV) trips and lower proportion of single occupancy vehicle (SOV) trips when compared to those in households without children. They also engage in more serve passenger activities and fewer personal business, shopping and social activities. A framework for modeling activities and travel of dependent children is proposed. The framework consists of six sub-models to simulate the choice of going to school/pre-school on a travel day, the dependency status of the child, the activity type, the destination, the activity duration, and the joint activity engagement with an accompanying adult. Econometric formulations such as binary probit and multinomial logit are used to obtain behaviorally intuitive models that predict children's activity skeletons. The model framework is tested using a 5% sample of a synthetic population of children for Maricopa County, Arizona and the resulting patterns are validated against those found in NHTS data. Microsimulation of these dependencies of children can be used to constrain the adult daily activity schedules. The deployment of this framework prior to the simulation of adult non-mandatory activities is expected to significantly enhance the representation of the interactions between children and adults in activity-based microsimulation models. / Dissertation/Thesis / M.S. Civil and Environmental Engineering 2010
96

Modeling Frameworks for Supply Chain Analytics

January 2012 (has links)
abstract: Supply chains are increasingly complex as companies branch out into newer products and markets. In many cases, multiple products with moderate differences in performance and price compete for the same unit of demand. Simultaneous occurrences of multiple scenarios (competitive, disruptive, regulatory, economic, etc.), coupled with business decisions (pricing, product introduction, etc.) can drastically change demand structures within a short period of time. Furthermore, product obsolescence and cannibalization are real concerns due to short product life cycles. Analytical tools that can handle this complexity are important to quantify the impact of business scenarios/decisions on supply chain performance. Traditional analysis methods struggle in this environment of large, complex datasets with hundreds of features becoming the norm in supply chains. We present an empirical analysis framework termed Scenario Trees that provides a novel representation for impulse and delayed scenario events and a direction for modeling multivariate constrained responses. Amongst potential learners, supervised learners and feature extraction strategies based on tree-based ensembles are employed to extract the most impactful scenarios and predict their outcome on metrics at different product hierarchies. These models are able to provide accurate predictions in modeling environments characterized by incomplete datasets due to product substitution, missing values, outliers, redundant features, mixed variables and nonlinear interaction effects. Graphical model summaries are generated to aid model understanding. Models in complex environments benefit from feature selection methods that extract non-redundant feature subsets from the data. Additional model simplification can be achieved by extracting specific levels/values that contribute to variable importance. We propose and evaluate new analytical methods to address this problem of feature value selection and study their comparative performance using simulated datasets. We show that supply chain surveillance can be structured as a feature value selection problem. For situations such as new product introduction, a bottom-up approach to scenario analysis is designed using an agent-based simulation and data mining framework. This simulation engine envelopes utility theory, discrete choice models and diffusion theory and acts as a test bed for enacting different business scenarios. We demonstrate the use of machine learning algorithms to analyze scenarios and generate graphical summaries to aid decision making. / Dissertation/Thesis / Ph.D. Industrial Engineering 2012
97

An Agent-Based Model of Wildlife Migratory Patterns in Human-Disturbed Landscapes

Tierney, Lauren 18 August 2015 (has links)
In recent years, human decision-making has led to significant landscape impacts in the western United States. Specifically, migratory wildlife populations have increasingly been impacted by rural urban development and energy resource development. This research presents the application of agent-based modeling to explore how such impacts influence the characteristics of migratory animal movement, focusing on mule deer (Odocoileus hemionus) in Western Wyoming. This study utilizes complex adaptive systems and agent-based modeling frameworks to increase understanding of migratory patterns in a changing landscape and explores thresholds of interference to migration patterns due to increased habitat degradation and fragmentation. The agent-based model utilizes GPS-collar data to examine how individual processes lead to population-level patterns of movement and adaptation. The assessment incorporates elements from both human and natural systems to explore potential future scenarios for human development in the natural landscape and incorporates adaptive behaviors, as well as animal-movement ecology, in changing landscapes.
98

Assessing the Effects of Institutional and Spatial Arrangements in Analytical and Computational Models of Conservation

January 2013 (has links)
abstract: This work is an assemblage of three applied projects that address the institutional and spatial constraints to managing threatened and endangered (T & E) terrestrial species. The first project looks at the role of the Endangered Species Act (ESA) in protecting wildlife and whether banning non–conservation activities on multi-use federal lands is socially optimal. A bioeconomic model is used to identify scenarios where ESA–imposed regulations emerge as optimal strategies and to facilitate discussion on feasible long–term strategies in light of the ongoing public land–use debate. Results suggest that banning harmful activities is a preferred strategy when valued species are in decline or exposed to poor habitat quality. However such a strategy cannot be sustained in perpetuity, a switch to land–use practices characteristic of habitat conservation plans is recommended. The spatial portion of this study is motivated by the need for a more systematic quantification and assessment of landscape structure ahead of species reintroduction; this portion is further broken up into two parts. The first explores how connectivity between habitat patches promotes coexistence among multiple interacting species. An agent–based model of a two–patch metapopulation is developed with local predator–prey dynamics and density–dependent dispersal. The simulation experiment suggests that connectivity levels at both extremes, representing very little risk and high risk of species mortality, do not augment the likelihood of coexistence while intermediate levels do. Furthermore, the probability of coexistence increases and spans a wide range of connectivity levels when individual dispersal is less probabilistic and more dependent on population feedback. Second, a novel approach to quantifying network structure is developed using the statistical method of moments. This measurement framework is then used to index habitat networks and assess their capacity to drive three main ecological processes: dispersal, survival, and coexistence. Results indicate that the moments approach outperforms single summary metrics and accounts for a majority of the variation in process outcomes. The hierarchical measurement scheme is helpful for indicating when additional structural information is needed to determine ecological function. However, the qualitative trend between network indicator and function is, at times, unintuitive and unstable in certain areas of the metric space. / Dissertation/Thesis / Ph.D. Applied Mathematics for the Life and Social Sciences 2013
99

A Multi-agent Based Model for Inter Terminal Transportation

Iqbal, Mubashir January 2015 (has links)
Context: Due to an increase in inter-terminal transportation (ITT) volume within a container port; complexity of transportation processes between the terminals has also increased. Problems with the current way of handling ITT resources are expected to rise in the near future. Different types of vehicles are already in place for transporting containers between different terminals in a container port. However, there need to be an efficient and effective use of these vehicle types in order to get maximum benefits out of these resources. Objectives: In this thesis, we investigate and propose a solution model for ITT considering the combination of both manned (MTS, Trucks) and unmanned (AGV) vehicles. An agent based model is proposed for ITT focusing on three ITT vehicle types. Objective of proposed model is to investigate the capabilities and combination of different vehicles for transporting containers between different container terminals in a port.  Methods: A systematic literature review is conducted to identify the problems and methods and approaches for solving those problems in the domain of container transportation. As a case, an agent-based model is proposed for the Maasvlakte area of the Rotterdam port. Simulations are performed on different scenarios to compare three different road vehicle types, i.e., AGV, MTS, and truck, in a network comprising of ten terminals. Results: Literature review results indicate that heuristics is the most commonly used method to solve different problems of container transportation in the recent past. The review also depicts that limited research has been published focusing on ITT when compared to intra-terminal transportation. Simulation results of our proposed model indicate that AGVs outperforms trucks in terms of loading/unloading time and number of vehicles required to handle the given volume of all scenarios. In most of the cases, it is observed that the number of trucks required are twice as much as compared to AGVs in order to transport containers between different terminals. Results also show that lower number MTS vehicles (as compared to AGVs) are required for handling containers in certain scenarios; however, the loading/unloading time for MTS is much higher than that of AGVs. Conclusions: Using agent-based simulation experiments, we propose a model that can help in estimating the required resources (vehicles) to handle the ITT containers volume and improve the utilization of different resources in a network of terminals. From comparison of three road vehicle types, it was concluded that trucks are incapable to handle higher container volume in an ITT. It was also concluded that AGVs can be an appropriate choice if automated operations are supported in the terminals, otherwise MTS is the best choice concerning the number of vehicles required to handle containers. Our simulation results may help the ITT planners in better estimations and planning of ITT to meet current and future challenges of transporting high containers volume.
100

Detección de Anomalías en Procesos Industriales Usando Modelos Basados en Similitud

León Olivares, Alejandro Samir January 2012 (has links)
La detección de anomalías en procesos industriales es un tema de alto impacto que ha sido analizado y estudiado en diversas áreas de la ingeniería e investigación. La mayor parte de los métodos de detección actualmente disponibles posibilitan el estudio de las irregularidades encontradas en el historial de un proceso, ayudando a extraer información significativa (y a veces crítica) en una amplia variedad de aplicaciones, y convirtiéndose de este modo en parta fundamental e integral de esquemas de reducción de costos tanto humanos como económicos en la industria contemporánea. El objetivo general de este trabajo es desarrollar e implementar un enfoque modular de detección de anomalías, aplicable a procesos industriales multivariados y fundado en el análisis de residuos generados a partir de modelos no paramétricos basados en similitud (similarity-based modeling, SBM). La herramienta consiste principalmente de un sistema de generación automática de modelos SBM, una metodología de estudio de eventos y un algoritmo estadístico de detección. El trabajo realizado se enmarca en un proyecto de colaboración conjunta entre las empresas CONTAC, INGENIEROS LTDA. y ENDESA-CHILE. Gracias a esto, ha sido posible evaluar el sistema propuesto utilizando datos de operación correspondientes a una central termoeléctrica de ciclo combinado perteneciente a la segunda empresa ya mencionada. Las comparaciones en cuanto al desempeño del sistema de modelación implementado permiten concluir que el algoritmo es capaz de generar una representación más apropiada del proceso, basado en el error obtenido con la técnica de modelación SBM, la cual es cercana al 25% del error obtenido con la técnica de modelación lineal en los parámetros. Además, la metodología de estudio de eventos permite detectar correctamente las variables que no aportan al objetivo de detección de un evento en particular, así como también identifica las variables más significativas para lograr tal propósito, reduciendo el número de variables analizadas y con esto, los requerimientos de cómputo de operación en línea. La validación de los resultados entregados por el método de detección de anomalías desarrollado, permite aseverar que la utilización de modelos no-paramétricos tipo SBM, en combinación con la metodología de estudio de eventos y el algoritmo estadístico de detección, resultan eficaces a la hora de generar alarmas y detectar las anomalías estudiadas.

Page generated in 0.1653 seconds