• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 69
  • 14
  • 8
  • 6
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 139
  • 29
  • 19
  • 15
  • 14
  • 14
  • 14
  • 13
  • 13
  • 13
  • 12
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Environmental Persistence of Foot and Mouth Disease Virus and the Impact on Transmission Cycles in Endemic Regions

Mielke, Sarah Rebecca January 2019 (has links)
No description available.
102

Time-Memory Behavior Yields Energetically Optimal Foraging Strategy in Honey Bees.

Van Nest, Byron N. 08 May 2010 (has links) (PDF)
Classical experiments on honey bee time-memory showed that foragers trained to collect food at a fixed time of day return the following day with a remarkable degree of time-accuracy. A series of field experiments revealed that not all foragers return to a food source on unrewarded test days. Rather, there exist two subgroups: "persistent" foragers reconnoiter the source; "reticent" foragers wait in the hive for confirmation of source availability. A forager's probability of being persistent is dependent both on the amount of experience it has had at the source and the environmental conditions present, but the probability is surprisingly high (0.4-0.9). Agent-based simulation of foraging behavior indicated these high levels of persistence represent an energetically optimal strategy, which is likely a compromise solution to an ever-changing environment. Time-memory, with its accompanying anticipation, enables foragers to improve time-accuracy, quickly reactivating the foraging group to more efficiently exploit a food source.
103

[pt] ESTUDO DA DINÂMICA ESTOCÁSTICA DE REDISTRIBUIÇÃO DA RIQUEZA USANDO UMA EQUAÇÃO DE FOKKER-PLANCK / [en] STUDY OF THE STOCHASTIC DYNAMICS OF WEALTH REDISTRIBUTION USING A FOKKER-PLANCK EQUATION

HUGO LEONARDO LEITE LIMA 22 December 2020 (has links)
[pt] A dinâmica da distribuição da riqueza para o modelo conhecido em inglês como Yard-Sale Model (Modelo da Venda de Quintal) pode ser descrita através de uma equação de Fokker-Planck para a função densidade de probabilidade P(w, t) da riqueza w em um instante t. Neste trabalho foi investigado o efeito de um arrasto redistributivo não linear nessa dinâmica. Considera-se (I) uma taxação do tipo linear por partes, onde apenas aqueles com riqueza acima de um determinado valor são taxados, e, (II) uma taxação na forma de lei de potência, que inclui os tipos progressivo e regressivo. Em todos os casos, o total arrecadado é distribuído igualmente. Analisou-se como essas regras podem modificar a distribuição da riqueza numa população e, principalmente, o nível de desigualdade medido pelo índice de Gini. / [en] The dynamics of wealth distribution for the so-called Yard-Sale Model can be described by a Fokker-Planck equation for the probability density function P(w, t) of wealth w at time t. In this work, the effect of nonlinear redistributive drifts was investigated. It was considered (I) a piecewise linear tax, where only those with wealth above a certain threshold are taxed, and, (II) a power-law tax that includes the progressive and regressive types. In all cases, the collected amount of wealth is redistributed equally. We analyze how these rules modify the distribution of wealth across the population and, mainly, the inequality level measured through the Gini index.
104

Three Essays on Exploration and Exploitation: Behavioral Insights and Individual Decision-Making

Guida, Vittorio 14 December 2022 (has links)
Since James G. March introduced the concepts of exploration and exploitation in 1991, they have become ubiquitous in research on organizations and management. According to March (1991), exploration and exploitation are two sets of activities that allow systems (i.e., agents, either organizations or individuals) to adapt to their environment. On the one hand, exploitation activities are based on pre-existing knowledge, and consist of its implementation and/or refinement (e.g., production). On the other hand, exploration is based on knowledge that is not currently possessed by the system and, hence, refers to those activities that allow to acquire such new knowledge (e.g., search and experimentation). Scholars have produced a large number of contributions that have expanded our knowledge of exploration and exploitation even going beyond the initial boundaries of the field of organizational learning. Today, this large body of contributions that has developed over 30 years appears complex and divided into a plethora of research subfields (e.g., Almahendra and Ambos, 2015). Thus, research on exploration and exploitation has reached a level of conceptual and methodological sophistication that demands a high level of effort from researchers wishing to approach it. Among the multiple strands of emerging research, some scholars (such as Wilden et al., 2018) have recently begun to propose a return to the adoption of a behavioral approach to the study of exploration and exploitation. The earliest behavioral approach adopted in organizational studies is that of the "Carnegie School", which included Herbert Simon, Richard Cyert, and James March himself. Such an approach focuses the investigation of organizations on human behavior. In other words, adopting a behavioral approach involves studying organizations from the attitudes of their members, cognition, rationality, motivation, relationships, conflicts, and many other instances of psychological, economic, and social factors that influence human behavior (see, for example, March and Simon, 1958; Cyert and March, 1963). Today, this return to the behavioral approach is also associated with the "micro-foundations of strategy" movement (e.g., Felin et al., 2015) and so-called behavioral strategy (Powell et al., 2011). In essence, while the former is based on the importance of studying organizations and strategy by adopting a level of analysis below the collective/systemic (i.e., organizational) level, the latter includes all the elements that already characterized the behavioral approach (i.e., psychological, and social factors), reinforced by insights from the behavioral economics literature and the adoption of multiple methods, including experiments. This Doctoral dissertation enters this discussion and aims to investigate exploration and exploitation by adopting a behavioral approach, a "micro-foundational" perspective, and research methods that include laboratory experiments and computer simulations. The first study is a literature review paper with three purposes, each pursued in one of its three sections. First, it addresses the conceptual development of the exploration-exploitation literature that led to the emergence of the complex body of contributions mentioned above, providing a kind of "road map" of the research field based on the major literature reviews published over the past three decades. This is intended as a contribution towards researchers who want to take the first steps in the study of exploration-exploitation research. At the end of this road map, the paper by Wilden et al. (2018) is presented, linking the entire field of research to an emerging stream of research directed toward a return to James March's behavioral approach, enhanced by contributions in the areas of "micro-foundations" (e.g., Felin et al., 2015) and behavioral strategy (Powell et al., 2011). Second, based on the approach promoted in such research stream, a review of the literature on experimental studies of exploration and exploitation is provided. Laboratory experiments are considered key methods for advancing the study of exploration and exploitation by adopting a behavioral approach. Finally, the first essay is concluded with three suggested directions for further research: the improvement of existing conceptualizations through modeling, the further sophistication of existing experimental designs to capture features of managerial decision making that are currently beyond the scope of the state-of-the-art models underlying the mainly adopted experimental investigations, and the consideration of a multilevel approach to the study of individual exploration and exploitation, which consists of examining the variables that influence individual behavior at different organizational levels. The second study consists of an experimental investigation of the role of different sources of uncertainty on individual exploration-exploitation. It is based on the rationale underlying the third further research path proposed in the first study. Although an increasing adoption of laboratory experiments can be acknowledged in the research field, it is here argued that scholars have not experimentally disentangled the effects of two different types of uncertainty that emerge in the managerial and psychological literature, namely internal uncertainty, and external uncertainty. The former consists in the inability of individuals to predict future performance; while the latter results from the external environment and consists of unknown information about phenomena that may affect the final outcomes of a decision. The experimental design deployed in the study exposes a group of participants to the presence of the sole internal uncertainty, and a treatment group to the combined presence of the two sources. Findings show that the combined presence of these two sources of uncertainty may lead to the over-exploitation of initial routines, and, consequently, to the inability of individuals to exploit new opportunities stemming by alternatives discovered over time. Finally, the third study focuses on imitation, and exploration and exploitation, and builds on an agent-based model and computer simulations. This essay follows the first research trajectory suggested in the first study. While prominent research has defined imitation as a less costly alternative to experimentation (i.e., exploration), the possible role of imitation in the exploration-exploitation trade-off appears to be under-investigated. The interplay between imitation and exploration is rendered by the modeling of two types of agents: imitators and explorers. Differently from previous studies based on modeling, agent types are explicitly modeled as Simonian "satisficers". Experimentation is modeled as random search, whereas imitation builds on research on imitative heuristics. When engaging in adaptation in a competitive environment, both the types of agent experience "over-crowding" effects depending on the characteristics of their type. The paper concludes with the acknowledgement of limitations of the adopted model and proposes further investigation paths that include the calibration through experimental data.
105

Simulation of airborne transmission of infection in a confined space using an agent-based model

Lützow, Joel, Mikiver, Cecilia January 2020 (has links)
As the world observes a new pandemic with COVID-19, it is clear that pathogens can spread rapidly and without recognition of borders. Outbreaks will continue to occur, and so the diseases’ transmission method must be thoroughly understood in order to minimize their impact. Some infections, such as influenza, tuberculosis and measles are known to be spread through droplets in the air. In a confined space the concentration can grow as more droplets are released. This study examined a simulated confined space modelled as a hospital waiting area, where people who could have underlying conditions congregate and mix with potentially infectious individuals. It further investigated the impact of the volume of the waiting area, the number of people in the room, the placement of them as well as their weight. The simulation is an agent-based model (ABM), a computational model with the purpose of analysing a system through the actions and cumulative consequences of autonomous agents. The presented ABM features embodied agents with differing body weights that can move, breathe and cough in a ventilated room. An investigation into current epidemiological models lead to the hypothesis that one may be implemented as a corresponding ABM, where it could possibly also be improved upon. In this paper, it is shown that all parameters of the Gammaitoni and Nucci model can be taken into account in an ABM via the MASON library. In addition, proof is produced to suggest that some flaws of the epidemiological model can be mended in the ABM. It is demonstrated that the constructed model can account for proximity between susceptible people and infectors, an expressed limitation of the original model. / När världen observerar en ny pandemi, COVID-19, är det tydligt att patogener kan spridas fort och utan hänsyn till landsgränser. Utbrott kommer att fortsätta ske och därför måste sjukdomarnas överföringsmetod förstås, så att deras påverkan kan minimeras. Det är känt att vissa infektioner, såsom influensa, tuberkulos och mässling kan spridas via droppkärnor i luften. I ett begränsat utrymme kan koncentrationen växa när fler droppar tillförs. Denna studie utvärderar ett simulerat begränsat utrymme modellerat som ett väntrum på ett sjukhus, där människor som kan ha underliggande sjukdomar samlas och beblandar sig med potentiellt smittsamma individer. Inverkan av volymen av väntrummet, antalet personer i rummet, var de var placerade i rummet samt deras vikt undersöktes också. Simuleringen är en agent-baserad modell (ABM), en beräkningsmodell med syftet att analysera ett system genom handlingarna och kumulativa konsekvenserna av självstyrande agenter. Personer med olika kroppsvikt som kan röra sig, andas och hosta i ett ventilerat rum simuleras i denna ABM. Efterforskning av aktuella epidemiologiska modeller leder till hypotesen att en sådan skulle kunna implementeras som en motsvarande ABM, där den möjligtvis också kan förbättras. I denna rapport kommer det att uppvisas att alla parametrar av Gammaitonioch Nucci-modellen kan tas hänsyn till i en ABM via MASON biblioteket. Därtill produceras bevis som pekar på att vissa brister i den epidemiologiska modellen kan hämmas i denna ABM. Det demonstreras att den konstruerade modellen kan beakta distansen mellan mottagliga personer och smittsamma, vilket är en känd begränsning i originalmodellen.
106

AN AGENT–BASED COMPUTATIONAL MODEL FOR BANK FORMATION AND INTERBANK NETWORKS

Ismail, Omneia R.H. 10 1900 (has links)
<p>The aim of this thesis is to study the role of banking in society and the effect of the</p> <p>interbank market on the performance of the banking system.</p> <p>It starts by reviewing</p> <p>several studies conducted on empirical banking networks and highlighting their salient</p> <p>features in the context of modern network theory. A simulated network resembling the</p> <p>characteristics documented in the empirical studies is then built and its resilience is</p> <p>analyzed with a particular emphasis in documenting the crucial role played by highly</p> <p>interconnected banks.</p> <p>It is our belief that the study of systemic risk and contagion in a banking system</p> <p>is an integral part to the study of the economic role of banks themselves. Thus the</p> <p>current work focuses on the fundamentals of banking and aims at identifying the</p> <p>necessary drivers for a dynamical setup of the interbank market.</p> <p>Through an agent–based model, we address the issues of bank formation, bank runs</p> <p>and the emergence of an interbank market. Starting with heterogeneous individuals,</p> <p>bank formation is viewed as an emergent phenomenon arising to meet the needs for</p> <p>investment opportunities in face of uncertain liquidity preferences. When banks work</p> <p>in isolation (no interbank market), in the long run and through a long experience with</p> <p>bank failures, banking turns into a monopoly or a market with few players.</p> <p>By equipping banks with their own learning tools and allowing an interbank market</p> <p>to develop, fewer bank failures and a less concentrated banking system are witnessed.</p> <p>In addition, through a scenario analysis, it is demonstrated that allowing banks to</p> <p>interact does not weaken the banking system in almost all the cases, and improves</p> <p>the performance on multiple occasions.</p> <p>The work is concluded by studying the effects of a banking system on individuals</p> <p>and the economy in what is called social measures. We establish that the effects</p> <p>of banking on social measures such as consumption level, consumption inequality</p> <p>between individuals, long term investment and economic waste, varies significantly</p> <p>based on the structure of the society.</p> / Doctor of Philosophy (PhD)
107

Assessing Working Models' Impact on Land Cover Dynamics through Multi-Agent Based Modeling and Artificial Neural Networks:  A Case Study of Roanoke, VA

Nusair, Heba Zaid 30 May 2024 (has links)
The transition towards flexible work arrangements, notably work-from-home (WFH) practices, has prompted significant discourse on their potential to reshape urban landscapes. While existing urban growth models (UGM) offer insights into environmental and economic impacts, There is a need to study the urban phenomena from the bottom-up style, considering the essential influence of individuals' behavior and decision-making process at disaggregate and local levels (Brail, 2008, p. 89). Addressing this gap, this study aims to comprehensively understand how evolving work modalities influence the urban form and land use patterns by focusing on socioeconomic and environmental factors. This research employs an Agent-Based Model (ABM) and Artificial Neural Network (ANN), integrated with GIS technologies, to predict the future Land Use and Land Cover (LULC) changes within Roanoke, Virginia. The study uniquely explores the dynamic interplay between macro-level policies and micro-level individual behaviors—categorized by employment types, social activities, and residential choices—shedding light on their collective impact on urban morphology. Contrary to conventional expectations, findings reveal that the current low rate in WFH practices has not significantly redirected urban development trends towards sprawl but rather has emphasized urban densification, largely influenced by on-site work modalities. This observation is corroborated by WFH ratios not exceeding 10% in any analyzed census tract. Regarding model performance, the integration of micro-agents into the model substantially improved its accuracy from 86% to 89.78%, enabling a systematic analysis of residential preferences between WFH and on-site working (WrOS) agents. Furthermore, logistic regression analysis and decision score maps delineate the distinct spatial preferences of these agent groups, highlighting a pronounced suburban and rural preference among WFH agents, in contrast to the urban-centric inclination of WrOS agents. Utilizing ABM and ANN integrated with GIS technologies, this research advances the precision and complexity of urban growth predictions. The findings contribute valuable insights for urban planners and policymakers and underline the intricate relationships between work modalities and urban structure, challenging existing paradigms and setting a precedent for future urban planning methodologies. / Doctor of Philosophy / As more people start working from home, cities might change unexpectedly. This study in Roanoke, Virginia, explores how work-from-home (WFH) practices affect urban development. Traditional city growth models look at big-picture trends, but this study dives into the details of workers' individual behaviors and their residential choices. Using advanced computer models such as machine learning and geographic information systems (GIS), predictions are made on how different work arrangements influence where workers live and how cities expand. Surprisingly, fewer people work from home than expected. This hasn't caused cities to spread out more. Instead, Roanoke is expected to become denser in the next ten years because on-site workers tend to live in urban centers, while those who work from home prefer suburban and rural areas and, sometimes, urban. Different work arrangements lead to distinct residential preferences. By including the workers' individual behaviors in the models, the model's accuracy increased from 86% to 89.78%. Logistic regression analysis highlights the factors influencing land use changes, such as proximity to roads, slopes, home values, and wages. This research helps city planners and policymakers understand working arrangement trends and create better policies to manage urban development. It shows the complex relationship between work practices and city structures, providing valuable insights for future city planning.
108

Integrated Multimodal Analysis: Evaluating the Impacts of Chemotherapy and Electroporation-Based Therapy on Lymphatic and Blood Microvasculature in Cancer

Esparza, Savieay Luis 05 June 2024 (has links)
The lymphatic and blood vascular systems are two important vessel networks that serve different roles in healthy states and in cancer. In breast cancer the most common cancer amongst women, mortality remains high despite increased treatment response due to metastatic spread, preferentially through the lymphatics. One aggressive subtype, triple negative breast cancer (TNBC) contributing to 15 to 30 percent of cases and is characterized by the absence of expression of three therapeutic biomarkers. As targeted therapy is limited, treatment relies on standard of care via surgery, radiotherapy, and chemotherapy with limited efficacy and increase in survival. Chemotherapies negatively alter the lymphatic vasculature benefiting the tumor, through lymphangiogenesis. This dissertation seeks to understand how the mechanisms of commonly used chemotherapeutics, like carboplatin, and a novel 2nd generation ablative therapy called High Frequency Irreversible Electroporation (H-FIRE), which utilizes electric pulses to ablate tumor cells, affect the lymphatic and blood microvasculature in the tumor, surrounding fat pad, tumor draining lymph node (TDLN) using multiple analysis methods. This occurred through three main methods 1) identification of oxidative stress effects of chemotherapeutic application of carboplatin on lymphatic endothelial cells in vitro, 2) characterization of lymphatic and blood microvascular dynamics in a 4T1 breast cancer mouse model treated with sub-ablative H-FIRE, 3) through the development of a novel habitat imaging method to identify treatment specific changes in the tumor draining lymph node, and the development of a hybrid agent-based model (ABM) to test cancer cell flow mediated invasion in brain cancer. Herin the work showed that carboplatin induced lymphatic phenotypic changes occurred through generation of reactive oxygen species dependent on VEGFR3 and was reversed through treatment with the antioxidant N-acetylcysteine. In the 4T1 model, sub ablation with H-FIRE induced temporal remodeling of the lymphatic and blood vasculature within the viable tumor, in the surrounding fat pad, and in the tumor draining lymph node over seven days, suggesting an optimal time of application of adjuvant therapy. The development of a habitat imaging analysis method to identify TDLN vascular habitats and the perturbation to treatment in a retrospective analysis of prior work. Lastly, the development of a hybrid ABM through the incorporation of experimentally measured fluid flow fields from dynamic contrast enhanced MRI imaging building upon existing work, and showing the usefulness in comparing mechanisms of cancer cell invasion mediated fluid flow. Altogether, this work presents novel insight into the lymphatic system in cancer within various treatments contexts and new methods of quantifying changes due to treatment. Hopefully, these findings can be used to further inform the field towards a more comprehensive understanding of treatment effects in breast cancer. / Doctor of Philosophy / The lymphatic and blood vascular systems are two important vessel networks that serve different purposes in healthy states and in the disease called cancer. In breast cancer , a common form of cancer in women , spread of this cancer tends towards the lymphatic vasculature and eventually to other parts of the body. Triple negative breast cancer (TNBC) a less common, but more aggressive form, relies on clinical standard treatments with anti-tumor drugs called chemotherapies. These chemotherapies negatively alter the lymphatic vasculature to the tumors benefit, leaving a lack new methods of treatment. This dissertation seeks to understand how the mechanisms of commonly used chemotherapeutics and a new promising pulsed electric field therapy , High frequency Irreversible Electroporation (H-FIRE), change the lymphatic and blood vessels over time and in different locations using different tools. This occurred through three main methods 1) the effects on lymphatic vascular cells treated with chemotherapy, 2) in a breast cancer mouse model treated with H-FIRE, 3) in math models of the draining lymphatic organ, called the lymph node and an agent-based math model (ABM) of cancer cell movement due to fluid flow. The work showed that in the lymphatic cells, carboplatin a type of chemotherapeutic used to treat breast cancer, changed lymphatic vasculature through generating stress through oxidation and was reversed through treatment with an anti-oxidant. In the breast cancer mouse model, incomplete ablation with H-FIRE caused time dependent changes to the lymphatic and blood vasculature in the tumor, in the surrounding tissue, and in the lymph node over seven days. This work shows the novel findings of pulsed electric field therapy causing changes to the lymphatic vasculature. The creation of a new method of identifying habitats of the lymph node was used to compare changes to the lymphatic and blood vasculature to treatment. Lastly, the creation of an ABM added measured fluid flow maps from medical imaging methods to build upon existing work, and showed the usefulness in comparing mechanisms of cancer cell invasion due to fluid flow. Altogether, this work presents novel insight into the lymphatic system in cancer within after various treatments are applied and new methods of measuring these changes because of treatment using multiple methods. It is our hope that these findings can be used to further inform the field towards a more comprehensive understanding of treatment effects in breast cancer.
109

Probing the roles of actin dynamics in the cytoskeleton of animal and plant cells

June hyung Kim (18432030) 26 April 2024 (has links)
<p dir="ltr">The actin cytoskeleton is a dynamic structure that regulates various important cellular processes, such as cell protrusion, migration, transport, and cell shape changes. Cells employ different actin architectures best suited for each of these functions. We have employed an agent-based model to illuminate how the actin cytoskeleton plays such functions in animal and plant cells, via dynamic interactions between molecular players.</p><p dir="ltr">Lamellipodia found in animal cells are two-dimensional actin protrusion formed on the leading edge of cells, playing an important role in sensing surrounding mechanical environments via focal adhesions. Various molecular players, architecture, and dynamics of the lamellipodia have been investigated extensively during recent decades. Nevertheless, it still remains elusive how each component in the lamellipodia mechanically interacts with each other to attain a stable, dynamic steady state characterized by a retrograde flow emerging in the branched actin network. Using the agent-based model, we investigated how the balance between different subcellular processes is achieved for the dynamic steady state. We simulated a branched network found in the lamellipodia, consisting of actin filament (F-actin), myosin motor, Arp2/3 complex, and actin crosslinking protein. We found the importance of a balance between F-actin assembly at the leading edge of cells and F-actin disassembly at the rear end of the lamellipodia. We also found that F-actin severing is crucial to allow for the proper disassembly of an actin bundle formed via network contraction induced by motor activity. In addition, it was found that various dynamic steady states can exist.</p><p dir="ltr">The actin cytoskeleton in plant cells plays a crucial role in intracellular transport and cytoplasmic streaming, and its structure is very different from the actin cytoskeleton in animal cells. The plant actin cytoskeleton is known to show distinct dynamic behaviors with homeostasis. We used the agent-based model to simulate the plant actin cytoskeleton with the consideration of the key governing mechanisms, including F-actin polymerization/depolymerization, different types of F-actin nucleation events, severing, and capping. We succeeded in reproducing experimental observations in terms of F-actin density, length, nucleation frequency, and rates of severing, polymerization, and depolymerization. We found that the removal of nucleators results in lower F-actin density in the network, which supports recent experimental findings.</p>
110

Real-time Traffic State Prediction: Modeling and Applications

Chen, Hao 12 June 2014 (has links)
Travel-time information is essential in Advanced Traveler Information Systems (ATISs) and Advanced Traffic Management Systems (ATMSs). A key component of these systems is the prediction of the spatiotemporal evolution of roadway traffic state and travel time. From the perspective of travelers, such information can result in better traveler route choice and departure time decisions. From the transportation agency perspective, such data provide enhanced information with which to better manage and control the transportation system to reduce congestion, enhance safety, and reduce the carbon footprint of the transportation system. The objective of the research presented in this dissertation is to develop a framework that includes three major categories of methodologies to predict the spatiotemporal evolution of the traffic state. The proposed methodologies include macroscopic traffic modeling, computer vision and recursive probabilistic algorithms. Each developed method attempts to predict traffic state, including roadway travel times, for different prediction horizons. In total, the developed multi-tool framework produces traffic state prediction algorithms ranging from short – (0~5 minutes) to medium-term (1~4 hours) considering departure times up to an hour into the future. The dissertation first develops a particle filter approach for use in short-term traffic state prediction. The flow continuity equation is combined with the Van Aerde fundamental diagram to derive a time series model that can accurately describe the spatiotemporal evolution of traffic state. The developed model is applied within a particle filter approach to provide multi-step traffic state prediction. The testing of the algorithm on a simulated section of I-66 demonstrates that the proposed algorithm can accurately predict the propagation of shockwaves up to five minutes into the future. The developed algorithm is further improved by incorporating on- and off-ramp effects and more realistic boundary conditions. Furthermore, the case study demonstrates that the improved algorithm produces a 50 percent reduction in the prediction error compared to the classic LWR density formulation. Considering the fact that the prediction accuracy deteriorates significantly for longer prediction horizons, historical data are integrated and considered in the measurement update in the developed particle filter approach to extend the prediction horizon up to half an hour into the future. The dissertation then develops a travel time prediction framework using pattern recognition techniques to match historical data with real-time traffic conditions. The Euclidean distance is initially used as the measure of similarity between current and historical traffic patterns. This method is further improved using a dynamic template matching technique developed as part of this research effort. Unlike previous approaches, which use fixed template sizes, the proposed method uses a dynamic template size that is updated each time interval based on the spatiotemporal shape of the congestion upstream of a bottleneck. In addition, the computational cost is reduced using a Fast Fourier Transform instead of a Euclidean distance measure. Subsequently, the historical candidates that are similar to the current conditions are used to predict the experienced travel times. Test results demonstrate that the proposed dynamic template matching method produces significantly better and more stable prediction results for prediction horizons up to 30 minutes into the future for a two hour trip (prediction horizon of two and a half hours) compared to other state-of-the-practice and state-of-the-art methods. Finally, the dissertation develops recursive probabilistic approaches including particle filtering and agent-based modeling methods to predict travel times further into the future. Given the challenges in defining the particle filter time update process, the proposed particle filtering algorithm selects particles from a historical dataset and propagates particles using data trends of past experiences as opposed to using a state-transition model. A partial resampling strategy is then developed to address the degeneracy problem in the particle filtering process. INRIX probe data along I-64 and I-264 from Richmond to Virginia Beach are used to test the proposed algorithm. The results demonstrate that the particle filtering approach produces less than a 10 percent prediction error for trip departures up to one hour into the future for a two hour trip. Furthermore, the dissertation develops an agent-based modeling approach to predict travel times using real-time and historical spatiotemporal traffic data. At the microscopic level, each agent represents an expert in the decision making system, which predicts the travel time for each time interval according to past experiences from a historical dataset. A set of agent interactions are developed to preserve agents that correspond to traffic patterns similar to the real-time measurements and replace invalid agents or agents with negligible weights with new agents. Consequently, the aggregation of each agent's recommendation (predicted travel time with associated weight) provides a macroscopic level of output – predicted travel time distribution. The case study demonstrated that the agent-based model produces less than a 9 percent prediction error for prediction horizons up to one hour into the future. / Ph. D.

Page generated in 0.1105 seconds