141 |
Modèle global et paramétrable, pour la gestion des foules d'agents en animation comportementale / Global and configurable model for crowd control in behaviour animationSoussi, Hakim 06 December 2011 (has links)
Le réalisme d'une application traitant de l'animation comportementale de foules est fondé d'une part sur le rendu graphique des scènes produites par l'application, mais aussi sur le réalisme du comportement lui-même. C'est ce dernier point qui est notre objet d'étude. Le réalisme du comportement d'une foule est avant tout global (réalisme macroscopique) : elle doit avoir des propriétés statistiques (densité, dispersion, vitesse moyenne,…) proches de celles d'une foule réelle. Il est aussi local (réalisme microscopique), c'est-à-dire que les agents ou groupes d'agents doivent idéalement avoir des comportements proches de ceux des humains ou groupes humains pris pour référence, tout au moins dans le domaine d'application restreint considéré. L'objet de cette thèse est de proposer un modèle générique pour effectuer des simulations comportementales de foules, pour pouvoir satisfaire simultanément les deux types de réalisme macroscopique et microscopique et rendre compte de la plupart des types de foules (foules homogènes, groupes, collection d'agents). Pour ce faire, nous avons dégagé quelques principes simples et peu coûteux en ressources. Nous introduirons la notion de contexte (global, localisé, et propagateur). Les contextes globaux et localisés appliqués à un groupe d'agents tendent à leur donner un même comportement et constituent ainsi un moyen de contrôle global de ces acteurs (réalisme macroscopique). Le contexte propagateur donne un moyen de propagation d'informations entre les agents de la foule (communication). Les agents eux-mêmes sont dotés de tendances qui, en les différenciant dans un même contexte, leur donne une variété de comportements qui concourt au réalisme microscopique. / The realism of crowd behavioral animation is based on one hand on a rendering of graphic scenes generated by the application, and on the other hand on the realism of the behavior. This is last point which is our object of study. The realism of crowds' behavior is essentially global (macroscopic realism). It must respect the required statistical characteristics of the crowd (density, dispersal, speed…) similar to those of a real crowd. The realism is also local (microscopic realism), i.e. agents should ideally behave like humans or human groups taken as reference, at least in the focus of the considered application. The aim of our research is to propose a generic model to perform crowd behavior simulation in order to simultaneously satisfy the criterion of macroscopic and microscopic realism and that may be used to build various crowd behavioral simulations (homogeneous crowds, groups, collections of agents). To do this, we identified some simple principles and inexpensive resources; we introduce the notion of context (global, localized, and propagator). Global contexts applied to a group of agents, it tends to give them the same behavior and thus constitutes a means for global control of those agents (macroscopic realism). Propagator contexts allow the propagation of information among agents (communication). Agents are themselves equipped with tendencies that differentiate them within the context, and give them a variety of behaviors that contributes to microscopic realism.
|
142 |
Understanding Structure And Growth Of Physisorbed Films : A Combined Atomic Force Microscopy And Modeling StudyPatil Kalyan, G 01 1900 (has links) (PDF)
Surface modification has wide ranging implications in lubrication, microelectromechanical systems (MEMS), colloidal systems and biological membranes. Surface modification plays an important role in stabilizing gold nanoparticles, which have applications in targeted drug delivery and catalysis. A variety of surface modification techniques are used for controlling corrosion and wettability, as well as used extensively to understand the nature of interactions between surfaces. This thesis is mainly focused on understanding the kinetics, film growth and surface modification by long chain molecules physisorbed on a surface.
The time evolution of film growth and domain formation of octadecylamine on a mica surface is studied using ex-situ AFM and reflectance FTIR. A novel technique of interface creation is developed to measure the height of the adsorbed film. The results show three distinct regions of film growth mechanism. Region I, corresponds to thin film and the interface height is in the monolayer regime. The transient regime
(II) consists of a sharp increase in the film thickness, from 1.5 nm to 25 nm within a time span of 180 s. In the final stage of film growth the film thickness is invariant with time, during which domain coarsening is observed. Domain evolution reveals a non-monotonic variation in the domain size as a function of adsorption time. A three stage mechanism is proposed to explain the domain evolution on the surface.
In order to explain the observed film thickness variation, we have developed and tested various models to explain the thin to thick film transition observed in the AFM experiments. A model based on adsorption kinetics is solved to obtain the evolution of the adsorbed film. The model with a two-step adsorption isotherm quantitatively captures the thin to thick film transition observed in the AFM experiments. The statistical thermodynamics of adsorption of long chain molecules on a surface has been studied using a lattice model. The molecules are characterized by backbone chain, either lying parallel or perpendicular to the surface. A square lattice with nearest neighbour interactions and a mean field approximation are used to generate the adsorption isotherms for different molecules as a function of chain length. The molecules change their orientation from a surface parallel to an upright configuration with an increase in chemical potential. A similar transition (with time) in the molecular orientation has been observed in the AFM experiments. The transition between these two orientations is accompanied by an entropy maximum
The last part of the thesis is concerned with carbon-carbon interactions. More specifically, we are interested in the interactions between graphite surfaces and their modification in the presence of a lubricant or base oil. Diamond like carbon (DLC) AFM tips and highly oriented pyrolitic graphite (HOPG) have been used for this study. Experiments were carried out by treating HOPG graphite in hexadecane oil at different temperatures. It is observed that pull-off forces on bare graphite are smaller when compared to the treated surface. The magnitude of the pull-off forces increases with the temperature of the hexadecane oil bath. Presence of charged patches responsible for the higher adhesion have been confirmed using surface potential microscopy. Results also confirm the presence of a thin liquid-like hexadecane film at room temperature.
|
143 |
Evaluation of bus terminals using microscopic traffic simulationAskerud, Caroline, Wall, Sara January 2017 (has links)
Traffic simulation is a safe and efficient tool to investigate infrastructural changes as well as traffic conditions. This master thesis aims to analyse a microscopic traffic simulation method for evaluation of bus terminal capacity. The evaluation is performed by investigating a case study of the bus terminal at Norrköping travel centre. The analysed method, referred to as terminal logic in the thesis, uses a combination of time based and event based simulation. Through the combination of time and event, it is possible to capture all movements within the terminal for individual vehicles. The simulation model is built in the software Vissim. A new travel centre for Norrköping is under development. Among the reasons for a new travel centre is the railway project Ostlänken in the eastern part of Sweden. An evaluation of the bus terminal is interesting due to a suspicion of overcapacity and the opportunity of redesigning. To investigate both the terminal capacity and the terminal logic, three scenarios were implemented. Scenario 1: Current design and frequency Scenario 2: Current design with higher frequency Scenario 3: Decreased number of bus stops with current frequency The results from the scenarios confirm the assumption of overcapacity. The capacity was evaluated based on several different measures, all indicating a low utilization. Even so, the utilization was uneven over time and congestion could still occur when several buses departed at the same time. This was also seen when studying the simulation, which showed congestions when several buses departed at the same time. The case study established the terminal logic to be useful when evaluating capacity at bus terminals. It provides a good understanding of how the terminal operates and captures the movements. However, it was time-consuming to adjust the logic to the studied terminal. This is a disadvantage when investigating more than one alternative. The thesis resulted in two main conclusions. Firstly, a more optimised planning of the buses at Norrköping bus terminal would probably be achievable and lead to less congestions at the exits. Secondly, the terminal logic is a good method to use when evaluating bus terminals but it is not straight forward to implement.
|
144 |
Amélioration de l'imprégnabilité aux solutions aqueuses des duramens des résineux : le cas du Douglas (Pseudotsuga Menziesii Franco) / Improving the impregnability to the aqueous solutions of resinous heartwood : The case of Douglas Fir (Pseudotsuga Menziesii Franco)Elaieb, Mohamed Tahar 19 December 2014 (has links)
Le Douglas sera la première essence résineuse en France dans les 10 ans à venir, avec des volumes commercialisables de l’ordre de 3 millions de m3/an. Sa valorisation par déroulage se heurte à deux caractéristiques défavorables de son duramen (une humidité à l’état vert proche du point de saturation des fibres (entre 30 et 40%) et une très mauvaise imprégnabilité à l’eau. Ceci rend ce bois très difficile et très long à chauffer par bouillottage avant déroulage. La matière ligneuse étant un bon isolant thermique, l’eau libre constitue généralement le milieu chauffant privilégié dans l’opération d’étuvage préalable au déroulage. Le temps de chauffe dans le cas du Douglas est doublé voire triplé par rapport à d’autres essences plus humides. Cela se traduit par un gaspillage énergétique et une immobilisation de stocks accrue. En vue d’améliorer la cinétique d’imprégnabilité de bois rond de Douglas, nous avons testés un certain nombre de modalités d’imprégnation à deux échelles différentes. À l’échelle de paillasse, des barreaux de 20 mm (R) x 20 mm (T) x 120 mm (L) prélevés dans le duramen, ont subis différents essais de trempage en faisant varier la température de l’eau, la durée du trempage, le type de refroidissement. Certaines modalités ont été répétés en plaçant le bain sous ultrason (fréquence 20 kHz, puissance 400 W) et pour d’autres en ajoutant un tensio-actif dans l’eau ou procéder à un séchage (thermique, naturel, vide) préalable des éprouvettes. Après chaque essai, la reprise d’eau a été quantifiée par double pesée. Nous avons montré un effet de bord répétable sur toutes les modalités mais aucune de celles-ci ne permet une amélioration significative de la reprise en eau du duramen sauf pour le cas du séchage préalable qui a profondément amélioré l’imprégnabilité. La transposition des traitements à l’échelle industrielle sur des billons de 50 cm de longueur et 20 cm de diamètre a montré l’efficacité du séchage préalable sur la capacité du bois d’être pénétré par l’eau, mais insuffisante pour améliorer significativement les conditions de déroulage. Le suivi de l’imprégnabilité par scanner à rayons X a confirmé la persistance de l’hétérogénéité de la répartition d’humidité sur toutes les modalités d’expériences réalisées. Les observations par microscopie confocale à balayage laser (CLSM) ont montré que le processus de séchage à 103°C, a généré des micro-fissures dans les parois cellulaires des ponctuations. Les essais de déroulage réalisés sur les billons issus des différentes modalités étudiées n’ont montré aucune différence de comportement aussi bien en termes d’efforts de coupe qu’en qualité des placages obtenus / In the ten next years, Douglas-fir will be the main softwood resource harvested in France. It its valorization by peeling comes up against two of its particularities that complicate boiling efficiency: (i) the heartwood has a MC near FSP (30 to 40%) i.e. there is near no free water into tracheid (ii) it is impossible to impregnate this heartwood at atmospheric pressure with water. As a result, wood material being a very efficient insulator material, boiling of Douglas-fir prior to peeling for veneer production will take a very long time, free water being the main medium allowing heat transfer into green wood. Wood is a good thermal insulator, free water is generally preferred in the heating operation steaming prior peeling. Heating time in the case of Douglas is doubled or tripled compared to others species. This results in wasted energy and increased immobilization stocks. In order to improve the kinetics of impregnability of Douglas heart wood, we tested a number of methods of impregnating at two different scales. At the bench scale, samples with 20 mm (R) x 20 mm (t) x 120 mm (L) taken from the heartwood, have suffered from various tests by varying the soaking water temperature, duration soaking, the type of cooling. Some terms were repeated by placing the bath in ultrasound (frequency 20 kHz, power 400 W) and others by adding a surfactant in water or to drying samples, (thermal, natural, vacuum) prior impregnation . After each test, the water uptake was quantified by double weighing. We showed a repeatable board effect across all categories but none of them allows a significant improvement in the water uptake heartwood except drying that profoundly improved the moisture content of samples. The transposition to the industrial scale processing on ridges (50 cm long and 20 cm in diameter) showed the effectiveness of prior drying on the ability of wood to be penetrated by the water, but insufficient to significantly improve the peeling conditions. X-ray scanner observations confirm the persistence of the heterogeneity of the moisture distribution across all categories of experiments. The confocal laser scanning microscopy (CLSM) showed that the drying process at 103 ° C, generated microcracks in the cell walls of the pits. The tests performed on the peeling logs from different modalities studied showed no difference in behavior both in terms of cutting forces and quality veneers obtained
|
145 |
Thermalization and its Relation to Localization, Conservation Laws and Integrability in Quantum SystemsRanjan Krishna, M January 2015 (has links) (PDF)
In this thesis, we have explored the commonalities and connections between different classes of quantum systems that do not thermalize. Specifically, we have (1) shown that localized systems possess conservation laws like integrable systems, which can be constructed in a systematic way and used to detect localization-delocalization transitions
, (2) studied the phenomenon of many-body localization in a model with a single
particle mobility edge, (3) shown that interesting finite-size scaling emerges, with universal exponents, when athermal quantum systems are forced to thermalize through the
application of perturbations and (4) shown that these scaling laws also arise when a perturbation causes a crossover between quantum systems described by different random
matrix ensembles. We conclude with a brief summary of each chapter.
In Chapter 2, we have investigated the effects of finite size on the crossover between quantum integrable systems and non-integrable systems. Using exact diagonalization of finite-sized systems, we have studied this crossover by obtaining the energy level statistics and Drude weight associated with transport. Our results reinforce the idea that for system size L → ∞, non-integrability sets in for an arbitrarily small integrabilitybreaking
perturbation. The crossover value of the perturbation scales as a power law
∼ L−3 when the integrable system is gapless and the scaling appears to be robust to
microscopic details and the precise form of the perturbation.
In Chapter 3, we have studied the crossover among different random matrix ensembles
CHAPTER 6. CONCLUSION 127
[Poissonian, Gaussian Orthogonal Ensemble (GOE), Gaussian Unitary Ensemble (GUE)
and Gaussian Symplectic Ensemble (GSE)] realized in different microscopic models. We
have found that the perturbation causing the crossover among the different ensembles
scales to zero with system size as a power law with an exponent that depends on the
ensembles between which the crossover takes place. This exponent is independent of
microscopic details of the perturbation. We have also found that the crossover from the
Poissonian ensemble to the other three is dominated by the Poissonian to GOE crossover
which introduces level repulsion while the crossover from GOE to GUE or GOE to GSE
associated with symmetry breaking introduces a subdominant contribution. Finally,we
have conjectured that the exponent is dependent on whether the system contains interactions among the elementary degrees of freedom or not and is independent of the
dimensionality of the system.
In Chapter 4, we have outlined a procedure to construct conservation laws for Anderson
localized systems. These conservation laws are found as power series in the hopping
parameters. We have also obtained the conservation laws for the disorder free Aubry-Andre model, where the states are either localized or extended depending on the strength of a coupling constant. We have formulated a specific procedure for averaging over disorder, in order to examine the convergence of the power series. Using this procedure for the Aubry-Andre model, we show that integrals of motion given by our construction are well-defined in the localized phase but not so in the extended phase. Finally, we also obtain the integrals of motion for a model with interactions to lowest order in the interaction.
In Chapter 5, we have studied many body localization and investigated its nature
in the presence of a single particle mobility edge. Employing the technique of exact diagonalization for finite-sized systems, we have calculated the level spacing distribution, time evolution of entanglement entropy, optical conductivity and return probability to characterize the nature of localization. The localization that develops in the presence of interactions in these systems appears to be different from regular Many-Body Localization (MBL) in that the growth of entanglement entropy with time is linear (like in
CHAPTER 6. CONCLUSION 128
a thermal phase) instead of logarithmic but saturates to a value much smaller than the
thermal value (like for MBL). All other diagnostics seem consistent with regular MBL
|
146 |
Detecção de ovos de S. mansoni a partir da detecção de seus contornos / Schistosoma mansoni egg detection from contours detectionEdwin Delgado Huaynalaya 25 April 2012 (has links)
Schistosoma mansoni é o parasita causador da esquistossomose mansônica que, de acordo com o Ministério da Saúde do Brasil, afeta atualmente vários milhões de pessoas no país. Uma das formas de diagnóstico da esquistossomose é a detecção de ovos do parasita através da análise de lâminas microscópicas com material fecal. Esta tarefa é extremamente cansativa, principalmente nos casos de baixa endemicidade, pois a quantidade de ovos é muito pequena. Nesses casos, uma abordagem computacional para auxílio na detecção de ovos facilitaria o trabalho de diagnóstico. Os ovos têm formato ovalado, possuem uma membrana translúcida, apresentam uma espícula e sua cor é ligeiramente amarelada. Porém nem todas essas características são observadas em todos os ovos e algumas delas são visíveis apenas com uma ampliação adequada. Além disso, o aspecto visual do material fecal varia muito de indivíduo para indivíduo em termos de cor e presença de diversos artefatos (tais como partículas que não são desintegradas pelo sistema digestivo), tornando difícil a tarefa de detecção dos ovos. Neste trabalho investigamos, em particular, o problema de detecção das linhas que contornam a borda de vários dos ovos. Propomos um método composto por duas fases. A primeira fase consiste na detecção de estruturas do tipo linha usando operadores morfológicos. A detecção de linhas é dividida em três etapas principais: (i) realce de linhas, (ii) detecção de linhas, e (iii) refinamento do resultado para eliminar segmentos de linhas que não são de interesse. O resultado dessa fase é um conjunto de segmentos de linhas. A segunda fase consiste na detecção de subconjuntos de segmentos de linha dispostos em formato elíptico, usando um algoritmo baseado na transformada Hough. As elipses detectadas são fortes candidatas a contorno de ovos de S. mansoni. Resultados experimentais mostram que a abordagem proposta pode ser útil para compor um sistema de auxílio à detecção dos ovos. / Schistosoma mansoni is one of the parasites which causes schistosomiasis. According to the Brazilian Ministry of Health, several million people in the country are currently affected by schistosomiasis. One way of diagnosing it is by egg identification in stool. This task is extremely time-consuming and tiring, especially in cases of low endemicity, when only few eggs are present. In such cases, a computational approach to help the detection of eggs would greatly facilitate the diagnostic task. Schistosome eggs present oval shape, have a translucent membrane and a spike, and their color is slightly yellowish. However, not all these features are observed in every egg and some of them are visible only with an adequate microscopic magnification. Furthermore, the visual aspect of the fecal material varies widely from person to person in terms of color and presence of different artifacts (such as particles which are not disintegrated by the digestive system), making it difficult to detect the eggs. In this work we investigate the problem of detecting lines which delimit the contour of the eggs. We propose a method comprising two steps. The first phase consists in detecting line-like structures using morphological operators. This line detection phase is divided into three steps: (i) line enhancement, (ii) line detection, and (iii) result refinement in order to eliminate line segments that are not of interest. The output of this phase is a set of line segments. The second phase consists in detecting subsets of line segments arranged in an elliptical shape, using an algorithm based on the Hough transform. Detected ellipses are strong candidates to contour of S. mansoni eggs. Experimental results show that the proposed approach has potential to be effectively used as a component in a computer system to help egg detection.
|
147 |
Apprentissage par noyaux multiples : application à la classification automatique des images biomédicales microscopiques / Multiple kernel learning : contribution to the automatic classification of microscopic medical imagesZribi, Abir 17 March 2016 (has links)
Cette thèse s'inscrit dans le contexte de diagnostic assisté par ordinateur pour la localisation subcellulaire des protéines dans les images microscopiques. L'objectif est la conception et le développement d'un système de classification automatique permettant d'identifier le compartiment cellulaire dans lequel une protéine d'intérêt exerce son activité biologique. Afin de surmonter les difficultés rencontrées pour discerner les compartiments cellulaires présents dans les images microscopiques, les systèmes décrits dans la littérature proposent d'extraire plusieurs descripteurs associés à une combinaison de classifieurs. Dans cette thèse, nous proposons un schéma de classification différent répondant mieux aux besoins de généricité et de flexibilité pour traiter différentes bases d'images.Dans le but de fournir une caractérisation riche des images microscopiques, nous proposons un nouveau système de représentation permettant d'englober de multiples descripteurs visuels identifiés dans les différentes approches d'extraction de caractéristiques : locale, fréquentielle, globale et par région. Nous formulons ensuite le problème de fusion et de sélection des caractéristiques sous forme d'un problème de sélection de noyaux. Basé sur l'apprentissage de noyaux multiples (MKL), les tâches de sélection et de fusion de caractéristiques sont considérées simultanément. Les expériences effectuées montrent que la plateforme de classification proposée est à la fois plus simple, plus générique et souvent plus performante que les autres approches de la littérature. Dans le but d'approfondir notre étude sur l'apprentissage de noyaux multiples, nous définissons un nouveau formalisme d'apprentissage MKL réalisé en deux étapes. Cette contribution consiste à proposer trois termes régularisant liés à la résolution du problème d'apprentissage des poids associés à une combinaison linéaire de noyaux, problème reformulé en un problème de classification à vaste marge dans l'espace des couples. Le premier terme régularisant proposé assure une sélection parcimonieuse des noyaux. Les deux autres termes ont été conçus afin de tenir compte de la similarité entre les noyaux via une métrique basée sur la corrélation. Les différentes expérimentations réalisées montrent que le formalisme proposé permet d'obtenir des résultats de même ordre que les méthodes de référence, mais offrant l'avantage d'utiliser moins de fonctions noyaux. / This thesis arises in the context of computer aided analysis for subcellular protein localization in microscopic images. The aim is the establishment of an automatic classification system allowing to identify the cellular compartment in which a protein of interest exerts its biological activity. In order to overcome the difficulties in attempting to discern the cellular compartments in microscopic images, the existing state-of-art systems use several descriptors to train an ensemble of classifiers. In this thesis, we propose a different classification scheme wich better cope with the requirement of genericity and flexibility to treat various image datasets. Aiming to provide an efficient image characterization of microscopic images, a new feature system combining local, frequency-domain, global, and region-based features is proposed. Then, we formulate the problem of heterogeneous feature fusion as a kernel selection problem. Using multiple kernel learning, the problems of optimal feature sets selection and classifier training are simultaneously resolved. The proposed combination scheme leads to a simple and a generic framework capable of providing a high performance for microscopy image classification. Extensive experiments were carried out using widely-used and best known datasets. When compared with the state-of-the-art systems, our framework is more generic and outperforms other classification systems. To further expand our study on multiple kernel learning, we introduce a new formalism for learning with multiple kernels performed in two steps. This contribution consists in proposing three regularized terms with in the minimization of kernels weights problem, formulated as a classification problem using Separators with Vast Margin on the space of pairs of data. The first term ensures that kernels selection leads to a sparse representation. While the second and the third terms introduce the concept of kernels similarity by using a correlation measure. Experiments on various biomedical image datasets show a promising performance of our method compared to states of art methods.
|
148 |
Contribution aux équations aux dérivées partielles non linéaires et non locales et application au trafic routier / Contribution to partial differential non linear and non local equations and application to traffic flowSalazar, Wilfredo 07 October 2016 (has links)
Cette thèse porte sur la modélisation, l’analyse et l’analyse numérique des équations aux dérivées partielles non-linéaires et non-locales avec des applications au trafic routier. Le trafic routier peut être modélisé à des différentes échelles. En particulier, on peut considérer l’échelle microscopique qui décrit la dynamique de chaque véhicule individuellement et l’échelle macroscopique qui voit le trafic comme un fluide et qui décrit le trafic en utilisant des quantités macroscopiques comme la densité des véhicules et la vitesse moyenne. Dans cette thèse, en utilisant la théorie des solutions de viscosité, on fait le passage entre les modèles microscopiques et les modèles macroscopiques. L’intérêt de ce passage est que les modèles microscopiques sont plus intuitifs et faciles à manipuler pour simuler des situations particulières (bifurcations, feux tricolores,...) mais ils ne sont pas adaptés à des grosses simulations (pour simuler le trafic dans toute une ville par exemple). Au contraire, les modèles macroscopiques sont moins évidents à modifier (pour simuler une situation particulière) mais ils peuvent être utilisés pour des simulations à grande échelle. L’idée est donc de trouver le modèle macroscopique équivalent à un modèle microscopique qui décrit un scénario précis (une jonction, une bifurcation, des différents types de conducteurs, une zone scolaire,...). La première partie de cette thèse contient un résultat d’homogénéisation et d’homogénéisation numérique pour un modèle microscopique avec différents types de conducteurs. Dans une seconde partie, on obtient des résultats d’homogénéisation et d’homogénéisation numérique pour des modèles microscopiques con- tenant une perturbation locale (ralentisseur, zone scolaire,...). Finalement, on présente un résultat d’homogénéisation dans le cadre d’une bifurcation. / This work deals with the modelling, analysis and numerical analysis of non- linear and non-local partial differential equations and their application to traffic flow. Traffic can be simulated at different scales. Mainly, we have the microscopic scale which describes the dynamics of each of the vehicles individually and the macroscopic scale which describes the traffic as a fluid using macroscopic quantities such as the density of vehicles and the average speed. In this PhD thesis, using the theory of viscosity solutions, we derive macroscopic models from microscopic models. The interest of these results is that microscopic models are very intuitive and easy to manipulate to describe a particular situation (bifurcation, a traffic light,...), however, they are not adapted for big simulations (to simulate the traffic in an entire city for example). Conversely, macroscopic models are less easy to modify (to simulate a particular situation) but they can be used for big simulations. The idea is then to find the macroscopic model equivalent to a microscopic model describing a particular scenario (a junction, a bifurcation, different types of drivers, a school zone,...). The first part of this work contains an homogenization result and a numerical homogenization result for a microscopic model with different types of drivers. The second part contains an homogenization and numerical homogenization result for microscopic models with a local perturbation (a moderator, a school zone,...). Finally, we present an homogenization result for a bifurcation.
|
149 |
Revize vybraných spodnokarbonských členovců z Moravského krasu / The Revision of selected Lower Carboniferous Arthropods from the Moravian KarstRak, Štěpán January 2020 (has links)
In my dissertation thesis I made a revision of two selected groups of Lower Carboniferous arthropds from the Moravian Karst: trilobites and thylacocephalans. I studied their systematics, autecology as well as taxonomy. In the first part I mapped an occurrence of an atheloptic trilobite association in the Březina village vicinity. Based on a study the new trilobite association is erected. The second part is focused on the very first discovery of Lower Carboniferous thylacocephalan outside of US. Its specific carapace structure is studied. New species Paraconcavicaris viktoryni (Broda et al. 2020) is determined and compared to known species.
|
150 |
Simulating Autonomous Vehicles in a Microscopic Traffic Simulator to Investigate the Effects of Autonomous Vehicles on Roadway MobilityLackey, Nathan 27 August 2019 (has links)
No description available.
|
Page generated in 0.0385 seconds