• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 5
  • 2
  • 1
  • 1
  • Tagged with
  • 35
  • 35
  • 8
  • 7
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Approximations, simulation, and accuracy of multivariate discrete probability distributions in decision analysis

Montiel Cendejas, Luis Vicente 17 July 2012 (has links)
Many important decisions must be made without full information. For example, a woman may need to make a treatment decision regarding breast cancer without full knowledge of important uncertainties, such as how well she might respond to treatment. In the financial domain, in the wake of the housing crisis, the government may need to monitor the credit market and decide whether to intervene. A key input in this case would be a model to describe the chance that one person (or company) will default given that others have defaulted. However, such a model requires addressing the lack of knowledge regarding the correlation between groups or individuals. How to model and make decisions in cases where only partial information is available is a significant challenge. In the past, researchers have made arbitrary assumptions regarding the missing information. In this research, we developed a modeling procedure that can be used to analyze many possible scenarios subject to strict conditions. Specifically, we developed a new Monte Carlo simulation procedure to create a collection of joint probability distributions, all of which match whatever information we have. Using this collection of distributions, we analyzed the accuracy of different approximations such as maximum entropy or copula-models. In addition, we proposed several new approximations that outperform previous methods. The objective of this research is four-fold. First, provide a new framework for approximation models. In particular, we presented four new models to approximate joint probability distributions based on geometric attributes and compared their performance to existing methods. Second, develop a new joint distribution simulation procedure (JDSIM) to sample joint distributions from the set of all possible distributions that match available information. This procedure can then be applied to different scenarios to analyze the sensitivity of a decision or to test the accuracy of an approximation method. Third, test the accuracy of seven approximation methods under a variety of circumstances. Specifically, we addressed the following questions within the context of multivariate discrete distributions: Are there new approximations that should be considered? Which approximation is the most accurate, according to different measures? How accurate are the approximations as the number of random variables increases? How accurate are they as we change the underlying dependence structure? How does accuracy improve as we add lower-order assessments? What are the implications of these findings for decision analysis practice and research? While the above questions are easy to pose, they are challenging to answer. For Decision Analysis, the answers open a new avenue to address partial information, which bing us to the last contribution. Fourth, propose a new approach to decision making with partial information. The exploration of old and new approximations and the capability of creating large collections of joint distributions that match expert assessments provide new tools that extend the field of decision analysis. In particular, we presented two sample cases that illustrate the scope of this work and its impact on uncertain decision making. / text
22

Probabilistic Fault Management in Networked Systems

Steinert, Rebecca January 2014 (has links)
Technical advances in network communication systems (e.g. radio access networks) combined with evolving concepts based on virtualization (e.g. clouds), require new management algorithms in order to handle the increasing complexity in the network behavior and variability in the network environment. Current network management operations are primarily centralized and deterministic, and are carried out via automated scripts and manual interventions, which work for mid-sized and fairly static networks. The next generation of communication networks and systems will be of significantly larger size and complexity, and will require scalable and autonomous management algorithms in order to meet operational requirements on reliability, failure resilience, and resource-efficiency. A promising approach to address these challenges includes the development of probabilistic management algorithms, following three main design goals. The first goal relates to all aspects of scalability, ranging from efficient usage of network resources to computational efficiency. The second goal relates to adaptability in maintaining the models up-to-date for the purpose of accurately reflecting the network state. The third goal relates to reliability in the algorithm performance in the sense of improved performance predictability and simplified algorithm control. This thesis is about probabilistic approaches to fault management that follow the concepts of probabilistic network management (PNM). An overview of existing network management algorithms and methods in relation to PNM is provided. The concepts of PNM and the implications of employing PNM-algorithms are presented and discussed. Moreover, some of the practical differences of using a probabilistic fault detection algorithm compared to a deterministic method are investigated. Further, six probabilistic fault management algorithms that implement different aspects of PNM are presented. The algorithms are highly decentralized, adaptive and autonomous, and cover several problem areas, such as probabilistic fault detection and controllable detection performance; distributed and decentralized change detection in modeled link metrics; root-cause analysis in virtual overlays; event-correlation and pattern mining in data logs; and, probabilistic failure diagnosis. The probabilistic models (for a large part based on Bayesian parameter estimation) are memory-efficient and can be used and re-used for multiple purposes, such as performance monitoring, detection, and self-adjustment of the algorithm behavior. / <p>QC 20140509</p>
23

Modélisation de la montée vers un état critique de la situation de basses eaux sous forçages naturel et anthropique en région méditerranéenne / Modeling the rise of a critical state of the low flows situation under natural and anthropogenic presures in the mediterranean basin

Canovas, Ingrid 12 December 2016 (has links)
L’eau et la vie sont inégalement réparties. Ceci entrave le développement des sociétés, et fait apparaître des situations de tension,dont l’intensité et la fréquence varient selon le contexte climatique, la densité des populations et donc de la demande. Avec le changement climatique en cours et la hausse des besoins en eau (dont la « réserve » à laisser aux milieux naturels), le niveau des tensions est susceptible de s’élever, et la fréquence des très basses eaux de s’accroître. Dans le sud-est de la France, l’aire soumise au climat méditerranéen est particulièrement concernée par la fragilité des ressources en eau en période estivale (une saison très peu arrosée et une forte attractivité touristique, etc.). De plus, une modification du régime climatique (diminution des apports pluviométriques, extrêmes plus marqués, etc.) affectera nécessairement davantage les bassins alimentés par des ressources très locales, tels les cours d’eau et les bassins de vie ne bénéficiant pas de l’influence des grands fleuves allogènes à l’aire climatique méditerranéenne (Rhône, Durance, etc.). La réflexion sur le manque ou la pénurie en eau, de qualité propre pour satisfaire les usages sur ces territoires, peut dès lors être portée sur la façon dont un tel risque se construit (succession des étapes, retentissement d’un facteur sur les autres, etc.), et sur la façon dont l’évolution des éléments structurels et des forçages extérieurs conduit ou non à s’approcher d’une situation indésirable.La démarche engagée ici propose de décliner cette approche particulière aux basses eaux. Il s’agit alors de répondre à un triple objectif : (1) identifier et compiler au sein d’une base de données spécifique les informations essentielles pour appréhender le phénomène ; (2) caractériser l’état moyen des basses eaux à partir de situations locales et par le biais de descripteurs statistiques robustes ; (3) déterminer les modalités de son évolution vers des états critiques. Ce travail conduit dans un premier temps au développement d’une modélisation probabiliste, laquelle est conçue pour être déployée dans différentes échelles spatiales et dans différentes temporalités. La réflexion est ensuite étendue aux modalités méthodologiques et techniques pour l’élaboration d’indicateurs analytiques et d’indices synthétiques d’évaluation de la situation de tension sur la ressource.La démarche aboutit enfin à des outils numériques simples et intuitifs, mobilisables à des fins opérationnelles comme support d’aide à la décision pour la mise en place de mesures d’anticipation et de régulation, et transposables à tout territoire méditerranéen. / Water and life are unequally distributed. This hinders the development of societies, and leads to situations of tension, whoseintensity and frequency vary according to climatic conditions, population density and according therefore to demand. With theongoing climate change and the rising water requirements (the stock dedicated to the natural environment), the level of tensionis likely to rise, such as the frequency of very low water. The French Mediterranean territories are particularly concerned withthe fragility of water resources during summertime (a little watered season, a strong tourist appeal, etc.). In addition, everymodification in the climate regime (the decreasing of rainfall inputs, etc.) would necessarily affect more the small catchmentwith very local resources, such as rivers and living areas which don’t benefit from the influence of major rivers (Rhone, Durance,etc.).The reflection on water scarcity can therefore be focused on the structure of such a risk (sequence of steps, the impact of onefactor on the others, etc.), and on the conditions that make the system reach an undesirable situation.The approach adopted here suggests to decline this particular approach to low water. Three objectives are here stated: (1) identifyand compile, within a specific database, the essential information to understand the phenomenon; (2) characterize the meanstate of low water from local situations and through robust statistical descriptors; (3) determine the terms of its evolution tocritical states.The whole work leads first to the development of a probabilistic modeling, which is designed to be deployed in different spatialscales and in dfferent times. The reflection is then extended to the methodological and technical arrangements that would allowthe development of analytical indicators and synthetic indexes for assessing the resource level of tension.The approach finally results in simple and intuitive digital tools, that can be mobilized for operational purposes such asdecision-making support for the implementation of anticipatory measures and regulation, and that can transferable to anyMediterranean territory.
24

Mapeamento rob?tico 2,5-D com representa??o em grade de ocupa??o-eleva??o

Souza, Anderson Abner de Santana 03 August 2012 (has links)
Made available in DSpace on 2014-12-17T14:55:05Z (GMT). No. of bitstreams: 1 AndersonASS_TESE.pdf: 3250611 bytes, checksum: 4e87cd6efd2a74f4715e56d6e2aa0064 (MD5) Previous issue date: 2012-08-03 / This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation / Este trabalho apresenta um novo m?todo de mapeamento de ambientes com rob?s m?veis com informa??es tridimensionais para navega??o. Muitas abordagens de mapeamento 3D, usam o m?todo em grade de ocupa??o, o que resulta no uso de muito recurso computacional tanto na constru??o como no armazenamento desses mapas. A presente pesquisa apresenta o mapeamento 2,5-D em grade de ocupa??o-eleva??o, a qual ? definida como uma representa??o discreta, onde cada c?lula armazena uma probabilidade de ocupa??o, a altura do espa?o mapeado e a vari?ncia desse valor de altura. Essa representa??o permite que um rob? m?vel tenha a ci?ncia se um lugar do seu ambiente est? ocupado por um obst?culo e qual a altura desse obst?culo. Dessa forma, ele pode decidir se ? poss?vel navegar sobre o obst?culo ou n?o, de acordo com suas habilidades motoras. As informa??es sensoriais necess?rias para construir o mapa s?o providas por um sistema de vis?o est?reo, o qual foi modelado atrav?s de uma robusta an?lise estat?stica, considerando os ru?dos presentes no processamento est?reo. Os mapas resultantes favorecem a execu??o de tarefas como tomadas de decis?es na navega??o aut?noma, explora??o, localiza??o e planejamento de caminhos. Experimentos pr?ticos reais mostram que o m?todo de mapeamento apresentado ? ?til para a navega??o de rob?s aut?nomos
25

Essais virtuels pour l'industrie du meuble / Virtual tests for the furniture industry

Makhlouf, Heba 14 December 2015 (has links)
Le travail s’inscrit dans le cadre d’une collaboration entre le Pôle Ameublement FCBA et le Laboratoire MSME de l’UPEM. L’objectif était de mettre au point un outil de simulation permettant à FCBA de mener une étude de validation (tenue aux tests normalisés) avant la fabrication du meuble. Ce travail était supporté par les fonds collectifs de la profession ameublement. Il a donné lieu à des développements dans le domaine de l’identification du comportement anisotrope du bois par analyse d’images couplée à la méthode des éléments finis, d’une approche multi-échelle pour identifier le comportement des liens entre éléments de meuble et d’un programme éléments finis utilisant l’approche « poutres » pour réaliser une étude statistique du comportement du meuble prenant en compte la dispersion du comportement du bois. Chaque étape a été validée expérimentalement. La simulation par éléments finis s’est focalisée sur une application « lits superposées en bois massif » pour laquelle un code à base de poutres a été développé dans l’environnement Matlab afin de pouvoir y implanter :• une théorie élastique anisotrope via des poutres de Timoshenko pour prendre en compte l’effet des déformations dues aux faibles rigidités transversales du bois en regard de la rigidité longitudinale ;• des éléments de connexion ponctuels représentant la contribution des composants de quincaillerie (vis, écrou noyé, tourillon…) et les effets locaux 3D aux liaisons entre poutres ;• la possibilité de prendre en compte les incertitudes sur les paramètres matériau d’une poutre à l’autre en fonction de l’orientation des planches, de la densité du bois etc… via une simulation de Monte-Carlo / The work joins within the framework of a collaboration between the Pole Furnishing FCBA and the Laboratory MSME of the UPEM. The objective was to finalize(to work out) a tool of simulation allowing FCBA to lead a study of validation (held the normalized(standardized) tests) before the manufacturing of the piece of furniture
26

Characterizing InternetWorm Spatial-Temporal Infection Structures

Wang, Qian 15 October 2010 (has links)
Since the Morris worm was released in 1988, Internet worms continue to be one of top security threats. For example, the Conficker worm infected 9 to 15 million machines in early 2009 and shut down the service of some critical government and medical networks. Moreover, it constructed a massive peer-to-peer (P2P) botnet. Botnets are zombie networks controlled by attackers setting out coordinated attacks. In recent years, botnets have become the number one threat to the Internet. The objective of this research is to characterize spatial-temporal infection structures of Internet worms, and apply the observations to study P2P-based botnets formed by worm infection. First, we infer temporal characteristics of the Internet worm infection structure, i.e., the host infection time and the worm infection sequence, and thus pinpoint patient zero or initially infected hosts. Specifically, we apply statistical estimation techniques on Darknet observations. We show analytically and empirically that our proposed estimators can significantly improve the inference accuracy. Second, we reveal two key spatial characteristics of the Internet worm infection structure, i.e., the number of children and the generation of the underlying tree topology formed by worm infection. Specifically, we apply probabilistic modeling methods and a sequential growth model. We show analytically and empirically that the number of children has asymptotically a geometric distribution with parameter 0.5, and the generation follows closely a Poisson distribution. Finally, we evaluate bot detection strategies and effects of user defenses in P2P-based botnets formed by worm infection. Specifically, we apply the observations of the number of children and demonstrate analytically and empirically that targeted detection that focuses on the nodes with the largest number of children is an efficient way to expose bots. However, we also point out that future botnets may self-stop scanning to weaken targeted detection, without greatly slowing down the speed of worm infection. We then extend the worm spatial infection structure and show empirically that user defenses, e.g., patching or cleaning, can significantly mitigate the robustness and the effectiveness of P2P-based botnets. To counterattack, we evaluate a simple measure by future botnets that enhances topology robustness through worm re-infection.
27

Méthode de simulation appropriée aux systèmes complexes : preuve de concept auto-adaptative et auto-apprenante appliquée aux transferts thermiques / Suitable method for complex systems simulation : self-adaptive and self-learning proof-of-concept applied to coupled heat transfer

Spiesser, Christophe 20 June 2017 (has links)
L’augmentation de la puissance informatique disponible permet aux ingénieurs et designers d’aborder par simulation des problèmes de plus en plus complexes (multi-physiques, multi-échelles, géométries intriquées ...). Dans ce contexte, les quadratures à base de discrétisation (FDM, FEM, FVM) montrent leur limite : le besoin d’un grand nombre de sous-domaines qui implique des coûts RAM et CPU prohibitifs. La méthode de Monte-Carlo apparaît plus appropriée, mais son utilisation est verrouillée par la difficulté de générer des modèles probabilistes de systèmes complexes. Pour surpasser ceci, une approche systémique est proposée et implémentée pour créer une preuve de concept appliquée à la simulation des transferts thermiques couplés. Après une étape de validation vis-à-vis de solutions analytiques, l’outil est employé; sur des cas d’illustration (transferts thermiques au sein de bâtiments et dans une centrale solaire) pour étudier ses capacités. L’approche mise en œuvre présente un comportement particulièrement avantageux pour la simulation de systèmes complexes : son temps de calcul ne dépend que des parties influentes du problème. De plus, elles sont automatiquement identifiées, même en présence de géométries étendues ou intriquées, ce qui rend les simulations auto-adaptatives. Par ailleurs, ses performances de calcul ne sont pas corrélées avec le rapport d’échelle caractérisant le système simulé. Ceci en fait une approche douée d’une remarquable capacité à traiter les problèmes à la fois multi-physiques et multi-échelles. En parallèle de l’estimation d’une observable par des chemins d’exploration, l’outil analyse également ces derniers de manière statistique. Ceci lui permet de générer un modèle prédictif réduit de l’observable, procurant ainsi une capacité d’auto-apprentissage à la simulation. Son utilisation peut améliorer les processus d’optimisation et de contrôle-commande, ou simplifier les mesures par méthodes inverses. De plus, elle a aussi permis de mener une analyse par propagation d’incertitudes, affectant les conditions aux frontières, vers l’observable. Enfin, une démonstration d’optimisation, utilisant des modèles réduits générés, a été réalisée. / As computing power increases, engineers and designers tackle increasingly complex problems using simulation (multiphysics, multiscale, intricated geometries ...). In this context, discretization-based quadratures (FDM, FEM, FVM) show their limit: the need of a great number of sub-domains which induces prohibitive consumption of RAM and CPU power. The Monte Carlo method appears to be more appropriate, but the difficulty to build probabilistic models of complex systems forms a bottleneck. A systemic approach is proposed to alleviate it and is implemented to create a proof-of-concept dedicated to the coupled heat transfer simulation. After a successful validation step against analytical solutions, this tool is applied to illustrative cases (emulating heat transfer in buildings and in solar heating systems) in order to study its simulation capabilities.This approach presents a major beneficial behavior for complex systems simulation: the computation time only depends on the influential parts of the problem. These parts are automatically identified, even in intricate or extensive geometries, which makes the simulation self-adaptive. In addition, the computational performance and the system scale ratio are completely uncorrelated. Consequently, this approach shows an exceptional capacity to tackle multiphysics and multiscale problems. Each temperature is estimated using exploration paths. By statistically analyzing these paths during the process, the tool is able to generate a reduced predictive model of this physical quantity, which is bringing a self-learning capacity to the simulation. Its use can significantly improve optimization and control of processes, or simplify inverse measurements. Furthermore, based on this model, an uncertainty propagation analysis has been performed. It quantifies the effect of uncertainties affecting boundary conditions on the temperature. Finally a Particle Swarm Optimization (PSO) process, based on simulations done by the framework, is successfully carried out.
28

vehicleLang: a probabilistic modeling and simulation language for vehicular cyber attacks

Katsikeas, Sotirios January 2018 (has links)
The technological advancements in the automotive industry as well as in thefield of communication technologies done the last years have transformed thevehicles to complex machines that include not only electrical and mechanicalcomponents but also a great number of electronic components. Furthermore,modern vehicles are now connected to the Wide Area Network (WAN) and inthe near future communications will also be present between the cars (Vehicleto-Vehicle, V2V) and between cars and infrastructure (Vehicle-to-Infrastructure, V2I), something that can be found as Internet of Vehicles (IoV)in the literature. The main motivations towards all the aforementioned changesin modern vehicles are of course the improvement of road safety, the higherconvenience of the passengers, the increase in the efficiency and the higher userfriendliness.On the other hand, having vehicles connected to the Internet opens them up toa new domain of interest, this no other than the domain of cyber security. Thispractically means that while previously we were only considering cyber-attackson computational systems, now we need to start thinking about it also forvehicles. This, as a result, creates a new field of research, namely the vehicularcyber security. However, this field does not only include the possible vehicularcyber-attacks and their corresponding defenses but also the modeling andsimulation of them with the use of vehicular security analysis tools, which isalso recommended by the ENISA report titled “Cyber Security and Resilienceof smart cars: Good practices and recommendations”.Building on this need for vehicular security analysis tools, this work aims tocreate and evaluate a domain-specific, probabilistic modeling and simulationlanguage for cyber-attacks on modern connected vehicles. The language will bedesigned based on the existing threat modeling and risk management toolsecuriCAD® by foreseeti AB and more specifically based on its underlyingmechanisms for describing and probabilistically evaluating the cyber threats ofthe models.The outcome/final product of this work will be the probabilistic modeling andsimulation language for connected vehicles, called vehicleLang, that will beready for future use in the securiCAD® software. / De tekniska framstegen inom fordonsindustrin såväl som inomkommunikationsteknik som gjorts de senaste åren har omvandlat fordon tillkomplexa maskiner som inte bara omfattar elektriska och mekaniskakomponenter utan också ett stort antal elektroniska komponenter. Dessutom ärmoderna fordon nu anslutna till Internet (WAN) och inom den närmasteframtiden kommer kommunikation också att etableras mellan bilarna (Vehicleto-Vehicle, V2V) och mellan bilar och infrastruktur (Vehicle-to-Infrastructure,V2I). Detta kan också kallas fordonens internet (Internet of Vehicles - IoV) ilitteraturen. De främsta motiven för alla ovannämnda förändringar i modernafordon är förstås förbättringen av trafiksäkerheten, ökad bekvämlighet förpassagerarna, ökad effektivitet och högre användarvänlighet.Å andra sidan, att ha fordon anslutna till Internet öppnar dem för en ny domän,nämligen cybersäkerhet. Då vi tidigare bara övervägde cyberattacker påtraditionella datorsystem, måste vi nu börja tänka på det även för fordon. Dettaområde omfattar emellertid inte bara de möjliga fordonsattackerna och derasmotsvarande försvar utan även modellering och simulering av dem med hjälpav verktyg för analys av fordonssäkerhet, vilket också rekommenderas avENISA-rapporten med titeln ”Cyber Security and Resilience of smart cars: Goodpractices and recommendations”.På grund av detta behov av verktyg för fordonssäkerhetsanalys syftar dettaarbete till att skapa och utvärdera ett domänspecifikt, probabilistisktmodelleringsspråk för simulering av cyberattacker på moderna anslutna fordon.Språket har utformats utifrån det befintliga hotmodellerings- ochriskhanteringsverktyget securiCAD® av foreseeti AB och mer specifikt baseratpå dess underliggande mekanismer för att beskriva och probabilistiskt utvärderamodellernas cyberhot.Resultatet/slutprodukten av detta arbete är ett probabilistisktmodelleringsspråk för uppkopplade fordon, vehicleLang.
29

An Assessment and Modeling of Copper Plumbing pipe Failures due to Pinhole Leaks

Farooqi, Owais Ehtisham 15 August 2006 (has links)
Pinhole leaks in copper plumbing pipes are a big concern for the homeowners. The problem is spread across the nation and remains a threat to plumbing systems of all ages. Due to the absence of a single acceptable mechanistic theory no preventive measure is available to date. Most of the present mechanistic theories are based on analysis of failed pipe samples however an objective comparison with other pipes that did not fail is seldom made. The variability in hydraulic and water quality parameters has made the problem complex and unquantifiable in terms of plumbing susceptibility to pinhole leaks. The present work determines the spatial and temporal spread of pinhole leaks across United States. The hotspot communities are identified based on repair histories and surveys. An assessment of variability in water quality is presented based on nationwide water quality data. A synthesis of causal factors is presented and a scoring system for copper pitting is developed using goal programming. A probabilistic model is presented to evaluate optimal replacement time for plumbing systems. Methodologies for mechanistic modeling based on corrosion thermodynamics and kinetics are presented. / Master of Science
30

Analyse des trains de spike à large échelle avec contraintes spatio-temporelles : application aux acquisitions multi-électrodes rétiniennes / Analysis of large scale spiking networks dynamics with spatio-temporal constraints : application to multi-electrodes acquisitions in the retina

Nasser, Hassan 14 March 2014 (has links)
L’évolution des techniques d’acquisition de l’activité neuronale permet désormais d'enregistrer simultanément jusqu’à plusieurs centaines de neurones dans le cortex ou dans la rétine. L’analyse de ces données nécessite des méthodes mathématiques et numériques pour décrire les corrélations spatiotemporelles de la population neuronale. Une méthode couramment employée est basée sur le principe d’entropie maximale. Dans ce cas, le produit N×R, où N est le nombre de neurones et R le temps maximal considéré dans les corrélations, est un paramètre crucial. Les méthodes de physique statistique usuelles sont limitées aux corrélations spatiales avec R = 1 (Ising) alors que les méthodes basées sur des matrices de transfert, permettant l’analyse des corrélations spatio-temporelles (R > 1), sont limitées à N×R≤20. Dans une première partie, nous proposons une version modifiée de la méthode de matrice de transfert, basée sur un algorithme de Monte-Carlo parallèle, qui nous permet d’aller jusqu’à N×R=100. Dans la deuxième partie, nous présentons la bibliothèque C++ Enas, dotée d’une interface graphique développée pour les neurobiologistes. Enas offre un environnement hautement interactif permettant aux utilisateurs de gérer les données, effectuer des analyses empiriques, interpoler des modèles statistiques et visualiser les résultats. Enfin, dans une troisième partie, nous testons notre méthode sur des données synthétiques et réelles (rétine, fournies par nos partenaires biologistes). Notre analyse non exhaustive montre l’avantage de considérer des corrélations spatio-temporelles pour l’analyse des données rétiniennes; mais elle montre aussi les limites des méthodes d’entropie maximale. / Recent experimental advances have made it possible to record up to several hundreds of neurons simultaneously in the cortex or in the retina. Analyzing such data requires mathematical and numerical methods to describe the spatio-temporal correlations in population activity. This can be done thanks to Maximum Entropy method. Here, a crucial parameter is the product N×R where N is the number of neurons and R the memory depth of correlations (how far in the past does the spike activity affects the current state). Standard statistical mechanics methods are limited to spatial correlation structure with R = 1 (e.g. Ising model) whereas methods based on transfer matrices, allowing the analysis of spatio-temporal correlations, are limited to NR ≤ 20. In the first part of the thesis we propose a modified version of the transfer matrix method, based on the parallel version of the Montecarlo algorithm, allowing us to go to NR=100. In a second part we present EnaS, a C++ library with a Graphical User Interface developed for neuroscientists. EnaS offers highly interactive tools that allow users to manage data, perform empirical statistics, modeling and visualizing results. Finally, in a third part, we test our method on synthetic and real data sets. Real data set correspond to retina data provided by our partners neuroscientists. Our non-extensive analysis shows the advantages of considering spatio-temporal correlations for the analysis of retina spike trains, but it also outlines the limits of Maximum Entropy methods.

Page generated in 0.5016 seconds