Spelling suggestions: "subject:"eventbased"" "subject:"events:based""
41 |
Commande faible coût pour une réduction de la consommation d'énergie dans les systèmes électroniques embarqués / Reduction of the energy consumption in embedded electronic devices with low control computational costDurand, Sylvain 17 January 2011 (has links)
La course à la miniaturisation des circuits électroniques pousse à développer des systèmes faible coût, quece soit en terme de consommation d’énergie ou de ressources de calcul. Il est ainsi possible de réduire la consommationen diminuant la tension d’alimentation et/ou la fréquence d’horloge, mais ceci a pour conséquence de diminuer aussila vitesse de fonctionnement du circuit. Une commande prédictive rapide permet alors de gérer dynamiquement un telcompromis, de manière à ce que la consommation d’énergie soit minimisée tout en garantissant de bonnes performances.Les stratégies de commande proposées ont notamment l’avantage d’être très robustes aux dispersions technologiquesqui sont un problème récurrent dans les nanopuces. Des solutions sont également proposées afin de réduire le coût decalcul du contrôleur. Les systèmes à échantillonnage non-uniforme, dont la loi de commande est calculée et mise à jourlorsqu’un événement est déclenché, sont ainsi étudiés. Ce principe permet de réduire le nombre d’échantillons et, parconséquent, d’économiser des ressources de calcul, tout en garantissant de bonnes performances du système commandé.Des résultats de simulation, et surtout expérimentaux, valident finalement l’intérêt d’utiliser une telle approche. / The demand of electronic components in all embedded and miniaturized applications encourages to developlow-cost components, in term of energy consumption and computational resources. Actually, the power consumption canbe reduced when decreasing the supply voltage and/or the clock frequency, but with the effect that the device runs moreslowly in return. Nevertheless, a fast predictive control strategy allows to dynamically manage this tradeoff in order tominimize the energy consumption while ensuring good performance of the device. Furthermore, the proposals are highlyrobust to tackle variability which is a real problem in nanometric systems on chip. Some issues are also suggested inthis thesis to reduce the control computational cost. Contrary to a time-triggered system where the controller calculatesthe control law at each (constant and periodic) sampling time, an event-based controller updates the control signalonly when the measurement sufficiently changes. Such a paradigm hence calls for resources whenever they are indeednecessary, that is when required from a performance or stability point of view for instance. The idea is to soften thecomputational load by reducing the number of samples and consequently the CPU utilization. Some simulation andexperimental results eventually validate the interest of such an approach.
|
42 |
Design och utvärdering av händelsebaserad Android-applikationOlsson Appler, Sebastian January 2013 (has links)
I detta papper, designas och utvärderas en händelsebaserad Android-applikation. MyWorld är en mobil applikation. Händelser som sker i användarens närhet kan ses på en karta. Genom arbetet används beprövade metoder för att skapa en väldefinierad design. Det tas upp problem och lösningar på designrelaterade problem i en händelsebaserad applikation. Designen utvärderas sedan genom användartester som använder mått av framgång, felsteg och tillfredställdhet. Genom en process på tre iterationer har arbetet gett applikationen en ny design som blivit mer användbar för användarna. I slutskedet finns en färdig designprototyp som är implementerad i applikationen.
|
43 |
A scheduling model for a coal handling facilitySwart, Marinda 10 June 2005 (has links)
The objective of this project is to develop an operational scheduling model for Sasol Mining’s coal handling facility, Sasol Coal Supply (referred to as SCS), to optimise daily operations. In this document, the specific scheduling problem at SCS is presented and solved using Mixed Integer Non-Linear Programming (MINLP) continuous time representation techniques. The most recent MINLP scheduling techniques are presented and applied to an example problem. The assumption is made that the results from the example problem will display trends which will apply to the SCS scheduling problem as well. Based on this assumption, the unit-specific event based continuous time formulation is chosen to apply to the SCS scheduling problem. The detail mathematical formulation of the SCS scheduling problem, based on the chosen technique, is discussed and the necessary changes presented to customise the formulation for the SCS situation. The results presented show that the first phase model does not solve within 72 hours. A solution time of more than three days is not acceptable for an operational scheduling model in a dynamic system like SCS. Various improvement approaches are applied during the second phase of the model development. Special Ordered Sets of Type 1 (SOS1) variables are successfully applied in the model to reduce the amount of binary variables. The time and duration constraints are restructured to simplify the structure of the model. A specific linearization and solution technique is applied to the non-linear equations to ensure reduced model solution times and reliable results. The improved model for one period solves to optimality within two minutes. This dramatic improvement ensures that the model will be used operationally at SCS to optimise daily operations. The scheduling model is currently being implemented at SCS. Examples of the input variables and output results are presented. It is concluded that the unit-specific event based MINLP continuous time formulation method, as presented in the literature, is not robust enough to be applied to an operational industrial-sized scheduling problem such as the SCS problem. Customised modifications to the formulation are necessary to ensure that the model solves in a time acceptable for operational use. However, it is proved that Mixed Integer Non-linear Programming (MINLP) can successfully be applied to optimise the scheduling of an industrial-sized plant such as SCS. Although more research is required to derive robust formulation techniques, the principle of using mathematical methods to optimise operational scheduling in industry can dramatically impact the way plants are operated. The optimisation of daily schedules at SCS by applying the MINLP continuous time scheduling technique, has made a significant contribution to the coal handling industry. Finally, it can be concluded that the SCS scheduling problem was successfully modelled and the operational scheduling model will add significant value to the Sasol Group. / Dissertation (MEng (Industrial Engineering))--University of Pretoria, 2006. / Industrial and Systems Engineering / unrestricted
|
44 |
Parallelism in Event-Based Computations with Applications in BiologyBauer, Pavol January 2017 (has links)
Event-based models find frequent usage in fields such as computational physics and biology as they may contain both continuous and discrete state variables and may incorporate both deterministic and stochastic state transitions. If the state transitions are stochastic, computer-generated random numbers are used to obtain the model solution. This type of event-based computations is also known as Monte-Carlo simulation. In this thesis, I study different approaches to execute event-based computations on parallel computers. This ultimately allows users to retrieve their simulation results in a fraction of the original computation time. As system sizes grow continuously or models have to be simulated at longer time scales, this is a necessary approach for current computational tasks. More specifically, I propose several ways to asynchronously simulate such models on parallel shared-memory computers, for example using parallel discrete-event simulation or task-based computing. The particular event-based models studied herein find applications in systems biology, computational epidemiology and computational neuroscience. In the presented studies, the proposed methods allow for high efficiency of the parallel simulation, typically scaling well with the number of used computer cores. As the scaling typically depends on individual model properties, the studies also investigate which quantities have the greatest impact on the simulation performance. Finally, the presented studies include other insights into event-based computations, such as methods how to estimate parameter sensitivity in stochastic models and how to simulate models that include both deterministic and stochastic state transitions. / UPMARC
|
45 |
Contribution à la prévision des crues sur le bassin du Lez : modélisation de la relation pluie-débit en zone karstique et impact de l'assimilation de débits / Improving flood forecasting in the Lez Catchment : modeling the rainfall-runoff relationship in karstic regions and the impact of assimilating discharge dataCoustau, Mathieu 13 December 2011 (has links)
Les crues « éclair » parfois dévastatrices qui touchent les bassins versants méditerranéens du Sud de la France sont difficiles à anticiper. Leur prévision passe par l'utilisation de modèles pluie-débit, dont l'efficacité est encore limitée par les incertitudes liées notamment à la variabilité spatiale des pluies méditerranéennes et à la caractérisation de l'état hydrique initial des hydrosystèmes. Dans le cas de bassins karstiques, à ces incertitudes s'ajoutent celles liées à la dynamique des aquifères et à leur rôle sur la formation des crues. La première partie de ce travail de thèse propose un modèle pluie-débit horaire, distribué, événementiel et parcimonieux pour reproduire les crues « éclair » à l'exutoire du bassin karstique du Lez (Montpellier) de 114 km2. Le modèle est évalué non seulement sur la qualité des simulations de débits mais aussi sur la qualité de son initialisation obtenu grâce à une relation entre sa condition initiale et divers indicateurs de l'état hydrique de l'hydrosystème. Calibré sur 21 épisodes de crues, le modèle fournit des simulations satisfaisantes, et sa condition initiale est significativement corrélée à l'indice d'humidité Hu2 du modèle SIM de Météo-France ou à la piézométrie dans l'aquifère du Lez. Les pluies mesurées par radar en début d'automne sont de bonne qualité et conduisent à une amélioration des simulations de débit et de l'estimation de la condition initiale du modèle. En revanche, les pluies mesurées par radar en fin d'automne sont de moindre qualité et n'améliorent pas les simulations. Face aux incertitudes liées à la paramétrisation du modèle ou à l'estimation des pluies radar, la deuxième partie du travail de thèse analyse l'apport de l'assimilation des débits observés pour corriger en temps réel les paramètres les plus sensibles du modèle et notamment sa condition initiale ou les pluies radar en entrée du modèle. La procédure d'assimilation de données a été mise en place à l'aide du coupleur PALM, qui permet de relier modèle hydrologique à l'algorithme d'assimilation. La correction de la condition initiale du modèle permet généralement d'améliorer les prévisions (sous hypothèse de pluie future connue); la correction de la pluie a des effets similaires. Néanmoins les limites de cette correction sont atteintes dans le cas où le modèle ne reproduit pas de façon satisfaisante la partie initiale de montée des eaux, ce qui pourra être amélioré par la suite. Finalement, ce travail de thèse montre que la complexité d'un bassin karstique peut être représentée efficacement à l'aide d'un nombre réduit de paramètres, pour simuler les débits, et contribue à l'amélioration des outils opérationnels pour la prévision des crues. / The sometimes devastating flash floods which affect the Mediterranean watersheds of the South of France are difficult to anticipate. Flood forecasting requires the use of rainfall-runoff models which are limited in their efficiency by uncertainty related to the spatial variability of Mediterranean rainfall and the characterization of the initial hydric state of the system. In karstic catchments, these uncertainties are added to those due to aquifer dynamics and their role in flood genesis. The first part of this work will present a distributed event-based parsimonious hourly rainfall-runoff model in order to reconstruct flash flood events at the outlet of the 114 km2 Lez Catchment (Montpellier). The model is evaluated not only for the quality of the simulations produced, but for the quality of its parameter initialization obtained using a relationship between the initial condition and various hydric state indicators of the system. Calibrated using 21 flood episodes, the model produces satisfactory simulations and its initial condition is significantly correlated with the Hu2 soil humidity index of the Météo-France model or piezometers measuring the Lez aquifer. Radar rainfall data measured in early fall are of good quality and lead to improved discharge simulations and an improved estimation of the model initial condition. However, rainfall measured by radar in late fall are of poor quality and do not improve the simulations. Confronted with the uncertainty related to model parametrization or the estimation of radar rainfall, the second part of this dissertation analyzes improvements achieved by assimilating observed discharge measurements in order to perform real-time corrections to the most sensitive model parameters and notably the initial condition and the radar rainfall input to the model. The data assimilation procedure was implemented with the help of the PALM coupling software which allows for the linking of the hydrological model with the assimilation algorithm. Correcting the initial condition allowed for, on average, the improvement of forecasting (under a known future rainfall hypothesis); correcting the rainfall had similar effects. Nevertheless, the limits of this approach are reached when the model is unable to satisfactorily reproduce the rising limb of the hydrograph, a problem which may be addressed by future research. Finally, this body of work demonstrates that the complexity of a karstic catchment can be efficiently represented with a reduced number of parameters in order to simulate discharges and contribute to the improvement of operational tools for flood forecasting.
|
46 |
Documenting and Improving the Design of a Large-scale SystemToresson, Gabriel January 2019 (has links)
As software systems become increasingly larger and more complex, the need to make them easily maintained increases, as large systems are expected to last for many years. It has been estimated that system maintenance is a large part of many IT-departments’ software development costs. In order to design a complex system to be maintainable it is necessary to introduce structure, often as models in the form of a system architecture and a system design. As development of complex large-scale systems progresses over time, the models may need to be reconstructed. Perhaps because development may have diverted from the initial plan, or because changes had to be made during implementation. This thesis presents a reconstructed documentation of a complex large-scale system, as well as suggestions for how to improve the existing design based on identified needs and insufficiencies. The work was performed primarily using a qualitative manual code review approach of the source code, and the proposal was generated iteratively. The proposed design was evaluated and it was concluded that it does address the needs and insufficiencies, and that it can be realistically implemented.
|
47 |
An Overview of Event-based Facades for Modular Composition and Coordination of Multiple ApplicationsMalakuti, Somayeh 18 May 2016 (has links)
Complex software systems are usually developed as systems of systems (SoS’s) in which multiple constituent applications are composed and coordinated to fulfill desired system-level requirements. The constituent applications must be augmented with suitable coordination-specific interfaces, through which they can participate in coordinated interactions. Such interfaces as well as coordination rules have a crosscutting nature. Therefore, to increase the reusability of the applications and to increase the comprehensibility of SoS’s, suitable mechanisms are required to modularize the coordination rules and interfaces from the constituent applications. We introduce a new abstraction named as architectural event modules (AEMs), which facilitate defining constituent applications and desired coordination rules as modules of SoS’s. AEMs augment the constituent applications with event-based facades to let them participate in coordinated interactions. We introduce the EventArch language in which the concept of AEMs is implemented, and illustrate its suitability using a case study.
|
48 |
Training of Object Detection Spiking Neural Networks for Event-Based VisionJohansson, Olof January 2021 (has links)
Event-based vision offers high dynamic range, time resolution and lower latency than conventional frame-based vision sensors. These attributes are useful in varying light condition and fast motion. However, there are no neural network models and training protocols optimized for object detection with event data, and conventional artificial neural networks for frame-based data are not directly suitable for that task. Spiking neural networks are natural candidates but further work is required to develop an efficient object detection architecture and end-to-end training protocol. For example, object detection in varying light conditions is identified as a challenging problem for the automation of construction equipment such as earth-moving machines, aiming to increase the safety of operators and make repetitive processes less tedious. This work focuses on the development and evaluation of a neural network for object detection with data from an event-based sensor. Furthermore, the strengths and weaknesses of an event-based vision solution are discussed in relation to the known challenges described in former works on automation of earth-moving machines. A solution for object detection with event data is implemented as a modified YOLOv3 network with spiking convolutional layers trained with a backpropagation algorithm adapted for spiking neural networks. The performance is evaluated on the N-Caltech101 dataset with classes for airplanes and motorbikes, resulting in a mAP of 95.8% for the combined network and 98.8% for the original YOLOv3 network with the same architecture. The solution is investigated as a proof of concept and suggestions for further work is described based on a recurrent spiking neural network.
|
49 |
Haptic optical tweezers with 3D high-speed tracking / Pinces optiques haptiques avec 3D haute vitesse de suiviYin, Munan 03 February 2017 (has links)
La micromanipulation a un grand potentiel pour révolutionner la recherche biologique et les soins médicaux. À petite échelle, microrobots peuvent effectuer des tâches médicales avec peu invasive, et d'explorer la vie à un niveau fondamental. Pinces optiques sont l'une des techniques les plus populaires pour la manipulation biologique. La production de petits lots qui exige une grande flexibilité repose principalement sur le processus de téléopération. Cependant, le niveau limité d'intuitivité rend de plus en plus difficile de conduire efficacement les tâches de manipulation et d'exploration dans le micromonde complexe. Dans de telles circonstances, des chercheurs pionniers ont proposé d'incorporer l'haptique dans la boucle de contrôle du système OTs, qui vise à gérer les tâches de micromanipulation de manière plus flexible et plus efficace. Cependant, la solution n'est pas encore complète, et il ya deux défis principaux à résoudre dans cette thèse: Détection de force 3D, qui doit être précis, rapide et robuste dans un espace de travail suffisamment grand; Haute vitesse jusqu'à 1 kHz force de rétroaction, ce qui est indispensable pour permettre une sensation tactile fidèle et d'assurer la stabilité du système. Dans la micromanipulation des pinceaux optiques, la vision est un bon candidat pour l'estimation de la force puisque le modèle force-position est bien établi. Cependant, le suivi de 1 kHz dépasse la vitesse des procédés de traitement classiques. La discipline émergente de l'ingénierie biomorphe visant à intégrer les comportements de vie dans le matériel informatique ou le logiciel à grande échelle rompt le goulot d'étranglement. Le capteur d'image asynchrone basé sur le temps (ATIS) est la dernière génération de prototype de rétine de silicium neuromorphique qui enregistre seulement les changements de contraste de scène sous la forme d'un flux d'événements. Cette propriété exclut le fond redondant et permet la détection et le traitement des mouvements à grande vitesse. La vision événementielle a donc été appliquée pour répondre à l'exigence de la rétroaction de force 3D à grande vitesse. Le résultat montre que les premières pinces optiques haptiques 3D à grande vitesse pour une application biologique ont été obtenues. La réalisation optique et les algorithmes de suivi événementiel pour la détection de force 3D à grande vitesse ont été développés et validés. L'exploration reproductible de la surface biologique 3D a été démontrée pour la première fois. En tant que puissant capteur de force 3D à grande vitesse, le système de pinces optiques développé présente un potentiel important pour diverses applications. / Micromanipulation has a great potential to revolutionize the biological research and medical care. At small scales, microrobots can perform medical tasks with minimally invasive, and explore life at a fundamental level. Optical Tweezers are one of the most popular techniques for biological manipulation. The small-batch production which demands high flexibilities mainly relies on teleoperation process. However, the limited level of intuitiveness makes it more and more difficult to effectively conduct the manipulation and exploration tasks in the complex microworld. Under such circumstances, pioneer researchers have proposed to incorporate haptics into the control loop of OTs system, which aims to handle the micromanipulation tasks in a more flexible and effective way. However, the solution is not yet complete, and there are two main challenges to resolve in this thesis: 3D force detection, which should be accurate, fast, and robust in large enough working space; High-speed up to 1 kHz force feedback, which is indispensable to allow a faithful tactile sensation and to ensure system stability. In optical tweezers micromanipulation, vision is a sound candidate for force estimation since the position-force model is well established. However, the 1 kHz tracking is beyond the speed of the conventional processing methods. The emerging discipline of biomorphic engineering aiming to integrate the behaviors of livings into large-scale computer hardware or software breaks the bottleneck. The Asynchronous Time-Based Image Sensor (ATIS) is the latest generation of neuromorphic silicon retina prototype which records only scene contrast changes in the form of a stream of events. This property excludes the redundant background and allows high-speed motion detection and processing. The event-based vision has thus been applied to address the requirement of 3D high-speed force feedback. The result shows that the first 3D high-speed haptic optical tweezers for biological application have been achieved. The optical realization and event-based tracking algorithms for 3D high-speed force detection have been developed and validated. Reproducible exploration of the 3D biological surface has been demonstrated for the first time. As a powerful 3D high-speed force sensor, the developed optical tweezers system poses significant potential for various applications.
|
50 |
Highly reliable, low-latency communication in low-power wireless networksBrachmann, Martina 11 January 2019 (has links)
Low-power wireless networks consist of spatially distributed, resource-constrained devices – also referred to as nodes – that are typically equipped with integrated or external sensors and actuators. Nodes communicate with each other using wireless transceivers, and thus, relay data – e. g., collected sensor values or commands for actuators – cooperatively through the network. This way, low-power wireless networks can support a plethora of different applications, including, e. g., monitoring the air quality in urban areas or controlling the heating, ventilation and cooling of large buildings. The use of wireless communication in such monitoring and actuating applications allows for a higher flexibility and ease of deployment – and thus, overall lower costs – compared to wired solutions. However, wireless communication is notoriously error-prone. Message losses happen often and unpredictably, making it challenging to support applications requiring both high reliability and low latency. Highly reliable, low-latency communication – along with high energy-efficiency – are, however, key requirements to support several important application scenarios and most notably the open-/closed-loop control functions found in e. g., industry and factory automation applications.
Communication protocols that rely on synchronous transmissions have been shown to be able to overcome this limitation. These protocols depart from traditional single-link transmissions and do not attempt to avoid concurrent transmissions from different nodes to prevent collisions. On the contrary, they make nodes send the same message at the same time over several paths. Phenomena like constructive interference and capture then ensure that messages are received correctly with high probability.
While many approaches relying on synchronous transmissions have been presented in the literature, two important aspects received only little consideration: (i) reliable operation in harsh environments and (ii) support for event-based data traffic. This thesis addresses these two open challenges and proposes novel communication protocols to overcome them.
|
Page generated in 0.0423 seconds