101 |
Privacy-Preserving Public Verification via Homomorphic EncryptionBecher, Kilian 07 February 2024 (has links)
Nachhaltige und ethisch vertretbare Beschaffung und Produktion gehören zu den großen Herausforderungen, die aus dem rasanten Klimawandel und der wachsenden Weltbevölkerung resultieren. Die Erneuerbare-Energien-Richtlinie II der EU und das deutsche Lieferkettensorgfaltspflichtengesetz sind nur zwei Beispiele für die Vielzahl von Gesetzen und Vorschriften, die Standards für nachhaltige und ethisch vertretbare Beschaffung und Produktion vorgeben. Sie implizieren einen Bedarf an Transparenz, Rückverfolgbarkeit und Verifizierbarkeit von Lieferketten und Transaktionen.
Öffentliche Verifikationen von Transaktionen entlang von Lieferketten ermöglichen es Dritten, die Einhaltung von Standards und Richtlinien und den Wahrheitsgehalt von Nachhaltigkeitsversprechen zu überprüfen. Folglich kann die öffentliche Überprüfbarkeit Kunden, öffentlichen Stellen und Nichtregierungsorganisationen dabei helfen, Verstöße und Betrug in Lieferketten aufzudecken. Dies wiederum kann dazu beitragen, den Druck zur Einhaltung geltender Standards und Vorschriften zu erhöhen.
Transaktionen in Lieferketten basieren oft auf vertraulichen Informationen, wie beispielsweise Mengen und Preise. Die Transparenz derartiger Daten könnte auf Geschäftsgeheimnisse schließen lassen, was direkten Einfluss auf die Wettbewerbsvorteile der beteiligten Firmen hätte. Die Vereinbarkeit von Transparenz und Vertraulichkeit scheint jedoch auf den ersten Blick widersprüchlich zu sein.
Diese Dissertation stellt sich der Herausforderung, die öffentliche Verifizierbarkeit von Transaktionen in Lieferketten unter Wahrung der Vertraulichkeit zu ermöglichen. Ausgehend von zwei Fallbeispielen für Lieferketten-Verifikationen werden zunächst Anforderungen an Lösungen untersucht und fünf Forschungsfragen abgeleitet. Anschließend wird eine universelle Lösung entworfen, welche Transparenz und Vertraulichkeit in Einklang bringt. Das vorgestellte Systemmodell ermöglicht sichere öffentliche Verifikationen durch den Einsatz von Fully Homomorphic Encryption (FHE) und Proxy Re-Encryption (PRE).
Um die Eignung des Systemmodells für eine Vielzahl realer Szenarien zu verdeutlichen, werden in dieser Dissertation Protokolle für verschiedene Verifikationsfunktionen entworfen. Dies umfasst die Verifikation von Bilanzen, motiviert durch den Handel mit nachhaltigem Palmöl, sowie die Verifikation von Verhältnissen, veranschaulicht durch die Verarbeitung verschiedener Arten von Kobalt. Durch theoretische und empirische Untersuchungen wird nachgewiesen, dass die Protokolle sichere öffentliche Verifikationen für realitätsnahe Szenarien in praktikabler Zeit ermöglichen.
Im Weiteren werden die Sicherheitseigenschaften und -implikationen des vorgeschlagenen Systemmodells und der Protokolle untersucht. Dies beinhaltet eine formale Analyse des Risikos, vertrauliche Informationen im Falle wiederholter, gleicher Verifikationen preiszugeben. Aufgrund der Anfälligkeit gegenüber derartigen Angriffen beim Verwenden probabilistischer Output Obfuscation, wird das Paradigma der Data-Dependent Deterministic Obfuscation (D3O) vorgestellt. D3O ist ein universelles Konzept und damit unabhängig vom Anwendungsfall der Lieferketten-Verifikation. Daher kann es in einer Vielzahl weiterer Protokolle für sichere Berechnungen eingesetzt werden, um das Abfließen vertraulicher Informationen zu reduzieren. / Sustainable and ethical sourcing and production are major challenges that arise from rapid climate change and our growing world population. The EU's Renewable Energy Directive II and the German Supply Chain Act are just two examples of the multitude of laws and regulations that define standards for sustainable and ethical sourcing and production. They imply a need for supply chain transparency, traceability, and verification.
Public verification of supply chain transactions gives any third-party verifier the chance to evaluate compliance and the correctness of claims based on supply chain transaction details. Therefore, public verification can help customers, buyers, regulators, and non-governmental organizations uncover non-compliance and fraud committed by supply chain actors. This, in turn, can help increase the pressure to comply with applicable standards and regulations.
Supply chain transactions often involve confidential data like amounts or prices. Transparency of such data could leak trade secrets and affect companies' competitive advantages. However, reconciling transparency with confidentiality seems contradictory at first glance.
This thesis takes up the challenge of enabling privacy-preserving public verification of confidential supply chain transactions. Given two exemplary real-world use cases for supply chain verification, the thesis first investigates requirements for valid solutions and infers five research questions. It then designs a universal solution that combines transparency with confidentiality. The proposed system model achieves privacy-preserving public verification by employing the cryptographic techniques of fully homomorphic encryption (FHE) and proxy re-encryption (PRE).
To demonstrate the suitability of the system model for a large variety of lifelike supply chain verification scenarios, the thesis designs privacy-preserving protocols for different verification functions. This includes the verification of balances, using the trade in sustainable palm oil as an example, as well as the verification of ratios, motivated by different forms of cobalt sourcing. These protocols are evaluated both theoretically and empirically. Through extensive empirical evaluation, the proposed protocols prove to enable privacy-preserving public verification for the mentioned supply chain scenarios in practical time.
Additionally, this thesis investigates the security implications of the proposed system model and protocols and formally analyzes the risk of leaking information through repeated similar verifications. Based on the identified vulnerability to such attacks in the case of probabilistically obfuscated protocol outputs, the thesis introduces and investigates the paradigm of data-dependent deterministic obfuscation (D3O). D3O is a universal concept that is independent of the field of supply chain verification. It can reduce the leakage of confidential information in a large class of privacy-preserving protocols.
|
102 |
Users Attitudes on Changing Anonymity on Social Media / Användares attityder gällande förändring av anonymitet på sociala medierToblad, Simon January 2023 (has links)
Anonymity on social media is a complex topic that has garnered attention from politicians and legislators alike. However, the opinions of end users on this matter have not been given due consideration. This study aims to explore the attitudes of active social media users (aged 30 and below) towards a possible change in how anonymity operates on social media platforms, in order to increase accountability and improve safety. A sample of 112 respondents in Sweden was surveyed, and five users were interviewed to gain deeper insights. Although a majority of users express a desire for change in how anonymity operates on social media platforms, there is currently no sense of urgency to effect change. Educating users and providing better reporting tools may be an effective alternative option for increasing safety online, as a small percentage of users are perceived to be responsible for the majority of harm on social media platforms. Different kinds of verification of users has been suggested as a possible solution to address the issue of malicious bots and impersonation accounts, with some social media platforms introducing paid verification subscriptions. However, the results suggest that the vast majority of users aren’t willing to pay, which could further exacerbate the issue at hand. Paid verification services brings with it the potential of dividing the user base (as verified paying users are attributed a status badge), whilst simultaneously continuing to provide malicious anonymous actors with a way to hide in the masses. Hence, whilst paid verification subscriptions may be a lucrative solutions for the social media companies, and provide them with an additional high-margin revenue stream, it may ultimately be to the detriment of safety and inclusivity. Further research on anonymity in relation to demographics, societal norms and backgrounds are needed to be able to definitively tell if a change to anonymity at scale is plausible. Furthermore, future work into the potential impact of the paid verification subscriptions are also needed, as these recent implementations have consequences on both the user base and the platforms as a whole. / Anonymitet på sociala medier är ett komplext ämne som har uppmärksammats av politiker och lagstiftare. Trots detta har användares åsikter i frågan inte fått tillräckligt med uppmärksamhet. Denna studie syftar till att undersöka attityder hos aktiva användare av sociala medier (30 år och yngre) gentemot hur anonymitet fungerar på sociala medieplattformar, i syfte att öka ansvarstagande och säkerhet. En grupp på 112 respondenter i Sverige undersöktes och fem användare intervjuades för att få djupare insikter. Resultaten visar att även om majoriteten av användare uttrycker en önskan om förändringar i hur anonymitet fungerar på sociala medieplattformar, finns det för närvarande ingen känsla av brådska att åstadkomma förändring. Att utbilda användare och tillhandahålla bättre rapporteringsverktyg kan vara ett effektivt alternativ för att öka säkerheten online, eftersom en liten andel av användarna uppfattas som ansvariga för den största delen av skadorna på sociala medieplattformar. Olika typer av användarverifiering har föreslagits som en möjlig lösning för att adressera problemet med skadliga botar och falska konton, vissa sociala medieplattformar har även introducerat betalda verifieringsprenumerationer. Resultaten tyder på att majoriteten av användarna inte är villiga att betala för en sådan prenumeration, vilket kan förvärra problemet. Betalda verifieringstjänster riskerar också att dela upp användarbasen (eftersom betalande användare tilldelas en statussymbol), samtidigt som det fortsätter att ge skadliga anonyma aktörer ett sätt att gömma sig bland massorna. Betalda verifieringsprenumerationer kan vara en lukrativ lösning för sociala medieföretag och ge dem ytterligare en intäktsström med hög marginal, men det kan i slutändan vara på bekostnad av säkerhet och inkludering. Ytterligare forskning om anonymitet i relation till demografi, samhällsnormer och bakgrund behövs för att kunna fastställa om en förändring av anonymitet i stor skala är möjlig. Dessutom behövs framtida arbete om de potentiella konsekvenserna av betalda verifieringsprenumerationer, eftersom dessa nyligen införda implementeringar har konsekvenser både för användarbasen och plattformarna som helhet.
|
103 |
Analyse von Prompt Gamma-Ray Timing Spektren mit Deep-Learning Methoden zur Behandlungsverifikation in der ProtonentherapieRitscher, Noah 29 October 2024 (has links)
In der Protonentherapie können aufgrund verschiedener Unsicherheiten Reichweitenveränderungen von Protonen auftreten. Bei der Prompt-Gamma-Ray Timing Methode wird eine Reichweitenveränderung von Protonen anhand der zeitlichen Verteilung der erzeugten prompten Gammastrahlung ermittelt. Die Eignung von neuronalen Netzen zur Reichweitenbestimmung anhand der zeitlichen und spektralen Verteilung von prompten Gammastrahlen wurde untersucht. Es wurden Modelle für die Untersuchung von 1D-Zeitspektren und 2D-Energie-Zeit-Spektren erstellt. Die entwickelten Modelle wurden auf statisch applizierten Spots hoher Intensität trainiert und auf gescannten Spots mit niedrigerer Intensität validiert. Es wurde festgestellt, dass die parallele Analyse von Energie und Zeit die beste Vorhersagekraft der Modelle erreichte. Es wurden Reichweitenveränderungen für statisch applizierte Spots mit einer Genauigkeit von 3,10 mm detektiert, während für gescannt applizierte Spots unter Verwendung von Datenakkumulation ein RMSE von 3,70 mm erreicht wurde. Im Vergleich zu vorherigen multivariaten Modellen konnte keine Verbesserung mit neuronalen Netzen demonstriert werden. Komplexere Modelle und ein umfangreicherer Datensatz könnten die Genauigkeit der Reichweitenvorhersage mittels Deep Learning noch weiter verbessern.:1 Einleitung 1
2 Theoretische Grundlagen 3
2.1 Protonentherapien 3
2.1.1 Wechselwirkungen 3
2.1.2 Behandlungsverifikation 4
2.2 Gamma-Ray-basierte Behandlungsverifikation 6
2.2.1 Prompt Gamma-Ray Timing 7
2.2.2 Prompt Gamma-Ray Spectroscopy 9
2.3 Deep Learning 9
2.3.1 Allgemeine Funktionsweise 9
2.3.2 Feedforward Neural Network 10
2.3.3 Faltendes Neuronales Netz 12
2.3.4 Regression und Klassifikation 14
3 Material und Methoden 16
3.1 Softwareumgebung und Infrastruktur 16
3.2 Datengrundlage 16
3.2.1 Datenaufnahme 16
3.2.2 Datenvorverarbeitung 18
3.2.3 Datenakkumulation 20
3.3 Dateneinteilung 21
3.4 Regression und Klassifikation 22
3.5 Modellarchitekturen 22
3.5.1 FFNN 22
3.5.2 1D-CNN 23
3.5.3 2D-CNN 24
3.6 Voruntersuchungen 25
3.6.1 Konstante Hyperparameter 26
3.6.2 Schichtanzahl von FFNN und CNN 26
3.6.3 Filtergröße von CNNs 27
3.6.4 Datennormierung 27
3.6.5 Untersuchung der Architekturen und Energie 28
3.7 Finale Vorhersage der Reichweitenverschiebung 28
4 Ergebnisse 30
4.1 Voruntersuchungen 30
4.1.1 Schichtanzahl von FFNN und CNN 30
4.1.2 Filtergröße von CNNs 31
4.1.3 Datennormierung 32
4.1.4 Untersuchung der Architektur und Energie 32
4.2 Finale Vorhersage der Reichweitenverschiebung: Regression 35
4.3 Finale Vorhersage der Reichweitenverschiebung: Klassifikation 39
5 Diskussion und Ausblick 42
6 Zusammenfassung 48
Anhang A Loss 58
Anhang B Untergrundkorrektur 60
Anhang C Leistungsparameter 61
Anhang D Ergebnistabellen Regression 63
Anhang E Ergebnistabellen Klassifikation 67 / In proton therapy, changes in the range of protons can occur due to various uncertainties. In the prompt gamma-ray timing method, a change in the range of protons is determined based on the temporal and spectral distribution of the generated prompt gamma radiation. The suitability of neural networks for range determination based on the temporal distribution of prompt gamma rays was investigated. Models were created for the investigation of 1D time spectra and 2D energy-time spectra. The developed models were trained on statically applied high intensity spots and validated on scanned spots applied with lower intensity. It was found that the parallel analysis of energy and time achieved the best predictive power of the models. Range changes for statically applied spots were detected with a root mean square error (RMSE) of 3.10 mm, and for dynamically scanned spots, a RMSE of 3.70 mm was achieved when utilizing data accumulation. Compared to previous multivariate models, no improvement could be demonstrated using simple neural networks. More complex models and a more comprehensive data set could further improve the accuracy of range prediction from Deep Learning.:1 Einleitung 1
2 Theoretische Grundlagen 3
2.1 Protonentherapien 3
2.1.1 Wechselwirkungen 3
2.1.2 Behandlungsverifikation 4
2.2 Gamma-Ray-basierte Behandlungsverifikation 6
2.2.1 Prompt Gamma-Ray Timing 7
2.2.2 Prompt Gamma-Ray Spectroscopy 9
2.3 Deep Learning 9
2.3.1 Allgemeine Funktionsweise 9
2.3.2 Feedforward Neural Network 10
2.3.3 Faltendes Neuronales Netz 12
2.3.4 Regression und Klassifikation 14
3 Material und Methoden 16
3.1 Softwareumgebung und Infrastruktur 16
3.2 Datengrundlage 16
3.2.1 Datenaufnahme 16
3.2.2 Datenvorverarbeitung 18
3.2.3 Datenakkumulation 20
3.3 Dateneinteilung 21
3.4 Regression und Klassifikation 22
3.5 Modellarchitekturen 22
3.5.1 FFNN 22
3.5.2 1D-CNN 23
3.5.3 2D-CNN 24
3.6 Voruntersuchungen 25
3.6.1 Konstante Hyperparameter 26
3.6.2 Schichtanzahl von FFNN und CNN 26
3.6.3 Filtergröße von CNNs 27
3.6.4 Datennormierung 27
3.6.5 Untersuchung der Architekturen und Energie 28
3.7 Finale Vorhersage der Reichweitenverschiebung 28
4 Ergebnisse 30
4.1 Voruntersuchungen 30
4.1.1 Schichtanzahl von FFNN und CNN 30
4.1.2 Filtergröße von CNNs 31
4.1.3 Datennormierung 32
4.1.4 Untersuchung der Architektur und Energie 32
4.2 Finale Vorhersage der Reichweitenverschiebung: Regression 35
4.3 Finale Vorhersage der Reichweitenverschiebung: Klassifikation 39
5 Diskussion und Ausblick 42
6 Zusammenfassung 48
Anhang A Loss 58
Anhang B Untergrundkorrektur 60
Anhang C Leistungsparameter 61
Anhang D Ergebnistabellen Regression 63
Anhang E Ergebnistabellen Klassifikation 67
|
104 |
Automatisierte Ausführungsemulation von Testprogrammen für SensorsystemeMayer, Franziska, Schott, Christian, Markert, Erik, Heinkel, Ulrich 16 August 2024 (has links)
Es wird eine Methodik zur Testprogrammverifikation und deren automatisierte Ausführung für ein namhaftes, industriell eingesetztes Testsystem vorgestellt. Der Ansatz basiert auf der Spezifikation von Bauteilvarianten. Die Bauteilantworten einer Variante werden während der Ausführung emuliert und ersetzen Messwerte physischer Bauteile durch das Testsystem. Die Generierung und Emulation der Bauteilvarianten erfolgen automatisiert. Es wird ausschließlich die Softwareumgebung des Testsystems benötigt. Weder physische Bauteile noch Online-Testzeit auf dem Testsystem sind notwendig.
|
105 |
High-Level-Synthese von OperationseigenschaftenLanger, Jan 23 November 2011 (has links)
In der formalen Verifikation digitaler Schaltkreise hat sich die Methodik der vollständigen Verifikation anhand spezieller Operationseigenschaften bewährt. Operationseigenschaften beschreiben das Verhalten einer Schaltung in einem festen Zeitintervall und können sequentiell miteinander verknüpft werden, um so das Gesamtverhalten zu spezifizieren. Zusätzlich beweist eine formale Vollständigkeitsprüfung, dass die Menge der Eigenschaften für jede Folge von Eingangssignalwerten die Ausgänge der zu verifizierenden Schaltung eindeutig und lückenlos determiniert.
In dieser Arbeit wird untersucht, wie aus Operationseigenschaften, deren Vollständigkeit erfolgreich bewiesen wurde, automatisiert eine Schaltungsbeschreibung abgeleitet werden kann. Gegenüber der traditionellen Entwurfsmethodik auf Register-Transfer-Ebene (RTL) bietet dieses Verfahren zwei Vorteile. Zum einen vermeidet der Vollständigkeitsbeweis viele Arten von Entwurfsfehlern, zum anderen ähnelt eine Beschreibung mit Hilfe von Operationseigenschaften den in Spezifikationen häufig genutzten Zeitdiagrammen, sodass die Entwurfsebene der Spezifikationsebene angenähert wird und Fehler durch manuelle Verfeinerungsschritte vermieden werden.
Das Entwurfswerkzeug vhisyn führt die High-Level-Synthese (HLS) einer vollständigen Menge von Operationseigenschaften zu einer Beschreibung auf RTL durch. Die Ergebnisse zeigen, dass sowohl die verwendeten Synthesealgorithmen, als auch die erzeugten Schaltungen effizient sind und somit die Realisierung größerer Beispiele zulassen. Anhand zweier Fallstudien kann dies praktisch nachgewiesen werden. / The complete verification approach using special operation properties is an accepted methodology for the formal verification of digital circuits. Operation properties describe the behavior of a circuit during a certain time interval. They can be sequentially concatenated in order to specify the overall behavior. Additionally, a formal completeness check proves that the sequence of properties consistently determines the exact value of the output signals for every valid sequence of input signal values.
This work examines how a circuit description can be automatically derived from a set of operation properties whose completeness has been proven. In contrast to the traditional design flow at register-transfer level (RTL), this method offers two advantages. First, the prove of completeness helps to avoid many design errors. Second, the design of operation properties resembles the design of timing diagrams often used in textual specifications. Therefore, the design level is closer to the specification level and errors caused by refinement steps are avoided.
The design tool vhisyn performs the high-level synthesis from a complete set of operation properties to a description at RTL. The results show that both the synthesis algorithms and the generated circuit descriptions are efficient and allow the design of larger applications. This is demonstrated by means of two case studies.
|
106 |
Static Partial Order Reduction for Probabilistic Concurrent SystemsFernández-Díaz, Álvaro, Baier, Christel, Benac-Earle, Clara, Fredlund, Lars-Åke 10 September 2013 (has links) (PDF)
Sound criteria for partial order reduction for probabilistic concurrent systems have been presented in the literature. Their realization relies on a depth-first search-based approach for generating the reduced model. The drawback of this dynamic approach is that it can hardly be combined with other techniques to tackle the state explosion problem, e.g., symbolic probabilistic model checking with multi-terminal variants of binary decision diagrams. Following the approach presented by Kurshan et al. for non-probabilistic systems, we study partial order reduction techniques for probabilistic concurrent systems that can be realized by a static analysis. The idea is to inject the reduction criteria into the control flow graphs of the processes of the system to be analyzed. We provide the theoretical foundations of static partial order reduction for probabilistic concurrent systems and present algorithms to realize them. Finally, we report on some experimental results.
|
107 |
Verification of Branching-Time and Alternating-Time Properties for Exogenous Coordination ModelsKlüppelholz, Sascha 24 April 2012 (has links) (PDF)
Information and communication systems enter an increasing number of areas of daily lives. Our reliance and dependence on the functioning of such systems is rapidly growing together with the costs and the impact of system failures. At the same time the complexity of hardware and software systems extends to new limits as modern hardware architectures become more and more parallel, dynamic and heterogenous. These trends demand for a closer integration of formal methods and system engineering to show the correctness of complex systems within the design phase of large projects.
The goal of this thesis is to introduce a formal holistic approach for modeling, analysis and synthesis of parallel systems that potentially addresses complex system behavior at any layer of the hardware/software stack. Due to the complexity of modern hardware and software systems, we aim to have a hierarchical modeling framework that allows to specify the behavior of a parallel system at various levels of abstraction and that facilitates designing complex systems in an iterative refinement procedure, in which more detailed behavior is added successively to the system description. In this context, the major challenge is to provide modeling formalisms that are expressive enough to address all of the above issues and are at the same time amenable to the application of formal methods for proving that the system behavior conforms to its specification. In particular, we are interested in specification formalisms that allow to apply formal verification techniques such that the underlying model checking problems are still decidable within reasonable time and space bounds.
The presented work relies on an exogenous modeling approach that allows a clear separation of coordination and computation and provides an operational semantic model where formal methods such as model checking are well suited and applicable. The channel-based exogenous coordination language Reo is used as modeling formalism as it supports hierarchical modeling in an iterative top-down refinement procedure. It facilitates reusability, exchangeability, and heterogeneity of components and forms the basis to apply formal verification methods. At the same time Reo has a clear formal semantics based on automata, which serve as foundation to apply formal methods such as model checking.
In this thesis new modeling languages are presented that allow specifying complex systems in terms of Reo and automata models which yield the basis for a holistic approach on modeling, verification and synthesis of parallel systems. The second main contribution of this thesis are tailored branching-time and alternating time temporal logics as well as corresponding model checking algorithms. The thesis includes results on the theoretical complexity of the underlying model checking problems as well as practical results. For the latter the presented approach has been implemented in the symbolic verification tool set Vereofy. The implementation within Vereofy and evaluation of the branching-time and alternating-time model checker is the third main contribution of this thesis.
|
108 |
Reverse Engineering of Passenger Jets - Classified Design ParametersDe Grave, Emiel January 2017 (has links) (PDF)
This thesis explains how the classified design parameters of existing passenger jets can be determined. The classified design parameters are; the maximum lift coefficient for landing and take-off, the maximum aerodynamic efficiency and the specific fuel consumption. The entire concept is based on the preliminary sizing of jet powered civil aeroplanes. This preliminary sizing is explained in detail because it is the foundation of the final result. The preliminary sizing is combined using reverse engineering which is not a strict method. Therefore, only the basics are explained. By applying reverse engineering on the preliminary sizing and aiming for the classified design parameters as output, formulas are derived to calculate the maximum lift coefficients, the maximum aerodynamic efficiency and the specific fuel consumption. The goal is to calculate these parameters, using only aircraft specifications that are made public by the manufacturer. The calculations are complex with mutual relations, iterative processes and optimizations. Therefore, it is interesting to integrate everything in a tool. The tool is built in Microsoft Excel and explained in detail adding operating instructions. The program is executed for miscellaneous aeroplanes, supported with the necessary comments. Investigated aeroplanes are: Caravelle 10B (Sud-Aviation), Boeing 707-320C, BAe 146-200 (British Aerospance), A320-200 (Airbus), "The Rebel" (based on A320), Boeing SUGAR High, Boeing 747-400, Blended Wing Body VELA 2 (VELA) and Dassault Falcon 8X.
|
109 |
Verification of Branching-Time and Alternating-Time Properties for Exogenous Coordination ModelsKlüppelholz, Sascha 19 March 2012 (has links)
Information and communication systems enter an increasing number of areas of daily lives. Our reliance and dependence on the functioning of such systems is rapidly growing together with the costs and the impact of system failures. At the same time the complexity of hardware and software systems extends to new limits as modern hardware architectures become more and more parallel, dynamic and heterogenous. These trends demand for a closer integration of formal methods and system engineering to show the correctness of complex systems within the design phase of large projects.
The goal of this thesis is to introduce a formal holistic approach for modeling, analysis and synthesis of parallel systems that potentially addresses complex system behavior at any layer of the hardware/software stack. Due to the complexity of modern hardware and software systems, we aim to have a hierarchical modeling framework that allows to specify the behavior of a parallel system at various levels of abstraction and that facilitates designing complex systems in an iterative refinement procedure, in which more detailed behavior is added successively to the system description. In this context, the major challenge is to provide modeling formalisms that are expressive enough to address all of the above issues and are at the same time amenable to the application of formal methods for proving that the system behavior conforms to its specification. In particular, we are interested in specification formalisms that allow to apply formal verification techniques such that the underlying model checking problems are still decidable within reasonable time and space bounds.
The presented work relies on an exogenous modeling approach that allows a clear separation of coordination and computation and provides an operational semantic model where formal methods such as model checking are well suited and applicable. The channel-based exogenous coordination language Reo is used as modeling formalism as it supports hierarchical modeling in an iterative top-down refinement procedure. It facilitates reusability, exchangeability, and heterogeneity of components and forms the basis to apply formal verification methods. At the same time Reo has a clear formal semantics based on automata, which serve as foundation to apply formal methods such as model checking.
In this thesis new modeling languages are presented that allow specifying complex systems in terms of Reo and automata models which yield the basis for a holistic approach on modeling, verification and synthesis of parallel systems. The second main contribution of this thesis are tailored branching-time and alternating time temporal logics as well as corresponding model checking algorithms. The thesis includes results on the theoretical complexity of the underlying model checking problems as well as practical results. For the latter the presented approach has been implemented in the symbolic verification tool set Vereofy. The implementation within Vereofy and evaluation of the branching-time and alternating-time model checker is the third main contribution of this thesis.
|
110 |
Static Partial Order Reduction for Probabilistic Concurrent SystemsFernández-Díaz, Álvaro, Baier, Christel, Benac-Earle, Clara, Fredlund, Lars-Åke January 2012 (has links)
Sound criteria for partial order reduction for probabilistic concurrent systems have been presented in the literature. Their realization relies on a depth-first search-based approach for generating the reduced model. The drawback of this dynamic approach is that it can hardly be combined with other techniques to tackle the state explosion problem, e.g., symbolic probabilistic model checking with multi-terminal variants of binary decision diagrams. Following the approach presented by Kurshan et al. for non-probabilistic systems, we study partial order reduction techniques for probabilistic concurrent systems that can be realized by a static analysis. The idea is to inject the reduction criteria into the control flow graphs of the processes of the system to be analyzed. We provide the theoretical foundations of static partial order reduction for probabilistic concurrent systems and present algorithms to realize them. Finally, we report on some experimental results.
|
Page generated in 0.0816 seconds