• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 33
  • 13
  • 11
  • 8
  • 8
  • 6
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 187
  • 40
  • 37
  • 36
  • 33
  • 22
  • 18
  • 18
  • 18
  • 17
  • 16
  • 15
  • 14
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Konzeption und prototypische Implementierung eines Generators zur Softwarevisualisierung in 3D

Müller, Richard 25 April 2018 (has links)
Softwareentwicklungsprojekte bringen viele verschiedene Artefakte hervor. Artefakte stellen unterschiedliche Aspekte wie Struktur (statische Informationen), Verhalten (dynamische Informationen) oder Evolution (historisierte Informationen) von Softwaresystemen dar. Die Softwarevisualisierung ist darauf ausgelegt, solche Artefakte in eine visuelle Form zu überführen. Externe Umfragen und eine intensive Literaturrecherche zeigen jedoch Defizite dieses Gebietes auf. So sind viele Werkzeuge zur Visualisierung vom Entwicklungsprozess entkoppelt, bieten unzureichenden Im- und Export von Visualisierungen und besitzen teilweise einen geringen Automatisierungsgrad des Visualisierungsprozesses, insbesondere bei Werkzeugen zur dreidimensionalen Visualisierung. In dieser Arbeit wurde durch Adaption und Kombination bestehender Theorien und Werkzeuge der generativen und der modellgetriebenen Softwareentwicklung in Verbindung mit Techniken aus der Softwarevisualisierung ein Konzept entwickelt, das beschreibt, wie dreidimensionale Visualisierungen von Softwaresystemen vollautomatisch generiert werden können. Im Mittelpunkt steht ein Generator, der ausgehend von einer Anforderungsspezifikation vollautomatisiert 3D-Modelle erzeugt. Zur Validierung des entwickelten Konzeptes wurde ein Prototyp implementiert, der auf die Visualisierung der Struktur von Softwaresystemen abzielt. Dieser lässt sich als Plugin in die Entwicklungsumgebung Eclipse integrieren und erzeugt aus Ecore-basierten Modellen nach Benutzeranforderungen mittels Modelltransformationen ein 3D-Modell im freien und standardisierten X3D-Format. Die Transformationen sind dabei mit dem Werkzeug openArchitectureWare realisiert. Schließlich wurde der Prototyp selbst einer Evaluation gemäß etablierten Kriterien aus der Softwarevisualisierung unterzogen.
132

Ansatz zur Interaktion mit dreidimensional visualisierten Softwaremodellen

Kovacs, Pascal 22 April 2010 (has links)
Softwaresysteme sind komplexe immaterielle Systeme mit einer Vielzahl von Bestandteilen und Beziehungen. Um den Aufbau, die Funktionsweise und die Entwicklung von Softwaresystemen besser zu verstehen, eignen sich Softwarevisualisierungen, welche die abstrakte Datengrundlage in eine visuelle Repräsentation übertragen. Auf Grund der Masse und der Komplexität der in der Visualisierung enthaltenen Informationen, kommt es schnell zur Unübersichtlichkeit, was sich negativ auf den Prozess des Verstehens auswirkt. Zur Beherrschung der Komplexität muss der Betrachter daher die Gesamtheit zuerst in mehrere Perspektiven unterteilen, um diese anschließend gezielt nach verschiedenen Aspekten untersuchen zu können. Die dafür benötigten Interaktionsmöglichkeiten sind Gegenstand der Untersuchungen in dieser Arbeit, wobei im Wesentlichen Visualisierungen der Struktur von Software als Ausgangspunkt genutzt werden. Insbesondere wird der Frage nachgegangen, wie die Interaktion gestaltet werden kann, damit der Benutzer ein möglichst umfassendes Verständnis der Struktur erlangt. Zur Umsetzung der theoretischen Erkenntnisse wird ein Prototyp vorgestellt, der automatisiert aus den Strukturinformationen eines Ecore-Modells eine interaktive dreidimensionale Softwarevisualisierung der Struktur im freien standardisierten Format Extensible 3D generiert. Der Prozess der Visualisierung wird dabei durch Werkzeuge des openArchitectureWare-Frameworks realisiert. Zur Integration in den Entwicklungsprozess ist der Prototyp in ein Plugin für Eclipse eingebettet.:1 Einleitung 1.1 Motivation 1.2 Ziele 1.3 Aufbau 2 Softwarevisualisierung 2.1 Visualisierung 2.1.1 Grundlagen 2.1.2 Aufgaben und Ziele 2.1.3 Teilgebiete 2.2 Definition 2.3 Ziele und Aufgaben 2.4 Visualisierungspipeline 2.5 Visualisierungstechniken 2.5.1 Graphen 2.5.2 Visuelle Metaphern 2.6 Taxonomie für Softwarevisualisierungen 3 Interaktion mit einer Softwarevisualisierung 3.1 Einordnung 3.2 Definition 3.3 Ziele 3.4 Benutzungsschnittstelle 3.5 Aufbau der Benutzungsschnittstelle 3.5.1 Anwendungsschnittstelle 3.5.2 Dialogschnittstelle 3.5.3 Ein-/Ausgabenschnittstelle 3.6 Interaktionstechnik 3.7 Taxonomie der Interaktionstechniken 3.8 Konzept zur Interaktion mit einer Softwarevisualisierung 4 Technische Grundlagen des Prototyps 4.1 Eclipse 4.2 Das openArchitectureWare-Framework 4.2.1 Modellgetriebene Softwareentwicklung 4.2.2 Aufbau des Frameworks 4.2.3 oAW-Workflow 4.2.4 Ecore 4.2.5 Xtend Modell-zu-Modell-Transformationen 4.3 Extensible 3D 4.3.1 Grundlagen 4.3.2 Szenegraph 4.3.3 Ereignismodell 4.3.4 X3D-Prototypen 5 Basisprototyp 5.1 Funktionsweise 5.2 Einordnung 5.3 Visualisierungsprozess des Generators 5.3.1 Modelltransformation von Ecore zu Graph 5.3.2 Modellmodifikation des Graphen 5.3.3 Modelltransformation von Graph zu X3D 5.4 Ansatzpunkte der Erweiterung 6 Erweiterung des Prototyps für die Interaktion 6.1 Benutzungsschnittstelle 6.1.1 Direkte Manipulation 6.1.2 Manipulation nach Elementtyp 6.1.3 Navigation durch Pakethierarchie und Klassengraph 6.1.4 Identifikation nach Bezeichner und Tooltip 6.2 Architektur des interaktiven X3D-Modells 6.3 Erweiterung des Generators 6.3.1 Anpassung der Transformationen des Basisprototyps 6.3.2 Modelltransformation in ein interaktives X3D-Modell 6.4 Integration in das Eclipse-Plugin 7 Fazit und Ausblick
133

Creating An Editor For The Implementation of WorkFlow+: A Framework for Developing Assurance Cases

Chiang, Thomas January 2021 (has links)
As vehicles become more complex, the work required to ensure that they are safe increases enormously. This in turn results in a much more complicated task of testing systems, subsystems, and components to ensure that they are safe individually as well as when they are integrated. As a result, managing the safety engineering process for vehicle development is of major interest to all automotive manufacturers. The goal of this research is to introduce a tool that provides support for a new framework for modeling safety processes, which can partially address some of these challenges. WorkFlow+ is a framework that was developed to combine both data flow and process flow to increase traceability, enable users to model with the desired granularity safety engineering workflow for their products, and produce assurance cases for regulators and evaluators to be able to validate that the product is safe for the users and the public. With the development of an editor, it will bring WorkFlow+ to life. / Thesis / Master of Applied Science (MASc)
134

Analysis and Characterization of Author Contribution Patterns in Open Source Software Development

Taylor, Quinn Carlson 02 March 2012 (has links) (PDF)
Software development is a process fraught with unpredictability, in part because software is created by people. Human interactions add complexity to development processes, and collaborative development can become a liability if not properly understood and managed. Recent years have seen an increase in the use of data mining techniques on publicly-available repository data with the goal of improving software development processes, and by extension, software quality. In this thesis, we introduce the concept of author entropy as a metric for quantifying interaction and collaboration (both within individual files and across projects), present results from two empirical observational studies of open-source projects, identify and analyze authorship and collaboration patterns within source code, demonstrate techniques for visualizing authorship patterns, and propose avenues for further research.
135

Scalable analysis of stochastic process algebra models

Tribastone, Mirco January 2010 (has links)
The performance modelling of large-scale systems using discrete-state approaches is fundamentally hampered by the well-known problem of state-space explosion, which causes exponential growth of the reachable state space as a function of the number of the components which constitute the model. Because they are mapped onto continuous-time Markov chains (CTMCs), models described in the stochastic process algebra PEPA are no exception. This thesis presents a deterministic continuous-state semantics of PEPA which employs ordinary differential equations (ODEs) as the underlying mathematics for the performance evaluation. This is suitable for models consisting of large numbers of replicated components, as the ODE problem size is insensitive to the actual population levels of the system under study. Furthermore, the ODE is given an interpretation as the fluid limit of a properly defined CTMC model when the initial population levels go to infinity. This framework allows the use of existing results which give error bounds to assess the quality of the differential approximation. The computation of performance indices such as throughput, utilisation, and average response time are interpreted deterministically as functions of the ODE solution and are related to corresponding reward structures in the Markovian setting. The differential interpretation of PEPA provides a framework that is conceptually analogous to established approximation methods in queueing networks based on meanvalue analysis, as both approaches aim at reducing the computational cost of the analysis by providing estimates for the expected values of the performance metrics of interest. The relationship between these two techniques is examined in more detail in a comparison between PEPA and the Layered Queueing Network (LQN) model. General patterns of translation of LQN elements into corresponding PEPA components are applied to a substantial case study of a distributed computer system. This model is analysed using stochastic simulation to gauge the soundness of the translation. Furthermore, it is subjected to a series of numerical tests to compare execution runtimes and accuracy of the PEPA differential analysis against the LQN mean-value approximation method. Finally, this thesis discusses the major elements concerning the development of a software toolkit, the PEPA Eclipse Plug-in, which offers a comprehensive modelling environment for PEPA, including modules for static analysis, explicit state-space exploration, numerical solution of the steady-state equilibrium of the Markov chain, stochastic simulation, the differential analysis approach herein presented, and a graphical framework for model editing and visualisation of performance evaluation results.
136

Mission Concept for a Satellite Mission to Test Special Relativity

Anadol, Volkan January 2016 (has links)
In 1905 Albert Einstein developed the theory of Special Relativity. This theory describes the relation between space and time and revolutionized the understanding of the universe. While the concept is generally accepted new experimental setups are constantly being developed to challenge the theory, but so far no contradictions have been found. One of the postulates Einsteins theory of Relativity is based on states that the speed of light in vacuum is the highest possible velocity. Furthermore, it is demanded that the speed of light is independent of any chosen frame of reference. If an experiment would find a contradiction of these demands, the theory as such would have to be revised. To challenge the constancy of the speed of light the socalled Kennedy Thorndike experiment has been developed. A possible setup to conduct a Kennedy Thorndike experiment consists of comparing two independent clocks. Likewise experiments have been executed in laboratory environments. Within the scope of this work, the orbital requirements for the first space-based Kennedy Thorndike experiment called BOOST will be investigated.BOOST consists of an iodine clock, which serves as a time reference, and an optical cavity, which serves as a length reference. The mechanisms of the two clocks are different and can therefore be employed to investigate possible deviations in the speed of light. While similar experiments have been performed on Earth, space offers many advantages for the setup. First, one orbit takes roughly 90 min for a satellite based experiment. In comparison with the 24 h duration on Earth it is obvious that a space-based experiment offers higher statistics. Additionally the optical clock stability has to be kept for shorter periods, increasing the sensitivity. Third, the velocity of the experimental setup is larger. This results in an increased experiment accuracy since any deviation in the speed of light would increase with increasing orbital velocity. A satellite planted in a Low Earth Orbit (LEO) travels with a velocity of roughly 7 km/s. Establishing an Earth-bound experiment that travels with a constant velocity of that order is impossible. Finally, space offers a very quiet environment where no disturbances, such as vibrations, act upon the experiment, which is practically unavoidable in a laboratory environment. This thesis includes two main chapters. The chapter titled "Mission Level" exploits orbital candidates. Here, possible orbits are explained in detail and the associated advantages and problems are investigated. It also contains a discussion about ground visibility and downlink feasibility for each option. Finally, a nominal mission scenario is sketched. The other chapter is called "Sub-Systems". Within this chapter the subsystems of the spacecraft are examined. To examine the possible orbits it is necessary to define criteria according to which the quality of the orbits can be determined. The first criterion reflects upon the scientific outcome of the mission. This is mainly governed by the achievable velocity and the orbital geometry. The second criterion discriminates according to the mission costs. These include the launch, orbital injection, de-orbiting, satellite development, and orbital maintenance. The final criteria defines the requirements in terms of mission feasibility and risks, e.g. radiation. The criteria definition is followed by explaining the mission objectives and requirements. Each requirement is then discussed in terms of feasibility. The most important parameters, such as altitude, inclination, and the right ascension of the ascending node (RAAN), are discussed for each orbital option and an optimal range is picked. The optimal altitude depends on several factors, such as the decay rate, radiation concerns, experimental contributions, and eclipse duration. For the presented mission an altitude of 600 km seems to be the best fit. Alongside the optimal altitude possible de-orbiting scenarios are investigated. It is concluded that de-orbiting of the satellite is possible without any further external influence. Thus, no additional thrusters are required to de-orbit the satellite. The de-orbiting scenario has been simulated with systems tool kit (STK). From the simulation it can be concluded, that the satellite can be deorbited within 25 years. This estimation meets the requirements set for the mission. Another very important parameter is the accumulative eclipse duration per year for a given orbit. For this calculation it is necessary to know the relative positions and motion of the Earth and the Sun. From this the eclipse duration per orbit for different altitudes is gained. Ground visibilities for orbital options are examined for two possible ground stations. The theory is based on the geometrical relation between the satellite and the ground stations. The results are in an agreement with the related STK simulations. Finally, both ground stations are found adequate to maintain the necessary contact between the satellite and the ground station. In the trade-off section, orbit candidates are examined in more detail. Results from the previous sections with some additional issues such as the experiment sensitivities, radiation concern and thermal stability are discussed to conclude which candidate is the best for the mission. As a result of the trade-off, two scenarios are explained in the "Nominal Mission Scenario" section which covers a baseline scenario and a secondary scenario. After selecting a baseline orbit, two sub-systems of the satellite are examined. In the section of "Attitude Control System (ACS)" where the question of "Which attitude control method is more suitable for the mission?" is tried to be answered. A trade-off among two common control methods those are 3-axis stabilization and spin stabilization is made. For making the trade-off possible external disturbances in space are estimated for two imaginary satellite bodies. Then, it is concluded that by a spin stabilization method maintaining the attitude is not feasible. Thus, the ACS should be built on the method of 3-axis stabilization. As the second sub-system the possible power system of the satellite is examined. The total size and the weight of the solar arrays are estimated for two different power loads. Then, the battery capacity which will be sufficient for the power system budget is estimated together with the total mass of the batteries. In the last section, a conclusion of the thesis work is made and the possible future works for the BOOST mission are stated.
137

SOFTVIZ... A Step Forward

Singh, Mahim 30 April 2004 (has links)
Complex software systems are difficult to understand and very hard to debug. Programmers trying to understand or debug these systems must read through source code which may span over thousands of files. Software Visualization tries to ease this burden by using graphics and animation to convey important information about the program to the user, which may be used either for understanding the behavior of the program or for detecting any defects within the code. SoftViz is one such software visualization system, developed by Ben Kurtz under the guidance of Prof. George T. Heineman at WPI. We carry forward the work initiated with SoftViz. Our preliminary study showed various avenues for making the system more effective and user-friendly. Specifically I completed the unfinished work, made optimizations, implemented new functionality and added new visualization plug-ins, all aimed at making the system a more versatile and user-friendly debugging framework. We built a solid core functionality that would be able to support various functionalities and created new plug-ins that would make understanding and bug-detection easier. Further we integrated SoftViz with the Eclipse development environment, making the system easily accessible and potentially widely used. We created an error classification framework relating the common error classes and the visualizations that could be used to detect them. We believe this will be helpful in both selecting the right visualization options as well as constructing new plug-ins.
138

A Metamodel independent approach for Conflict Detection tosupport distributed development in MDE

Pordel, Mostafa January 2009 (has links)
<p>The need for change of direction in Software Engineering has been suggested in severalresources [Power of Models]. In recent years many organizations have focused on ModelDriven Engineering (MDE) as an approach to application design and implementation.Model Driven Architecture (MDA) was introduced by Object Management Group (OMG) in2001 in support of MDE. Models are the basic elements in MDE. The focus on MDE is onthe concept of “Everything is a model”. So far several languages, tools and platforms havebeen created for MDE.In particular, models can be developed in a distributed environment, thus once they mergetogether, conflicts and inconsistencies should be detected in a convenient way to bereconciled (automatically by software or manually by developers). This project is based onprevious works that define difference and conflict metamodels of a source model. In thisreport, we introduce the benefits of versioning systems in MDE framework. A conflictmetamodel that is generated from the input metamodel and the architecture for detectingconflicts are represented. The proposed approach is metamodel independent that meansconflict metalmodel is created for any input model which conforms to Kernel Meta MetaModel (KM3). The concept of used conflict model can be also changed regardless ofmodels. The extended idea for conflicts detection, the presented architecture for modelcode management and the tools that can detect conflicts of concurrent models can help toimprove MDE in model code management. For this report, some implementation in Eclipseplatform has been rendered while some future works are suggested.</p> / University of L'Aquila, Project group in Moedling with Alfonso Pierantonio
139

Δίκτυα υποβρύχιων ασύρματων αισθητήρων: Εϕαρμογή σε δεξαμενές βιομηχανικών λυμάτων

Γκικόπουλι, Αντριάνα 30 April 2014 (has links)
Το αντικείμενο της παρούσας διπλωματικής εργασίας είναι η δημιουργία ενός υποβρύχιου ασύρματου δικτύου αισθητήρων για την πραγματοποίηση της μέτρησης της στάθμης μίας δεξαμενής γεμισμένης με νερό και λύματα. Πραγματοποιήθηκε μία πλήρη βιβλιογραϕική αναζήτηση πάνω στο θέμα των των υποβρύχιων ασύρματων δικτύων αισθητήρων και στην συνέχεια αγοράστηκε ο κατάλληλος εξοπλισμός για την πραγματοποίηση των πειραμάτων. Με την χρήση του ολοκληρωμένου εξοπλισμού evaluation kit EK010-JN5148, δημιουργήσαμε ένα δίκτυο μεταξύ ενός συντονιστή, ενός δρομολογητή και διαϕόρων τερματικών συσκευών. Ο δρομολογητής και οι τερματικές συσκευές πραγματοποιούν μετρήσεις της θερμοκρασίας στο υδάτινο περιβάλλον και ο δρομολογητής είναι υπεύθυνος για την μεταϕορά των πληροϕοριών εκτός του υποβρύχιου περιβάλλοντος, στον συντονιστή, ο οποίος απεικονίζει τα πακέτα δεδομένων στην LCD οθόνη. Με αυτό τον τρόπο, ο χρήστης βλέπει ανά πάσα στιγμή τις μετρήσεις που τον ενδιαϕέρουν, αλλά και ταυτόγχρονα παρακολουθεί την ισχύ του δικτύου στα διάϕορα βάθη,στα οποία εμβυθίζονται οι μικροεπεξεργαστές που ϕέρουν τους αισθητήρες. Απώτερος στόχος της εργασίας είναι η εξοικείωση του αναγνώστη με το αντικείμενο των ασύρματων δικτύων υποβρύχιων αισθητήρων και η ανάδειξη της χρησιμότητάς τους μέσω των πολυάριθμων εϕαρμογών τους. Τα πειράματα που πραγματοποιήθηκαν έδωσαν ένα αριθμό κριτηρίων για την διαπίστωση της ισχύος των ηλεκτρομαγνητικών κυμάτων στο νερό και τελευταίο και μη αμελητέο η εϕαρμογή που δημιουργήθηκε αποτελεί μία σημαντική λύση στο πρόβλημα ανίχνευσης της στάθμης των λυμάτων σε μία δεξαμενή γεμισμένη με νερό και λύματα, ουσίες οι οποίες πρέπει να διαχωριστούν στην συνέχεια. Στα πλαίσια της διπλωματικής εργασίας, έγινε λεπτομερής μελέτη της λειτουργίας του Υδροηλεκτρικού Σταθμού του Γλαύκου στην περιοχή της Αχαίας και των αναγκών του σταθμού, όπου και στο τέλος προτάθηκαν λύσεις για την βελτίωση και διευκόλυνση της ετήσιας πραγματοποίησης μετρήσεων πάνω στην ποιότητα του αρδεύσιμου νερού, χρησιμοποιώντας τον αγορασθέντα εξοπλισμό. / The object of this thesis is to create an underwater wireless sensor network for the embodiment of the level measurement of a tank filled with water and wastewater. A search in literature was conducted on the topic of underwater wireless sensor network, in order to further purchase the appropriate equipment to perform the experiments. Using the integrated equipment kit EK010-JN5148, a network was created between a coordinator device, a router device and various terminals. The router and terminal devices operate temperature measurements in the aquatic environment, while the router has the additional role to transfer the gathered information to the coordinator, who is placed outside the aquatic environment. Afterwards, the coordinator illustrates the data packets on the LCD screen for the user to see. The advantage of the network utilization is that the user can benefit from the update of the information and choose the way to depict them and concurrently monitor the power of the network in various depths. The ultimate goal of this paper is to familiarize the reader with the object of underwater wireless sensor networks highlighting their usefulness through numerous applications. The experiments that were carried out provide criteria to determine the effect of the electromagnetic waves in water. Finally, through coding in language C, an application was created to serve as a solution to the problem of detecting the level of water waste in an industrial tank and give necessary information to facilitate its separation later in the process. During the thesis, a detailed study was made on the operation of a hydroelectric power plant in Glavkos in the region of Achaea, Greece. Solutions including the use of the kit EK010-JN5148 were proposed in order to enhance and facilitate the annual measurements on the quality of the irrigable water.
140

Integrated tooling framework for software configuration analysis

Singh, Nieraj 05 May 2011 (has links)
Configurable software systems adapt to changes in hardware and execution environments, and often exhibit a variety of complex maintenance issues. Many tools exist to aid developers in analysing and maintaining large configurable software systems. Some are standalone applications, while a growing number are becoming part of Integrated Development Environments (IDE) like Eclipse. Reusable tooling frameworks can reduce development time for tools that concentrate on software configuration analysis. This thesis presents C-CLEAR, a common, reusable, and extensible tooling framework for software configuration analysis, where clear separation of concern exists between tooling functionality and definitions that characterise a software system. Special emphasis will be placed on common mechanisms for data abstraction and automatic IDE integration independent of the software system that is being analysed. / Graduate

Page generated in 0.0287 seconds