• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 2
  • 2
  • 1
  • Tagged with
  • 19
  • 19
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Generating Accurate Dependencies for Large Software

Wang, Pei 06 November 2014 (has links)
Dependencies between program elements can reflect the architecture, design, and implementation of a software project. According a industry report, intra- and inter-module dependencies can be a significant source of latent threats to software maintainability in long-term software development, especially when the software has millions of lines of code. This thesis introduces the design and implementation of an accurate and scalable analysis tool that extracts code dependencies from large C/C++ software projects. The tool analyzes both symbol-level and module-level dependencies of a software system and provides an utilization-based dependency model. The accurate dependencies generated by the tool can be provided as the input to other software analysis suits; the results along can help developers identify potential underutilized and inconsistent dependencies in the software. Such information points to potential refactoring opportunities and assists developers with large-scale refactoring tasks.
2

Automatic Restoration and Management of Computational Notebooks

Venkatesan, Satish 03 March 2022 (has links)
Computational Notebook platforms are very commonly used by programmers and data scientists. However, due to the interactive development environment of notebooks, developers struggle to maintain effective code organization which has an adverse effect on their productivity. In this thesis, we research and develop techniques to help solve issues with code organization that developers face in an effort to improve productivity. Notebooks are often executed out of order which adversely effects their portability. To determine cell execution orders in computational notebooks, we develop a technique that determines the execution order for a given cell and if need be, attempt to rearrange the cells to match the intended execution order. With such a tool, users would not need to manually determine the execution orders themselves. In a user study with 9 participants, our approach on average saves users about 95% of the time required to determine execution orders manually. We also developed a technique to support insertion of cells in rows in addition to the standard column insertion to help better represent multiple contexts. In a user study with 9 participants, this technique on a scale of one to ten on average was judged as a 8.44 in terms of representing multiple contexts as opposed to standard view which was judged as 4.77. / Master of Science / In the field of data science computational notebooks are a very commonly used tool. They allow users to create programs to perform computations and to display graphs, tables and other visualizations to supplement their analysis. Computational Notebooks have some limitations in the development environment which can make it difficult for users to organize their code. This can make it very difficult to read through and analyze the code to find or fix any errors which in turn can have a very negative effect on developer productivity. In this thesis, we research methods to improve the development environment and increase developer productivity. We achieve this by offering tools to the user that can help organize and cleanup their code making it easier to comprehend the code and make any necessary changes.
3

Rosie - A Recovery-oriented Security System

Chow, Shun Yee 11 July 2013 (has links)
Recovery is a time-consuming and computationally expensive operation. If an attacker can affect heavily-shared objects on the machine, then many other processes and files can be compromised from accessing them. This would greatly increase the recovery effort. Since intrusions start with a network connection, we argue that the integrity of heavily-shared objects should be protected from the network, in order to minimize the recovery effort. We discuss our prototype Rosie, which is designed with incident response and post-intrusion recovery in mind. Rosie predicts how heavily-shared each file or process is, based on the previous system activities observed. Rosie enforces appropriate mandatory access control and uses techniques such as sandboxing, in order to protect heavily-shared objects’ integrity. Rosie provides an important recovery guarantee that the maximum number of files need to be recovered is at most equal to the dependency threshold, a value that can be adjusted by a system administrator.
4

Rosie - A Recovery-oriented Security System

Chow, Shun Yee 11 July 2013 (has links)
Recovery is a time-consuming and computationally expensive operation. If an attacker can affect heavily-shared objects on the machine, then many other processes and files can be compromised from accessing them. This would greatly increase the recovery effort. Since intrusions start with a network connection, we argue that the integrity of heavily-shared objects should be protected from the network, in order to minimize the recovery effort. We discuss our prototype Rosie, which is designed with incident response and post-intrusion recovery in mind. Rosie predicts how heavily-shared each file or process is, based on the previous system activities observed. Rosie enforces appropriate mandatory access control and uses techniques such as sandboxing, in order to protect heavily-shared objects’ integrity. Rosie provides an important recovery guarantee that the maximum number of files need to be recovered is at most equal to the dependency threshold, a value that can be adjusted by a system administrator.
5

The Effects of Open Source License Choice on Software Reuse

Brewer, John VIII 08 June 2012 (has links)
Previous research shows that software reuse can have a positive impact on software development economics, and that the adoption of a specific open source license can influence how a software product is received by users and programmers. This study attempts to bridge these two research areas by examining how the adoption of an open source license affects software reuse. Two reuse metrics were applied to 9,570 software packages contained in the Fedora Linux software repository. Each package was evaluated to determine how many external components it reuses, as well as how many times it is reused by other software packages. This data was divided into subsets according to license type and software category. The study found that, in general, (1) software released under a restrictive license reuse more external components than software released under a permissive license, and (2) that software released under a permissive license is more likely to be reused than software released under a restrictive license. However, there are exceptions to these conclusions, as the effect of license choice on reuse varies by software category. / Master of Science
6

An Impact Assessment and Dependency Management Framework for BPM

Christie, Adrian Troy 07 June 2019 (has links)
Business Process Management (BPM) is a relatively new development paradigm that takes a high-level approach to coding by leveraging a graphical, “flow chart” aesthetic that allows users to assemble modular tasks into a larger process. The resulting diagrams effectively enshrine the organizations processes into an executable model that provides an objective and transparent view of the process and the activities contained within. In doing so, BPM models serve as both a system to guide employees through proper business procedures as well as documentation of the businesses processes. Similar to most other software development environments, BPM development platforms possess several features intended to address the needs of code versioning, dependency management and impact assessment. However, due to the unique way that development is done in BPM platforms, the more traditional functionality of these features sometimes renders them ineffective and ill suited to the task of BPM development. Changes to lower level reusable components in these BPM models can result in impacts to diverse processes across an organization that are difficult to predict and onerous to locate. There is much room for improvement in BPM development tools. This thesis proposes a new framework for dependency management and impact assessment to improve the usability, effectiveness and efficiency. The framework is composed of a Business Process Component Architecture, a Dependency Data Model and an Upgrade Algorithm which are all used to provide increased visibility over dependent processes and superior guidance during upgrade operations. Several example case scenarios are be used to evaluate our proposed framework. The cases represent progressive degrees of complexity to test the capabilities and robustness of the framework. Overall, the framework was able to appropriately handle the case examples used and showed promise in terms of providing practical effort, time and cost savings for BPM developers. The framework can also provide developers assistance in locating circular dependencies, but is subject to the same limitations as developers when attempting to upgrade these relationships.
7

Package Dependencies Analysis and Remediation in Object-Oriented Systems

Laval, Jannik 17 June 2011 (has links) (PDF)
Les logiciels évoluent au fil du temps avec la modification, l'ajout et la suppression de nouvelles classes, méthodes, fonctions, dépendances. Une conséquence est que le comportement peut être placé dans de mauvais paquetages et casser la modularité du logiciel. Une bonne organisation des classes dans des paquetages identifiables facilite la compréhension, la maintenance, les tests et l'évolution des logiciels. Nous soutenons que les responsables manquent d'outils pour assurer la remodularisation logicielle. La maintenance des logiciels nécessite des approches qui aident à (i) la compréhension de la structure au niveau du paquetage et l'évaluation de sa qualité; (ii) l'identification des problèmes de modularité, et (iii) la prise de décisions pour le changement. Dans cette thèse nous proposons ECOO, une approche qui aide la remodularisation. Elle concerne les trois domaines de recherche suivants: (i) Comprendre les problèmes de dépendance entre paquetages. Nous proposons des visualisations mettant en évidence les dépendances cycliques au niveau des paquetages; (ii) Proposer des dépendances qui devraient être changées. L'approche propose des dépendances à changer pour rendre le système plus modulaire; (iii) Analyser l'impact des changements. L'approche propose une analyse d'impact du changement pour essayer les modifications avant de les appliquer sur le système réel. L'approche présentée dans cette thèse a été validée qualitativement et les résultats ont été pris en compte dans la réingénierie des systèmes analysés. Les résultats obtenus démontrent l'utilité de notre approche.
8

Dataflow Analysis and Workflow Design in Business Process Management

Sun, Xiaoyun January 2007 (has links)
Workflow technology has become a standard solution for managing increasingly complex business processes. Successful business process management depends on effective workflow modeling, which has been limited mainly to modeling the control and coordination of activities, i.e. the control flow perspective. However, given a workflow specification that is flawless from the control flow perspective, errors can still occur due to incorrect dataflow specification, which is referred to as dataflow anomalies.Currently, there are no sufficient formalisms for discovering and preventing dataflow anomalies in a workflow specification. Therefore, the goal of this dissertation is to develop formal methods for automatically detecting dataflow anomalies from a given workflow model and a rigorous approach for workflow design, which can help avoid dataflow anomalies during the design stage.In this dissertation, we first propose a formal approach for dataflow verification, which can detect dataflow anomalies such as missing data, redundant data, and potential data conflicts. In addition, we propose to use the dataflow matrix, a two-dimension table showing the operations each activity has on each data item, as a way to specify dataflow in workflows. We believe that our dataflow verification framework has added more analytical rigor to business process management by enabling systematic elimination of dataflow errors.We then propose a formal dependency-analysis-based approach for workflow design. A new concept called "activity relations" and a matrix-based analytical procedure are developed to enable the derivation of workflow models in a precise and rigorous manner. Moreover, we decouple the correctness issue from the efficiency issue as a way to reduce the complexity of workflow design and apply the concept of inline blocks to further simplify the procedure. These novel techniques make it easier to handle complex and unstructured workflow models, including overlapping patterns.In addition to proving the core theorems underlying the formal approaches and illustrating the validity of our approaches by applying them to real world cases, we provide detailed algorithms and system architectures as a roadmap for the implementation of dataflow verification and workflow design procedures.
9

System Dependency Analysis for Evolving Space Exploration System of Systems

Christopher T Brand (9189131) 31 July 2020 (has links)
Evolution is a key distinguishing trait of Systems-of-Systems (SoS) that introduces a layer of complexity in analysis that is not present when considering static systems. Some SoS analysis tools exist to determine and evaluate the evolution of an SoS, while other tools are better suited for studying individual instances of an SoS. System Operational Dependency Analysis (SODA) is one such method that has been used previously to study static SoS networks. SODA that has been proven effective in investigating the impacts of partial system disruptions and would benefit from a framework to apply SODA to evolving SoS. This thesis provides an approach to modeling evolving SoS in SODA and presents new data visualization methods to highlight the effects of changing network configurations across evolutionary phases. These visualization enhancements include Failure Impact Range sequence plots to show effects of deterministic system disruptions on capabilities of interest across evolutionary phases, as well as Stochastic Impact plots to quantify the impact of disruptions in particular systems in the context of the probabilistic operating statuses assigned to each system. Integration of SODA and the related method of System Developmental Dependency Analysis (SDDA) is explored to model how operational disruptions and developmental delays might interact and compound during the evolution of an SoS. The SODA enhancements provide decision makers with new information that can be used to explore design and implementation tradeoffs in an evolving SoS under budget and scheduling constraints. These ideas are demonstrated through a case study based on NASA's Artemis program to return humans to the Moon in commercially-built Human Landing Systems (HLS). The HLS concepts proposed to NASA consist of multiple elements that provide distinct capabilities in different phases of the lunar mission, and therefore can be considered an evolving SoS architecture. The operational dependencies of two HLS concepts are modeled across a four-phase lunar landing mission and results are generated using the new visualization methods to highlight the impacts of changing SoS configuration on the performance of key mission capabilities. The development timeline of the first three planned Artemis lunar landing missions is analyzed with SDDA and integrated with SODA results from one HLS concept to explore how developmental delays impact the likelihood of HLS mission completion and how operational failures requiring system redesign impact the program schedule. Connections between SDDA and Integrated Master Schedules (IMS) are discussed to show how SDDA results can be useful in a context more familiar to program managers.
10

Analysis of a flight mechanics simulator

Helgesson, Fredrik January 2019 (has links)
Aircraft design is an act of art requiring dedication and careful work to ensure good results. An essential tool in that work is a flight mechanics simulator. Such simulators are often built up of modules/models that are executed in a sequential order in each time iteration. This project aims to analyze potential improvements to the model execution order based on the dependency structure of one such simulator. The analysis method Design Structure Matrix (DSM), was used to define/map the dependencies and then Binary Linear Programming (BLP) was utilized to find five new potentially improved model orders to minimize the number of feedbacks from one iteration to the next one. Those five proposed execution orders were next compared and evaluated. The result is a model order that reduce the number of models receiving feedbacks from the previous iteration from 13 to 6, with insignificant changes in the precision of the simulator. / Vid flygplanskonstruktion krävs hårt och noggrant arbete för att säkerställa gott resultat. Ett oumbärligt verktyg är då en flygmekanisk simulator. Den typen av simulatorer är ofta uppbyggda av moduler/modeller som exekveras i en bestämd sekventiellt ordning i varje tidsteg. Syftet med detta projekt är att undersöka möjliga förbättringar av exekverings ordningen av de olika modellerna i en existerande simulator, baserat på beroendestrukturen. Analysmetoden Design Structure Matrix (DSM) användes för att bestämma beroendestrukturen och sedan utnyttjades Binär Linjär Programmering (BLP) för att hitta fem förbättrade modellordningar med avseende på att minimera antalet modeller som erhåller indata från föregående tidsiteration. De fem förbättringsförslagen jämfördes och utvärderades. Resultatet är en modellordning som kan minska antalet återkopplande modeller från 13 till 6, med insignifikanta skillnader i precisionen av simulatorn.

Page generated in 0.0787 seconds