171 |
Reverse Engineering Behavioural Models by Filtering out Utilities from Execution TracesBraun, Edna 10 September 2013 (has links)
An important issue in software evolution is the time and effort needed to understand existing applications. Reverse engineering software to recover behavioural models is difficult and is complicated due to the lack of a standardized way of extracting and visualizing knowledge. In this thesis, we study a technique for automatically extracting static and dynamic data from software, filtering and analysing the data, and visualizing the behavioural model of a selected feature of a software application. We also investigate the usefulness of the generated diagrams as documentation for the software.
We present a literature review of studies that have used static and dynamic data analysis for software comprehension. A set of criteria is created, and each approach, including this thesis’ technique, is compared using those criteria.
We propose an approach to simplify lengthy traces by filtering out software components that are too low level to give a high-level picture of the selected feature. We use static information to identify and remove small and simple (or uncomplicated) software components from the trace. We define a utility method as any element of a program designed for the convenience of the designer and implementer and intended to be accessed from multiple places within a certain scope of the program. Utilityhood is defined as the extent to which a particular method can be considered a utility. Utilityhood is calculated using different combinations of selected dynamic and static variables. Methods with high utilityhood values are detected and removed iteratively. By eliminating utilities, we are left with a much smaller trace which is then visualized using the Use Case Map (UCM) notation. UCM is a scenario language used to specify and explain behaviour of complex systems.
By doing so, we can identify the algorithm that generates a UCM closest to the mental model of the designers. Although when analysing our results we did not identify an algorithm that was best in all cases, there is a trend in that three of the best four algorithms (out of a total of eight algorithms investigated) used method complexity and method lines of code in their parameters. We also validated the algorithm results by doing a comparison with a list of methods given to us by the creators of the software and doing precision and recall calculations. Seven out of the eight participants agreed or strongly agreed that using UCM diagrams to visualize reduced traces is valid approach, with none who disagreed.
|
172 |
Building reverse engineering tools with software componentsKienle, Holger M. 20 November 2006 (has links)
This dissertation explores a new approach to construct tools in the
domain of reverse engineering. The approach uses already available
software components -- such as off-the-shelf components and integrated
development environments -- as building blocks, combining and
customizing them programmatically to realize the desired functional
and non-functional requirements. This approach can be characterized as
component-based tool-building, as opposed to traditional
tool-building, which typically develops most of the tool's
functionalities from scratch.
The dissertation focuses on research tools that are constructed in a
university or research lab (and then possibly evaluated in an
industrial setting). Often the motivation to build a research tool is
a proof-of-concept implementation. Tool-building is a necessary part
of research -- but it is a costly one. Traditional approaches to tool
building have resulted in tools that have a high degree of custom code
and exhibit little reuse. This approach offers the most flexibility,
but can be costly and can result in highly idiosyncratic tools that
are difficult to use. To compensate for the drawbacks of building
tools from scratch, researchers have started to reuse existing
functionality, leading towards an approach that leverages components
as building blocks. However, this emerging approach is pursued in an
ad hoc manner reminiscent of craftsmanship rather than professional
engineering.
The goal of this dissertation is to advance the current state of
component-based tool-building towards a more disciplined, predictable
approach. To achieve this goal, the dissertation first summarizes and
evaluates relevant tool-building experiences and case studies, and
then distills these into practical advice in the form of lessons
learned, and a process framework for tool builders to follow.
The dissertation uniquely combines two areas, reverse engineering and
software components. The former addresses the constructed tool's
application domain, the latter forms the foundation of the
tool-building approach. Since this dissertation mostly focuses on
tools for reverse engineering, a thorough understanding of this
application domain is necessary to elicit its requirements. This is
accomplished with an in-depth literature survey, which synthesizes
five major requirements. The elicited requirements are used as a
yardstick for the evaluation of component-based tools and the proposed
process framework. There are diverse kinds of software components that
can be leveraged for component-based tool building. However, not all
of these components are suitable for the proposed tool-building
approach. To characterize the kinds of applicable components, the
dissertation introduces a taxonomy to classify components. The
taxonomy also makes it possible to reason about characteristics of
components and how these characteristics affect the construction of
tools.
This dissertation introduces a catalog of components that are
applicable for the proposed tool-building approach in the reverse
engineering domain. Furthermore, it provides a detailed account of
several case studies that pursue component-based tool-building. Six of
these case studies represent the author's own tool-building
experiences. They have been performed over a period of five years
within the Adoption-Centric Reverse Engineering project at the
University of Victoria. These case studies, along with relevant
experiences reported by other researchers, constitute a body of
valuable tool-building knowledge. This knowledge base provides the
foundation for this dissertation's two most important contributions.
First, it distills the various experiences -- the author's as well as
others -- into ten lessons learned. The lessons cover important
requirements for tools as uncovered by the literature
survey. Addressing these requirements promises to result in better
tools that are more likely to meet the needs of tool users. Second,
the dissertation proposes a suitable process framework for
component-based tool development that can be instantiated by tool
builders. The process framework encodes desirable properties of a
process for tool-building, while providing the necessary flexibility
to account for the variations of individual tool-building projects.
|
173 |
The reverse engineering notebookWong, Kenny 17 December 2007 (has links)
Software must evolve over time or it becomes useless. Much of software production today is involved not in creating wholly new code from scratch but in maintaining and building upon existing code. Much of this code resides in old legacy software systems.
Unfortunately, these systems are often poorly documented. Typically, they become more complex and difficult to understand over time. Thus, there is a need to better understand existing software systems. An approach toward this problem would be a first step toward easing changes and extending the continuous evolution of these systems.
This dissertation addresses the problem by enabling continuous software understanding. There should be a base of reverse engineering abstractions that are carried forward during evolution. The proposed approach seeks to redocument existing software structure, capture the analysis decisions made, and support personal, customizable, and live perspectives of the software in an online journal called the Reverse Engineering Notebook.
The premise that software reverse engineering be applied continuously throughout the lifetime of the software has major tool design implications. Thus, tool integration, process, and adoption are key issues for the Notebook. In particular, data integration requirements, control integration via pervasive scripting, presentation integration through the management of views, user roles, methodology, end user needs, and goal-directed framework for the Notebook are described.
A major theme of the dissertation is learning from the successes and failures of studies involving tool integration and reverse engineering technologies. Case studies and user experiments helped to evaluate various aspects of the Notebook approach and provide feedback into software understanding tool requirements.
|
174 |
Reverse engineering of UML sequence diagrams using dynamic information /Miao, Yucong, January 1900 (has links)
Thesis (M. Sc.)--Carleton University, 2003. / Includes bibliographical references (p. 77-79). Also available in electronic format on the Internet.
|
175 |
Towards reverse engineering of UML sequence diagrams of real-time, distributed systems through dynamic analysis /Leduc, Johanne, January 1900 (has links)
Thesis (M. App. Sc.)--Carleton University, 2004. / Includes bibliographical references (p. 116-119). Also available in electronic format on the Internet.
|
176 |
Automated generation of SW design constructs from MESA source code /Egerton, David. January 1993 (has links)
Thesis (M.S.)--Rochester Institute of Technology, 1993. / Typescript. Includes bibliographical references (vol. 1, leaves 155-160).
|
177 |
Reverse engineering εφαρμογών παγκόσμιου ιστού με αξιοποίηση μεθοδολογιών μοντελοποίησηςΣουρλά, Γεωργία 08 May 2013 (has links)
Οι εφαρμογές παγκόσμιου ιστού προσφέρουν ολοένα και περισσότερες, με υψηλό βαθμό πολυπλοκότητας, υπηρεσίες σε σχέση με τους πρώτους ιστότοπους που χρησιμοποιούνταν απλά και μόνο για την προβολή πληροφοριών. Λόγω της ολοένα αυξανόμενης πολυπλοκότητας των εφαρμογών αυτών, ο σχεδιασμός, η ανάπτυξη κι η συντηρησιμότητα μιας εφαρμογής παγκόσμιου ιστού προβάλλει ως μία από τις μεγαλύτερες προκλήσεις που καλείται να αντιμετωπίσει ο σχεδιαστής της. Η ερευνητική κοινότητα προκειμένου να αντιμετωπίσει την αυξανόμενη πολυπλοκότητα του σχεδιασμού εφαρμογών παγκόσμιου ιστού έχει προτείνει ένα πλήθος προσεγγίσεων και μεθόδων βασισμένων σε μοντέλα. Η δουλειά του σχεδιαστή εφαρμογών παγκόσμιου ιστού μπορεί να απλοποιηθεί ακόμα περισσότερο με την επαναχρησιμοποίηση της εμπειρίας άλλων σχεδιαστών εφαρμογών παγκόσμιου ιστού. Η επαναχρησιμοποίηση της εμπειρίας αυτής γίνεται με τη χρήση σχεδιαστικών προτύπων που ορίζονται από πεπειραμένους σχεδιαστές. Αν κατά το σχεδιασμό μιας εφαρμογής χρησιμοποιηθεί κάποια μέθοδος μοντελοποίησης σε συνδυασμό με ένα σύνολο σχεδιαστικών προτύπων, η τελική εφαρμογή θα είναι πιο αποδοτική και ποιοτική. Πολλές φορές όμως, κατά το σχεδιασμό και την ανάπτυξη μιας εφαρμογής παγκόσμιου ιστού δε λαμβάνεται υπόψη κάποια συγκεκριμένη μεθοδολογία μοντελοποίησης και ανάπτυξης. Ένα πλήθος μεθόδων αντίστροφης μηχανίκευσης έχει αναπτυχθεί για την ανάλυση, κατανόηση και μοντελοποίηση των αρχιτεκτονικών τέτοιου είδους εφαρμογών.
Κατά τη μοντελοποίηση της αρχιτεκτονικής παραδοσιακών συστημάτων λογισμικού δεν εμφανίζεται η εσωτερική δομή του λειτουργικού συστήματος και του συστήματος παραθύρων (windowing system). Αντίστοιχα και στο πεδίο εφαρμογών παγκόσμιου ιστού δεν εμφανίζεται η εσωτερική δομή του εξυπηρετητή παγκόσμιου ιστού και του φυλλομετρητή. Τα αρχιτεκτονικά διαγράμματα των εφαρμογών παγκόσμιου ιστού πρέπει να δείχνουν τα κύρια συστατικά μιας εφαρμογής τα οποία συνδέονται μεταξύ τους για να υλοποιήσουν μεγάλες και σύνθετες εφαρμογές. Κύρια συστατικά μιας εφαρμογής θεωρούνται τα αντικείμενα του παγκόσμιου ιστού, οι πίνακες της βάσης δεδομένων και τα πολυμεσικά αντικείμενα.
Στα πλαίσια αυτής της διπλωματικής εργασίας, αναλύσαμε το πιο δημοφιλές σύστημα διαχείρισης περιεχομένου (Content Management Systems – CMS) ανοιχτού κώδικα, το Joomla!. Συγκεκριμένα, εφαρμόσαμε αντίστροφη μηχανίκευση σε αυτό, ώστε να μπορέσουμε να μοντελοποιήσουμε την αρχιτεκτονική του. Για αυτό το σκοπό, αναπτύξαμε μια εφαρμογή που ερευνά τον HTML κώδικα όλων των σελίδων ενός ιστότοπου φτιαγμένου σε Joomla!, προκειμένου να γίνει αυτόματη εξαγωγή του μοντέλου του και όχι με το χέρι. Βασικός στόχος είναι να μπορέσουμε να αξιολογήσουμε το σχεδιασμό του μοντέλου του, αλλά και να παρέχουμε έναν αυτοματοποιημένο τρόπο για αξιολόγηση. Έτσι, θα μπορούν να προταθούν τρόποι αναδιάταξης, με στόχο τόσο την ευχρηστία όσο και την αποδοτικότητα χρήσης του συγκεκριμένου συστήματος διαχείρισης περιεχομένου. / Web Applications provide many services and they are not used just to display content, as it was the case for the first web sites. Due to the growing complexity, the design, development and maintenance of these applications has become one of the major challenges that the developer has to face. In an attempt to face this growing complexity, the research community has proposed a number of model based approaches and methods. The task of the hypertext architect may be further facilitated by reusing the experience of other hypertext architects. This reuse is achieved by means of design patterns that have been defined by experts. If, we make use of a modeling method in combination with design patterns when designing a web application, the final result will be more efficient and qualitative. Usually, due to the pressing market demands, the modeling methods or techniques are not applied during the degin and development of the web applications. A number of reverse engineering methods and tools have been proposed in order to analyze, comprehend and model the architectures of such applications.
When the architecture of traditional software systems is modeled, the internal structure of the operating and windowing system does not appear. Similarly, in the field of web applications, the internal structure of the web server and browser does not appear. The architectural diagrams of web applications need to show the main components of an application, which are linked together in order to create large and complex applications. The main components of an application are the objects of the Web, the database tables and the multimedia objects.
In this thesis, we analyzed the most popular open source Content Management System (CMS), Joomla!. More specifically, we applied reverse engineering to it, so that we could model its architecture. For this purpose, we developed an application which explores the HTML code of all web pages of a web site created by Joomla!, so that the extraction of its model will be done automatically and not manually. The main goal is to manage to evaluate the design of its model and to provide an automatic way for evaluation, as well. Thus, we can recommend ways to redeploy the system, so that usability and efficiency will be achieved, in the use of this content management system.
|
178 |
CAD/CAM/Usinagem CNC integrado a engenharia reversa /Santos, Edgar Pereira dos January 2017 (has links)
Orientador: Ruis Camargo Tokimatsu / Resumo: A sociedade sempre busca inovações como facilitadores de processos para o alcance de objetivos propostos em distintas áreas, utilizando instrumentos, métodos e técnicas diversas. Dentre estas encontra-se a Engenharia Reversa (ER), que é o processo de engenharia afim de se obter um produto ou objeto a partir de um modelo original. Nos últimos anos, com a rápida evolução tecnológica , envolvendo máquinas, ferramentas e softwares, a ER passou a ser um recurso utilizado até mesmo por pequenas e médias empresas. O proposito desta pesquisa é avaliar o desempenho da aplicação de recursos de engenharia, tais como hardwares e softwares de baixo custo ou de uso livre, sobre a ER a fim de obter-se uma peça usinada em uma maquina CNC o mais fiel possível ao modelo físico original. Para isso foram aplicados os conceitos, os processos e recursos necessários nas etapas de reconstrução de um modelo físico , desde a digitalização 3D, utilizando scanners tridimensionais, o tratamento do objeto capturado, a reconstrução do modelo no software CAD até a geração do código G, por meio de softwares CAM, a ser enviado a uma máquina CNC para realização da usinagem e obtenção da peça modelo. Foi utilizado o scanner 3d manual Ciclop, cuja digitalização resultou em dimensões muitas próximas ao objeto original. Também foram utilizados os softwares Meshlab para tratamento do modelo digitalizado e o AutoCad Fusion 3D para criação do modelo CAD e geração dos processos de usinagem e código G, escolhidos devid... (Resumo completo, clicar acesso eletrônico abaixo) / Mestre
|
179 |
Analysis of low-level implementations of cryptographic protocolsGkaniatsou, Andriana Evgenia January 2018 (has links)
This thesis examines the vulnerabilities due to low-level implementation deficiencies of otherwise secure communication protocols in smart-cards. Smart-cards are considered to be one of the most secure, tamper-resistant, and trusted devices for implementing confidential operations, such as authentication, key management, encryption and decryption for financial, communication, security and data management purposes. The self-containment of smart-cards makes them resistant to attacks as they do not depend on potentially vulnerable external resources. As such, smart-cards are often incorporated in formally-verified protocols that require strong security of the cryptographic computations. Such a setting consists of a smart-card which is responsible for the execution of sensitive operations, and an Application Programming Interface (API) which implements a particular protocol. For the smart-card to execute any kind of operation there exists a confidential low-level communication with the API, responsible for carrying out the protocol specifications and requests. This communication is kept secret on purpose by some vendors, under the assumption that hiding implementation details enhances the system’s security. The work presented in this thesis analyses such low-level protocol implementations in smart-cards, especially those whose implementation details are deliberately kept secret. In particular, the thesis consists of a thorough analysis of the implementation of PKCS#11 and Bitcoin smart-cards with respect to the low-level communication layer. Our hypothesis is that by focusing on reverse-engineering the low-level implementation of the communication protocols in a disciplined and generic way, one can discover new vulnerabilities and open new attack vectors that are not possible when looking at the highest levels of implementation, thereby compromising the security guarantees of the smart-cards. We present REPROVE, a system that automatically reverse-engineers the low-level communication of PKCS#11 smart-cards, deduces the card’s functionalities and translates PKCS#11 cryptographic functions into communication steps. REPROVE deals with both standard-conforming and proprietary implementations, and does not require access to the card. We use REPROVE to reverse-engineer seven commercially available smart-cards. Moreover, we conduct a security analysis of the obtained models and expose a set of vulnerabilities which would have otherwise been unknown. To the best of our knowledge, REPROVE is the first system to address proprietary implementations and the only system that maps cryptographic functions to communication steps and on-card operations. To that end, we showcase REPROVE’s usefulness to a security ecosystem by integrating it with an existing tool to extract meaningful state-machines of the card’s implementations. To conduct a security analysis of the results we obtained, we define a threat model that addresses low-level PKCS#11 implementations. Our analysis indicates a series of implementation errors that leave the cards vulnerable to attacks. To that end, we showcase how the discovered vulnerabilities can be exploited by presenting practical attacks. The results we obtained from the PKCS#11 smart-card analysis showed that proprietary implementations commonly hide erroneous behaviours. To test the assumption that the same practice is also adopted by other protocols, we further examine the low-level implementation of the only available smart-card based Bitcoin wallets, LEDGER. We extract the different protocols that the LEDGER wallets implement and conduct a through analysis. Our results indicate a set of vulnerabilities that expose the wallets as well as the processed transactions to multiple threats. To that end, we present how we successfully mounted attacks on the LEDGER wallets that lead to the loss of the wallet’s ownership and consequently loss of the funds. We address the lack of well-defined security properties that Bitcoin wallets should conform to by introducing a general threat model. We further use that threat model to propose a lightweight fix that can be adopted by other, not necessarily smart-card-based, wallets.
|
180 |
Rétro-ingénierie de programmes binaires en une exécution : une analyse dynamique légère basée au niveau des fonctions / Reverse-engineering of binaries in a single execution : a lightweight function-grained dynamic analysisGoër de Herve, Franck de 20 October 2017 (has links)
Dans cette thèse, nous proposons une nouvelle approche d’analyse dynamique de programmes binaires. Ce travail se place dans un contexte de rétro-conception de binaires avec des motivations liées à la sécurité : compréhension de logiciels malveillants, détection de vulnérabilités, etc. Concrètement, nous nous intéressons à retrouver des informations de haut niveau à partir d’un binaire en une seule exécution : les prototypes de fonctions, une nouvelle notion que nous nommons « couplage », et les allocateurs mémoire. L’approche proposée est basée sur des heuristiques afin d’analyser rapidement de larges programmes, et les résultats expérimentaux montrent qu’une telle approche permet d’obtenir des résultats précis.Les trois objectifs principaux de notre approche sont : 1) l’universalité - les hypothèses sur le programme à analyser sont le plus faibles possibles (pas de recompilation nécessaire, pas de source, applicable à des programmes strippés), 2) le passage à l’échelle - l’analyse se veut suffisamment légère pour pouvoir analyser de gros programmes, 3) la correction - dans les résultats produits, on cherche à minimiser les faux- positifs (par exemple, détecter des paramètres de fonction qui n’existent pas).La thèse se découpe en trois parties : une première partie dans laquelle on présente le contexte de rétro-conception dans lequel ce travail se situe, une seconde partie dans laquelle nous présentons notre approche, et une troisième partie qui détaille notre implémentation et des résultats numériques. / In this thesis, we propose a new dynamic approach to analyze binary programs. The context of this work is reverse-engineering binaries with motivations related to security: understanding malwares, detecting vulnerabilities, etc. Concretely, we focus on retrieving high-level information from a binary in a single execution: function prototypes, a new notion we name coupling, and memory allocators. The proposed approach is based on heuristics to analyze efficiently large programs, and experimental results show that with an approach leads to accurate results.The three main objectives of the approach are: 1) universality - hypothesis on the program to analyze are as weak as possible (no recompilation, no source code, possibly stripped), 2) scalability - the analysis aims to be lightweight enough to handle large programs, 3) soundness - we aim to minimize false positives in the results (e.g., detecting parameters of functions that actually do not exist).The thesis is divided in three parts: a first part presenting the context of reverse-engineering we work in, a second part in which we present our approach, and a third part to describe our implementation and numeric results.
|
Page generated in 0.1172 seconds