Master of Science / Department of Computer Science / Eugene Vasserman / Since 2014 there have been over 120 million new malicious programs registered every year. Due to the amount of new malware appearing every year, analysts have automated large sections of the malware reverse engineering process. Many automated analysis systems are created by re-implementing analysis techniques rather than automating existing tools that utilize the same techniques. New implementations take longer to create and do not have the same proven quality as a tool that evolved alongside malware for many years.
The goal of this study is to assess the efficiency and effectiveness of using existing tools for the application of automated malware analysis. This study focuses on the problem of discovering how malware persists on an infected system. Six tools are chosen based on their usefulness in manual analysis for revealing different persistence techniques employed by malware. The functions of these tools are automated in a fashion that emulates how they can be manually utilized, resulting in information about a tested sample. These six tools are tested against a collection of actual malware samples, pulled from malware families that are known for employing various persistence techniques. The findings are then scanned for indicators of persistence. The results of these tests are used to determine the smallest tool subset that discovers the largest range of persistence mechanisms. For each tool, implementation difficulty is compared to the number of indicators discovered to reveal the effectiveness of similar tools for future analysis applications.
The conclusion is that while the tools covered a wide range of persistence mechanisms, the standalone tools that were designed with scripting in mind were more effective than those with multiple system requirements or those with only a graphical interface. It was also discovered that the automation process limits functionality of some tools, as they are designed for analyst interaction. Regaining the tools’ functionality lost from automation to use them for other reverse
engineering applications could be cumbersome and could require necessary implementation overhauls. Finally, the more successful tools were able to detect a broader range of techniques, while some less successful tools could only detect a portion of the same techniques. This study concludes that while an analysis system can be created by automating existing tools, the characteristics of the tools chosen impact the workload required to automate them. A well-documented tool that is controllable through a command line interface that offers many configuration options will require less work for an analyst to automate than a tool with little documentation that can only be controlled through a graphical interface.
Identifer | oai:union.ndltd.org:KSU/oai:krex.k-state.edu:2097/38783 |
Date | January 1900 |
Creators | Webb, Matthew S. |
Publisher | Kansas State University |
Source Sets | K-State Research Exchange |
Language | en_US |
Detected Language | English |
Type | Thesis |
Page generated in 0.002 seconds