Return to search

Analyzing Mobile App Privacy Using Computation and Crowdsourcing

Mobile apps can make use of the rich data and sensors available on smartphones to offer compelling services. However, the use of sensitive resources by apps is not always justified, which has led to new kinds of privacy risks and challenges. While it is possible for app market owners and third-parties to analyze the privacy-related behaviors of apps, present approaches are difficult and tedious.
I present two iterations of the design, implementation, and evaluation of a system, Gort, which enables more efficient app analysis, by reducing the burden of instrumenting apps, making it easier to find potential privacy problems, and presenting sensitive behavior in context. Gort interacts with apps while instrumenting them to detect sensitive information transmissions. It then presents this information along with the associated app context to a crowd of users to obtain their expectations and comfort regarding the privacy implications of using the app. Gort also runs a set of heuristics on the app to flag potential privacy problems. Finally, Gort synthesizes the information obtained through its analysis and presents it in an interactive GUI, built specifically for privacy analysts.
This work offers three distinct new advances over the state of the art. First, Gort uses a set of heuristics, elicited through interviews with 12 experts, to identify potential app privacy problems. Gort heuristics present high-level privacy problems instead of the overwhelming amount of information offered through existing tools. Second, Gort automatically interacts with apps by discovering and interacting with UI elements while instrumenting app behavior. This eliminates the need for analysts to manually interact with apps or to script interactions. Third, Gort uses crowdsourcing in a novel way to determine whether app privacy leaks are legitimate and desirable and raises red flags about potentially suspicious app behavior. While existing tools can detect privacy leaks, they cannot determine whether the privacy leaks are beneficial or desirable to the user. Gort was evaluated through two separate user studies. The experiences from building Gort and the insights from the user studies guide the creation of future systems, especially systems intended for the inspection and analysis of software.

Identiferoai:union.ndltd.org:cmu.edu/oai:repository.cmu.edu:dissertations-1327
Date01 May 2014
CreatorsAmini, Shahriyar
PublisherResearch Showcase @ CMU
Source SetsCarnegie Mellon University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceDissertations

Page generated in 0.0019 seconds