Return to search

Quantifying information flow with constraints

Quantifying flow of information in a program involves calculating how much information (e.g. about secret inputs) can be leaked by observing the program's public outputs. Recently this field has attracted a lot of research interest, most of which makes use of Shannon's information theory, e.g. mutual information, conditional entropy, etc. Computability entails that any automated analysis of information is necessarily incomplete. Thus quantitative flow of analyses aim to compute upper bounds on the sizes of the flows in a program. Virtually all the current quantitative analyses treat program variables independently, which significantly limits the potential for deriving tight upper bounds. Our work is motivated by the intuition that knowledge of the dependencies between program variables should allow the derivation of more precise upper bounds on the size of flows, and that classical abstract interpretation provides an effective mechanism for determining such dependencies in the form of linear constraints. Our approach is then to view the problem as one of constrained optimization (maximum entropy), allowing us to apply the standard technique of Lagrange multiplier method. Application of this technique turns out to require development of some novel methods due to the essential use of non-linear (entropy) constraints, in conjunction with the linear dependency constraints. Using these methods we obtain more precise upper bounds on the size of information flows than is possible with existing analysis techniques.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:528483
Date January 2010
CreatorsZhu, Ping
PublisherCity University London
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://openaccess.city.ac.uk/12101/

Page generated in 0.0145 seconds