Title

Highly Scalable Distributed Dataflow Analysis

Conference

Published in the Proceedings of the 9th Annual IEEE/ACM International Symposium on Code Generation and Optimization (CGO 2011), April 2011 (acceptance rate: 28/105 ≈ 27%)

Authors

Joseph L. Greathouse, Chelsea LeBlanc, Todd Austin, Valeria Bertacco

Abstract

Dynamic dataflow analyses find software errors by tracking meta-values associated with a program's runtime data. Despite their benefits, the orders-of-magnitude slowdowns that accompany these systems limit their use to the development stage; few users would tolerate such overheads.

This work extends dynamic dataflow analyses with a novel sampling system which ensures that runtime slowdowns do not exceed a user-defined threshold. While previous sampling methods are inadequate for dataflow analyses, our technique efficiently reduces the number and size of analyzed dataflows. In doing so, it allows individual users to test large, stochastically chosen sets of a process's dataflows. Large populations can therefore, in aggregate, analyze a larger portion of the program than is possible by any single user running a complete, but slow, analysis. In our experimental evaluation, we show that 1 out of every 10 users expose a number of security exploits while only experiencing a 10% performance slowdown, in contrast with the 100x overhead caused by a complete analysis that exposes the same problems.

Awards

Best Student Presentation, CGO 2011

Paper

IEEE | PDF

Presentation

PPTX | PPT | PDF Copyright © 2011 IEEE. Hosted on this personal website as per this IEEE policy.