This section needs additional citations for verification. (February 2018) |
Part of a series on |
Software development |
---|
Data-flow analysis is a technique for gathering information about the possible set of values calculated at various points in a computer program. It forms the foundation for a wide variety of compiler optimizations and program verification techniques. A program's control-flow graph (CFG) is used to determine those parts of a program to which a particular value assigned to a variable might propagate. The information gathered is often used by compilers when optimizing a program. A canonical example of a data-flow analysis is reaching definitions. Other commonly used data-flow analyses include live variable analysis, available expressions, constant propagation, and very busy expressions, each serving a distinct purpose in compiler optimization passes.
A simple way to perform data-flow analysis of programs is to set up data-flow equations for each node of the control-flow graph and solve them by repeatedly calculating the output from the input locally at each node until the whole system stabilizes, i.e., it reaches a fixpoint. The efficiency and precision of this process are significantly influenced by the design of the data-flow framework, including the direction of analysis (forward or backward), the domain of values, and the join operation used to merge information from multiple control paths.This general approach, also known as Kildall's method, was developed by Gary Kildall while teaching at the Naval Postgraduate School.[1][2][3][4][5][6][7][8]
Kildall_1972_Optimization
was invoked but never defined (see the help page).
Kildall_1973_Optimization
was invoked but never defined (see the help page).
Cortesi_1999
was invoked but never defined (see the help page).
Laws_2014_IEEE
was invoked but never defined (see the help page).
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search