Changing the granularity of collected data

Quantify can collect function time information at different levels of detail, or granularity. As the level of detail increases, so does the cost of collecting the data. By default, Quantify determines the level of detail based on the type of code your compiler emits.

Quantify can collect data at the following levels of granularity:

Collection granularity
 

Description

Function
 

Distinguishes counts for each function only

Basic-block
 

Distinguishes counts for each basic block

Line
 

Distinguishes counts for each line

You can use the -collection-granularity option to control the level of detail at which Quantify collects data. You can specify function, basic-block, or line.

You can use basic-block and line only if debugging information is available; that is, if you compile your application using the -g debugging option.

Counting for basic blocks and lines is substantially slower than counting for function granularity, requiring approximately seven machine cycles per basic block or line counter. This is because Quantify must update both its function counter and each of the basic-block counters as it enters each basic block. Updating the basic-block counters requires updating a counter in memory.

The average cost of each basic block in normal code is 5 to 10 machine cycles. This means that, exclusive of function entry and exit overhead, Quantify's counting insertion in this case slows the function on average by a factor of two. This contrasts with function granularity which slows the same code by only about 20 percent.

To control the granularity of data collection without using the -collection-granularity option, you can:

To speed up data collection in an application compiled for debugging without recompiling the code, use the -force-rebuild option with the -collection-granularity option:

% quantify -force-rebuild \
      -collection-granularity=function cc ...