|
(This is more of a MRTG question, and is dealt with in more detail in the MRTG documentation)
In RRD, each RRA is defined by a number of data points and a consolidation factor.
If your data sample is 5-minute, then your Daily Average line will be of an RRA that is 1 data point, Average -- IE, the sampled data itself.
The weekly graph is 6 data points, Average (and the purple Max line is 6dp,Max). This means that what you are seeing is the average over 6x5min, IE, the 30-min average. Similarly, Monthly gives a 2hr average and Yearly gives a 1day average.
Where it getsa bit more complex is the Max lines. These show the Max *datapoint* value, so the yearly Max line actually shows the maximum 5-min average over the period of a day, since a data point is 5-min. If your sample Interval is 1-min then things are similar, but with 5 times as many data points (the 6-hr graph is the 1dp graph).
People with a mathematical bent will have now realised why the 95th Percentile calculations are necessarily inaccurate and become more inaccurate as the timeframe widens. This is again explained mathematically in the routers2 documentation.
So, in summary, the sampling period is still 5-min, but RRD collects the required number of data points and averages them to get a value which is stored. The XFF value, which defaults to 50%, is the proportion of datapoints in the set which must exist for this calculation to be considered valid - else an 'unknown' is stored.
_________________ Steve Shipway UNIX Systems, ITSS, University of Auckland, NZ Woe unto them that rise up early in the morning... -- Isaiah 5:11
|