calculation variance and standard deviation in a single parse
I have a very large network trace file with two timestamps on each
packet.I calculate the difference between the timestamps for each pair
consecutive packets.
delta_ts1 = ts1(packet N) - ts1(packet N-1)
delta_ts2 = ts2(packet N) - ts2(packet N-1)
Assume ts_2 is the reference value and I want to test ts_1 against ts_2.
And the variance ts_variance = (delta_ts2 - mean_ts)^2/packet_count
Now the problem with the above approach is that I don't get the mean till
I reach the end of the file.I want to achieve this in a single parse.I am
thinking of using an approach as below
running_mean_till_now += ts2/packet_count_till_now
ts_variance = (delta_ts2 - running_mean_till_now)^2/packet_count_till_now
Is this approach acceptable? How accurate will the estimated variance and
hence the standard deviation will by using this approach.?
No comments:
Post a Comment