DTT time axis scaling becomes corrupt when the time series is long enough
Taken from bugzilla ticket 207
From Keita
Created attachment 10 [details] sad plot
Take a long time series (say 300 seconds) of 32kHz channel using dtt "triggered time response".
In dtt it seems as if the time of the last data point looks bigger than it should, and if you export the time series to ASCII it certainly seems like that's the case.
But the number of data points is correct. Seems like the error of dt*N becomes bigger than dt when N is big.
From Jim Batch
I have confirmed this issue. A 400 second time series of a 16384 sample/sec channel results in final x values of:
399.99997 400.00003 400.00009 400.00015 400.00021
when the total number of points collected indicates it should stop at 400. The numerical accuracy should be better than what is being produced.
Investigation is ongoing.
Again from Jim Batch
The function which generates the excitation points is calcAWGPeriodic() in awg.c, which is part of awgtpman, not DTT.