Jump to content

As of July 17, 2015, the LabJack forums here at forums.labjack.com are shut down. New registrations, topics, and replies are disabled. All forums are in a read-only state for archive purposes.

Please visit our current forums at labjack.com/forums to view and make new posts. To post on the current forums, use your labjack.com login account. Your old LabJack forums login credentials have been retired. There are no longer separate logins for labjack.com and LabJack forums.


Photo

timestamps when streaming

timing time timestamps stream

  • Please log in to reply
3 replies to this topic

#1 mikelwrnc

mikelwrnc
  • Members
  • 17 posts

Posted 11 August 2014 - 12:25 PM

I'm streaming data from an analog photocell (as in http://labjack.com/s...ingle-ended-AIN ) and need to apply timestamps to the resulting samples. So I have code like:

 

import u3, time
freq = 10000
d = u3.U3()
d.configU3()
d.getCalibrationData()
d.configAnalog(u3.FIO0)
d.streamConfig( NumChannels = 1, PChannels = [ 0 ], NChannels = [ 31 ], Resolution = 1, ScanFrequency = freq )
d.streamStart()
for r in d.streamData():
	now = time.time()
	if r is not None:
		data = r['AIN0']
		times = [ now - t*1.0/freq for t in reversed(range(len(data))) ]

That is, my first attempt was to get the time that python is done getting the sample, then interpolate backwards to ascertain the time of each sample. However, I'm occasionally getting results such that the last entry from a sample has a later timestamp (up to 2ms) than the first entry from the next sample, which obviously shouldn't happen. I'm guessing that there is variability in the time it takes for control to return to python when getting sample data, meaning my use of time.time() to guess the time of the final entry in a given sample is inappropriate. Can anyone recommend a better method?



#2 LabJack Support

LabJack Support
  • Admin
  • 8677 posts

Posted 11 August 2014 - 05:03 PM

Stream mode is hardware timed. On the U3, channels are continuously scanned at the configured scan frequency and buffered. streamData() gets the scanned samples from the U3's stream buffer. Stream mode is documented further here:

 

http://labjack.com/s...users-guide/3.2

 

So basically you shouldn't use your computer's timestamp in your stream read loop for the scan's timestamp since samples are from a buffer and not real time. You are close in your timestamp calculation in that you need to use the frequency to calculate the timestamp. For example, the first three scan times would be, for a frequency of 10000 scans per second, [startTime, startTime+0.1ms, startTime+0.2ms] and so on for the lifetime of the stream. In your code that could look like (untested):

tbs = 1.0/freq #Time between scans in seconds
d.streamStart()
curTime = time.time() #Get a timestamp for our start time.
for r in d.streamData():
    if r is not None:
        data = r['AIN0']
        times = [ curTime + t*tbs for t in (range(len(data)) ]
        curTime = times[-1] + tbs #The current time for our next set of stream scans.


#3 mikelwrnc

mikelwrnc
  • Members
  • 17 posts

Posted 12 August 2014 - 09:42 AM

Thanks! One note for posterity is that this can be tricky if using a Windows machine, on which time.time() has mere millisecond precision. I've actually devised a method of solving this by using time.clock(), which counts the time since its first call with microsecond precision, in combination with time.time(). That is, you can compute the offset between the high-precision time.clock() and the low-precision time.time(), then use time.clock() in place of time.time() to mark the time of stream start. Here's the code to estimate the offset:

import time
time.clock() #start the session counter

import numpy

offsets = []
waits = []

#collect a bunch of offsets
for i in range(1000):
	last = time.time()
	clockTime1 = time.clock()
	this = last
	while (this-last)==0:
		this = time.time()
		clockTime2 = time.clock()
	offsets.append( this - clockTime2 )
	waits.append(clockTime2-clockTime1)

waits = numpy.array(waits)
offsets = numpy.array(offsets)
offset_estimate = offsets[numpy.argmin(numpy.absolute(waits-0.001))]

I've run a simulation on a unix machine, on which time.time() has microsecond precision, to evaluate the accuracy of this estimate (code here: https://gist.github....00ca4f76255e65b), and it looks pretty good; about 90% of estimated offsets differ from the true offset by less than 0.5ms, and about 70% differ by less than 0.1ms.



#4 mikelwrnc

mikelwrnc
  • Members
  • 17 posts

Posted 12 August 2014 - 12:24 PM

Oops. Turns out that there's an easier solution, at least if you're using Windows 8 where you can use the GetSystemTimePreciseAsFileTime (http://msdn.microsof...5(v=vs.85).aspx) via ctypes:

import ctypes
t = ctypes.c_int64()
ctypes.windll.Kernel32.GetSystemTimePreciseAsFileTime(ctypes.byref(t))
t = t.value/1e7 #get value and convert to seconds (UTC)



0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users