A 1.5% accurate pulse is accurate enough for my purposes for this project.
Problem is I can't get that close with software timing.
ADDRequest(LJID, put_config, channel.stream_wait_mode, streamwaitmode.none, 0, 0)
ADDRequest(LJID, ClearStreamChannels, 0, 0, 0, 0)
ADDRequest(LJID, add_stream_channel_diff, 0, 1, 1, 0) '(0&1)
ADDRequest(LJID, add_stream_channel_diff, 2, 1, 3, 0) '(2&3)
ADDRequest(LJID, add_stream_channel_diff, 4, 1, 5, 0) '(4&5)
ADDRequest(LJID, add_stream_channel_diff, 6, 1, 7, 0) '(6&7)
ADDRequest(LJID, add_stream_channel, 193, 0, 0, 0) ' to catch when Digital channel is actually on
eget(LJID, Start_Stream, 0, dblvalue, 0) ' start streaming
'method 1 to turn on device to measure
threading.thread.sleep(20) ' 20mS give a little delay time
eput(LJID,Put_Digital_Bit, LJ_Eio0, 1, 0) ' turn on source
threading.thread.sleep(intPulseWidth) ' wait intPulseWidth=100mS
eput(LJID,Put_Digital_Bit, LJ_Eio0, 0, 0) ' turn off
'method 2 to turn on device to measure ' no 20mS pause as before to
ADDRequest(LJID,Put_Timer_Mode, 0, PWM16) ' setup for 5Hz : 48Mhz_Div with Div=148
ADDRequest(LJID,Put_Timer_Value, 0, 32768) ' 50% duty cycle gives 100mS pulse
ADDRequest(LJID,Put_Timer_Mode, 1, TimerStop,0,0) '
ADDRequest(LJID,Put_Timer_Value, 1, 1 ,0,0) ' one pulse
' get the data
eget(LJID,Get_Stream_Data, All_Channels, numscans,dblData) ' retrieve data
eget(LJID,Stop_Stream, 0, dblvalue, 0)
' with method one, everything looks okay, trigger event happens around 20mS into the stream,
' although changing small changes in this delay don't seem to affect it (till you go below 15 or above 23)
' but the event lasts 108mS. Okay, I tried reducing the Sleep time but nothing happened till around
' 93 when pulse time dropped to 92mS pulse width
okay - little strange behaviour with the threading.thread.sleep method
' with method 2, I now get a fairly accurate 100mS pulse (<1ms error), but, I can't seem to get it to
' start at a consistent place. Sometimes it starts immediately, other times it waits as much as 170mS
' to get startedm (no discernable pattern to the delay is apparent).
I don't do any data collection from the buffer as my collection times are quite short and will never collect
more than 100points (500samples with the 5 channels) so I shouldn't overrun the U3 buffer.
Why is getting the start with the TimerStop Mode method so inaccurate.
Have I done something wrong that you can see?
Not sure if the strange sleep behaviour is a Labjack result or something in my system parameters.
Does the automatic transfer of data from the U3 to the driver interfere with the execution of events?
(If so, is there a way to stop the transfer, until specified, I never collect large quantities of data but I
do want to have short interval measurements)
I suppose I could put the TImer events into the Stream setup and execute everything as a GoOne
but I'd rather have a delay between Stream start and the Trigger start.
Any suggestions as to what I'm doing wrong or as to a better way?