Data streaming without clearing buffer and blocking the code using U3 | LabJack
 

Data streaming without clearing buffer and blocking the code using U3

7 posts / 0 new
Last post
WM
WMarin's picture
Data streaming without clearing buffer and blocking the code using U3

Hi there,

I have used National Instruments and Matlab to run experiments that required retriving data from the buffer at set intervals synced with the refresh rate of the monitor. 32 bit versions of Matlab had a function called PEEKDATA (Data acquistion toolbox) that allowed me to obtain the most recent data points in the buffer without clearing it. The values retrieved from the buffer were used to presented online feedback to the user on the monitor screen. Let's say that I had a 5 second data collection period, I could present the last 100 data points every screen refresh rate and still keep the whole 5 sec of data. I have been trying to replicated this using a U3-HV daq, but without much success. I am following the streamTest-thread.py example and managed to stream data to the command window and save it, but when I tried to integrate Psychopy commands (controlling the visual stimulus) with the call get data using "result = sdr.data.get(True, 1)" (see streamTest-threading.py)  the data collection goes on for much longer as if it that command blocks the code until a certain number of data points is available. The seems related to the acquisition rate. The greater the acquistion rate, the lower the lag.

Bottom line, my questions is: Is it possible to retrieve the 50 most recent data points online after data acquisition is started - say every 16.66 ms, the monitor refresh rate - without blocking the presentation of the stimulus on screen controlled by Psychopy and also without clearing the buffer using the U3-HV so that the whole data can be analysed offline? Any advice would be appreciated.

Cheers,

W

   

LabJack Support
labjack support's picture
sdr.data is a Queue object:

sdr.data is a Queue object:

https://docs.python.org/3.7/library/queue.html#queue.Queue

You can set the block parameter to false so it doesn't block, and there will be an exception when the queue is empty (no stream data available). Queue does not have a peek like method, but there is an empty method which will tell you when data is available or not.

As for storing the samples, you can have a list that stores all samples. You can retrieve the last 50 samples from the list and analyze all samples later. Here is a modified check of streamTest-threading.py code that stores the samples to an overall list and then gets the last 50 samples:

        # Convert the raw bytes (result['result']) to voltage data.
        r = d.processStreamData(result['result'])
        
        # Add the latest voltage data to the overall AIN0 samples list (ain0).
        ain0.extend(r['AIN0'])
        
        # Get the last 50 AIN0 samples
        last50 = ain0[-50:]

Note that the above chunk of code would be in the example's "while True:" loop, and ain0 needs to be declared/initialized before the loop.

WM
WMarin's picture
Thanks for the quick reply.

Thanks for the quick reply. Adding data points to the declared variable inside the loop works great, thanks for that. However, changing sdr.data.get to False (as in result = sdr.data.get(False, 1)), crashed every single time I tried it. Perhaps I am missing something else elsewhere in the code. I have uploaded a file in the hope you might be able to spot what is happening. I modified streamTest-threading.py a bit so this might be the source of the error.

File Attachment: 
LabJack Support
labjack support's picture
By crash, do you mean you are

By crash, do you mean you are getting an Empty exception? That would be expected for a "get" call when the Queue is empty, as documented in my previous link. Add exception handling to the "get" call to handle when no stream data is available yet (Empty Queue), which will happen at times since the "get" call is no longer waiting.

WM
WMarin's picture
Using "result = sdr.data.get

Using "result = sdr.data.get(False, 1)", the error message is: "File "D:\Anaconda3\lib\queue.py", line 167, in get
    raise Empty"

I tried to follow some examples on how to use the empty queue function, but no luck so far. Is it something that I need to do when definning StreamDataReader or readStreamData function? I typically use Matlab and translating to Python doesn't come naturally yet, so any tips would be appreciated.

 

 

 

LabJack Support
labjack support's picture
The example does this to get

The example does this to get StreamDataReader streaming:

sdr = StreamDataReader(d)

sdrThread = threading.Thread(target=sdr.readStreamData)

# Start the stream and begin loading the result into a Queue
sdrThread.start()

The above creates a StreamDataReader object, then creates a Thread object passing the callable StreamDataReader's readStreamData method, and then starts the thread which will make readStreamData run. After that sdr.data.get can be used. Setting "sdr.finished = True" will stop readStreamData's streaming.

If using sdr.data.empty(), it will return true or false. In the case of false you will want to then run your routine with sdr.data.get to get the available stream data.

When using empty and sdr.data.get with false, there can be times where no data is available, and you code needs to handle that. In the case of the empty method, it is if/else conditions, and in the case of sdr.data.get raising an Empty exception you handle the exception with something like:

try:
    result = sdr.data.get(False, 1)
    # Code for handling retrieved stream data goes here.
except Queue.Empty:
    # No stream data available.
    # Code for handling when no stream data is available goes here.

If there is never stream data available, you will need to debug if readStreamData is running its stream read loop.

WM
WMarin's picture
Great, that worked. I also

Great, that worked. I also blocked the code for 1 second before my first "sdr.data.get(False, 1)" call so some data would be available at the buffer already for the first frame of the animation.

Thanks heaps!