Bit latency for digital output via FIO0-FIO7 | LabJack
 

Bit latency for digital output via FIO0-FIO7

5 posts / 0 new
Last post
KPNL
KPNL's picture
Bit latency for digital output via FIO0-FIO7

Hello,

In my lab electroencephalography lab we use Python-based PsychoPy software (http://www.psychopy.org/) and the LabJack U3 DIO device to send  event markers to a digital acquisition channel of the EEG system. While testing this setup with a photodiode and oscilloscope I observed that there was approximately a 400 microsecond latency between each bit being set. So sending “00000011” takes slightly longer than sending “00000001”, because the least significant bit is set to high and then the second most least significant bit is set to high, and so on. This can add up such that sending a code of “255” can take more than 3 ms longer than sending a code of “1”, which creates a potential confound if contrasting the peak latencies of EEG trials with these codes. 

This also results in an issue where the sampling rate of the acquisition device is faster than the digital input from the LabJack U3.  For example, sending a code of “00000011” results in the LabJack setting the least significant bit high and then the next LSB high. But there is a latency of ~ 400 µs between these events and my EEG system is sampling at 2048 Hz (488 µs / sample), so the system records a “00000001” immediately followed by a “00000011”.  The “00000011” is then immediately followed by a “00000010”, presumably because the code “0” (which I send after the event code) results in the LabJack setting the bits low in the same sequential fashion.  In short, having the LabJack send a “3” followed by a “0” results in the following being recorded on the digital channel of the BioSemi: 00000000 => 00000001 => 00000011 => 00000010 => 00000000. 

Is this the normal latency? Is there any way set the whole byte simultaneously so that it is set within a single 488 µs sample?

Apologies if any of this is unclear or if I've omitted diagnostic info (I'm a bit out of my depth). But any help will be greatly appreciated.

Thanks in advance!

LabJack Support
labjack support's picture
As for whether your 400us

As for whether your 400us delay is expected, I would have to see what LabJack calls are used to send 00000001 versus 00000011.  Can you provide more details about that?

As for a solution, you want to to a PORT write with iotype LJ_ioPUT_DIGITAL_PORT:

https://labjack.com/support/datasheets/u3/high-level-driver/example-pseu...

That is the high level Windows iotype, and it uses the low-level PortStateWrite:

https://labjack.com/support/datasheets/u3/low-level-function-reference/f...

That documentation in Section 5.2.5.10 points out that firmware has to use 9 different instructions to update the 20 DO, but that all happens in less than 1 microsecond.  Some groups of DO, such as EIO0-5, are updated at the exact same instant so they can be used when that is desirable.

Someone else here can provide more python specific help if needed.

KPNL
KPNL's picture
Hi,
Hi, I should have mentioned in my initial post that I’m using a Mac (OS X Yosemite).  * The relevant python code (I think/hope) from the PsychoPy software is copy/pasted below. I’m afraid I do not know Python so please let me know if it was something else you were asking to see.  Thanks for your help! # Part of the PsychoPy library # Copyright (C) 2015 Jonathan Peirce # Distributed under the terms of the GNU General Public License (GPL).   """This provides a basic ButtonBox class, and imports the `ioLab python library     <http://github.com/ioLab/python-ioLabs>`_. """ #  This file can't be named ioLabs.py, otherwise "import ioLabs" doesn't work. #  And iolabs.py (lowercase) did not solve it either, something is case insensitive somewhere     from __future__ import division from psychopy import core, event, logging   try:     from labjack import u3 except ImportError:     import u3 #Could not load the Exodriver driver "dlopen(liblabjackusb.dylib, 6): image not found"   class U3(u3.U3):     def setData(self, byte, endian='big', address=6008):         """Write 1 byte of data to the U3 port           parameters:               - byte: the value to write (must be an integer 0:255)             - endian: ['big' or 'small'] determines whether the first pin is the least significant bit or most significant bit             - address: the memory address to send the byte to                 - 6008 = EIO (the DB15 connector)         """         if endian=='big':             byteStr = '{0:08b}'.format(byte)[-1::-1]         else:             byteStr = '{0:08b}'.format(byte)         [self.writeRegister(address+pin, int(entry)) for (pin, entry) in enumerate(byteStr)]

LabJack Support
labjack support's picture
From the amount of time you

From the amount of time you mentioned setting digital I/O takes, it sounds like you are setting lines individually and only the lines whose bits are 1. The code you sent, the setData method, shouldn't do that and sets 8 consecutive digital lines individually regardless of a 1 or 0. I don't know how you are causing your issue with the code provided. How is setData is being used and what other calls are being performed?

A modified version of setData that sets 8 lines (FIO/EIO/CIO) in one call would look something like this:

    #Only showing the setData portion of code
    def setData(self, byte, endian='big', address=6701):
        """Write 1 byte of data to the U3 port
 
        parameters:
 
            - byte: the value to write (must be an integer 0:255)
            - endian: ['big' or 'small'] (ignoring) determines whether the first pin is the least significant bit or most significant bit
            - address: the memory address to send the byte to
                - 6700 = FIO
                - 6701 (default) = EIO (the DB15 connector)
                - 6702 = CIO
        """
        #Upper byte is the writemask, and lower byte is the 8 lines/bits to set. Bit 0 = line 0, bit 1 = line 1, bit 2 = line 2, etc.
        self.writeRegister(address, 0xFF00 + (byte&0xFF))

KPNL
KPNL's picture
Thank you!! 

Thank you!! 

That seems to have done the trick. The full byte is now being set instantaneously (or at least as close to instantaneous as I need. Sampling at 2048 Hz I can now see the full byte being acquired within a single sample of ~ 488 microseconds).