Multifunction DAQ

Showing results for 
Search instead for 
Did you mean: 

nidaqmx AnalogUnscaledReader data type as unsigned int

Go to solution


I'm writing a python script to read "raw" AI data (in continuous mode). Right now I'm using a PCI-6361 board. I got it to work using the AnalogUnscaledReader. However, from trial and error, I found out that the data is int16. I have two questions.

First, how could I automatically detect in code that the data received from the card is signed int 16? On the (opened) AIChannel, I found ai_raw_samp_size = 16, and I assume that's how I'd know it's 16 bits (and not 32 bits). However, I cannot find any indication that it's signed or unsigned. Is there any attribute for that?


Next, is there a way to actually request a different format to the card? I would actually prefer to read the data as uint16 (because that's how I have to save it, so that would save a conversion step).


For reference here is my current code (which works fine, but it's stuck to reading int16's):




DEVICE = "Dev1"
def read_ai_cont(channels: List[int], sample_rate=float,
                 should_stop=threading.Event, q=queue.Queue):

    buffer_size = min(1_000_000, int(sample_rate * 0.1)) # Enough for 0.1 s
    # Generate the channels names. It's just a comma separated list of [device]/[channel]
    # ex: "Dev1/ai0,Dev1/ai4"
    channel_names = ",".join([f"{DEVICE}/ai{n}" for n in channels])
    with nidaqmx.Task() as task:

                                        samps_per_chan=buffer_size, # In continuous mode, indicates the size of the buffer
        reader = AnalogUnscaledReader(task.in_stream)
        while not should_stop.is_set():
            # TODO how do we now it's int16? (and not uint16 or int32?)
            # TODO how to change the device to read as uint?
            data = numpy.empty((len(channels), buffer_size), dtype=numpy.int16)
            data[:] = 0
            reader.read_int16(data, number_of_samples_per_channel=buffer_size)
            logging.debug("Acquired one buffer")





0 Kudos
Message 1 of 4
Accepted by topic author pieleric

According to nidaqmx._task_modules.in_stream — NI-DAQmx Python API 0.7 documentation, since you specify a negative to positive input range, the device will return an int.



0 Kudos
Message 2 of 4

Thanks a lot for pointing to that code. That seems to explain it.

I've looked further, but still couldn't find any way to force the DAQ board to return a different type? (I'd like to get unsigned, even if the voltage of the board is actually -5 -> 5V)


Also, do you know if that's the same for the AO raw type? For this, I actually care less (because I can generate whatever pleases the board), but it'd be handy to know in advance, instead of doing trials and error!

0 Kudos
Message 3 of 4

I've looked further, but still couldn't find any way to force the DAQ board to return a different type?

I believe that you cannot change this behaviour. You can do a post-processing and add an offset to cast the int16 to uint16 but you have to remember that you make the change.


Also, do you know if that's the same for the AO raw type?

I am not sure about this. Since both ADC and DAC on PCIe-6361 are 16-bit, I guess that they will use the same data type.

0 Kudos
Message 4 of 4