LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx Read data to JSON file

I've got 9 channels being read by a 1Chan, NSamp (Waveform) instance of the DAQmx Read node.
I'm taking 5000 samples at 51.3kSPS
I would like to log the data to a JSON file but I'm having trouble formatting the data output to json in a format like so:

[
    {
        "channel": "one",
        "data": [
            {
                "time(ms)": 0.000,
                "value": 12.2376
            },
            {
                "time(ms)": 0.0194,
                "value": 13.5832
            },
            ...
        ]
    },
    {
        "channel": "two",
        "data": [
            {
                "time(ms)": 0.000,
                "value": 12.2376
            },
            {
                "time(ms)": 0.0194,
                "value": 13.5832
            },
            ...
        ]
    },
    ...
]

the main issues being;
I can't extract the time of each sample, as I understand it, it's not necessarily 1/51300 of a second per sample.
I can't think of the best way to convert the data output to a something I can build a JSON object from.

Kind regards,
Henry

0 Kudos
Message 1 of 6
(1,452 Views)

DAQmx data will be hardware timed to be evenly sampled, which is why waveforms have a delta-t value rather than individual timestamps.

0 Kudos
Message 2 of 6
(1,440 Views)

@HenryHEXR wrote:

the main issues being;
I can't extract the time of each sample, as I understand it, it's not necessarily 1/51300 of a second per sample.
I can't think of the best way to convert the data output to a something I can build a JSON object from.


Assuming you are using hardware timing, which appears you are, then you can trust that sample rate being pretty accurate, at least enough that you would care.

 


@HenryHEXR wrote:

I've got 9 channels being read by a 1Chan, NSamp (Waveform) instance of the DAQmx Read node.


Why are you not putting all of the channels into a single task and only do 1 read?  It would make things a lot simpler.

 

And why JSON?  This seems likes a horribly inefficient format for storing your data.  I would lean towards a TDMS or tab delimited text file, which are a lot more readable.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 6
(1,439 Views)

Hello both,

Noted on the timing thank you I'll just derive it from the sample rate. In that case a better format would be:

[
    {
        "channel": "one",
        "data": [
            12.2376,
            13.5832,
            ...
        ]
    },
    {
        "channel": "two",
        "data": [
            12.2376,
            13.5832,
            ...
        ]
    },
    ...
]

in JSON as I need to pipe the data into other software, in fact it's quite likely I'll be making a POST request with the data as the body.
I am doing one read using one 1Chan, NSamp (Waveform) instance of the DAQmx Read node so all the data is coming out of one node. I just don't know how to put it into the above (or similar) JSON format

Kind regards,
Henry

0 Kudos
Message 4 of 6
(1,376 Views)

In a single task, use the N Ch, N Sample version of DAQmx Read to read all the channels at once. You can then write the data to a cluster as shown. Note, I wasn't able to flatten any data structure that included the timestamp. Apparently, the Flatten to JSON does not like timestamps - who knew?

 

Flattening Multiple Channels to JSON.png

 

This code produces the JSON string as shown below:

 

dsbNI_0-1629778017312.png

 

Doug
NI Sound and Vibration
0 Kudos
Message 5 of 6
(1,345 Views)

Hi Doug,

I was able to get the timestamp value to flatten to JSON by first converting to string. When flattening to XML I could see that timestamp values in LabVIEW aren't a simple key:value pair so flatten to JSON struggles.

H

0 Kudos
Message 6 of 6
(1,325 Views)