07-16-2020 03:41 AM
Hello:
I am doing an application in which I read 4 channels from a NI9234 and write the values in a 1024 size FIFO in order to transmit this information from a cRIO-9053 FPGA to the CPU part.
In the part of the CPU I read those fifo and write a measurement file in csv each 5 seconds.
If I put the sample rate of the acquisition module at 51.2 Ks/S and i measure during 5 seconds, I should have 256000 samples in the file, but when I open it I only have 94000.
There are two main VI, the FPGA VI where i read the inputs and another two, one in the CPU of the cRIO and a similar VI in a Host PC.
I can not understand what is wrong with the code
07-16-2020 04:43 AM
Why don't you stop using "Write to Measurement File"? It does too much operation.
How to Write Data to File Faster in LabVIEW
Then, try those ways
High CPU Usage When Reading Data from Target-to-Host DMA FIFOs
Creating Deterministic Applications Using the Timed Loop (Real-Time Module)
07-16-2020 06:19 AM
Hello:
@Tepig ha escrito:
Why don't you stop using "Write to Measurement File"? It does too much operation.
How to Write Data to File Faster in LabVIEW
I am using the "Write to measurement File" cause I need storage the values.
What's the different of what i am implementing.
Then, try those ways
High CPU Usage When Reading Data from Target-to-Host DMA FIFOs
Creating Deterministic Applications Using the Timed Loop (Real-Time Module)
I have read those links, but I can not figure how that write the values in a file.
07-16-2020 07:02 AM - edited 07-16-2020 07:03 AM
File IO is SLOW. And the Write Measurement File makes it even slower since it is constantly opening and closing the file. So it is very likely you are getting FIFO overflows, causing data loss. So my suggestions are as follows:
1. Do not use a CSV file. Converting the values to text is overhead you cannot afford at this stage. Instead, write to a TDMS file and the TDMS API.
2. Create/Open the file only when you have to and close it when that file is complete.
3. Use a Producer/Consumer setup to do the logging in a separate loop.
07-16-2020 07:37 AM
@crossrulz ha escrito:
File IO is SLOW. And the Write Measurement File makes it even slower since it is constantly opening and closing the file. So it is very likely you are getting FIFO overflows, causing data loss. So my suggestions are as follows:
1. Do not use a CSV file. Converting the values to text is overhead you cannot afford at this stage. Instead, write to a TDMS file and the TDMS API.
2. Create/Open the file only when you have to and close it when that file is complete.
3. Use a Producer/Consumer setup to do the logging in a separate loop.
I put the data in a CSV because I need to make different analysis once the file has been captured. TDMS I think is not the most functional file to handle with python scripts.
As I said before, I need the complete data values during 5 second, after 5 seconds i need a new file with new data.
It is the reason because I use a write measurement file because once the time elapsed it send a bit to close the file a reset it
Do you know a better way to do it?
Thank you
07-16-2020 08:14 AM
Looking a little closer, I see you are using the cRIO-9053 and not really anything special in the FPGA. So I would get rid of the FPGA and perform the readings with DAQmx. You can then use the DAQmx Configure Logging to save the data to a TDMS file. There are settings so that you can have DAQmx start a new file every X samples. This would GREATLY simplify your code.
Once the acquisition and logging is complete, you can run a small application to convert from the TDMS format into a csv to allow Python to parse it.
07-17-2020 05:33 AM
@crossrulz ha escrito:
Looking a little closer, I see you are using the cRIO-9053 and not really anything special in the FPGA. So I would get rid of the FPGA and perform the readings with DAQmx. You can then use the DAQmx Configure Logging to save the data to a TDMS file. There are settings so that you can have DAQmx start a new file every X samples. This would GREATLY simplify your code.
Once the acquisition and logging is complete, you can run a small application to convert from the TDMS format into a csv to allow Python to parse it.
I have tried your suggestion but I need to research more about to get the tdms file to transform to use with python
08-03-2020 10:17 AM
https://pypi.org/project/npTDMS/
I've been using the Python package in the link above to do data analysis on data stored in TDMS files without any issue. It's pretty quick, I was up and running with it in under 15 minutes.