From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

sampling rate vs. # of points written to file (RMS)

We are sampling three phases of AC voltage (approx +/- 181 Volts each phase) on a generator running at 60Hz. Each of the phases goes through the Average DC/RMS VI in order to calculate the RMS values for the AC signal. This value is approximately 128 Vrms when we read each phase on a voltmeter.

The rprogram we are writing should read the voltage values being sampled by the DAQ, graph them on a waveform graph, and write the data to an excel file. We want our sampling rate to be 150 samples per second and we want each of those values written to the excel file. So basically, we are looking to sample the AC signal off of the generator set, convert those values to RMS, make a graph of RMS voltage versus time, and write the RMS voltage data to file.

When I run this portion of the application, we are able to sample a RMS voltage signal which is displayed on the graph, but the number of data points being written to file is not equal to 150 points/second. It is usually something much lower than that. If we run the test for 10 seconds, we would expect to find 1500 rows of data points being written to the file. The excel file has four columns. Column 1 is the voltage for phase 1, column 2 is the voltage for phase 2, column 3 is the voltage for phase 3, and column 4 should be the time (in seconds from the beginning of the sampling at t = 0 seconds) that correspond to each of the data points. In a side note, Column 4 is derived from the Get Waveform Time Array.vi and at low sampling rates generates a number that is written to the excel file that has around 15 digits. At higher sampling rates, the time data that is written to column 4 seems to be the correct (3.01 seconds, 3.02 seconds, etc). We have absolutely no explanation for this phenomena.

If you have the time, please let me know if you can find any reason why the number of data points being written to file is not equal to the sampling rate. Ideally, we would like the scan rate (set at 150) to be equal to the number of data points being written to the file per second so that the user knows that if they increase the sampling rate to 200, then they will have 200 points per second being written to the file. This problem is severely hindering our progression for this project and has been a burden for some time now.

We are using an SCXI 1000 chassis, 1120 module, and 1327 terminal block. Our DAQ card is a 6040E.
0 Kudos
Message 1 of 2
(2,637 Views)
Some thoughts.

If you save n points then n points should be saved. Where are they being
lost? How are you saving them? When data is lost, what does the timestamp
information reveal- is only the initial data present, only the final or is
the data evenly or randomly distributed?

Are you saving straight to Excel via ActiveX? If so are you re-opening the
reference to Excel- or even re-opening the spreadsheet- on each iteration?
This would give a major speed loss in the data saving.

If you have to save straight to Excel, consider saving a second's worth of
data at a time by sending a 2D array of data to Excel. Each data transfer
has some overhead, and transferring individual numbers in many operations is
significantly slower than transferring the same data as a 2D array of
numbers.

If you don't have to save to Excel, consider saving to a simpler file on
disk and importing the data to Excel later. If you save a "Spreadsheet" file
then you can import this to Excel manually as text. It's probably not worth
the extra hassle to save raw binary data at these sampling frequencies. If
this is the technique you're using, make sure you're not overwriting the
file with new data- the timestamps will reveal this if so.

Putting the program somewhere may be useful- your message is pretty vague on
what you're doing. Whether anyone will have time to take a look is of course
another matter! 🙂

VSECorp wrote in message
news:506500000008000000D2160000-984280909000@quiq.com...

> When I run this portion of the application, we are able to sample a
> RMS voltage signal which is displayed on the graph, but the number of
> data points being written to file is not equal to 150 points/second.
> It is usually something much lower than that. If we run the test for
> 10 seconds, we would expect to find 1500 rows of data points being
> written to the file. The excel file has four columns. Column 1 is
> the voltage for phase 1, column 2 is the voltage for phase 2, column 3
> is the voltage for phase 3, and column 4 should be the time (in
> seconds from the beginning of the sampling at t = 0 seconds) that
> correspond to each of the data points. In a side note, Column 4 is
> derived from the Get Waveform Time Array.vi and at low sampling rates
> generates a number that is written to the excel file that has around
> 15 digits. At higher sampling rates, the time data that is written to
> column 4 seems to be the correct (3.01 seconds, 3.02 seconds, etc).
> We have absolutely no explanation for this phenomena.
0 Kudos
Message 2 of 2
(2,636 Views)