From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

10 KS/S data for 28 Channels

Solved!
Go to solution

Hi All,

 

From the field I have 28 channel sensors data, the data rate is 10KS/S for each channel.

I have to store field data upto 3 Month to generate the historical reports or to analysis the field conditions.

The data is very huge so i want some suggestion from you all to store this much of data techinique or prefered way to store this much data.

 

:mansurprised:

Thanks and Regards
Himanshu Goyal | LabVIEW Engineer- Power System Automation
Values that steer us ahead: Passion | Innovation | Ambition | Diligence | Teamwork
It Only gets BETTER!!!
0 Kudos
Message 1 of 9
(2,770 Views)

WOW! It's a very huge ammount of data. I think the best way to follow is to implement a queue to fill with the data while the acquisition is running (producer loop), the data in the queue will be writen on hard drive from a consumer loop which will be slower than the acquistion loop. Actually I don't have idea how you can store and handle this ammount of data after you have read them.

Ricky
Italian Developer engineer
www.www.selt-sistemi.com
0 Kudos
Message 2 of 9
(2,761 Views)

Your biggest problem isn't going to be collecting and storing the data.  That is simply a matter of having a hard-disk array large enough to store your data (possibly several terabytes).  Where you will run into problems will be finding the information your looking for in the terabytes of data you're going to collect.  Having been in similar situations before (although with only 100's of GB of data), I can offer a few suggestions:

 

1. Find out what information your user/customer is actually looking for.  You may be able to store only data which meets certain criteria and discard the rest.  Be sure to keep a sufficient data buffer so that you can store the data from both before and after the events your customer is looking for.

 

2. Store the data in a format, such as TDMS, that allows you to embed metadata in the data files.  Then store channels minimums, maximums, averages, and any other relevant measurements in the metadata.  When it comes time to analyze the data, this will make the job 1000 times easier.  For example, if your user/customer wants a report of every time channel 17 drops below 3.9 V, you'll only have to read the data from the files where the minimum is less than 3.9 V and the maximum is greater than 3.9 V.

 

3. Over a period of months, you're almost certain to have interruptions due to equipment failures, power outages, janitors tripping over power cords, etc.  Write the software so that it can restart cleanly, preferably automatically, no matter what it was doing when it was terminated.

 

Mark Moss

Electrical Validation Engineer

GHSP

0 Kudos
Message 3 of 9
(2,745 Views)

What hardware are you using?  If it's NI-DAQmx hardware, then you can use our TDMS logging function to automatically split the TDMS file into more manageable sizes and then set up some sort of process to copy those files to more permenant long term storage.  This also handles the issue of large file sizes easier as you can split the files in such a way to keep the file size small.

Seth B.
Principal Test Engineer | National Instruments
Certified LabVIEW Architect
Certified TestStand Architect
0 Kudos
Message 4 of 9
(2,742 Views)
Solution
Accepted by topic author Himanshu_Goyal

Thanks for your suggestions.

 

The project requirements are already define by the costumer. The costumer wants complete 10K data of all 28 channels. He can't afford the less sample rate or data loss.

 

I tried with TDMS file to store 10 KS/S (SGL Format) data for 28 channels(Sine Wave); the file size is around 3.5 GB for 1 hour. The complete day data come around 84 GB, I am not sure about the TDMS file size this can support upto this much of file size, if this not support then again I have to save complete day data in set of bunch of TDMS files. Now if costumer want to see complete 1 hour data report in graphical or tabular format the task to perform the action takes around 2-3 minutes or some time it come up with some error like "Not Enough Memory". So in the end of the day if user want to see the complete day report how I can provide the report that have such kind of huge data.

 

I am using PXI RT system with some local hard disk. The PXI always have 7 days data as an backup or in case of connection failure with server the data backup is in PXI.

 

So my issue now same what database or file format i have to chosse to store complete data.

If there is any techinque to compress the data please suggest me.

Thanks and Regards
Himanshu Goyal | LabVIEW Engineer- Power System Automation
Values that steer us ahead: Passion | Innovation | Ambition | Diligence | Teamwork
It Only gets BETTER!!!
0 Kudos
Message 5 of 9
(2,723 Views)

If your ADC is only 16 bit. Then store the result as 16 bit integers (raw data) you can scale data back to the correct unit then reading/processing the file. That will cut the file size in half compared to using SGL data type 



Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
0 Kudos
Message 6 of 9
(2,706 Views)

3.5GB for one hour is too much! I think it is better to organize the archive in a series of folders of 1hour where you will collect the files of 1min of data recording. If you split the data using this method you would have the size of a single file around 60MB, that is a huge file but it is more manageable than 1h of recording data. To show the data you can use the queue and load the data you have to show to the user dinamically, for example if you want to show the data in a graph you can load 1 minute at time, you can handle the loading of the file by the x-scroll bar of the graph, when the user sroll the bar and reach the limit of the minute you have to load the second file...


I hope I was clear!

Ricky
Italian Developer engineer
www.www.selt-sistemi.com
0 Kudos
Message 7 of 9
(2,699 Views)

If I assume a 16-bit A/D converter, then each channel takes 2 bytes.  28 channels at 10KHz for an hour works out to (approximately) 50GB of data (per hour).  A 2TB hard drive (pretty big!) will only (?!?) hold 40 days of data, not 3 months.  It does make sense to break this enormous amount of data into smaller file-sized chunks, either a day at a time, an hour at a time, whatever makes sense to your customer.  If you can tolerate a brief break in the data stream, you simply close one file and open the next (name them sensibly so you can "stitch" the data stream back together).  On the other hand, if you really need to record every byte, then you'll need a pretty big buffer (queue) to handle the switch-over.

 

I've not tried this, myself, but you might be able to use a "double-buffering" type of technique.  Open one file and start writing it.  While doing this write, open a second file that will hold the next "chunk" of data.  When you are ready to switch over, simply start writing to File 2, and, in parallel, close File 1.  You now get ready for the third chunk by opening File 1 with the new file name.  By flipping between writing to File 1 and File 2, you may be able to break your file into "bite-sized" (pun intended) pieces without losing data.

 

Bob Schor

0 Kudos
Message 8 of 9
(2,653 Views)

With a huge ammount of data like you have to manage I think it is a good idea to take in cosideration a SQL sever to store the data and to analyse them after you have stored them. I think it is simplier than to think how split the data you are acquiring because you don't have to care about how the server splits the data and how it saves them into hard drive, it can also help you to find the portion of data you need when you have to extract them to show them in a graph or to make some statistisc. Nowadays it is not so expensive to buy a cloud services that can provide this kind of solution.

Ricky
Italian Developer engineer
www.www.selt-sistemi.com
0 Kudos
Message 9 of 9
(2,635 Views)