LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Periodically send NEW data in a file

Hi,

I need help in finding a possible strategy in this problem. I record data from sensors on a text file which can be very (very!) big after a few months. I want to periodically send data in that file over the internet and insert data in my remote database. In order to optimize internet traffic I would send only last data in the file, where "last" is, for example, the last hour of data acquisition. My idea was using a shift register containing the row of the last sent data in the file, and update it when data is trasmitted (it is not if, for example, there is no internet connection for some reasons). The problem is: what happens if the VI stops working? The shift register is re-inizialized, obvioulsy. So how can i manage this problem? Any suggestion? (A possible solution is saving the value of the last sent row in another file, but it seems inelegant to me)

0 Kudos
Message 1 of 10
(3,203 Views)

Dindex82 wrote: So how can i manage this problem?

How often are you logging the data in the file?

 

I would suggest you just let the shift register initialize to an empty string.  As long as the value is empty, do not allow sending to your database.  As soon as you get a new line, you can update your database.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 10
(3,196 Views)

The file is updated every 30 seconds in my example. So every hour I should send 120 rows of data (if there internet connection is available, if it is not I should send 240 rows and so on)

0 Kudos
Message 3 of 10
(3,184 Views)

@Dindex82 wrote:

The file is updated every 30 seconds in my example. So every hour I should send 120 rows of data (if there internet connection is available, if it is not I should send 240 rows and so on)


Text compresses really well, why not just use LabVIEW built in ZIP file functions to compress a days worth of data and email it as an attachment and start a new file every day?

 

I have found if you are going to bother collecting data you might as well save it all, as it could come in handy reconstructing an event.

 

What if something that happened six hours ago setup a chain of events that caused a failure and you only save the last hour of data?

 

BTW: I have long term test running that saves data every minute that has been running for a couple YEARS!

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 4 of 10
(3,170 Views)

How much storage space do you have locally?  Since you don't want to lose data, and you are uncertain if you will always have a valid internet connection, be sure to save all the data locally.  Don't rely on memory in a shift regsister.

 

I don't think saving additional "data in a separate file is that bad of an idea.  It may seem inelegant to you, but for a remote device, you want reliability, not elegance.

 

I would store your data in a file locally.  When the time comes to send the data over the internet, do it.  I would set up a communication scheme so that the distant server responds back that it received it.  The local device can then update a shift register with info.  Store that to a separate file as well if you want to create a log of all the send dates.

 

Later, when the time to update occurs again, grab that info from the shift register or from the log file.  Open up your data file, move your file pointer or read your file to get through the data that has already been sent as compared to that date.  Then read and transmit the new data to the server.  Server acknowledges and you update the "last sent" date.

 

In the event the transmit fails, (and you'll know if you dont' get a response from the server),  the next time you are ready to send, you repeat the process.  You still have the older data in your file and you'll send that again with any newer data you've gotten since.

 

If by any chance you lose track of what the last sent date is, just send the entire file.  Let your server worry about if there are any duplicates.

 

Periodically, you'll want to check the file size of your local file, delete any data older than the last sent date to reduce the size of the file.

0 Kudos
Message 5 of 10
(3,158 Views)

Hi,

Please can you post an example of your code

Many thanks,

0 Kudos
Message 6 of 10
(3,122 Views)

Your solution seems perfect, thank you, and it is logically strong. I will try in the following days.

 

I would take advantage of your kindness and ask you (this is my ignorance): when I add a row to a file using write to spreedsheet file, the file is loaded entirely into RAM memory? Because I see an high memory usage increasing over loop cycles which cannot be explained in other ways (I checked all references, shift registers etc.)

 

0 Kudos
Message 7 of 10
(3,110 Views)

Appending new data to the end of a spreadsheet file should not be causing increasing memory.  By "spreadsheet file", I assume you mean a text file created by the Write to Spreadsheet file functions within Excel.

 

If data is memory usage is growing, you'll need to take a close look at your code and see if you have arrays growing endlessly in shift registers.  Once you have data written out to a file, you should be able to put an empty array into a shift register and build from there.

0 Kudos
Message 8 of 10
(3,104 Views)

Are you sure that when I open a file it is not loaded into the memory? There are no growing shift registers in that loop.

 

Tomorrow I will try to upload my code for this task (which is a 1 hour data buffering on disk). Thank you

 

PS: I use "Write to spreadsheet" for writing to a .txt file, I dont use Excel. 

0 Kudos
Message 9 of 10
(3,096 Views)

Yes, I'm sure just opening the file doesn't load it into memory.  If it did, you'd have huge problems trying to read through large files.  There would be no way to do it piecemeal.

0 Kudos
Message 10 of 10
(3,089 Views)