10-07-2008 06:38 AM
Hi,
I Have a large database that has hundreds of shared variables stored constantly.
the database is growing in the rate of 0.3 GB per day.
I would like to create an appliaction that programmatically keep the database smaller than 3-4 Giga,
while stores the data collected that is older than one week in the hardisk (I have a very big hardisk).
what is more preferable :
1. to create each week another new database and diverting all the data (by changing the looging data properties of the library that has all the shared variables in the project) to the new databse. and then detaching the older database ?
2. to archive the last week each week, to a new database (while destoying the data archived from the original database)
and then detaching the new database ?
thank you for your help in advance,
Amitai Abramson
10-07-2008 07:48 AM
I forgot to write that I use LabVIEW DSC RT 8.2.1 with Citadel 5
Another question - (connected with the last one)
the data is stored in the database constantly,
so If I choose option number 2,
the archive process is active while the database still collecting data,
can it create any problems ?
Also, the data storage of citadel is into pages of data.
so if some variables are in the middle of a page,
does archiving with destroying the original data interups the work of the original database ?
or it just opens new pages ?
Amitai Abramson.
10-08-2008 02:19 PM
Hi Amitai,
I think you may find a number of answers to your questions in this white paper on Citadel. Particularly the sections under Citadel Operations (like Backing Up and Restoring a Database, Archiving Data , etc..). Take some time to read through this paper, and I believe you will find many answers to your specific questions.
In general, it sounds like your best option would be to use the DSC VIs under DSC Module>>Historical>>Database Management to programmatically archive your data (using the Archive Traces VI) after a set period of time. This post shows an example of this.
I hope this helps!