Data Acquisition Idea Exchange

Community Browser
Top Authors
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

We really need a hard drive crio module for long term monitoring and reliably storing large amounts of data remotely.

 

Hard-Drive-Module-Concept.png

 

Options:

 

1. Solid State Drive: Fast, reliable, and durable. Extremely high data rates. It would be a very high price module but it could be made to handle extreme temperatures and harsh conditions. It should be available in different capacities, varying in price.

 

2. Conventional Hard Drive: This would give any user the ability to store large amounts of storage, in the order of hundreds of Gigabytes. This type should also come in varying storage capacities.

 

For this to be useable:

 

1. It would need to support a file system other than FATxx. The risk of data corruption due to power loss/cycling during recording makes anything that uses this file system completely unreliable and utterly useless for long term monitoring. You can record for two months straight and then something goes wrong and you have nothing but a dead usb drive. So any other file system that is not so susceptible to corruption/damage due to power loss would be fine, reliance, NTFS, etc.

 

2. You should be able to plug in multiple modules and RAID them together for redundancy. This would insure data security and increase the usability of the cRIO for long term remote monitoring in almost any situation. 

 

 

Current cRIO storage issues:

We use NI products primarily in our lab and LabVIEW is awesome. I hope that while being very forward about our issues, we will not upset anyone or turn anyone away from any NI products.  However, attempting to use a cRIO device for long term remote monitoring has brought current storage shortfalls to the forefront and data loss has cost us dearly. These new hard drive modules would solve all the shortfalls of the current storage solutions for the crio. The biggest limitation of the cRIO for long term monitoring at the moment is the fact that it does not support a reliable file system on any external storage. The SD Card module has extremely fast data transfer rates but if power is lost while the SD card is mounted, not only is all the data lost, but the card needs to be physically removed from the device and reformatted with a PC. Even with the best UPS, this module is not suitable for long term monitoring. USB drives have a much slower data transfer rate and are susceptible to the same corruption due to power loss.

 

When we have brought up these issues in the past, the solution offered is to set up a reliable power backup system. It seems that those suggesting this have never tried to use the device with a large application in a situation where they have no physical access to the device, like 500 miles away. Unfortunately, the crio is susceptible to freezing or hanging up and becoming completely unresponsive over the network to a point that it can not be rebooted over the network at all. (Yes even with the setting about halting all processes if TCP becomes unresponsive). We would have to send someone all the way out to the device to hit the reset button or cycle power. Programs freeze, OS' freeze or crash, drivers crash, stuff happens. This should not put the data being stored at risk.

 

I would put money on something like this being already developed by NI. I hope you guys think the module is a good idea, even if you don't agree with all the problems I brought up. I searched around for an idea like this and my apologies if this is a re-post.

 

 

The DAQ Assistant was presumably created to simplify data acquisition.  The idea seems to be to put all of the needed pieces in one place, so that all the low level 'traditional' DAQ vi functions are not needed.

 

Consider the following simple vi:

 

Demo VI

 

This could be as simple as one analog input channel.

 

The program will compile into an .exe and work just fine, as long as you don't use one of the features of the DAQ Assistant:  Custom scales.

 

Custom scales are not stored with the VI or project, but in a system file that does not automatically get included in an .exe build.  The .exe will work fine on the original PC that built it, but it will not work when the .exe is loaded on a different PC.

 

There is a method that allow the user to port the custom scales to another PC, but it is not automatic.

 

http://digital.ni.com/public.nsf/allkb/12288DEB3C6A185B862572A70043C353

 

 

The fundamental problem is that the DAQ Assistant is intended to make life simple and give you everything you need to make a program.  Custom scales are included in the DAQ Assistant so that the programmer does not need to manually create scaling in their vi.  But what good does that do if they are not included in the .exe build, and there is no obvious clue that this requires extra work or what that work is?

 

The build .exe process needs to be upgraded to automatically include custom scales and possibly other MAX settings that are essential to the operation of a compiled program.  It does not matter if the build process ciphers and includes only the specific scales or setting used by the particular program / vi, or if it just takes all the settings. 

 

These are critical pieces to make the final compiled program run on another machine.  The user should not have to somehow know that these pieces are separate but need to be included, and have to take extra steps to go out and select them in order for them to be used in the build.  That is totally counter intuitive to the simplicity intended by the DAQ Assistant.