We have a real-time application where windows has to respond to new data arriving on a pci-6509 within 20 ms. New data (54 16-bit words) will continue to arrive each 100 ms. There must be enough time left over to put the data in a buffer and write it in a disk file. While the amount of data is small, missing any is unacceptable. Is there some way to insure that priority can be set high enough to insure that no data is lost? It could be lowered to do disk writes, setup, and other housekeeping. The windows pc would not be connected to the internet and could be devoted to the above if possible.
Is it absolutely necessary for your application to put the data in a buffer and write it to a disk file within that specified amount of time? Or is it just necessary to acquire the data at a specific rate? If this is the case I would recommend looking into the producer/consumer loop design pattern. If you're not familiar with this design, I have provided a link to an overview of the producer/consumer loop:
-Application Design Pattern: Producer/Consumer
Also, if you looking to run a real-time application, National Instruments has a LabVIEW module specifically designed for real-time:
-NI LabVIEW Real-Time Module
The Producer/Consumer is definately the way you want to go. Have one loop acquire the data every so often (100ms?). The data is then sent via a queue to another loop that can format the data and save it to disk.
The data doesn't have to be written in a file until the observation is complete; it amounts to maybe 3mb per observation and could be stored in memory. The data flow would have to be switched on and off during the observation setup and left on for the 50 minutes elapsed time of the observation. The pc would have to respond each 100 ms and give enough time to move (enque?) the 54 16-bit words.
What we did before was to set interrupt priorities in software for different operations, the data aquisition being the highest. The data "consuming" took place at a lower priority and involved writing data on disk or tape and display of some of the acquired data. Once we started observing, the data flow continued so the setup program could know what was going on-- it wasn't recorded in that case. The computer was totally dedicated to the observation at that time.
I would definitely look into the Producer/Consumer loop architecture. You can actually create a new VI from template with a Producer/Consumer Design Pattern.