Hi,
I am following this thread as this is a first attempt at actually looking at the data acquired with VS2011. If need be I can switch to a new thread.
So, I have an application running on a RT Target with VS2011 with hardware IOs, models, custom device... Currently the HW loop rate is 4kHz and when the target runs, its CPUs are all four under a 50-60% load on average. A LabVIEW program takes advantage of the VS API to start the target, set model parameters and inputs etc...
The system definition has about 50 channels defined, some are models I/Os, some are hardware I/Os.
To acquire data I am using the host API Start Data Logging, passing it the channels of interest. Sampling rate was initially set at 1 KHz.
What we observe at this point is that the data set found in the tdms file is inconsistent: some channels have less data points returned than what would be expected with a given test duration and sampling rate, some logged channels are kept constant in the code, but are returned as toggling between 0 and their actual set up value.
At this point I am not excluding there is something wrong on our side, however, I am wondering if similar issues were observed before? What is the maximum throughput Target --> Host one can expect ??
thx.
Laurent