07-10-2009 05:22 PM
Okay, here's an update as to where I am in terms of sample time. Using the maximum time, and setting a number of samples to be taken, we've determined that there is 1696 ms between each sample. I figured this out by setting 1000 samples and "closing in" on 1696 ms as the smallest maximum time that doesn't give me an error.
Is it safe to assume that the multimeter is taking readings at equal time intervals? (ie, the first sample is taken at t=1696, the second is taken at t=3392, the third at t=5088, etc.)? It seems intuitive that this would be the case, but I just want to make sure. If this is the case, I guess I don't have to worry about a specific time stamp for each measurement.
Also, another question: in the "multiple points" setting under the Read VI, does the multimeter take and store all the data, which is then sent to LabVIEW?
Thanks so much for all of your help.
07-11-2009 07:01 PM
Could you post the edited Multiple Points VI. If you edited it correctly, the meter would take a sample every 1 msec and if you requested 1000 samples, you would get all of those every time you read the meter. There would be some time required before you could request another 1000 samples but your VI is not written to do it very efficiently.
The Get Date/Time function would be used to give a timestamp to when you start each acquisition. The timestamp for each reading would be the start time plus the interval.
07-13-2009 10:44 AM
Here is the modified Read Multiple Points VI.
thanks
07-14-2009 04:48 PM
I tried wiring the get date/time function to the build array function, but I can't get it to work. I also talked to an Agilent applications engineer who told me that:
a. The 34401A does not have an internal clock, so a time stamp is not stored with each measurement
b. The samples are not taken at equal time intervals; they can occur at any point in the sample interval
c. The datalogger that they sell can only take 300 readings per second (not enough for us)
Another strange thing happened when I was playing around with the VI: I set the sample interval to 0.01 seconds (10ms), the number of samples to 100 and the maximum time to 1000ms. This (as I expected), worked. So I shortened the time to see at what time I would get the "maximum time exceeded" error. First I shortened the max. time to 990ms. No error. Then I shortened it to 900ms. No error. Then I shortened it to 500ms. No error. I finally shortened it all the way to 143ms with no error. I don't understand how the multimeter is taking 100 samples in 143ms with a sample interval of 10ms.
thanks.
07-15-2009 03:47 PM
I hate to do this, but can someone give me some insight on some possible sources of the problem?
Thanks in advance
07-16-2009 04:33 PM
The manual for the 34401A does not mention this SAMP:TIM property. I thought that was odd until I looked further into the drivers. if you look into the 34401 case, it does state that this property is not supported for the 34401, and is only used for the other models supported by this driver.
07-17-2009 10:56 AM