11-27-2024 09:06 PM
Hello,
I have the Agilent 34401 Read Multiple Measurements.vi example
it appears to read as fast as possible
does anyone have any examples where you can set the sampling frequency?
google AI says maximum is 1kHz / 1 sample per millisecond
so if I set it to 10kHz, I assume I would get an error or maybe it would default to the maximum it could do
but if I want to sample every 100ms for example, how could I do that?
I supposed I could put a 100ms delay in the while loop but this doesnt seem efficient.
Thank you
11-28-2024 01:24 AM
Hi cwhw,
@cwhw112 wrote:
if I want to sample every 100ms for example, how could I do that?
I guess you could read the manual for your device to find the answer.
Did you?
11-29-2024 07:13 PM
It looks like it can be set with remote interface operation
I dont know how to do this
11-29-2024 08:54 PM
I have been searching online and not finding much of anything.
I made the code:
and ran it and didnt get an error
but when I ran the example program
with 10 samples, it only took a few seconds, not 100
12-05-2024 05:21 PM
has anyone ever came across this or have any ideas?
I am willing to try any suggestions.
12-06-2024 03:59 AM
Hi cwhw,
@cwhw112 wrote:
has anyone ever came across this or have any ideas?
I am willing to try any suggestions.
The command in your code image is different to the command in the manual image in the message before.
And it seems you don't send an EndOfLine char with the command…
12-08-2024 08:09 PM
yes, i am sure I didnt implement it correctly. I am just guessing and hoping my guessing will add something that will spark an idea in someone who is more familiar with this type of thing.
I would think this is a fundamental feature commonly used
but Maybe there is a simpler answer that the sampling rate can not be set or determined with labview/34401A DMM.
Does it just take the specified number of samples at a rate that we can not set or verify?
The manual references remote interface operation in several locations but nowhere shows how to use/implement it and my internet searches have not yielded any labview examples.
12-08-2024 08:29 PM
I put the feedback node in from this post
https://forums.ni.com/t5/LabVIEW/Timing-a-while-loop/td-p/3922638
into the read multiple points from the instr.lib
for 10 samples it is zero, for 100 it is 1.7, so shouldnt it be 0.17 for 10
50 is also 1.7, so is 2.1???
12-08-2024 08:42 PM
I tried a flat sequence:
1 sample 673
2 samples 842
3 samples 1014
5 samples 1350
10 samples 2194
20 samples: 3881
50 samples 8945
100 samples 17383
200 samples 34262
it appears to have a setup time of 500ms and then sampling frequency of around 6Hz (1/0.16878)
this method seems to be somewhat accurate at checking the frequency
but 6Hz is quite a ways from the claimed 1000Hz
which can be configured, so they question would be - how do you configure the reading rate
12-09-2024 01:01 AM
Hi cwhw,
@cwhw112 wrote:
so they question would be - how do you configure the reading rate
By reading the manual and implement ALL the recommendations to achieve this high sampling rate.
(I will not do the reading for you…)
From my experience:
All this will be mentioned in the manual. And the programming manual will show all the needed commands to set those pre-requisites…