I am reading 2 oscilloscope traces from an Agilent scope with a GPIB USB HS adapter, using Measurement Studio and VS 2008 C#. On the scope trigger, my code tells the scope to dump 2 10000-pt., 2-byte traces to scope memory, and then my code grabs the 2 scope traces via the GPIB. This all happens about 3 times per second, for a total data transfer rate of 10000*2*2*3 = 120 kbytes/s . The GPIB USB HS spec sheet at http://sine.ni.com/nips/cds/view/p/lang/en/nid/201586 quotes a standard data transfer rate of 1.8 MB/s . Does anyone have any thoughts as to why our data transfer is so much slower than that, or what types of things we can check? Thank you in advance.
First, your math is flawed. The max GPIB transfer rate does not include the time for the scope to trigger, capture, and store the waveforms. Second, the instrument itself is usually the bottleneck. You need to read the manual and see what it's maximum transfer rate. You also need to see the particulars for achieving the max transfer rate.
If you really need high transfer rates, you probably need to look at a different architecture such as a PCI or PXI scope card.
As others have pointed out, many factors play into the actual achieved transfer rate. One thing you can do in your application is to make sure that you are reading the largest possible packets from the bus. For example, if you read your 10000 points in a single read command, you should pretty high transfer rates, because your are minimizing the overhead of going out over USB to communicate to the hardware. On the other hand, if you perform 10000 2-byte reads, you should expect very low performance, because every one of these reads will incur all the USB overhead.
If you want to have a general idea of how long your reads are taking, you can view the duration of your calls in NI I/O Trace (Called NI-Spy in older software distributions). You could use this to try to isolate the portions of the application which are actually causing the bottleneck, and identify whether it is a software or an instrument issue.
Thanks very much. We are reading 10000 2-byte points all at once, 3 times per second. We will try that I/O Trace idea to see if it can tell us anything. It may very well be the dead time on the scope when we are communicating with it remotely, in which case a digitizer might solve our problem (?).