LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

data communication speed for sequential commands via usb

Hi, I would like to know the 'dead time' between commands, sent by labview, through usb.

My application requires small, similar commands to be sent roughly every microsecond.
I run Labview 8.2, and have a usb-to-rs232 adaptor to connect between computer and module. Can labview handle it?

Thanks in advance for your help!

Robert
0 Kudos
Message 1 of 9
(3,797 Views)
The problem isn't LV. The problem is going to be producing a 1 usec timing with a non-deterministic operating system - and we haven't even begun talking about propegation delay from the USB interface or how long it takes to send each message... Do you perhaps mean every millisecond?

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 2 of 9
(3,792 Views)
I haven't seen any usb->RS-232 converters that support a baud rate greater than 115K and Mike is correct to mention the problem with USB itself. I think the latency between separate writes is a millisecond or more. Ni sells a pci serial card with baud rates up to 1b/s so if your instrument supports it, that would be your best chance. You'd also have to move to a real-time os.
0 Kudos
Message 3 of 9
(3,786 Views)

Hi

Could you please let me know how you communicate with the device through USB interface by Labview? do you use the Labview Visa?

Mike

0 Kudos
Message 4 of 9
(3,772 Views)
USB adapters typically emulate a serial port, so you would use the same VISA drivers as you would for talking to any of the serial ports built into your computer.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 5 of 9
(3,768 Views)
The USB adaptor is standard, and as I understand it does just emulate a serial port. I plan to use the Labview VISAs to communicate. I run Windows XP with a dual Intel processor.

The project is in it's infancy - at this stage I would just like to know what is the fastest I could possibly send timed commands; that is, what is the shortest possible 'dead' time using Labview on XP, through a Serial/USB port.

I'm sorry I can't be more clear!
0 Kudos
Message 6 of 9
(3,738 Views)
Well it is possible to calculate a theoretical maximum but even if you took the time it would probably end up giving you a useless number. But there are a few things that can give you a sense of scale. Assuming 10 bits per character (1 start bit, 8 data bits and 1 stop bit) you can take the reciprocal of the data rate and multiply it by ten to determine how long it will take to transmit each character. For example, with a data rate of 115000bps:

1/115000 = 8.695 usec sec per bit; or 86.95 usec per character

Already you're the better part of 2 orders of magnitude over your microsecond repetition rate and we haven't started talking about:
  • Actual message length
  • Variation in the operating system
  • Program execution time
  • Throughput delay in the USB interface
  • Turn-around time in the device you're talking to.
Of those the big unknown is probably going to be the OS latency. In the end you'll most likely end-up having to determine the maximum rate empirically and accept that whatever number you come up with will incorporate a statistical probability.

What is it exactly that you are trying to do?

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 7 of 9
(3,733 Views)
Dear Mike,

Thanks for your useful reply.  This is Jeff, Robert's supervisor, with some add'l info.

We wish to simulate the response of a detector system (hardware) to a particular set of events.  Every ~ 3 milliseconds a pulse of x rays will produce some excited nuclei in a sample.  Then, during the time between these pulses we hope to see some gamma rays emitted as these excited nuclei decay.  The emission times of any such decays will be pseudo-random, but with an exponentially-decreasing rate with a characteristic time parameter.  Prior to doing this experiment, we'd like to simulate this type of event to check the system behavior.

Ideally, we'd like to produce a logic (TTL) pulse with a fixed rate to simulate the x ray pulses.  Then we'd like to have another electronic signal produced randomly, but with an exponentially-decreasing probability, after each simulated x-ray pulse.  Since we would know the set time characteristics, we would then know if the detection system electronics gave us data that we could interpret as giving the correct time characteristics.

We have many pulse generators (logic and "tail") inthe lab, including one that produces pseudo-random pulses within a gaussian distribution around the set rate.  However, this doesn't simulate a decay.  A vendor has a pulse generator that can be set to accept "software" triggers, so that it will produce a TTL pulse or pulses every time it receives a so-called "TRG" via USB from a PC.  So the question then becomes whether or not we can produce pseudo-random TRG's, with an exponentially-decreasing probability, from the PC and send them to the external pulse generator.  I had already calcualted the ~ 8 microsecond/bit and have asked the vendor how many bits are required for the TRG signal.  My guess is at least 3.  So this means that we could not produce simulated decay pulses more often than about 300 microseconds.  Robert mentioned 1 microsecond, but 10 microseconds was really the shortest that we would try to measure.  Since that seems infeasible, we might have to do with 300 microseconds.  Since the time between x-ray pulses is around 3 milliseconds, we could still produce a simulated decay, just not with as short of a time characteristic as what we will eventually try to measure.  It would still be a reasonable system test.

Hope that explains our aims.  I look forward to any comments.

Sincerely,
Jeff
0 Kudos
Message 8 of 9
(3,707 Views)
I see what you are wanting to do. The way to approach it though is to control the pulse generator through GPIB. The GPIB standard specifies a trigger command that is a digital signal - like an interrupt line on a microprocessor. It can be generated much faster than a command sent serially over an RS232 interface, certainly at the rates you are talking about. To make the GPIB connection, there are interfaces that plug into the computer as well as USB ones, but given your timing constraints I would recommend the kind that goes into the computer.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 9 of 9
(3,666 Views)