LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Data Transfer Rate

Hai Everyone,

                  I have an image which i need to acquire, analyze and present within a few micro seconds.

I need to know the highest data transfer rate of NI DAQ for an image acquisition and displaying.

 

'' A professional is someone who can do his best work when he doesn't feel like it''...........
0 Kudos
Message 1 of 6
(3,042 Views)

We need more information.

 

  1. How large is the image?
  2. What type of camera or device is taking it?
  3. What interface are you using to transfer the image?
  4. Why do you need to acquire, analyze, and present in microseconds (your eye cannot see things faster than about 60Hz / 17ms)?
  5. What operating system are you using?

However, up front, you need to know that taking an image, analyzing it, and showing it on a video screen will probably take milliseconds just due to video screen technology. Yes, you can get high speed cameras and video screens, but they are very expensive.

0 Kudos
Message 2 of 6
(3,025 Views)

That question doesn't make sense. Did you ever think about how much information a human eye can capture within a SECOND?

Trust me, is by far LESS than 100,000!!!!! It's even less than 1000.....

So why do you even want to try updates at that rate?

 

Make it a rule: visual update makes sense up to about 50 per SECOND. Not more...

 

This does, however, not answer your question on the transfer rate. What hardware are you looking for? PCI or PCIExpress?

What kind of camera are you going to use? What is the resolution/color-depth? How many images per second can it acquire? Or is it a linecamera?

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 3 of 6
(3,024 Views)

Image size - 1024*768

Camera - BASLER acA2500-14gm

interface - Ethernet

We need to acquire image and find if any faults on it by adjusting brightness/contrast programatically which happens total process in micro seconds.

OS - Windows 7 professional(32 bit) CPU G2020 @ 2.90GHz.

 

We have ADC with some sampling rate, and i need to acquire and plot it simultaneously, so will it make any sense slow data transfer through buffers which makes hanging of PC.

I need to know the data transfer rate.

'' A professional is someone who can do his best work when he doesn't feel like it''...........
0 Kudos
Message 4 of 6
(2,949 Views)

From Basler's Web site:  The Basler acA2500-14gm GigE camera with the Aptina MT9P031 CMOS sensor delivers 14 frames per second at 5 MP resolution.

 

Just to "take a picture" takes 70 milliseconds, which is time to do a whole lot of image processing in LabVIEW!  Even if you could process the data infinitely fast, you need an image to process (which takes 70ms to acquire).  You should certainly have enough time analyzing one image to be ready (with, possibly, updated parameters for processing the next image) before the next image arrives.

 

BS

Message 5 of 6
(2,925 Views)

First, what is the bit depth of your image? The camera offers quite a few options. If you are interested in speed, I would recommend the 8-bit one. If you are interested in bit depth resolution, one of the 12-bit ones.

 

Your camera is connected by GigE ethernet, so limited to gigabit per second speeds. Doing the numbers, your data transfer theoretical maximum is going to be:

 

(1024  x 768) pixel x 8 bits/pixel = 6,291,456 bits

6,291,456 bits / 1 Gbit/s = 6.3ms transfer time

 

If you go to packed 12-bit, this increases to 9.4ms; if you go to 12-bits, 12.6ms (12 bits is put into two 8 bit bytes for actual 16 bits). Note that this is theoretical and a host of things could slow this down.

 

As mentioned above, your maximum frame rate is 14Hz or 71ms, so you have a lot of time to run algorithms on the images as they come in. How much time this takes will depend upon how much you need to do and how efficiently you write your code.

 

Synchronizing an external DAQ board to the acquisition should not be too difficult, provided you use the hardware triggers of your camera and the DAQ board. Do not depend upon software to synchronize your devices. It will not work (unless you are using an FPGA). How much data are you taking with the DAQ board? At what sample rate and bit depth? What DAQ board are you using?

 

Note that LabVIEW makes it very easy to use multithreaded acquire, analyze, and present. Acquire your data (images and DAQ) in separate loops. Send the data via queue to processing loops. Send the processed data to another loop via queue for display.  This will allow the operating system to optimize the processing when you have available time. The separate loops can (and probably should) be in subVIs.

 

Let us know if you need more help.

0 Kudos
Message 6 of 6
(2,893 Views)