LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

scb 68 counter

I am using an scb 68 connected to a pci 6221.
I require an application where the 16 analog inputs are used to count events. Every analog input has two values (1 or 0) like TTL signals. The problem is that I am using the vi "elapsed time" to measure the time of the events and the time is not accurate nor constant. I´ve read that counters should be used but that is impossible because the scb 68 only has 2 counters and i require 16.
any suggestion?
i've thought on creating my counters physically and use the result with labview, although the utility of the program and the scb 68 card would be negligible.
 
Miguel Angel 
0 Kudos
Message 1 of 10
(4,108 Views)
Miguel Angel,

If you use the hardware timing on the analog input, you can get the time of the period of each pulse from the interval between samples. This will give more consistent and accurate results than using the "elapsed time" VIs. You will need to code your own logic for detecting the beginning and ending of the events and you will need to make sure that you sample fast enough to acquire the edges of the events with sufficient resolution that you do not miss any.

Your analog data for one channel would look something like this: 0.1, 0.2, 0.1, 0.0, 1.2, 1.1, 1.3, 1.1, 1.1, 0.5, 0.1, 0.2 ... You would need to define a threshold (probably around 0.6 for the sample data) and calculate the timing from the sample rate and the array index where the data switched from below the threshold to above.

Lynn
0 Kudos
Message 2 of 10
(4,068 Views)
Thanks Lynn,
 
I have a few questions.
I do not really understand the concept of hardware timing. does it mean that i have to connect a TTL clock signal to PFI0 (for example) or that i have to create my own counters?.
In addition, I have found many examples about counters but any related to using 16 analog input channels. Do you have any good example for it? like using more than 2 or more analog input channels to measure the time of the pulse.
I´m new in labview and any good suggestion would be helpful
 
thanks
 
Miguel Angel
0 Kudos
Message 3 of 10
(4,055 Views)
Miguel Angel,

The 6221 device has an internal sample clock which allows sampling rates up to 250000 samples per second. This internal timing source is the one to use. The 16 channels share a common analog to digital (A/D) converter, so you can sample each channel about 15000 times per second when measuring all 16 channels. The exact value depends on how long the channel switching takes.

If your events occur faster than about 5000 times per second, you will not be able to get good results with this measurement device. Is this within the parameters of the data you wish to measure?

You only need one analog channel per signal to measure the timing. To simplify an example let's say that you are sampling 16 channels at 10000 samples per second for each channel. After one second of data acquisition you will have 160000 data points, 10000 for each of the 16 channels. On any one channel each of the 10000 samples is acquired 100 microseconds after the previous sample for that channel. This timing is accurate to 50 ppm according to the 6221 data sheet.

What version of LabVIEW and what version of the DAQ software are you using? There are some good examples of data acquisition which come with LV. These are a good place to start.

Lynn
0 Kudos
Message 4 of 10
(4,051 Views)
Lynn,
 
Thanks for your response and advices.
My events takes between .05 and .1 seconds so the sampling rate to use per channel is enough.
My labview version is 7.1 and the Daq software is 7.2.0f1 as indicated by the measurement & automation explorer (ver. 4.0.0.3010).
 
Finally, as I understand by your comments, what I have to do is to define a sampling rate per channel (around 10000), that sampling rate specify that every sample takes .0001 seconds, my program should count the samples when the analog value is over certain threshold and then I could define the time. If that is correct, the last thing to do is to work with some examples. does the labview software have some good ones to start?
 
Thanks
 
Miguel Angel
 
0 Kudos
Message 5 of 10
(4,023 Views)
Miguel Angel,

Yes. Your last paragraph indicates that you have the concept I am suggesting. Since your data is slower (50 to 100 ms), you could use a slower sampling rate. Sampling at 1000 samples per second should be fast enough and would produce a dt (time between samples) of 1 millisecond.

The first place to look for examples are the data acquisition examples. I am not sure what was available in version 7.1, but look for something which acquires multiple channels continuously. You will need to make your own VI compare the data to the threshold and count the samples.

Lynn
0 Kudos
Message 6 of 10
(4,019 Views)

Lynn,

I have done what you suggested. Nevertheless, I have another problem. I am using 16 analog input channels in my scb68 and I would like to run the acquisition for 2 minutes. I am taking 500 samples at a 10kS/sec rate. I used a for loop  2400 times (2400=120sec/.05sec) but it takes more than 2 minutes. I tried different times for the time loop and i found that it takes around .065 sec each iteration. I think it is because i have a high input impedance, for the time it consumes to process some vi i have inside the for loop or another cause. Do you have any idea?. Finally, it is important to state that the time seems constant, that means, the 2 minutes are constant.

thanks

miguel

0 Kudos
Message 7 of 10
(3,976 Views)
Miguel,

Without seeing your code, I can only guess at the problem. What are you doing inside the loop besides reading the DAQ?

The best performance comes when you do all the initialization before the loop and all clearing of tasks after the loop. If you are doing a lot of data processing on the data as it comes in, it is better to move the processing to a separate parallel loop.

Please post your code so that we offer more specific suggestions.

Lynn
0 Kudos
Message 8 of 10
(3,971 Views)
Lynn,
 
I have attached the block diagram of my program. It only shows the DAQ acquisition vi and a subvi that only contains a filter and a comparator. The question again, is it possible to measure 2 minutes based on the number of iterations of the for-loop?. In theory, each iteration should take .05 seconds but it takes around .065 seconds. I used a chronometer and it seems to have accuracy, i just want to know if it is constant or not.
Thanks
Miguel
0 Kudos
Message 9 of 10
(3,939 Views)
Miguel,

It is hard to tell from an image, but here are a few things to consider:

1. The DAQ Assistant is not considered to be optimized for high performance. It is set up for ease of use. It is often better to use the low-level DAQ VIs. Do the configuration and initialization outside the loop and clear the task after the loop finishes. Just read the data inside the loop.

2. The filters may take longer than you think. It depends on the complexity of the filter and the amount of data. Is the filter VI reentrant? If not you may have issues with the filter transients. If you have the VI Profiler, use it to see where the timing bottleneck is located.

3. Local variables are slower than wires. I do not see the v1.2, c1.2, ... indicators or controls in the image posted. Setting up a shift register to pass the data from iteration to iteration or an Action Engine to pass data from a parallel loop may be faster.

4. Do you need to update the indicators 20 times per second? The user cannot see update quite that fast and it looks like that data is just incrementing. If you only write to the
indicators 2-3 times per second (or, better, yet, in a parallel loop), this might speed things up a bit. Panel updates require OS action and your program has no control over how the OS handles that.

Lynn
0 Kudos
Message 10 of 10
(3,917 Views)