Digital I/O

cancel
Showing results for 
Search instead for 
Did you mean: 

Fast detection of rising AND falling digital edges - change detection event missing events

Hi everyone, I have searched the forum for this problem and it seems like some people have problems with detecting fast signal changes/edges by using the change detection event method. However, I have not found an answer to the problem so I am posting my specific problem. Maybe someone can help or suggest a better approach: I am operating a laser scanning microscope with a piezo stage. Let's say I want to rasterscan an area of 400x400 pixels and record fluorescence at each pixel for 100 µs (=pixel dwell time). To synchronize the data acquisition software (Labview 2009) the controller for the piezo stage is setup in such way that I will give a HIGH TTL signal every second pixel. That way, the controller outputs the following digital signal array: 1 0 1 0 1 0....for 400x400=160000 times. The duration of the HIGH/LOW state is actually the pixel dwell time. So the controller outputs each 1 or 0 for 100 µs. Starting with the first pixel 100 µs HIGH then 100µs LOW and so forth for 160000 times. After the controller for the piezo stage is setup correctly I now wanted to check if I can actually read all edges correctly without data loss. So I basically copied the example shipped with Labview explaining the change detection event in order to count all falling AND rising edges (every edge signals a new pixel as explained above). When I run the programm however it does not count 160000 events but 158200 +- a few events. The longer the pixel dwell time is, the less missing events I get but it is still too much data loss. How well is this method (change detection event) suited for detecting digital TTL edges which happen every 100 µs? Are there better concepts to implemt this? Eventually, I do not only want to count the pixels but actually read two counters everytime the event fires and put that counter value in an array (=the image array with the fluorescence signal in counts). So there is some more calculations I need to do during the 100 µs pixel dwell time. Therefore I need a robust and fast method to tell me when a new pixel is to be read. Any suggestions? I thought the change detection would perform well as it is advertised as causing no CPU load... I am using Labview 2009 on Windows XP with a PCIe 6259. Thanks for your help, Christian

0 Kudos
Message 1 of 4
(3,803 Views)

Hi Christian,

 

just a question, why aren't you using two counters? One to count the rising and one to count the falling edges.

 

Kind regards

Carsten

 

0 Kudos
Message 2 of 4
(3,746 Views)

Hi Carsten,

 

my two counters are already in use by my two detectors. Thats why I need some other mean of dealing with this. In the meantime I am now not synchronizing at every pixel but at every first pixel of a new line (in a 2d raster scan). I am in the process of implementig this. If it works ok I will write a follow-up.

 

Thanks for your help,

 

Christian

0 Kudos
Message 3 of 4
(3,742 Views)

Hi everyone,

 

I started a new thread after reading on the forum discussing ways of implementing a retriggerable AI measurement without the use of counters as it is commonly done.

 

Here is the link: Click here to go to new thread

 

Cheers,

 

Christian

0 Kudos
Message 4 of 4
(3,716 Views)