HI, I have two cards PCI6032E(16 bit) and PCI-MIO-16E1(12 bit). These cards have been shynchronized using the RTSI cable. There are Three channels on the 12 bit (at 3840 samples/sec) card and Six channels on the 16 bit card (at 15360 samples/sec). The problem I am having is that the 12 bit card buffer is getting filled (faster half ready) faster than the 16 bit card. I checked the buffer sizes of both the cards. They are OK. The time base fo rboth the cards is 1 MHz clock. When Ireduce the total sampling rate ((No of channels + 1, 1 extra to be on safe side) * SAmpling rate - small amount) of the 12-bit card slightly it is running fine. Again when I increase the sampling rate for all the channels of 12-bit card by tw ice i.e. 7680 samples/sec they are running fine. Can you suggest me any solution to this problem.
If you are using DAQ_Rate to calculate your sample interval and your scan interval, you are more than likely going to request an unsupported timebase on both of your E-Series boards. Since you mention that the timebase for both boards is 1 MHz, this is most definitely occurring. DAQ_Rate has no way of knowing what kind of device that it is calculating these parameters for, and thus it will return whatever parameters make the most sense according to its algorithm. For example, E-Series boards only have two possible timebases--100 kHz and 20 MHz, but DAQ_Rate suggested 1 MHz. Unfortunately, you have no way of knowing what SCAN_Start coerces these values to. You are much better off calculating the sample and scan parameters manually and passing them directly to SCAN_S tart.