I have a PCIe-7852 and am trying to figure out the precise timing associated with the analog in channel. On other boards, like my PCI-6259, I am able to externalize something called AI_Convert and look on my oscilliscope to see the signal being sampled relative to AI_Convert.
My question has to do with the timing of the ADC's sample and hold, not really setting up the sampling frequency.
I am looking for a way of telling exactly when the ADC clamps analog signal relative to the DIO outputs that are controlling the sensor I am looking at. As I mentioned, in the M-series I can just send AI_Convert to an output and look at it's timing on a scope.
The spec for the PCIe-7852 says that the AI is rated at 750Khz. So, when I do an Analog-In measurement and am twiddling DIO control signals to the sensor using the 40Mhz clock, I need to understand when the S&H on the signal occurred.
R-Series cards work a little differently than you are used to. Typically, these events are based on “Ticks” as opposed to actual time measurements. This is due to the fact that the user has control over the clock rate and the card simply requires “Tick iterations” not amounts of time (some tasks can be faster or slower).
As we have said, the clock rate on your PCIe-7852R card is 40 MHz. For ADC, the FPGA requires 3 ticks. Therefore running your clock at 40 MHz, it will convert in 75 nano seconds (25 x 3). However, the maximum sample rate for this card is 750kS/s. This translates to 80,000 nano seconds per sample. The reason for this has to do with settling, conversions, buffering, and all the other requirements associated with taking an analog measurement.
Two methods of viewing this close with your R-Series are shown below. The top example will utilize the actual clock signal while the bottom one will generate a signal in parallel. Because this is FPGA, they run exactly the same.
Does this make sense?
So, for a Single Cycle Loop running at 40Mhz, with concurrently reading an analog in channel and writing a DIO
Tick 0 (T=0): Set DIO0 low and Read AI0
Tick 1 (T+25ns): DIO is still low, and AI0 is still converting
Tick 3 (T+50ns): DIO is still low, and AI0 is still converting
Tick 4 (T+75ns):DIO is still lo and AIO is done converting
What I need to know, relative to DIO going low, when does the ADC do it's sample & hold and how long does the signal need to be stable. Is the S&H occuringconcurrently with the falling edge of DIO, or a tick later
I understand that at 750KHz, I can only read in that analog channel one every 1/750KHz = 1333ns, but my question has to do with the fine timing in those 3 ticks when the converstion is occuring.
I am controlling a detector using various FPGA controlled DIO signals from the card and the state of the output of the detector is dependant on those signals. when I toggle the last contol DIO line that is part of the control sequence and issue and FPGA I/O Item read of AI0, there is no way for me to know when the actual sample and hold occurred (I am aware that an SuccApprox ADC then needs some amount of time to process the result after the input has been sampled). So I know when the DIO line was asserted as I can see that on the oscilliscope but have no way of knowing when the S&H actually occured.
We seem to contimue to be on different pages regarding my question and I apologize if I am not being clear.
Any information about the ADC and it's timing relative to the FPGA FPGA I/O Item read being executed would be appreciated.