Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Can I use AI clcock config.vi to set external time base

DAQ PCI-MIO-16XE-50 board

According to NI website, I can use "Alternate Clcok rate spec in AI clock config.vi" to set scan clock using external timebase. Does this method in fact change the DAQ cards Master Timebase?

According to NI, I can use RTSI to change the DAQ cards Master Timebase, then all on board clock will be based on the external timebase.

Do the two methods have the same result of changing the DAQ card Master Timebase?
0 Kudos
Message 1 of 15
(3,320 Views)
Dear May Lee,

Thank you for contacting National Instruments.

To address your question, you can acquire external clock rates using several methods. Your external scan clock can be input on any PFI pin on E Series devices. The Traditional NI-DAQ default pin is PFI7/SCANSTART. Most DAQ devices also have an PFI pin on the I/O connector for providing your own channel clock. Also, as you mentioned, with devices that have a RTSI connector, you can get your clock from other NI DAQ devices. When configuring your device to use external clocks, wire a "0" to the Clock Config VI to disable the internal clock.

When you set the scan clock to use an external timebase, you are not replacing the Master Timebase. Rather, you are telling the PCI-MIO-16XE-50 to use the external clock rate instead. The allowable ranges for your external clock signal are still limited by the specifications on your board. Therefore, the high limit of your external clock signal should not exceed the maximum signal rate specified for your device.

Just to review, the sample (scan) and convert (channel) control how the DAQ device�s channels are sampled. The sample (scan) clock controls when a scan is initiated. The convert (channel) clock controls when each individual channel is sampled.

I believe you are trying to change your sample (scan) clock to a specific external rate using the AI Clock Config VI. When you select a scan rate, LabVIEW requests NI-DAQ to automatically select the convert (channel) clock rate for you.

Traditional NI-DAQ selects the fastest channel clock rate possible. However, to allow for adequate settling time for the amplifier and any unaccounted factors, Traditional NI-DAQ adds an extra 10 microseconds to the interchannel delay (channel clock period). Keep in mind, if the scan rate is too fast for Traditional NI-DAQ to apply the 10 microsecond delay and still sample every channel before the next scan clock, then the delay will not be added, and the signals collected may be distorted. Also, you can manually specify a specific rate for the channel clock, and if you do, the delay will not be added.

Using Traditional NI-DAQ, you can manually set your channel clock rate with the interchannel delay input of the AI Config VI, which calls the AI Clock Config sub-VI to actually configure the channel clock.

NI-DAQmx selects the slowest convert clock rate possible in order to sample all of the channels within one scan. This results in the longest possible interchannel delay and allows more settling time for the amplifier. Using DAQmx, you can manually set your convert (channel) clock rate using the DAQmx Timing property node.

Let me know if you have any further questions or if this does not resolve your issue.

Thanks again and have a great day!

Chad E
Applications Engineer - National Instruments
Message 2 of 15
(3,320 Views)
Hi Chad,
Thanks for your reply, it is very helpful. Some other questions.
(1)
If I just set scan clock to use external timebase (as the method in this artical http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/364653410e69dcc286256c3a005ecd83?OpenDocument)
thus,the scan clock will be based on the external timebase, the channel clock will be automatically selected by NI-DAQ and the channel clock is still based on the DAQ board internal timebase. Right?
(2)
I set scan clock based on external timebase because I want more stable timebase. If I just do as, (scan clock on external timebase, while channel clock on internal timebase),is it meanful?

If your answer is no, I will choose to change the internal Master timebase on the DAQ board by RTS
I method.

(3)Can I set scan clock and channel clock based on external timebase by the method of AI clock config.vi?
I mean, in my program, between AI config.vi and AI start.vi, I use one AI clock config.vi to set scan clock, then use another AI clock config.vi to set channel clock. In this way, I don't need to use RTSI method?
0 Kudos
Message 3 of 15
(3,320 Views)
Hi May Lee,

I have some answers for you.

(1) I've never used SISOURCE before, but if it was found on our website, it should work. Assuming SISOURCE accepts an external clock source, that is correct. Scan clock will be set as the external scan rate, and Traditional NI-DAQ will select the fastest channel clock possible except for a minimal 10 microsecond interchannel delay (to allow for settling time for the amplifier and any unaccounted factors). If the scan rate is too fast for Traditional NI-DAQ to apply the 10 microsecond delay and still sample every channel before the next scan clock, than the delay will not be added. Also, if you specify the channel clock rate, the delay is not added.

(2) I'm confused what you mean by setting the channel clock using the internal timebase. If you use the internal timebase of 20MHz as your channel clock rate, this sets the interchannel delay at 50ns. To acquire accurate data, the signal must settle within an accuracy range before the A/D conversion takes place. This is called the settling time. Consult the settling time specifications of your device to see if it can acquire that quickly. Most boards typically have microsecond values.

If you're wondering about settling time issues with your board, consider the following. First, determine the maximum sampling rate and gains your application requires; you must select a DAQ board that settles to within your specifications. If you have a DAQ board already, which I think you do, run the DC or AC settling-time test to determine the settling time for your board. After you know the constraints of your DAQ board, consider the output impedance of your signals and arrange your signals if possible to minimize the swing in voltages between channels. Finally, select cables that are as short as possible with low resistance and capacitance. Using these guidelines, you can resolve all of your settling time concerns and be confident that your acquired data is accurate.

(3) You can set the scan clock and channel clock based on external timebases using the AI clock config VI. You can input the timing signals on any PFI pin or select the I/O connector option which looks for the signals to be input on the PFI pin dedicated for its output.

I've included an example from a PowerPoint presentation below. Let me know if this doesn't resolve the issue,

Chad AE
Applications Engineer - National Instruments
0 Kudos
Message 4 of 15
(3,320 Views)
Hi Chad, Thanks for the powerpoint, it is very clear and helpful. Could you check my vi program right or not? I use Labview 6.1 and PCI-MIO-16XE board. Following the article (Can I replace my device's timebase with an external source?), my program is:
(1)AI config.vi, set the channel, device...
(2)route RTSI clcok to board clock,
(3)route PFI8/GPCTR0 SOURCE to RTSI6
(4)using a jumper to connect RTSI6 and RTSI clock
(5)AI start, AI read, AI clear.
(6)My external source 10MHz is connected to PFI8 and the DGND.

But when I run the program, it stuck at the AI read, no data is display. no error display also. When I use NI-MAX to test the board, it failed. If I remove the jumper between RTSI6 and RTSI clock, the board passes the MAX test
. But I need to connect RTSI6 to RTSI clock, other wise I need to physically connect my external source to RTSI clock, which is difficult. Thanks for your help!
0 Kudos
Message 5 of 15
(3,320 Views)
Hi May Lee,

I'm confused on if you want to change the internal timebase or not. If you want to divide down your external signal, then yes you want to change the internal timebase to your external timebase. If you just want to use your external signal as the scan clock, and have Traditional DAQ calculate the channel clock, there is no reason to change the internal timebase.

I'm assuming you want to divide down the signal. To change the internal timebase to your external signal, use the AI Clock Config VI. Configure the timebase using the "alternate clock rate specifications" input. Change "timebase source" to source of counter n and "timebase signal" to 0 to use the source of counter 0 to input the external timebase. Then inclu
de however much you want to divide down your signal in the "timebase divisor". You want the result of this to be your scan clock so set "which clock" to Scan Clock 1. Place this VI between the AI Config VI and replace the AI Start VI with AI Control VI with a start wired to "control code". This way you won't have to route signals.

Let me know if this helps you,

Chad Erickson
Applications Engineer

p.s. If you don't have to divide down the external signal, use the example provided earlier for External Scan Clock.
0 Kudos
Message 6 of 15
(3,321 Views)
Hi Chad,
Here I explain my task to make clear what I want:
collect data on 3 AI channels on PCI-MIO-16XE-50 board; scan rate is 1000 scan/s(it means: 1000 data on every ai channel every second, right? Scan rate is a divided on the scan clock right?). My supervisor said we need a much more stable timebase or clock for this task. We have a 10MHz very very stable signal.Then I found there are probably 2 ways to do this work.
(1) using route signal.vi to totally replace the 20 MHz master timebase of the DAQ board with my 10MHz external signal source, then all the scan clock, channel clock will be stable enough. I don't need to worry the AI clock config.vi. So I follow the articel (Can I replace my device's timebase with an exter
nal source?), but get error as I said before.
(2) using AI clock config.vi, using the alternative clock specification. I am not clear how to use this vi. First, can I use PFIn as the "timebase signal"?, so my 10MHz singal will be connected on one of the PFIn pin? If as you said, use "source of counter 0" as "timebase signal", to which pin on the board shall I connect my 10MHz signal? Second, can I just use the 10MHz external singal as scan clock, no need divide it down in the timebase divisor? I think when I define scan rate (1000 scan/s), it will automatically divide based on the scan clcok, right? Third, if I just use my external 10MHz signal as the scan clock, the traditional DAQ will calculate the channel clock for me. You know, I use external signal as scan clock because I want to very very stable,in this case, is the channel clock stable as the scan clock? or it doesn't matter, I don't need to worry about it?

Hope it is a bit clear to understand what I want to do. And what me
thod do you think I should use to do this work? Thanks a lot!
0 Kudos
Message 7 of 15
(3,321 Views)
May Lee,

Route signal will not work to replace the 20MHz master timebase of the PCI-MIO-16-XE-50 with an external timebase. The only signal that can be routed to the Master Timebase of the 16-XE-50 is RTSI 7. The only signal that can be routed to RTSI 7 is the 20MHz Timebase. You may ask why that is. This allows several devices to be referenced to a common reference clock. For example, one device could be the master, and all the other boards in a system could use the same timebase as that board. This does not apply to your system, as there is no way to route an external signal to a master timebase. So, (1) is not an option.

In option (2), think of it as specifying an external scan clock. You are using your signal, but are dividing it down (by 10,000) to get a 1kHz signal from the 10MHz signal. To answer your first question, the source of counter 0 is an input on PFI 8, pin number 37 (on a 68-pin connection). Therefore, instead of using source on the block diagram, just use PFI 8.

To answer your second, you will need to divide it down, because you are actually inputting the scan rate.

Third, you can't use the 10MHz signal as the scan clock, because the frequency is too fast for your DAQ board. You can use it as the timebase for the scan clock by allowing the board to divide it down. The channel clock is going to be as fast as possible, with a 10 microsecond delay. In Traditional DAQ, you really don't need to worry about it.

This is the method I suggest. Use the alternative clock specification.

which clock = scan clock 1
clock frequency = -1
alternate clock rate specification
clock period = -1
timebase source = PFI pin, low to high
timebase signal = 8
timebase divisor = 10000 (if your signal is 10MHz, and you want a scan clock of 1kHz)

Remember to use AI Control instead of AI Start, and as an input to AI Read, make the time limit large enough so that you can acquire all your scans. I've included a .doc file below that shows (in very low resolution) what I simulated.

Thanks again for contacting National Instruments. Let me know if this resolves the issue,

Chad AE
Applications Engineer - National Instruments
0 Kudos
Message 8 of 15
(3,321 Views)
Hi Chad,

I'll follow your method. Thanks for your great patience and help.

In your method, why must it use AI control in stead of AI start?

Also, I am still confused with the article ( Can I replace my device's timebase with an external source) from NI website. I would like to confirm one more time: Can I replace the timebase of my board with an external signal source with a frequency other than 20MHz, for example an ultra stable 10MHz source? The article doesn't mention this point.
0 Kudos
Message 9 of 15
(3,321 Views)
Hi may lee,

AI Control is lower level and therefore will result in better performance in your program. I think you can use AI Start.

The article references how to use the external timebase in creating the scan clock. You are correct in realizing the article does not address its title. However, for your board, the only way to replace the master timebase is through RTSI7, and the only signal that can be routed to that RTSI connection is the 20MHz timebase.

I hope this helps,

Chad AE
Applications Engineer - National Instruments
0 Kudos
Message 10 of 15
(3,321 Views)