Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

NIDAQmxBase does not use differential signals?

Solved!
Go to solution

I recently upgraded NIDAQmx Base to the latest and greatest version.  But now I am getting horrible corruption of my signals.  I have checked with a scope and the signals are correct but there seems to be horrendous cross talk between channels.  The only thing I can think of is that the board is not truly doing differential switching beyond the first 8 channels.

 

I am using a 64 channel board so there are 32 differential channels, I am only using the beginning 16 channels.  I am using "Dev1/ai0:7,Dev1/ai16:23" for my channel specification.  The hardware works since the default "data logger" shows correct data.  But I am guessing that the data logger was built with an earlier version of NIDAQmx base.  This seems to occur when using more than the first 8 channels.  This VI used to work with the previous version of NIDAQmx base.

 

Is this a known problem?

 

Here is the code for initializing the NI-6033 board

 

DAQ Init Code

 



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Mac OS X 10.4.11 PPC/ LV 8.5.1 / NIDAQmx Base 3.2.0 / VISA 4.4.0

Message Edited by sth on 01-29-2009 08:56 AM

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 1 of 8
(3,713 Views)

Hi Scott-

 

To answer your question, no, this is not a problem that I know of.  I looked through the configuration code for the high-channel E Series boards and everything looks to be in order.  What sampling rate do you specify in your test VI?  What do the signals look like from channel to channel, relative to each other?  In other words, are there lots of steps up or down on the signal levels as you scan through the channels?  Do you know the approximate output impedance of your source signals?

 

Thanks-

Tom W
National Instruments
0 Kudos
Message 2 of 8
(3,681 Views)

Hi Tom,

I knew you would see this immediately!  It is odd and I haven't had time to trace it down but I thought I also had this problem with a previous application.   In this application I am reading some DC volatages (0-10 V) and then some incoming AC power phases (two phases of 12.5 kV input power) which is stepped down thru a transformer and then an isolation amplifier.  Then I am reading 8 output volatages on the order of 500 V also stepped down thru an isolation amplifier.  So the first 4 channels are some slowly varying voltage up to 7 V and the next 4 are AC voltages at 60 Hz from -8 to 8 volts with varying phase and the remaining 8 are voltages in the 0-10 V range slowly changing.  Of course I want to catch fast events when they happen so I am recording on the mS range.  This is a total of 33 kHz which is way below the boards rated 100 kHz and still lower than the 80 KHz recommended for 1 LSB settling time.

 

All the signals should be low impedance as the output from the isolation amplifier is directly to the SCB-100.  The SCB-100 has 100 KΩ resistors  to ground.  Thus the signals should be low impedance and I shouldn't be seeing channel to channel cross talk.  I am reading 16 channels at 2100 Hz scan rate.  I am continuously dumping to files with a new file every 15 minutes.  I have a background task that deletes old files so I have about a 24 hour record stored.

 

I haven't made any hardware changes but may have updated some of the software (ie GPIB drivers, visa drivers, DAQ drivers).  It was working in early December and after I updated it over the holidays it did not work.  I had our electronics shop check the signals at the board and they claim that they are all correct and it is just my bad programming.

 

I will post  some examples of the waveforms when I have a moment.

-Scott

 

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 3 of 8
(3,657 Views)

Tom,

It is a totally different problem and the subject is misleading.    Something is very very wrong and I mis-diagnosed it.  The problem is in my loop to read the DAQ I am only getting 16 data points instead of 210 datapoints!  In the code snippet above the scan rate is 2100 Hz.  I try to set it up so that there is a 100 msec read time of 210 scans and set that up as the number of "Samples per Channel" in the daq base timing VI in the code above.

 

When I do the read, I use that same 210 (1/10 of the frequency) to retrieve samples.  I consistently get a 16X16 array of data out!  Which is not a 210 X 16 and of course the data really looks funky.  Here is the code I use to retrieve the data from the DAQ system.

 

There are also two Probes that show that I am asking for 210 data points and the other shows the highest point in the array that is filled.  If I make either index 16 there is no data. 

 

The data is passed to a queue so that it can be displayed and written to a data file.  This worked under LabVIEW 8.2.1 and an earlier version of NIDAQmx.  As I said, I was updating systems over the break so that all my systems are on a consistent version of software both OS, LV and drivers.

 

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 4 of 8
(3,636 Views)

Stupid Lithium timed out while I was trying to edit the message...  sheesh!

 

Anyway the data seems to be collected about 10 Hz but only a 16 X 16 array of data.

 

PS Montior Read code

 

Here is the code and the two probes showing the data.  If I make either index in the data probe a 16 it shows an empty cell in the array.  I think I am going to try to re-install DAQmx base. 

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 5 of 8
(3,633 Views)
Solution
Accepted by topic author sth