Data Acquisition Idea Exchange

Community Browser
Top Authors
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

There are a few different versions of the NI USB 9211 Thermocouple Module.  The USB 9211 is now a legacy device and the USB 9211A is the upgraded version for higher performance.  The issue that I have seen when using the USB 9211A for Mac is that the DAQmx Base drivers that support the USB 9211 are different than the drivers that support the 9211A, but the device itself does not specify which version of the 9211 the customer has. DAQmx Base 2.1 supports the 9211 and DAQmx Base 3.2 supports the 9211A, but the device will not be recognized if the incorrect driver is downloaded.

 

My suggestion would be to print the "A" after the 9211 on the new devices to make sure that the customer knows which driver to download for their device.

We have a need to a DAQ card that can handle more than 10V.  I know there's an SC module that does 300V, but antenuation by 30 in signal conditioning seems a bit much.  In retrospect, Agilent has a 100V input PXI card without the signal conditioning (32 Channels, 16-bit, 250kS/s).

Hello,

 

For those of us who develop using DAQmx all the time, this might seem silly.  Nonetheless, I'm finding that users of my software are repeatedly having a tough time figuring out how to select multiple physical channels for applications that use DAQmx.  Here's what I'm talking about:

DAQmxChannels.png

 

Typically a user of my universal logger application wishes to acquire from ai0:7, for example.  They attempt to hold down shift and select multiple channels, only to assume that one channel at a time may be aquired.  For some odd reason, nearly everyone fears the "Browse" option because they don't know what it does.

 

 

While, as a developer, I have no problem whatsoever knowing to "Browse" in order to accomplish this, I was just asked how to do this for literally the fifth time by a user.  Thus, I'm faced with three choices: Keep answering the same question repeatedly, develop my own channel selection interface, or ask if the stock NI interface may be improved.

 

I'm not sure of the best way to improve the interface, but the least painless manner to do so might be to simply display the "Browse" dialog on first click rather than displaying the drop-down menu.

 

Please, everyone, by all means feel free to offer better ideas.  What I do know for certain, though, is that average users around here continually have a tough time with this.

 

Thanks very much,

 

Jim

 

I rarely have to set up hardware for a new analog measurement and always have to puzzle over the difference between RSE and NRSE modes. I think of the inverting input as the reference, so "Non-Referenced Single-Ended" doesn't make sense to me. And, if I run the AISense line to my remote sensor, isn't that a Referenced Single-Ended measurement?

 

Yesterday, I noticed that at least some on-line documentation now refers to GRSE (Ground Referenced Single-Ended); adding that single letter helps a lot. What about adding another single letter and referring to the other mode as RRSE (Remote Referenced Single-Ended)? One letter could save a lot of people a lot of time.

We need a way to query an output task to determine its most recently output value.  Or alternately, a general ability to read back data from an output task's buffer.

 

This one's been discussed lots of times over the years in the forums but I didn't see a related Idea Exchange entry.  Most of the discussion I've seen has related to AO but I see no reason not to support this feature for DO as well.

 

There are many apps where normal behavior is to generate an AO waveform for a long period of time.  Some apps can be interrupted unexpectedly by users or process limit monitoring or safety range checking, etc.  When this happens, the output task will be in a more-or-less random phase of its waveform.  The problem is: how do we *gently* guide that waveform back to a safe default value like 0.0 V?  A pure step function is often not desirable.  We'd like to know where the waveform left off so we can generate a rampdown to 0.  In some apps, the waveform shape isn't directly defined or known by the data acq code.  So how can we ramp down to 0 if we don't know where to start from?  This is just one example of the many cases where it'd be very valuable to be able to determine the most recently updated output value.

 

Approach 1:

  Create a DAQmx property that will report back the current output value(s).  I don't know if/how this fits the architecture of the driver and various hw boards.  If it can be done, I'd ideally want to take an instantaneous snapshot of whatever value(s) is currently held in the DAC.  It would be good to be able to polymorph this function to respond to either an active task or a channel list.

 

Approach 2 (active buffered tasks only):

   We can currently query the property TotalSampPerChanGenerated as long as the task is still active.  But we can't query the task to read back the values stored in the buffer in order to figure out where that last sample put us.  It could be handy to be able to query/read the *output* buffer in a way analogous to what we can specify for input buffers.  I could picture asking to DAQmx Read 1 sample from the output buffer after setting RelativeTo = MostRecentSample , Offset = 0 or -1 (haven't thought through which is the more appropriate choice).  In general, why *not* offer the ability to read back data from our task's output buffers?

 

-Kevin P

It would be great if the full DAQmx library supported all NI data acquisition products on Windows, Mac OS X and Linux. The situation right now is too much of a hodge-podge of diverse drivers with too many limitations. There's an old, full DAQmx library that supports older devices on older Linux systems, but it doesn't look like it's been updated for years.  DAQmx Base is available for more current Linux and Mac OS systems, but doesn't support all NI devices (especially newer products).  DAQmx Base is also quite limited, and can't do a number of things the full DAQmx library can.  It's also fairly bloated and slow compared to DAQmx.  While I got my own application working under both Linux and Windows, there's a number of things about the Linux version that just aren't as nice as the Windows version right now.  I've seen complaints in the forums from others who have abandoned their efforts to port their applications from Windows to Mac OS or Linux because they don't see DAQmx Base as solid or "commercial-grade" enough.

 

I'd really like to be able to develop my application and be able to easily port it to any current Windows, Mac or Linux system, and have it support any current NI multi-function DAQ device, with a fast, capable and consistent C/C++ API.

 

Anyone else see this as a priority for NI R&D?

 Does NI notice that the R&D requirement of MEMS sensors with "atto scale capacitance or inductance change?" In the past years, we can only achieve the measurements by MS3110 or Agilent E4980A. But they are not easy to be integrated with other equipments. The operiating or measurement frequency of the sensors, for example MEMS microphone, accelerometer, gyroscope, tactile sensor, is usually DC~100kHz, with aF or aH capacitance or inductance change.

 

MS3110

http://www.irvine-sensors.com/Chips&Modules.html 

 

On the other hand, which relay type of switch/multiplex/matrix card is proper for the MS3110 or Agilent E4980A with lower capacitance or inductance?

Consider the case where you have a cDAQ II chassis filled with analog input modules. If I have several modules that are running at the same rate I can use channel expansion to include them in the same task. Since they're in the same task they will have the same timing and triggering and will share the same timing engine. This leaves the other two timing engines available for tasks with different timing constraints. If I want to dynamically change the sampling rate of just one of the modules I would have to stop the task and, hence, the acquisition on the other modules. If the same timing engine could be shared among tasks, each module could have its own task and be independently controlled. I imagine this would involve a lot of changes to the DAQmx task structure but it's something that would come in handy.

Some NI boards have different properties for different channels.  For example the NI 4432 has ICP power available for channels 0 through 3 but does not have ICP power for the channel #4.  Please add the IEPE or ICP power support and AC/DC coupling support information to the DAQmx Physical Channels Node. 

Based on this LabVIEW Idea...

 

Support the iPad as an execution target

 

Adding this type of support would allow NI and others to develop applications similar to the Oscium oscope using NI DAQ hardware.

 

Would like to be able to collect a couple channels of analog inputs to the iPad.  This is nice but I need a minimum of 2 analog inputs and I would rather have NI:  http://www.oscium.com/

 

Response from coorporate:

"We don't currently have anything that would meet the customer's requirement of being able to plug in directly into the iPad for data acquisition.

I don't believe that the iPad supports Silverlight which is a framework developed by Microsoft.  Also, wireless DAQ has to communicate with a host running DAQmx, so the customer would still need a 2nd computer even if using wireless DAQ.

If you want to connect data acquisition hardware (of any form-factor) to a machine running LabVIEW and DAQmx,  then use LabVIEW Web Services to publish the front panel to the web and view/control it from his iPad.

We do have several USB products that will work with Windows-based netbooks that could be an alternative solution if topic is open to a non-Apple platform.  For example, the 5132/5133 are bus-powered digitizers with much higher sample rate, bandwidth, and buffer size compared to the Oscium device.  However, the price is also quite a bit higher."

I recently had a customer offer the suggestion to expand the support for the USB-8451 to include a coding environment that includes .NET. Currently the only supported development environments are:

 

LabVIEW™ 8.5, 8.6, LabVIEW 2009 (32-bit), and LabVIEW 2010 (32-bit)
LabWindows™/CVI™ 8.0 (or newer)
Microsoft Visual C/C++ 6.0

 

I just wanted to share this with the community.

 

I know it is possible to listen for multiple DI lines when performing a Change Detection (with a pxi 6511, for example), however it appears to not be a feature to know which channel was the cause of the event trigger  without building your own comparison logic around the DIO reads.

 

I would like to see an additional property, either under 'Timing', or under 'Channel' that would allow me to ask, for the specific channel / line in the task that caused the change-event instance rather than having to search for it manually in a 1D boolean array.

 

 

 

 

When using TEDS load cells it would be useful to have a built in tare function  The null offset function only offsets the electrical value by the intialy measured amount.  This essentially shifts the calibration curve horizontally only.  The tare function could also shift the calibration curve vertically, in the load direction.  Since two point calibrations don't always create a line that goes through zero, a tare function is needed to get to zero.  Please see the attached VIs.  Also, check out my thread on this subject.

 

http://forums.ni.com/t5/Signal-Conditioning/9237-Null-Offset-with-TEDS/td-p/1499954

 

The title pretty much says it all. I would like the ability to either configure a full hardware compliment as simulated devices then switch them over to real devices when the hardware arrives or go from real devices to simulated devices without the need to add new, discrete simulated devices to MAX.

 

This would make for much easier offline development and ultimate deployment to real hardware.

Screenshot from the help for number of samples per channel for a Read task :

 


Extract of help.jpg

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

Please add an option for a continuous sample task to read all currently available samples (-1) but with a wait for a minimum number of samples (#min). Behavior of this configuration :

  • #available samples >= #min : read all available samples
  • #available samples < #min : wait until #min samples are available or until timeout

 

This would be very useful in several cases. For example :

  • No extra coding to handle or avoid the error that appears if 0 sample is available.
  • Avoid the use of a wait function in the loop to ensure a minimum number of samples are available.

Increase the max clock rate of the USB-8451 from 250kHz to at least 400kHz.

I am not an electircal engineer, so I have no idea if there is some reason this has not been implemented in exiting versions of teh cDAQ chassis.  But there are a whole host of applications where a user wants to do Hardware timed digital output to different channels using DIFFERENT time bases.  It would be nice to have more than one DO timing engine available.  I would love to see that in future versions of the cDAQ chassis.

 

Thanks

Matt

For each device, MAX will use an unique device number.

This is no problem with fixed measurement equipments.

With USB devices this may become a problem.

On a school, a student will work with different combinations of computer and device.

 

If the student wants to use his program with a different device, he will get an error.

Even if the device is the same type, but has a different serial number.

New Picture (1).png

To solve it, the student needs to open all DAQ routines and to alter the device number.

Or he needs to change the DAQ assistant routine into a VI and change the constant device number by a routine as shown in DAQmx device to use.vi.

 DAQmx device to use.png

This same problem occurs when using NI-IMAQdx devices.

 

Solution:

Make it possible to select a device by type instead of a device by number.

New Picture.png


I recently had a customer create a global virtual channel in Measurement and Automation Explorer (MAX).  They then set the maximum and minimum values for the input range of their signal. 

 

GlobalVirtualChannel.jpg

 

 

 

 

minmax.jpg

 

My customer wanted to access the +2 and -2 values entered above and display them in LabVIEW.  However, the property nodes for global virtual channels only accesses the limits of the board.  For example, the customer's board may only be able to handle voltages between +/- 10 Volts.  No matter which property node we chose... all that was returned was the +/- 10 Volt range.  Could we please give customers access to this information?