Data Acquisition Idea Exchange

Community Browser
About Data Acquisition Idea Exchange

Have an idea for new DAQ hardware or DAQ software features?

  1. Browse by label or search in the Data Acquisition Idea Exchange to see if your idea has previously been submitted. If your idea exists be sure to vote for the idea by giving it kudos to indicate your approval!
  2. If your idea has not been submitted click Post New Idea to submit a product idea. Be sure to submit a separate post for each idea.
  3. Watch as the community gives your idea kudos and adds their input.
  4. As NI R&D considers the idea, they will change the idea status.
  5. Give kudos to other ideas that you would like to see implemented!
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

I often use one DAQ device to test the basic functionality of another device and like to be able to quickly do this through test panels.  Unfortunately, MAX does not allow the user to open more than a single test panel at once.  The current workaround for this is to launch the test panels outside of MAX (see this KB).

 

It would be nice to have the same functionality when opening test panels in MAX.  Specifically, I would like to be able to do the following with a Test Panel open:

1.  Be able to navigate through MAX to do things like check device pinouts, calibration date, etc.

 

2.  Be able to move and/or resize the original MAX Window (it always seems to be blocking other applications that I want to view alongside the Test Panel)

 

3.  Open a test panel for a second (or third...) device.

 

It is nice that there is a workaround in place already but I think it would be nice if MAX had this behavior to begin with.

  • Multifunction DAQ

As someone who migrated entire product lines from PLCs to cFieldPoint platforms, and now is in the process of migrating further into cRIO platforms, I am finding some cRIO module selection limitations.  One big gap I see in the selection is with analog in/out modules.  A set of 2-in / 2-out analog modules would be very welcome, offering standardized +/- 10V or 0-20mA ranges.  There are a many times in our products that we need to process just a single analog signal, which now with cRIO requires 2 slots be used, with many unused inputs and outputs (which just feels like a waste of money and space).

The vast majority of my working life is spent with RIO devices or midrange X series cards, but I often come across applications where an inexpensive, reliable DAQ would be handy for low level tasks - monitoring presence sensors, measuring voltages at moderate precision and slow speed, providing interlocks for material storage bins etc.

 

Traditionally, you'll see a lot of USB 600X units being used for applications like these. However, running on USB has a few associated problems: unreliability of the Windows bus, cable strain relief on USB connectors, mounting of USB 600X units, connection type. Don't get me wrong, you can do a lot with these units but they're not an ideal, inexpensive solution for production processes.

 

There's a jump between the functionality of these USB units and X (or even M or E for the vintage crowd) series cards. The only thing that's really in that range anymore is the B series PCI-6010 card, which has the fantastic benefit of using a 37W DSUB connector too, but is a little limited in terms of channel offerings and the like.

 

I'd like to see the B series range revived to provide products that fit between the PCIe-6320 and the USB 600X devices, providing non-USB connection and preferably with a DSUB backplane connector for cost and ease of use. This would provide a more reliable offering for simple acquisition tasks in the industrial environment at a cost-effective price point.

Dear NI Idea Exchange,

 

I had a service request recently where the customer wished to use a mass flow meter, using the HART protocol (massive industrial protocol used worldwide with over 30 million devices) to communicate updated values to a cRIO 9074 chassis using a NI 9208 module.

 

They did not know how they could use this protocol with our products. There was only one example online regarding use of this protocol using low level VISA functions. There is currently no real support for this protocol.

 

I suggested that if they wished to use this sensor they would be required to convert it to a protocol we do support, for example Modbus or to RS-232 (using a gateway/converter).

 

Then they could use low level VISA functions to allow the data communication.

 

They were fine with doing this but they felt that NI should probably support this protocol with an off-the-shelf adaptor or module. This is the main point of this idea exchange.

 

There is clearly a reason why we do not currently provide support for this protocol in comparison to PROFIBUS or Modbus.

 

I thought I would pass on this customer feedback as it seemed relevant to NI and our vision. 

 

Regards,

 

Dominic Clarke

 

Applications Engineer

National Instruments UK

I have a data acquisition NI-DAQmx/C++ program where I am continuously acquiring 5 channels of data at 40KHz/channel and reading them in 0.1 second chunks.  This successfully works perfectly for over 14 hrs continuous acquisition, but at 14hrs, 54 min and 47 seconds the program hangs up due to an overflow in the int32 DAQmxInternalAIbuffer_Offset value sent to the DAQmxSetReadOffset() function.  In the DAQmxSetReadRelativeTo() function, I set the offset relative to the first sample using DAQmx_Val_FirstSample.  (See "32-bit lmitation pof the NI-DAQmx int32 DAQmxSetReadOffset() function?")

 

It would be very helpful for the DAQmxSetReadOffset() offset value to be 64-bits rather than the current int32 value.  This would make this function analogous to the DAQmxGetReadTotalSampPerChanAcquired() which returns a 64-bit value.  I understand that the offset is maintained internally as a 64-bit value, so perhaps this would not be too difficult to do.

 

I hope that National Instruments fixes this limitation in their API, not just for 64-bit Windows, but also for 32-bit Windows because a lot of us are still using 32-bit compilers and our users are using Windows XP.  Perhaps it could be implemented as a separate DAQmxSetReadOffset64() 64-bit function for the 32-bit Windows.

 

Thank you,
Bill Anderson

 

Currently when streaming analog or digital samples to DAQ board, output stays at the level of last sample received when buffer underflow occurs. This behavior can be observed on USB X Series Multifunction DAQ boards. I have USB-6363 model. The exact mode is hardware-timed, buffered, continuous, and non-regenerating. The buffer underflow error code is -200290 “The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.”

 

I would like to have an option to configure DAQ hardware to immediately set voltage on analog and digital outputs to a predefined state if the buffer underrun occurs. Also, I would like to have an option to immediately set one of PFI pins on buffer underrun.  

 

I believe this could be accomplished by modifying X series firmware and providing configuration of this feature in the DAQmx API. If no more samples are available in the buffer the DAQ board should immediately write predefined digital states / analog levels to outputs and indicate buffer underrun state on PFI line. Then it should report error to PC.

 

Doing this in firmware has certain advantages:

  1. It can be done quickly (possibly within the time of the next missing sample – at 2Ms/s that’s 0.5us).
  2. Handles all situations (software lockups, excessive CPU loading by other processes, loss of communication do to bus traffic, interface disconnection…)
  3. It does not require any additional hardware (to turn off outputs externally).
  4. Buffer underrun indication on PFI line could provide additional safety measure (it could be used for example to immediately disable external power amplifier connected to DAQ AO). 

Doing this using other methods is just too slow, does not handle all situations, or requires additional external circuitry.

 

Setting outputs from software, once error occurs, is slow (~25ms / time of 50000 samples at 2MS/s) and does not handle physical disconnection of the interface. Analog output does eventually go to 0 V on USB-6363 when USB cable is disconnected, but it takes about half a second.  

 

Using watchdog timer would also be too slow. The timer can be set to quite a short time, but form the software, I would not be able to reset it faster than every 10ms. It also would require switching off analog channels externally with additional circuitry, because watchdog timer is not available for analog channels.

 

The only viable solution right now is to route task sample clock to PFI and detect when it stops toggling. It actually does stop after last sample is programmed. Once that occurs, outputs can be switched off externally. This requires a whole lot of external circuitry and major development time. If you need reaction time to be within time of one or two samples, pulse detector needs to be customized for every possible sampling rate you might what to use. To make this work right for analog output, it would take RISC microcontroller and analog electronic switches. If you wanted to use external trigger to start the waveform, microcontroller would have to turn on the analog switch, look for beginning of waveform sample clock, record initial clock interval as reference, and finally turn off the switch if no pulse is received within reference time.

 

I’m actually quite impressed how well USB-6363 handles streaming to outputs. This allows me to output waveforms with complexity that regular arbitrary generators with fixed memory and sequencing simply cannot handle. The buffer underflow even at the highest sampling rate is quite rare. However, to make my system robust and safe, I need fast, simple, and reliable method of quickly shutting down the outputs that only hardware/firmware solution can provide.

 

Thanks,

Sebastian

I continually come to your site looking for the DAQmx base API manual and have yet to find it.  I eventually have to dig out an old CD to find my copy.

 

How 'bout posting these online so that we can help ourselves out of jams?

 

Thanks,

Jeff

Multiple people have requested that there be a natural way for Labview and SignalExpress to do a rotational speed measurement using a quadrature encoder. An express VI under "Acquire Signals>>Counter Input>>Rotational Speed" that asks you basic quadrature encoder type questions and computes the rotational speed would be very useful. The information it asks would be things such as Ticks per Revolution, Decoding type (x1, x2, x4) would be useful in computing rotational speed. In addition, this can be then converted into a shipping example for DAQmx relatively easily. I have had multiple people ask this question and believe that especially within SignalExpress, this would be very useful.

 

 

Rotation.png

 

 

It has come up a few times from customers, and I wanted to gauge interest and solicit ideas on how this should work.

 

Currently, with the built-in TDMS logging support, if you want to change to a new file in the middle of logging, you need to stop the task and start again.  For some use cases, this isn't practical (for example, http://forums.ni.com/t5/LabVIEW/Why-the-TDMS-file-is-larger-than-it-should-be/m-p/1176139#M511099).

 

The question is: How would you like to specify the "new file" behavior and what are your use cases?

 

For instance, a couple ideas to get the ball rolling:

  1. Add an interval attribute like "Change file after n samples".   We would then auto-increment the file name and change to that file when we have logged "n" samples.
  2. Make file path attribute changeable at runtime.  We have a file path attribute for logging.  The idea here would be to support changing the file path "on the fly" without stopping and starting the task.  The problem here is that it would not suit very well a use case where you want a specific file size.  Additionally, it wouldn't be as easy to use as #1; it would be more flexible though.
  3. (Any additional ideas/use cases?)

Thank you for your input!

 

Andy McRorie

NI R&D

It has come up a few times from customers, and I wanted to gauge interest and solicit ideas on how this should work.

 

Currently, with the built-in TDMS logging support, if you want to change to a new file in the middle of logging, you need to stop the task and start again.  For some use cases, this isn't practical (for example, http://forums.ni.com/t5/LabVIEW/Why-the-TDMS-file-is-larger-than-it-should-be/m-p/1176139#M511099).

 

The question is: How would you like to specify the "new file" behavior and what are your use cases?

 

What I'm currently thinking (because it seems the most flexible to different criteria and situations) is to simply allow you to set the file path property while the task is running (on DAQmx Read property node).  The only downside I can think of with this approach is that you wouldn't know exactly when we change to the new file.  We could guarantee within (for example) 1 second, but you wouldn't be able to specify the exact size.

 

Would this be a good solution for you?  Can you think of a better way to specify this behavior?

 

I could use a USB X series multifunction device with more than 4 AO analog output channels.  My current application does not require them to have waveform generation capability.  There are devices available with more than 4 AO but I need AI and DI/DO and CTR as well and prefer them all on one device.

 

  • Multifunction DAQ

 

Many users (such as our customers) expect from devices in the mid-price segment like the USB-62xx family a proper input signal handling adapted to the selected input sampling rate.

 

Proposal:

Include anti-aliasing filters in mid-class hardware,such as the USB-62xx family. As an immediate action, please include at least a warning remark in the user manual of the devices.

Every application with variable sampling rate needs appropriate and adaptable input signal filtering.

Many NI DAQ devices do not contain anti-aliasing filters corresponding to the sampling frequencies (e.g USB-62xx family).

Signals containing higher frequency components than nyquist frequency will be folded to lower frequencies causing wrong spectrum information.

Many applications need different sampling frequency settings but use the same external hardware. In these cases, hardware including filters have to be designed for the highest possible frequency. This situation leads to unrecoverable errors in the frequency spectrum, if input signal components do not meet the nyquist criteria.

 

Thanks

Klaus

www.rfbeam.ch

 

 

 

There is a need for a quick low cost device to controlling/communicating with the popular low-voltage differential signaling (LVDS).  There is a need for transmit only, receive only, and transmit/receive USB devices.  The current solution is to buy the NI USB-6501 and build a daughter board with TTL-LVDS ICs on it to interface to digital equipment in the military/defense industry. Lots of time and energy is wasted on designing, building, and cabling boards as as additional interface step, where a simple USB solution could be made available.

I rarely have to set up hardware for a new analog measurement and always have to puzzle over the difference between RSE and NRSE modes. I think of the inverting input as the reference, so "Non-Referenced Single-Ended" doesn't make sense to me. And, if I run the AISense line to my remote sensor, isn't that a Referenced Single-Ended measurement?

 

Yesterday, I noticed that at least some on-line documentation now refers to GRSE (Ground Referenced Single-Ended); adding that single letter helps a lot. What about adding another single letter and referring to the other mode as RRSE (Remote Referenced Single-Ended)? One letter could save a lot of people a lot of time.

The X Series has support for a watchdog timer that can be used to define the states for all DIO and PFI lines.  It would be useful to me if it could also set the states of analog outputs since those are also outputs often used to control systems that should have a defined condition upon a fault.

 

Steve

 

  • Multifunction DAQ
0 Kudos

It would be better, to have auto zero support to all daqmx devices (or) to give any offset value to read or configuration of channel function

0 Kudos

We have a need to a DAQ card that can handle more than 10V.  I know there's an SC module that does 300V, but antenuation by 30 in signal conditioning seems a bit much.  In retrospect, Agilent has a 100V input PXI card without the signal conditioning (32 Channels, 16-bit, 250kS/s).

0 Kudos

A customer of mine was trying to read 8 thermocouple readings simultaneously over the course of a week and then store the data.  She quickly found out that there are memory limitations with Signal Express.  Eventually no matter how She saved or logged their data, she would run out of memory.  My product suggestion is to write code that will determine the RAM on a customers computer as well as available hard drive space and show customers (at the start of a task) exactly how long they can acquire signals without filling up their hard drive or seriously draining their resources.  That way when they start a process, they are aware of what they're working with in terms of space at that instant rather than when an error pops up three hours or so later.  Most customers would figure out these needs in advance of any acquisition, but for the ease of the customer, this would be a welcomed additional feature.  It would also be nice to have a document that explained how this time frame was calculated.  I suggest that when a task is ran, there is a pop up that explains the maximum amount of samples that can be acquired and time that can pass.  The document I mentioned could be available via hyper-link, and the hyper-link text could read "How did we calculate this number?", which could explain the process in more depth.

 

I have submitted this idea for Signal Express at the product suggestion page found at www.ni.com/contact ("feedback" link in bottom left of window).  I figured this would be a good idea for DAQ in general for any extended signal acquisition.

 

Shawn Shaw

Applications Engineering Department

National Instruments

0 Kudos

I'm not sure who reads this, but I'm not seeing any feedback in the forum so I thought I'd post up here.   This may seem like a simple thing, but hopefully my pain will be someone else gain.  I messed with this off and on for two months before I finally figured it out.  It's probably obvious to those who work with this equipment every day and have EE degrees, but not so much to those of us who do not. 

 

Please see http://forums.ni.com/t5/Multifunction-DAQ/Measuring-220VAC-with-CompactDAQ/td-p/1145956 for a description of the issues I was having. 

 

Some will tell me to "search"... well, I did.  ...and everywhere I looked all I found was that I needed to measure from the power leg to ground.  All of the examples either were only measuring 120 which is fine since what you have is one power leg and neutral... which is basically ground!  So those numbers come out fine.  Or they were for 3 phase, which I don't use currently.  BUT when you try and measure anything with two power legs (basically anything between 208VAC to 240 VAC) the numbers don't work out right if you try to measure between each power leg to ground and then try to recombine them and plug them into the Power VI's.  Not one place that I looked did it tell me that I need to measure both power legs across a single channel.  I only finally tried it because I'd done pretty much everything else.  I know this is pretty basic stuff, but when all you give me says one thing... that's what we do.  Just trying to help others.

Thanks,

Chad

0 Kudos

As far as I know, the DAQ-mx has provicion to generate the COUNTER output or to read digital input triggered by the change detection event.


It would be nice if one is capable of generating the DIGITAL output on accurance of change detection event.(i.e. it can be triggered)Smiley Tongue

 

Also digital output must be allowed to generate the waveforms which are triggered by any of the digital input lines.

 

If possible then COUNTER counts should be available to user in LABVIEW after triggering, to synchronise the various digital input/output to counter state/count.Since some of the actions are COUNT dependent.

  • Multifunction DAQ