Data Acquisition Idea Exchange

Community Browser
About Data Acquisition Idea Exchange

Have an idea for new DAQ hardware or DAQ software features?

  1. Browse by label or search in the Data Acquisition Idea Exchange to see if your idea has previously been submitted. If your idea exists be sure to vote for the idea by giving it kudos to indicate your approval!
  2. If your idea has not been submitted click Post New Idea to submit a product idea. Be sure to submit a separate post for each idea.
  3. Watch as the community gives your idea kudos and adds their input.
  4. As NI R&D considers the idea, they will change the idea status.
  5. Give kudos to other ideas that you would like to see implemented!
Top Kudoed Authors
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

Currently when streaming analog or digital samples to DAQ board, output stays at the level of last sample received when buffer underflow occurs. This behavior can be observed on USB X Series Multifunction DAQ boards. I have USB-6363 model. The exact mode is hardware-timed, buffered, continuous, and non-regenerating. The buffer underflow error code is -200290 “The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.”

 

I would like to have an option to configure DAQ hardware to immediately set voltage on analog and digital outputs to a predefined state if the buffer underrun occurs. Also, I would like to have an option to immediately set one of PFI pins on buffer underrun.  

 

I believe this could be accomplished by modifying X series firmware and providing configuration of this feature in the DAQmx API. If no more samples are available in the buffer the DAQ board should immediately write predefined digital states / analog levels to outputs and indicate buffer underrun state on PFI line. Then it should report error to PC.

 

Doing this in firmware has certain advantages:

  1. It can be done quickly (possibly within the time of the next missing sample – at 2Ms/s that’s 0.5us).
  2. Handles all situations (software lockups, excessive CPU loading by other processes, loss of communication do to bus traffic, interface disconnection…)
  3. It does not require any additional hardware (to turn off outputs externally).
  4. Buffer underrun indication on PFI line could provide additional safety measure (it could be used for example to immediately disable external power amplifier connected to DAQ AO). 

Doing this using other methods is just too slow, does not handle all situations, or requires additional external circuitry.

 

Setting outputs from software, once error occurs, is slow (~25ms / time of 50000 samples at 2MS/s) and does not handle physical disconnection of the interface. Analog output does eventually go to 0 V on USB-6363 when USB cable is disconnected, but it takes about half a second.  

 

Using watchdog timer would also be too slow. The timer can be set to quite a short time, but form the software, I would not be able to reset it faster than every 10ms. It also would require switching off analog channels externally with additional circuitry, because watchdog timer is not available for analog channels.

 

The only viable solution right now is to route task sample clock to PFI and detect when it stops toggling. It actually does stop after last sample is programmed. Once that occurs, outputs can be switched off externally. This requires a whole lot of external circuitry and major development time. If you need reaction time to be within time of one or two samples, pulse detector needs to be customized for every possible sampling rate you might what to use. To make this work right for analog output, it would take RISC microcontroller and analog electronic switches. If you wanted to use external trigger to start the waveform, microcontroller would have to turn on the analog switch, look for beginning of waveform sample clock, record initial clock interval as reference, and finally turn off the switch if no pulse is received within reference time.

 

I’m actually quite impressed how well USB-6363 handles streaming to outputs. This allows me to output waveforms with complexity that regular arbitrary generators with fixed memory and sequencing simply cannot handle. The buffer underflow even at the highest sampling rate is quite rare. However, to make my system robust and safe, I need fast, simple, and reliable method of quickly shutting down the outputs that only hardware/firmware solution can provide.

 

Thanks,

Sebastian

Hi all,

 

Any series card should have a feature listing different  parameters like voltage, temperature etc it supports(May be a property node should be used). so that user can configure the required parameter among the supported.

Ex: SCXI -1520 module can be configured as Strain, Pressure or voltage but this information will be known only by seeing its manual or when a task is created in MAX. But in LabVIEW               Software i cant get this information directly. Because it allows me to configure 1520 as temperature also and we will come to known that 1520 module doesn't support for temperature       parameters only when once tried to acquire.

 

So what you people think about you.Share your ideas on this please. 

 

Regards,

geeta 

Hi

 

I'd like to see PCI express versions of existing PCI Analogue Output cards eg PCI6713 and PCI6733.

 

I'm finding it quite difficult (and quite a bit more expensive) to source desktop PCs featuring PCI slots.

 

 

Currently, there is no way to format a .txt file to log data with a time stamp that includes the current time that the data is being acquired. When creating a "Save to ASCII/LVM" event, the "Time Axis Preference" setting under the File Settings tab has two options: Absolute Time or Relative Time. Relative time works as expected, logging the time starting from 0 seconds until the data acquisition is complete. However, the Absolule Time setting logs the time stamp as time in seconds from some arbitrary point in time, usually the Windows system time in seconds. 

 

This timestamp is essentially useless without a conversion to the actual time. It would be great if the Absolute Time logged the current time in hours:minutes:seconds instead. 

I continually come to your site looking for the DAQmx base API manual and have yet to find it.  I eventually have to dig out an old CD to find my copy.

 

How 'bout posting these online so that we can help ourselves out of jams?

 

Thanks,

Jeff

In dealing with multiple projects and systems that each have different sets of tasks in MAX, I think it would be very handy if you could make virtual folders in the directory style listing under "NI-DAQmx Tasks" - that way you could folder up tasks by project or section of a project instead of having a long list of task names.

 

Anyone else think this would be helpful? or might it cause an issue in some way?

 

-pat

The title pretty much says it all. I would like the ability to either configure a full hardware compliment as simulated devices then switch them over to real devices when the hardware arrives or go from real devices to simulated devices without the need to add new, discrete simulated devices to MAX.

 

This would make for much easier offline development and ultimate deployment to real hardware.

When it comes to documentation of an measurement, you need to report ALL settings of a device that effects that measurement.

From a core memory dump written as a hex string to a XML document.... anything that shows up a difference in the settings that affect the measurement would be fine for documentation.

Something like a big property node readout followed by a format into string .... but make sure not to miss a property.... and a bit more complicated when it comes to signal routing....

 

A measurement that isn't sufficiently documented is all for naught. 

or

Just think of a nasty auditor Smiley Wink

 

It's so easy to make measurements with LabVIEW, please make it easy and consistent to document it.

 

Example:

A quick measurement setup with the DAQ-assistant/Express fills Gigabytes but after a certain time they are useless because nobody knows how they where taken. A simple checkbox could add all this information in the variant of the waveform. (or TDMS or ...) even if the operator don't have a clue of all the settings that affect his measurements.

 

Multiple people have requested that there be a natural way for Labview and SignalExpress to do a rotational speed measurement using a quadrature encoder. An express VI under "Acquire Signals>>Counter Input>>Rotational Speed" that asks you basic quadrature encoder type questions and computes the rotational speed would be very useful. The information it asks would be things such as Ticks per Revolution, Decoding type (x1, x2, x4) would be useful in computing rotational speed. In addition, this can be then converted into a shipping example for DAQmx relatively easily. I have had multiple people ask this question and believe that especially within SignalExpress, this would be very useful.

 

 

Rotation.png

 

 

It gets a bit annoying that PXI1Slot2 is listed after PXI1Slot14 when doing an ascii sort. I (ok, admittedly, my coworker) proposes having naming conventions that will allow for a better ascii sort. For instance, PXI1Slot002 PXI1Slot014. 

Would like to be able to collect a couple channels of analog inputs to the iPad.  This is nice but I need a minimum of 2 analog inputs and I would rather have NI:  http://www.oscium.com/

 

Response from coorporate:

"We don't currently have anything that would meet the customer's requirement of being able to plug in directly into the iPad for data acquisition.

I don't believe that the iPad supports Silverlight which is a framework developed by Microsoft.  Also, wireless DAQ has to communicate with a host running DAQmx, so the customer would still need a 2nd computer even if using wireless DAQ.

If you want to connect data acquisition hardware (of any form-factor) to a machine running LabVIEW and DAQmx,  then use LabVIEW Web Services to publish the front panel to the web and view/control it from his iPad.

We do have several USB products that will work with Windows-based netbooks that could be an alternative solution if topic is open to a non-Apple platform.  For example, the 5132/5133 are bus-powered digitizers with much higher sample rate, bandwidth, and buffer size compared to the Oscium device.  However, the price is also quite a bit higher."

NI provides some 100-Pin-DAQ devices, e.g. one for INDUSTRIAL DIGITAL IO

http://sine.ni.com/nips/cds/view/p/lang/en/nid/13577

 

But why doesn't you offer also a basic connector block for a reasonable price, especially for industrial applictations, where it is common to wire (DIO) signals through DIN rail mounted terminal blocks?

 

This connector block should have the following features:

 

- DIN rail mountable

- simple wire connection, best with spring terminals

- 100 Pin-cable connection

      (http://sine.ni.com/nips/cds/view/p/lang/en/nid/13600)

- relatively small for installation in a switch cabinet

- no signal conditioning, just clamps

- much cheaper than then currently available SCB-100 block

 

Please see also this related idea:

http://forums.ni.com/t5/Data-Acquisition-Idea-Exchange/Terminal-Block-layouts/idi-p/2160542

 

Regards

A-T-R

 

While running a test I developed I noticed an odd DAQmx behavior.  After a USB 6212 connection was momentarilly interupted all read and write tasks associated with the device hung until the USB cable was disconnected.  It would have been easy to code around if there was a Dev connection event and a Dev disconnection event that I could use to pause operations and trigger a Device reset.  Since the devices are PnP couldn't the DAQmx API simply make the system hardware connect/disconnect events visable?

It has come up a few times from customers, and I wanted to gauge interest and solicit ideas on how this should work.

 

Currently, with the built-in TDMS logging support, if you want to change to a new file in the middle of logging, you need to stop the task and start again.  For some use cases, this isn't practical (for example, http://forums.ni.com/t5/LabVIEW/Why-the-TDMS-file-is-larger-than-it-should-be/m-p/1176139#M511099).

 

The question is: How would you like to specify the "new file" behavior and what are your use cases?

 

For instance, a couple ideas to get the ball rolling:

  1. Add an interval attribute like "Change file after n samples".   We would then auto-increment the file name and change to that file when we have logged "n" samples.
  2. Make file path attribute changeable at runtime.  We have a file path attribute for logging.  The idea here would be to support changing the file path "on the fly" without stopping and starting the task.  The problem here is that it would not suit very well a use case where you want a specific file size.  Additionally, it wouldn't be as easy to use as #1; it would be more flexible though.
  3. (Any additional ideas/use cases?)

Thank you for your input!

 

Andy McRorie

NI R&D

It has come up a few times from customers, and I wanted to gauge interest and solicit ideas on how this should work.

 

Currently, with the built-in TDMS logging support, if you want to change to a new file in the middle of logging, you need to stop the task and start again.  For some use cases, this isn't practical (for example, http://forums.ni.com/t5/LabVIEW/Why-the-TDMS-file-is-larger-than-it-should-be/m-p/1176139#M511099).

 

The question is: How would you like to specify the "new file" behavior and what are your use cases?

 

What I'm currently thinking (because it seems the most flexible to different criteria and situations) is to simply allow you to set the file path property while the task is running (on DAQmx Read property node).  The only downside I can think of with this approach is that you wouldn't know exactly when we change to the new file.  We could guarantee within (for example) 1 second, but you wouldn't be able to specify the exact size.

 

Would this be a good solution for you?  Can you think of a better way to specify this behavior?

 

I could use a USB X series multifunction device with more than 4 AO analog output channels.  My current application does not require them to have waveform generation capability.  There are devices available with more than 4 AO but I need AI and DI/DO and CTR as well and prefer them all on one device.

 

I use Daqmx a lot for writing .NET based measurement software.

 

Whereas the API itself is quite decent, the docs are horrible. Accessing them is convoluted at best, requiring the VS help viewer. Almost nothing is available online and decent examples are quite scarce, which will definitely be an issue for absolute beginners...

 

This definitely deserves some attention!

 

Cheers,

 

Kris

I would like to have an programmable gain amplifier in the analog output path that I can use to adjust the amplitude of an output signal.  In control applications, this would be much better than having to stop a continuous task, reload the data with a new amplitude, and start the task again.

 

Ideally, for some of my applications, it would be nice to be able to generate a basic waveform scaled to +/- 1V and then have a property that I can write to while the task is running to set the gain.

It would be nice to have a hardware request forum as well so NI could see the demand for producing hardware for us. 

Just A few things I could use or want:

cDaq counter modules, like a 6602 in cDaq form so I can add more counters to my compact daq module, a lower end crio 'brick', think like a 6009 morphed with a single board rio, something under 1k for simple embedded data logging prototypes.  Maybe support for PIC micros (FPGA is so empowering, more microcontroller programming native to labview would be very nice to add to our toolbox).

 

would a seperate HW forum be helpful?

Currently, DSA devices that use voltage excitation have no method to provide that excitation to a particular device within test panels. The only method to do this would be to create a task in Measurement and Automation Explorer which takes much more time that doing a simple test panels test. This should be a fairly simple addition to the test panels user interface. One could simply have a box to check if they require excitation, and a control to determine the voltage level to provide to the DUT. They currently have this for IEPE devices, and it makes sense that voltage excitation should be the same.

The size of for example the NI-Rio driver package is 4GB in the most recent version which is comparable to size of common operating systems. This is too much in my opinion if someone needs only a specific driver for a specific NI hardware. Therfore i suggest granularity reduction of driver packages to a more mouth friendly morsel (for ex. 200MB max).