In my post on the LabVIEW board I asked if it was possible to have control over the DIO of a simualted DAQ device. Unfortunately it seems this feature is not available. Once MAX is closed the DIOs run through their own sequences.
If there was a non-blocking way to control a simulated DAQ device through MAX it would permit much simpler prototyping of systems before they need to be deployed to hardware. For example if you want to see how a program responds to a value change simply enter it in the non-blocking MAX UI. Or as in my original case can make an executable useable even if you don't have all the necessary hardware.
I think this feature should only be available for simulated devices.
Thanks for reading - and hopefully voting,
Occasionally, I need to create global virtual channels that are used to acquire AC voltage signals. Currently, I just acquire the instantaneous values and take the RMS average in LabVIEW. However, this does not let you calibrate the global virtual channel in MAX (because the acquisition is the instantaneous DC voltage).
It would be nice to have the custom scales allow user customizable LabVIEW programming plug-ins, such as RMS average point by point, so that I can calibrate an AC voltage channel in MAX.
NI provides some 100-Pin-DAQ devices, e.g. one for INDUSTRIAL DIGITAL IO
But why doesn't you offer also a basic connector block for a reasonable price, especially for industrial applictations, where it is common to wire (DIO) signals through DIN rail mounted terminal blocks?
This connector block should have the following features:
- DIN rail mountable
- simple wire connection, best with spring terminals
- 100 Pin-cable connection
- relatively small for installation in a switch cabinet
- no signal conditioning, just clamps
- much cheaper than then currently available SCB-100 block
Please see also this related idea:
Is there any technical reason why this cannot be added to DAQmx? M series boards still have features that cannot be found on X or S series such as analog current input.
Ideally, it would be best to be able to have multidevice tasks for both M and X at the same time.
When it comes to documentation of an measurement, you need to report ALL settings of a device that effects that measurement.
From a core memory dump written as a hex string to a XML document.... anything that shows up a difference in the settings that affect the measurement would be fine for documentation.
Something like a big property node readout followed by a format into string .... but make sure not to miss a property.... and a bit more complicated when it comes to signal routing....
A measurement that isn't sufficiently documented is all for naught.
Just think of a nasty auditor
It's so easy to make measurements with LabVIEW, please make it easy and consistent to document it.
A quick measurement setup with the DAQ-assistant/Express fills Gigabytes but after a certain time they are useless because nobody knows how they where taken. A simple checkbox could add all this information in the variant of the waveform. (or TDMS or ...) even if the operator don't have a clue of all the settings that affect his measurements.
Would like to be able to collect a couple channels of analog inputs to the iPad. This is nice but I need a minimum of 2 analog inputs and I would rather have NI: http://www.oscium.com/
Response from coorporate:
"We don't currently have anything that would meet
the customer's requirement of being able to plug in directly into the iPad for
I don't believe that the iPad supports Silverlight which is a framework developed by Microsoft. Also, wireless DAQ has to communicate with a host running DAQmx, so the customer would still need a 2nd computer even if using wireless DAQ.
If you want to connect data acquisition hardware (of any form-factor) to a machine running LabVIEW and DAQmx, then use LabVIEW Web Services to publish the front panel to the web and view/control it from his iPad.
We do have several USB products that will work with Windows-based netbooks that could be an alternative solution if topic is open to a non-Apple platform. For example, the 5132/5133 are bus-powered digitizers with much higher sample rate, bandwidth, and buffer size compared to the Oscium device. However, the price is also quite a bit higher."
Would it be possible to update the export wizard in MAX so that the NI-DAQmx Tasks list under Data Neighborhood is listed in alphabetical order? In the main MAX application the list is in order, so finding tasks that are named with a common prefix is easy. However, in the export wizard you have to scroll and hope you clicked them all.
Certified LabVIEW Developer
Lead Engineer - LabVIEW
I continually come to your site looking for the DAQmx base API manual and have yet to find it. I eventually have to dig out an old CD to find my copy.
How 'bout posting these online so that we can help ourselves out of jams?
Multiple people have requested that there be a natural way for Labview and SignalExpress to do a rotational speed measurement using a quadrature encoder. An express VI under "Acquire Signals>>Counter Input>>Rotational Speed" that asks you basic quadrature encoder type questions and computes the rotational speed would be very useful. The information it asks would be things such as Ticks per Revolution, Decoding type (x1, x2, x4) would be useful in computing rotational speed. In addition, this can be then converted into a shipping example for DAQmx relatively easily. I have had multiple people ask this question and believe that especially within SignalExpress, this would be very useful.
Any series card should have a feature listing different parameters like voltage, temperature etc it supports(May be a property node should be used). so that user can configure the required parameter among the supported.
Ex: SCXI -1520 module can be configured as Strain, Pressure or voltage but this information will be known only by seeing its manual or when a task is created in MAX. But in LabVIEW Software i cant get this information directly. Because it allows me to configure 1520 as temperature also and we will come to known that 1520 module doesn't support for temperature parameters only when once tried to acquire.
So what you people think about you.Share your ideas on this please.
I could use a USB X series multifunction device with more than 4 AO analog output channels. My current application does not require them to have waveform generation capability. There are devices available with more than 4 AO but I need AI and DI/DO and CTR as well and prefer them all on one device.
The title pretty much says it all. I would like the ability to either configure a full hardware compliment as simulated devices then switch them over to real devices when the hardware arrives or go from real devices to simulated devices without the need to add new, discrete simulated devices to MAX.
This would make for much easier offline development and ultimate deployment to real hardware.
It seems the only indication in MAX that a device is simulated is that under the Devices and Interfaces section, the tiny glyph to the left of the device name is colored yellow instead of being white/transparent. I end up not remembering what color means what. It would be useful to add text "Simulated" next to the device name. It would also help to distinguish simulated devices by having the color of that glyph be green (instead of its current transparent/white) when the device is installed and detected. Have the color change to red (and keep the existing red X) if had been detected and a device number assigned but is currently not installed/detected. Then simulated devices being yellow may imply "warning/caution" or "not real". Perhaps also have a help-hint popup ("Detected" or "Not Detected" or "Simulated") when the mouse hovers over device names.
Measurement and Automation Explorer MAX's Test Panel's Analog Input provides a quick method to examine a signal and vary acquisition parameters. It would be useful to be able to zoom the time axis and have a cursor display so that for example noise level or rise time could be looked at in more detail. The time axis limits can currently be manually overwritten as a way to zoom but that is cumbersome. Assuming the graph being used in this test panel is built from a standard NI graph, it should have zoom and cursor capability already part of it and thus easily added.
I would like to have an programmable gain amplifier in the analog output path that I can use to adjust the amplitude of an output signal. In control applications, this would be much better than having to stop a continuous task, reload the data with a new amplitude, and start the task again.
Ideally, for some of my applications, it would be nice to be able to generate a basic waveform scaled to +/- 1V and then have a property that I can write to while the task is running to set the gain.
Suggest NI produce an inexpensive (<$100) USB "stick" that has 2 hardware counters on it for optically isolated measurement of encoders, or other high-speed devices. The stick would have a standard connector it for easy wiring of differential encoders with ABZ lines. The device would enable measuring two separate encoders or track two sections of a shaftless drive line that needs to position-follow. One or two DIO lines would be a bonus. This would seem to be a good fit for the industrial machine markets (at the very least). Today you need to buy a multifunction daq for a several hundred dollars if you want two counters.
Contact me with any further questions.