Occasionally, I need to create global virtual channels that are used to acquire AC voltage signals. Currently, I just acquire the instantaneous values and take the RMS average in LabVIEW. However, this does not let you calibrate the global virtual channel in MAX (because the acquisition is the instantaneous DC voltage).
It would be nice to have the custom scales allow user customizable LabVIEW programming plug-ins, such as RMS average point by point, so that I can calibrate an AC voltage channel in MAX.
We use often the NI CompactDAQ 9234 for sound measurements.
Our standard microphones with iepe amplifier have a noise level of about 16 dB(A) and sensitivity of 40...50 mV/Pa.
The noise of the 9234 is about 50μV rms, corresponding to a sound level about 32 dB(A). So we can use this microphones only for measurement above 35 dB(A).
A better version of a card 9234 with 2 ranges 5V and 0.5 V would be very useful. The noise in the lower range should of course not exceed the range of 5...10 μV (12..18 dB(A)).
And many monitoring systems have only one Microphone, so we use only one channel of the 9234.
For this cases would be a lower priced one channel card OK.
A two channel card would be perfect: two channel measurement of one microphone signal in both ranges. The sound level program can measure from 20 dB(A) ... 136 dB peak without range switching.
I want to buy a small standalone controller based NI Data Acquisition system which would have the following features,
By standalone i mean that the DAQ system should be such that, it does'nt require a permanent PC connection (i.e. just program once). And it sends the acquired signals to a remote location via ethernet interface.
Please suggest me a DAQ system with the above mentioned features.
By default, DAQmx terminal constants/controls only show a subset of what is really available. To see everything, you have to right-click the terminal and select "I/O Name Filtering", then check "Include Advanced Terminals":
I guess this is intended to prevent new users from being overwhelmed. However, what is really does is create a hurdle that prevents them from configuring their device in a more "advanced" manner since they have no idea that the name filtering box exists.
I am putting "advanced" in quotes because I find the distinction very much arbitrary.
As a more experienced DAQmx user, I change the I/O name filtering literally every time I put down a terminal without thinking about it (who can keep track of which subset of DAQmx applications are considered "advanced"). The worst part about this is trying to explain how to do something to newer users and having to tell them to change the I/O name filtering every single time (or if you don't, you'll almost certainly get a response back like this).
Why not make the so-called "advanced" terminals show in the drop-down list by default?
With NI 9234 board you can use 4 IEPE sensors but you don' have IEPE open/short detection capability.
NI 9232 board has IEPE open/short detection capability but has only 3 channels.
I think that a board with 4 channels (as 9234) and an IEPE open/short detection capability would be great!
Currently we are using LabWindows/CVI with a 96 bit DIO card (PXI-6509).
What we have found and NI support confirmed is that with the following the software needs to be aware of the bit offset during a write to one or more lines on a port.
Virtual Channel Physical Channel
dataEnable dev1/port0/line 2
Our assumption was that writing to 'dataEnable' a value of 1 using DAQmxWriteDigitalU8() would write to the virtual channel 'dataEnable'. What we found is that is not the case. We need to write a value of 0x04. But that the bits that are set to zero in this value written to 'dataEnable' have no affect on other lines on the port that are already set. This gives us the impression that the driver has knowledge of what bit position we are trying to write too.
So based on this why is it not possible that when I call from LabWindows/CVI to do a write to a virtual channel I cannot just do something like this:
Virtual Channel Physical Channel
Line 2 = dataEnable
Line 3 = dataClk
write( dataEnable_Clk, 1) // to set enable line high
write( dataEnable_Clk, 3) // to keep enable line high and raise clk line
write( dataEnable_Clk, 1) // keep enable line high and lower clk line
write( dataEnable_Clk, 0) // lower enable line
** assumption is that seperate lines on another port are used to present the data to the external hardware and are not shown here. The data would have bee setup before the sequence above and then change data and repeat sequence as needed.
Here I don't have to keep in mind that Enable is on line 2 and Clk is on line 3 or have to setup values of 0x04 for hte Enable and 0x08 for the Clk. If I have to do this I would rather have direct access to each port to just write the values directly. i know there is the register level I can use but doing this at a higher level is better.
In our code when a internal function is called to write data we would like to just write a value out to the virtual channel and not have to figure out the bit alignment to shift over the value to use one of the current functions.
Let me know your thoughts.
I need to acquire signals from an incremental encoder and I have a board NI USB6259 to do this.
It seems that this hardware has special inputs for position acquisition from incremental encoders.
Looking at USB6259 datasheet I could make the data acquisition using inputs Counter/Timer.
If that is right I should send the square waves TTL generated from the encoder to these inputs but my encoder has sinusoidal outputs with a certain phase between two signals.
If I need to send TTL signal to my USB6259 I should convert sinusoidal signals with additional hardware.
Is everything right? Did anyone acquire encoder signals before? Suggestions?
Thanks in advance
Texas Instruments makes a superb line of 16-bit Sigma-Delta ADCs with sampling rates up to 10MHz: ADS 1610, ADS 1602, etc.
They sell for about $25 each in modest quantities.
Using these converters would provide much better fidelity than any available products as there is no need for external analog antialiasing filters.
I'll place my order now. I personally need 1,2,4,8 AIs and 1,2 AOs with same sampling frequencies from same clock. I don't need all those digital I/Os and quadrature decoders.
Just give me analog I/O with sigma-delta converters. When can I place an order?
We use our acquisition software with a variety of hardware configurations. We validate our configurations using simulated hardware, but every time we need to check out a different configuration, we have to delete and create simulated devices. It would be nice to have a better method for switching between different simulated configurations.
I bought a NI USB-6251 BNC but the support explained me that it would have no Linux support out of the box. I will have to find out how to use it on Linux systems myself now (perhaps with help of the forum). It would be a nice feature, if it would ship with Linux support.
Measurement and Automation Explorer MAX's Test Panel's Analog Input provides a quick method to examine a signal and vary acquisition parameters. It would be useful to be able to zoom the time axis and have a cursor display so that for example noise level or rise time could be looked at in more detail. The time axis limits can currently be manually overwritten as a way to zoom but that is cumbersome. Assuming the graph being used in this test panel is built from a standard NI graph, it should have zoom and cursor capability already part of it and thus easily added.
NI should make sure that the measurement uncertainty specifications for its DAQ hardware are aligned with uncertainty analyses that are performed according the ISO "Guide to the expression of Uncertainty in Measurement" (GUM). See http://www.bipm.org/en/publications/guides/gum.htm
Absolute encoders have been around for some time, but NI's motion hardware still supports only incremental encoders. I would like to see support for absolute encoders in NI Motion or NI Soft Motion.
NI supports almost any bus. Why not SSI (synchronous serial interface) ?
Of course, there is always the option to use an R series card and then build an interface. Why not have a low-cost PCI or USB card? Also, perhaps a C-series module, so that we don't have to take up FPGA space?
Hi everyone, I have new project of measuring pressure and temperature, the problem here is I must put everything in a box with IP 65 (High safety) and the data will be logged about 5 minutes of period, after each week I will take the data and load to computer. Can I using NI product to do it? is there any NI products with large memory inside for storage?
Currently when streaming analog or digital samples to DAQ board, output stays at the level of last sample received when buffer underflow occurs. This behavior can be observed on USB X Series Multifunction DAQ boards. I have USB-6363 model. The exact mode is hardware-timed, buffered, continuous, and non-regenerating. The buffer underflow error code is -200290 “The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.”
I would like to have an option to configure DAQ hardware to immediately set voltage on analog and digital outputs to a predefined state if the buffer underrun occurs. Also, I would like to have an option to immediately set one of PFI pins on buffer underrun.
I believe this could be accomplished by modifying X series firmware and providing configuration of this feature in the DAQmx API. If no more samples are available in the buffer the DAQ board should immediately write predefined digital states / analog levels to outputs and indicate buffer underrun state on PFI line. Then it should report error to PC.
Doing this in firmware has certain advantages:
Doing this using other methods is just too slow, does not handle all situations, or requires additional external circuitry.
Setting outputs from software, once error occurs, is slow (~25ms / time of 50000 samples at 2MS/s) and does not handle physical disconnection of the interface. Analog output does eventually go to 0 V on USB-6363 when USB cable is disconnected, but it takes about half a second.
Using watchdog timer would also be too slow. The timer can be set to quite a short time, but form the software, I would not be able to reset it faster than every 10ms. It also would require switching off analog channels externally with additional circuitry, because watchdog timer is not available for analog channels.
The only viable solution right now is to route task sample clock to PFI and detect when it stops toggling. It actually does stop after last sample is programmed. Once that occurs, outputs can be switched off externally. This requires a whole lot of external circuitry and major development time. If you need reaction time to be within time of one or two samples, pulse detector needs to be customized for every possible sampling rate you might what to use. To make this work right for analog output, it would take RISC microcontroller and analog electronic switches. If you wanted to use external trigger to start the waveform, microcontroller would have to turn on the analog switch, look for beginning of waveform sample clock, record initial clock interval as reference, and finally turn off the switch if no pulse is received within reference time.
I’m actually quite impressed how well USB-6363 handles streaming to outputs. This allows me to output waveforms with complexity that regular arbitrary generators with fixed memory and sequencing simply cannot handle. The buffer underflow even at the highest sampling rate is quite rare. However, to make my system robust and safe, I need fast, simple, and reliable method of quickly shutting down the outputs that only hardware/firmware solution can provide.
NI provides some 100-Pin-DAQ devices, e.g. one for INDUSTRIAL DIGITAL IO
But why doesn't you offer also a basic connector block for a reasonable price, especially for industrial applictations, where it is common to wire (DIO) signals through DIN rail mounted terminal blocks?
This connector block should have the following features:
- DIN rail mountable
- simple wire connection, best with spring terminals
- 100 Pin-cable connection
- relatively small for installation in a switch cabinet
- no signal conditioning, just clamps
- much cheaper than then currently available SCB-100 block
Please see also this related idea:
I've been told that in order to change module settings, you have to connect a development machine to the network has has the cRIO hardware. Most systems I build have a deployed application and are located somewhere else (i.e. no connected to my machine). It'd be nice if the cRIO module setting could be ported to the cRIO unit without having to connect a development machine and hit "connect" in the project.
For example, I had a cRIO unit in my office and set it up for a project. They installed it, wired it, and tested it. A few months later we needed to add a NI 9213 (16 ch Thermocouple module). It defaults to type J and degree Celcius. In order to switch it to type K, I had to bring my desktop out to the unit, connect to the network, and redeploy the cRIO. I called support twice and was told this is the only way to deploy the module settings. If i'm wrong, someone please correct me.
I'm thinking it could be another type of application builder or something you could add to an existing application.