Data Acquisition Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

1. on page 2-4 of the manual (http://www.ni.com/pdf/manuals/375865a.pdf😞

here is a sketch or a picture helpful to understand the text. 

It is easier to understand page 2-4 with a small picture how to connect the AI SENSE for exapmle.

 

2. a terminal diagramm in the manual for each card (PXI, PCI) is also very helpful.

Alternatively a paper with the terminal diagramm to send with the devices.

1. on page 2-4 of the manual (http://www.ni.com/pdf/manuals/370489g.pdf😞

here is a sketch or a picture helpful to understand the text.

 

2. a terminal diagramm in the manual for each card (PXI, PCI) is also very helpful.

Alternatively a paper with the terminal diagramm to send with the devices.

 

Hi Everyone. I have an old Kistler type 7001 pressure sensor and type 5007 charge amplifier and I'm going to use NI USB 6009 DAQ board to measure in-cylinder pressures of a single cylinder diesel engine.I'm very much new to Lab view and using all these sensors.Can some one kindly tell me how to start and I would be very grateful if someone could post a program (block diagram) to collect,convert the voltage to pressure and write the final output to a text file.Thanks alot in advance.

We use often the NI CompactDAQ 9234 for sound measurements.

Our standard microphones with iepe amplifier have a noise level of about 16 dB(A) and sensitivity of 40...50 mV/Pa.

The noise of the 9234 is about 50μV rms, corresponding to a sound level about 32 dB(A). So we can use this microphones only for measurement above 35 dB(A).

A better version of a card 9234 with 2 ranges 5V and 0.5 V would be very useful. The noise in the lower range should of course  not exceed the range of 5...10 μV (12..18 dB(A)).

And many monitoring systems have only one Microphone, so we use only one channel of the 9234.

For this cases would be a lower priced one channel card OK.

A two channel card would be perfect: two channel measurement of one microphone signal in both ranges. The  sound level program can measure from 20 dB(A) ... 136 dB peak without range switching.

 

 

I want to buy a small standalone controller based NI Data Acquisition system which would have the following features,

  • 16 AIs
  • 4 Counters
  • Ethernet interface

By standalone i mean that the DAQ system should be such that, it does'nt require a permanent PC connection (i.e. just program once). And it sends the acquired signals to a remote location via ethernet interface.

 

Please suggest me a DAQ system with the above mentioned features.

 

Thanks

 

Regards,

WAQAR

Currently we are using LabWindows/CVI with a 96 bit DIO card (PXI-6509).

 

What we have found and NI support confirmed is that with the following the software needs to be aware of the bit offset during a write to one or more lines on a port.

 

Virtual Channel                    Physical Channel

dataEnable                          dev1/port0/line 2

 

Our assumption was that writing to 'dataEnable' a value of 1 using DAQmxWriteDigitalU8() would write to the virtual channel 'dataEnable'.  What we found is that is not the case.  We need to write a value of 0x04.  But that the bits that are set to zero in this value written to 'dataEnable' have no affect on other lines on the port that are already set.  This gives us the impression that the driver has knowledge of what bit position we are trying to write too.

 

So based on this why is it not possible that when I call from LabWindows/CVI to do a write to a virtual channel I cannot just do something like this:

 

Virtual Channel                    Physical Channel

dataEnable_Clk                   dev1/port0/line2:3

 

Line 2 = dataEnable

Line 3 = dataClk

 

write( dataEnable_Clk, 1)               // to set enable line high

write( dataEnable_Clk, 3)               // to keep enable line high and raise clk line

write( dataEnable_Clk, 1)               // keep enable line high and lower clk line

write( dataEnable_Clk, 0)               // lower enable line

 

** assumption is that seperate lines on another port are used to present the data to the external hardware and are not shown here.  The data would have bee setup before the sequence above and then change data and repeat sequence as needed.

 

Here I don't have to keep in mind that Enable is on line 2 and Clk is on line 3 or have to setup values of 0x04 for hte Enable and 0x08 for the Clk.  If I have to do this I would rather have direct access to each port to just write the values directly.  i know there is the register level I can use but doing this at a higher level is better.

 

In our code when a internal function is called to write data we would like to just write a value out to the virtual channel and not have to figure out the bit alignment to shift over the value to use one of the current functions.

 

Let me know your thoughts.

 

Bob Delsescaux

Hi all,

I need to acquire signals from an incremental encoder and I have a board NI USB6259 to do this.

It seems that this hardware has special inputs for position acquisition from incremental encoders.

 

Looking at USB6259 datasheet I could make the data acquisition using inputs Counter/Timer.

If that is right I should send the square waves TTL generated from the encoder to these inputs but my encoder has sinusoidal outputs with a certain phase between two signals.

If I need to send TTL signal to my USB6259 I should convert sinusoidal signals with additional hardware.

 

Is everything right? Did anyone acquire encoder signals before? Suggestions?

Thanks in advance

 

Vito

 

Texas Instruments makes a superb line of 16-bit Sigma-Delta ADCs with sampling rates up to 10MHz: ADS 1610, ADS 1602, etc.

See  http://www.ti.com/lit/ds/symlink/ads1610.pdf

They sell for about $25 each in modest quantities.

 

Using these converters would provide much better fidelity than any available products as there is no need for external analog antialiasing filters. 

 

I'll place my order now. I personally need 1,2,4,8 AIs and 1,2 AOs with same sampling frequencies from same clock. I don't need all those digital I/Os and quadrature decoders.

 

Just give me analog I/O with sigma-delta converters.  When can I place an order?

 

 

 

We use our acquisition software with a variety of hardware configurations. We validate our configurations using simulated hardware, but every time we need to check out a different configuration, we have to delete and create simulated devices. It would be nice to have a better method for switching between different simulated configurations.

In order to real-time monitor and automatic debug for a exist GBIP device loop by software application, need NI provide the solution without any impact to original device loop if plug a GPIB+ card.

 

It would be nice to be able to start Ananlog Waveform Editor with a specified file loaded.

Hi everyone, I have new project of measuring pressure and temperature, the problem here is I must put everything in a box with IP 65 (High safety) and the data will be logged about 5 minutes of period, after each week I will take the data and load to computer. Can I using NI product to do it? is there any NI products with large memory inside for storage?

I've been told that in order to change module settings, you have to connect a development machine to the network has has the cRIO hardware. Most systems I build have a deployed application and are located somewhere else (i.e. no connected to my machine). It'd be nice if the cRIO module setting could be ported to the cRIO unit without having to connect a development machine and hit "connect" in the project.

 

For example, I had a cRIO unit in my office and set it up for a project. They installed it, wired it, and tested it. A few months later we needed to add a NI 9213 (16 ch Thermocouple module). It defaults to type J and degree Celcius. In order to switch it to type K, I had to bring my desktop out to the unit, connect to the network, and redeploy the cRIO. I called support twice and was told this is the only way to deploy the module settings. If i'm wrong, someone please correct me.

 

 

I'm thinking it could be another type of application builder or something you could add to an existing application.

 

 

It would be better, to have auto zero support to all daqmx devices (or) to give any offset value to read or configuration of channel function

For test systems that use several daq devices and need to associate them with specific hardware tasks in a LabVIEW program, it is desirable to allow the user to specify which daq device is to be used with a certain function in the program.  For example Dev1 may perform sensor measurements, Dev2 may perform control tasks.  There can also be situations where the same type of task is performed more than once in the program by more than one daq device, again requiring that the specific daq device be associated with the specific task.  The serial embedded number can be read from the daq hardware and displayed to the user for them to choose and match with the appropriate task.  It would be easier and less error prone if there was a way to program (via MAX ?) a custom serial number/name into the daq hardware (in addition to the NI supplied one) that could then be read and shown to the user.

 

Steve

 

I need to computer 16 ADC lines simultaneously. 9239 is great except when I use 4 of them, each start with a random delay. Also after about minutes, they start drifting and the phase shift is very obvious.

 

My signals have to be isolated from each other and the USB port.  It would be ideal to have them connected through a second port or the DB9 which already exist so that they share the same synch signal and clock. Once one of them starts, it kick start the rest.

 

Thanks,

Omid

The current way to programmatically update the GPIB-ENET/100 is to call firmwareupdate.exe and then use the windows API to tab through and enter data.  I suggest that we update the code so that firmwareupdate.exe could be called from the command prompt with parameters that will allow this process to be automated in a batch file.

 

Thanks!

 

Shawn S.

I think it would be nice if NI could develop a digitizer to be able to perform USB 3.0 compliance tests, as well as potentially support PCIe tests as it's a pretty similar physical layer.  I'll just go ahead and buy the LeCroy or Agilent solution instead for $100k+.  

Not sure if this is the ideal place for this suggestion (maybe MAX suggestions?), but here goes...

 

When dealing with various remotely deployed cRIO hardware configurations, it may be impossible to keep a sample configuration of every type of system we ever sell.  Unfortunately, if upgrades or revisions are made to the base code in our system, remotely deploying to our customers becomes impossible unless we have their exact configuration on-hand for the programmers to compile.  Remote connection to the hardware for this type of operation is also not typically possible.

 

To be able to simulate or emulate a full cRIO system (processor & hardware modules), then compile the RT code for deployment on that system as if it is physically connected to our development system would be ideal.  This would allow images to be created, which can be sent to customers for local deployment at their facility.  Dramatic decrease in "hardware library" requirements on the development end, reduction in "on-site upgrade" service trip costs to the customers.  Plus, easier for OEMs like me to justify the move away from PLC types of hardware and towards cRIO, once you take away some of the potentially nightmarish continued support and update issues involved with basing systems on cRIO platforms.

I am trying to use labview to make a stress strain curve for an experiment I am running. However my use of lab view is highly limited. I have a load cell that inputs voltage using a transducer techniques DPM3 to send the voltage signal and I have a laser extensometer that inputs voltages also. My question is about the graph and what is the best way to put the strain input n the x axis and the load input on the y axis.