As far as I'm aware there is no current way to export the quadrature conversion clock from an encoder task. It would be great to be able to export that signal to use for other things.
For example, you could use this signal to perform an analog acquisition each time your encoder passed a mark, giving you position vs. signal. This would be useful for something like a roughness tester where you drag a stylus across a sample and get roughness vs. position. Or a pressure sensor reading pressure versus crankshaft angle.
Why is there no way to determine Trigger Status in NI-Scope?
No, "Acquisition Status" is NOT the same thing. As Acquisition Status only tells you if acquisition is complete. I suppose you could say if it's not complete then it is still waiting for trigger, but you could be wrong. Specially in the case of a slow acquisition (seconds/div).
Any modern DSO I have used had a VI to determine Trigger Status and would return several states.
Ready, Armed, or Waiting (pre-trigger acquisition complete, ready and waiting for trigger)
Note: this might only be relevant & useful for multiplexing AI devices.
It was an important technique used to investigate channel-to-channel ghosting influence on a multiplexing device. For example, suppose you had a large signal A and a small signal B. You could set up a channel list to read "A,B,B,B,B", and thereby *investigate* the trend of these successive readings of the same channel and use *evidence* to determine when the ghosting influence had dissipated. THIS WAS IMPORTANT!
I recall hearing of others use the capability to be able to get faster sampling for one of their fast-changing channels, using a channel list along the lines of "A,B, C,B, D,B, E,B".
The feature that used to be supported on multiplexing MIO devices as far back as I could remember working with NI DAQ products. I only recently discovered that such support was inexplicably removed from DAQmx sometime in the last few years, now producing a fatal task error. I tested a few available systems and found that it was still supported under DAQmx 16.0, not supported by DAQmx 18.1, and not yet restored as of DAQmx 20.1. (All verifications were done with desktop or PXI X-series devices.)
Why was this ability taken away? Stop being so overzealous trying to protect us from ourselves! Multiple readings of the same channel is a legitimate and sometimes almost necessary technique. Maybe you lacked the imagination to understand why we'd want this, but guess what? We knew what we were doing. So stop stopping us.
Attempting to add a channel (Ch8) to an existing task which already uses that channel as a result of having already assigned its paired differential channel (Ch0) results in that channel correctly not being added to the task but also does not generate a warning or error. Thus when attempting to configure a task with 2 channels as Ch0 differential and Ch8 RSE, the resulting task only contains 1 channel. This may be not be apparent because the channels 0 and 8 appear as different from each other. Having a warning could give notification of this fact.
I'd like to expose an opportunity to make FlexLogger easier to use around shunt calibration. With various internal shunt calibration resistors between different hardware, is there a way that the software would know the appropriate location and resistance value when performing a shunt calibration? It is tedious, and a possible point of error, having to manually enter, for example, the correct shunt resistor between using our 9236 cards and the strain based FieldDAQ units.
The use of NI DAQmx Global Virtual Channels is the best way that I have found to configure and manage parameter scaling to provide measurement data directly in engineering units. The software architecture of our data acquisition applications is centered around the use of Global Channels (aka Virtual Channels). Configuring Global Channels using NI MAX (Measurement & Automation Explorer) is convenient for a small number of measurement channels. However, in system configurations with hundreds of channels, to create and manage them effectively, you need to build your own application for this task. We have built custom Global Channel creation VIs in LabVIEW for each type of analog input or sensor type that we use. These VIs are designed to read a configuration table in CSV format, and then loop through creating a Global Channel for each table row of information. This process allows for more flexibility in naming the channels as well as setting different scaling and storage of other channel-specific metadata.
We employ multiple networked data acquisition systems with Linux-based NI controllers, and we manage those DAQmx Global Channels across our network. Currently, to create DAQmx Global Channels on these systems, we must directly connect to the NI controller in the LabVIEW Project View and run our suite of Global Channel creation apps on that particular controller using their accompanying, locally-stored CSV configuration file. Currently, our networked distributed data acquisition system has grown to 10 controllers, each with their own set of common I/O. The creation of those Global Channels has now become cumbersome because we have five I/O types to manage and hence five custom LabVIEW applications each with a specific configuration file to run on 10 controllers individually.
A potential simplification to the process would be to run our custom LabVIEW applications on a Windows host PC and create the DAQmx Global Channels remotely. That would eliminate the need to directly connect to each controller from the LabVIEW Project View and copy a common set of config files to each controller and run the same applications on individually on each controller. However, the current version of DAQmx makes no provision for remotely creating DAQmx Global Channels. Note that NI-MAX is able to create DAQmx Global Channels remotely, but this functionality is not exposed for LabVIEW programmers to utilize. My idea/suggestion is that NI make this functionality available in LabVIEW.
It would be great if there is c series CAN interface module which doesn't need an external power supply. This makes it easy to use and saves time to set up because we don't have to find or prepare an additional power source.
While using NI USB-6008/6009, I am encountering a strange error which I have failed to find the source of it. As you can see, there are 2 types of outputs. When I comment out one of the outputs, other one works fine. However, when I use them both, it gives me an error as shown below in the picture. What could be the problem?
There's no option to cancel changes made to a channel in FlexLogger. If I edit or change a series of parameters, but want to revert to the previous settings, there is no option to do so. I then need to re-enter what was there (and hope I don't forget what it was), or mash Ctrl+Z and hope I undo the correct number of steps.
(Also pressing Esc on this window should perform a cancel operation and close the window).
So I have a cRIO with a 9203 mA input module. I also have sensors etc that are 4-20mA. So when it came to using the scaling feature of the shared variable as below see if you can spot the bug.
So I was thinking in mA from sensor to mA input to the 9203 which is a mA module - but the RAW scale is in Amps! (Which is obvious once your colleague points it out!) Consequently I wasn't seeing any signal readings from the cRIO as 20mA << 4A.
Since there is an error if you get Full and Zero around the wrong way... can there also be an error or warning if its outside the range of the module?
DAQmx allows to register dynamic events , but how about NI-Scope, NI-Fgen , .... if you have an event you can route to something in hardware it should be possible to register it also as an dynamic event in LabVIEW.