Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for
Search instead for
Did you mean:
Do you have an idea for LabVIEW NXG?
Use the in-product feedback feature to tell us what we’re doing well and what we can improve. NI R&D monitors feedback submissions and evaluates them for upcoming LabVIEW NXG releases. Tell us what you think!
In LabVIEW we can dynamically run a VI in a few ways:
a) If it's not running Top Level VI or if the VI re-entrant with the Run method.
b) Already running as sub VI, with Call By Reference.
c) Make a new VI and drop the (running) sub VI on the diagram.
Downside of a) is we can't always make sub VI's re-entrant, but still want to call it by reference. Downside of b) is we need to know the strict type (connector pane). Downside of c) is we might end up with a lot of VI's just to function as Top Level VI for the sub VI's and it doesn't work in executables.
I like to propose a method, so we can dynamically call a sub VI without knowing the strict type.
Using it, we enable LV to dynamically run sub VI's while setting\getting it's parameters by name.
For sub VI's (already running) this method will act as Top Level VI. For Top Level VI's it will fail unless it's idle.
The "Read Key (Double).vi" is not giving out proper values when a SI notation number is present in the ini file. For e.g. if the number is 1m it returns as 1 while the actual value should be 0.001. It will be helpful if the format specifier is provided as another control.
The request is mostly in the title. Right now it seems as though changing an alarm limit requires stopping the OPC UA server. It seems reasonable to expect that users may wish to adjust alarm limits on the fly -- for my use case, tuning the system during a long commissioning process. It might be many months before a system is fully ready to go, and during that time alarm limits change regularly and we still want to report alarms to operators during this period.
Here's a dumb mistake I think many of us can relate to:
It would be really nice if the VI were just broken in this situation. But I can understand why it's not necessarily simple to mark node *outputs* as required.
But maybe there could be a way for the editor to *hint* that there is a problem here. Maybe the bundle nodes could change color when the output terminal is wired, so you could get a little more obvious feedback if you screwed up like this. The same could go for any other primitives that have a "for type" input (e.g. unflatten from string, variant to data, to more specific class, etc).
Granted, VI Analyzer could report bugs like this, but having a little more immediate feedback would probably be a big win.
(Let me know if this should be cross-posted to the NXG idea exchange, too).
Take for example an enum that is saved as a type def. The enum has many items (let's say words) of varying length.
In order to see all of the elements, inclusive of the widest one, the array can be sized with the right-click option "Size to Widest Element".
If the type def'd enum is edited, and a longer element is added, the array of enum constants will not size to the widest element. This can be frustrating, as dozens (or more) of these arrays scattered about the program are rendered unreadable.
If the user has previously chosen to "Size to Widest Element", this setting should persist. If the user edits the enum, all of the array constants should size to the widest element.
-------------------------- Example ----------------------------------------------------------------------
When a 1 Dimensional array of any type is showing only a single element, LabVIEW forces a horizontal scrollbar. I couldn't find any documentation or reasoning behind it. It's really annoying and ruins UI design that Vertical is the normal scrolling direction for just about everything else ever and LV messes that up for some seemingly arbitrary reason.
The Excel-Specific, Excel General sub-Palette of the Report Generation Toolkit contains a useful function, "Excel Get Last Row". This allows a user to add new rows of data to an existing Worksheet by returning a cluster of MS Office Parameters that can be used with the other Excel Report functions (such as Excel Easy Table) to place the new (row-organized) data below existing rows, such as Column Headers, that might be already present in the Template.
I propose that NI add a similar "Excel Get Last Column" that does the same thing, but returns the last column on the WorkSheet. This would be useful when entering data one column at a time, not uncommon when entering multiple channels (columns) of sampled data, where you want the new data to be just to the right of the existing (columnar) data.
I could easily write such a function myself, but so could NI, and if NI did it, everyone who uses the Report Generation Toolkit would have access to such functionality.
This idea is to improve the QuickDrop search window, to return functions that might be betters suited, based on the datatype of the selected function or wire. Lets say I have a 1D array of numerics selected and I want to reverse it. I will select the wire, then invoke QD and type "reverse". But the first item in the list is actually "Reverse String". With a context aware QD hopefully the search window will see I have an array of numerics, and prioritize the Reverse 1D Array function, and still include the reverse string but maybe push it farther down the list.
This idea can be applied to the basic data types pretty easily (numerics, boolean, array, string, cluster). But we could also use this on class wires that are selected. A class library can have an associated mnu file, which is why some functions you can right click and the corresponding subpalette menu comes up. So this idea could also prioritize functions that are found in the mnu associated with the class.
Please let me opt out from this new feature, introduced in LabVIEW 2017, permanently in the setup dialog.
Using LabVIEW for a very long time (since LabVIEW 2.0), I never wished such a feature (it got only 27 Kudoes) - and - I am even using it's "anti feature", implemented up to now, constructively to detach objects (Pull control into a structure, connect it to the new target - and "Ctrl B").
This new feature, forced onto everybody, would be less annoying, if pressing "W" would reliably disable the feature. However, at least in vritual windows machines (Parallels) on a Mac, it does not work 50% of the time.
What I propose is to have functionality built into the XNet API that is similar to the DAQmx API. I'd want a separate subpalette where functions like Start, and Stop logging can be invoked which will log the raw data from an XNet interface into a TDMS file. Maybe even some other functions like forcing a new file, or forcing a new file after the current file is so old, or of a certain file size. On buses that are heavily loaded, reading every frame, and then logging it can be use a non-trivial amount of resources, and having this built into the API would likely be more efficient.
XNet already has a standard for how to read and write raw frames into a TDMS file that is compatible with DIAdem, and has several shipping examples with LabVIEW to log into this format.
I imported and pasted a graphic (my company logo) onto the front panel of my VI, but apparently there's no simple way to resize it while constraining the aspect ratio (e.g. resizing it proportionally).
I'm not asking to do it programmatically; I'm asking simply about resizing it to fit the rest of the front panel's design.
If I grab the handles and drag, it allows arbitrary resizing on both axes. If I hold down Shift it constrains it to resizing the vertical axis only. I've tried every combination of modifier keys, on both OS X and on Windows, and I can't seem to constrain the aspect ratio.
Most other applications that allow image resizing offer one or more of the following:
Holding down a modifier key (typically Shift) while dragging a corner resize handle forces proportional resizing. The current behavior in LV makes no sense. Holding down Shift forces a vertical resizing only, but this can be done by dragging the top/bottom handle (e.g. not a corner handle).
In the Set Width/Height dialog, have a checkbox to Preserve Aspect Ratio. If checked, any entry into height or width causes automatic calculation in width or height OnBlur.
In the Set Width/Height dialog, support entry by % or by pixels, with the same option to constrain.
This will help all of us produce more professional-looking applications.
Now that the SSP package is delivered on USB instead of DVDs (good stuff!), I have a minor request: Could you have the USB label include a release/version name on its label?
It might add too much of a cost depending on how you get them customized, but if that is not an issue it would be very practical to be able to see what the USB contains by its label (as we could with the DVDs).
On a side note: Many companies have strict regulations on the use of USBs, and the need for such has increased with weaknesses like BadUSB. Perhaps NI could state something about how the USB sticks they send out are protected, either in the delivery package, or just as a statement on ni.com? That way people who need to convince their IT departments to allow them to use the NI USB sticks will have something to show (I'm sure you will have to add some legal disclaimers there as well , but that's OK).
After reading Restore High Contrast Icons I procrastinated as long as possible before installing LV2016. When I finally did, I was disappointed by the additional space required for the palettes; all of them! I have been using LabVIEW since 5.0 and switched to an Icon view of the palettes shortly after getting comfortable with the graphics. Now, I have to move my mouse further to get to each sub-menu and VI selection. It's a waste of developer's time and apparently done for absolutely no good reason except to make a change; very similar to the washed out icons.
This extra space needs to be removed or at least an option provided to set the spacing back to the condensed spacing always available.
These images to show the relative size of the palettes LV2016 vs. 2015.
Yes, this might seem trivial, until you think about traversing several palettes to get to your needed VI.
*Random example, if one were doing FTP development they'd pin the menu.
** The original size of the above graphic is 1030 pixels wide; less than 800 for 2015.
Quit messing with what works and has become the standard with regards to options. At least when that ridiculous "default" setting for icons instead of terminals was introduced we could undo the setting in Options.
It seems that NI has hired some non-G experts to mess up the interface simply so they can enumerate all the "great" improvements they've made. Or, was all the extra space to make sure newbies couldn't miss the folder tab, since connecting the "right arrow" on an icon to it being a sub-folder would be too difficult for children?
Many controls allow you to make scrollbars visible. When a user clicks anywhere within the control, including on the scrollbar, this counts as a Mouse Down. It would be nice if the Mouse Down event would indicate whether the click was on the scrollbar or on the actual clickable area of the control, so you could do different actions based on which it was. Of course, you can usually do manually by checking boundaries of the control against the coordinates of the click, but it seems like a common thing so it would be easier if the check was built in.