LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
Dominik-E2

Make numeric display format auto-set to what you input

Status: New

Instead of having to open the "Display Format" window everytime you want to drop a control/indicator/constant with a special format value (i.e. 2400000000 as 2.4G) make the control/indicator/constant automatically set to whatever format you put in. For example

 

idea.PNG

 

Typing in 2.4G sets the Display Format to "SI notation".

Typing in 2.4E9 sets the Display Format to "Scientific"

Typing in 2400000000 sets Display Format to "Automatic formatting" etc

 

 

30 Comments
AristosQueue (NI)
NI Employee (retired)

> Why LV (or you or LV R&D ) think he is smarter than the programmer,

 

Ah, there's the difference in viewpoint. I don't think it has anything to do with "being smarter". It has to do with making it easy to put a value into a more universal format. I don't have any problem if you want to change the format explicitly to something that you think works better in this particular program (your 13Mhz vs 130Mhz example). But the default behavior currently makes it easy to type in whatever format I happen to have and to have that value shift over to a common notation.

 

In other words, I fully support your goal of making it easy to say "yes, I want to switch to this format". But I don't want that to come at the cost of "all of us have our own favorite format and the default behavior shifts us to a common format".

Henrik_Volkers
Trusted Enthusiast

>But the default behavior currently makes it easy to type in whatever format I happen to have and to have that value shift over to a common notation.

 

Common ?  What is common?  In a technical way the lowerst level, all can understand without special knowledge?  Why be forced to one format if I already entered a valid one?

(OK, I can change it, but that is annoying. I don't think it is soo much special knowledge to understand number formats 😉 )

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


AristosQueue (NI)
NI Employee (retired)

> Why be forced to one format if I already entered a valid one?

 

Because you have the pleasure of being able to enter in any format and still conform to the coding convention. I see it as a feature -- everyone gets to type in however they want, and the code gets reformatted to a standard regardless of which way you work. 

 

> I don't think it is soo much special knowledge to understand number formats

 

I know. And if I put on my LV R&D hat, I might even agree with you that this info is so common as to be acceptable. Maybe. I honestly don't know how common I -- as a user of LabVIEW -- am.

KeithTK
Member

The arguments here seem to be generally pro format-as-typed in the case of dropping a new constant and against it when changing values in existing code.

 

How about adapting the formatting to the input, but only the first time?  When the value of the new constant is defined (when it loses focus) the display format should be set based on how it was typed.  Changing the value of a constant anytime after that can keep the current format-coercing behavior with the properties tab to manually redefine things.

 

I like this idea since setting the format based on the input is way more intuitive than the settings in the display format tab. Defining the value and formatting in a single step is just a bonus.  I'd even suggest extending the functionality to allow a unit to be defined during that first input as well, but since they aren't properly supported elsewhere why should they be here.

 

Changing the radix should be just as easy during the initial drop.  While typing the input, typing any of the letters { d x o b p } should not insert the character and instead set the radix to the corresponding type and make it visible.  You could hit the same letter again to hide the radix again.  A brief popup of a tip-strip "notation set to:.." when the key is hit could give additional visual confirmation in case the little radix indicator is too small to read on modern displays.  🙂

 

Intaris
Proven Zealot

Short version,

 

Edit time (Constants and Controls / Indocators while not running) YES

Everything else NO

Jeffrey_Zola
Member

I voted for this idea but came back to look at the comments today.

 

I like the concept of the suggestion, and indeed enter numbers using the SI prefix as a shortcut because I have learned what LabVIEW will do with "10u", which is fewer keystrokes than "10e-6" or "0.00001".

 

Because the users of LabVIEW are specialized in what they do, a feature like this would absolutely need to be user-configurable. One user doesn't immediately recognize the difference between "k", "M", "G", or "T", but another lives and breathes those increments of a thousand and has no problem with using SI prefixes as a default.

 

Does LabVIEW treat these as case-sensitive? If I have a numeric value that corresponds to a time, there is a HUGE difference between 1 ms and 1 Ms!!

 

It was suggested that this be integrated into the default format for numeric displays. That seems to be a great idea to me. Perhaps a default numeric format (or different defaults for different numeric types?) with a option to change the formatting if specific formats were entered or read.

 

I am still confused that LabVIEW accept the unit "min" as an equivalent of 1/1000 inch. What is the LabVIEW unit for 1/60 hour?!? Smiley Surprised

 

Jeff

Jeffrey Zola
X.
Trusted Enthusiast
Trusted Enthusiast

min is not milli-inch but minute.

Gin however is not the beverage, but giga-inch (about 16,000 miles or 25,400 km).

A Tin is worth a lot of Gin, obviously.

My favorite is the zin. It is much smaller than a pin, in fact much smaller than a fin, but a thousand time large than a yin.

 

BTW, I voted for the idea.

I can't imagine how AQ had that much free time to explain to us that since he is not familiar with the G prefix, then it should be verboten.

No kidding.

Intaris
Proven Zealot

I still think this idea implemented as a RUN-TIME action is going to be a major PITA.  It'll cause all kinds of weird behaviour which will end up alienating more people than it could ever help.

 

I also fear greatly that if this behaviour becomes DEFAULT then the universe may implode and we'll all be cast into a parallel dimension where we must all code in MS-BASIC.  Seriously, IF this gets implemented, we need a setting for controls / indocators to set them to "Adaptive" formatting or something.  Everything else would be a major mistake.

AristosQueue (NI)
NI Employee (retired)

X: I didn't say "verboten." I said "by default autoconverted to the same standard across all numerics." I went out of my way to not say "verboten."

 

In many ways, I'd be just as happy if when you typed in "2.4E+9" that LabVIEW converted that to "2.4G". The standard across all numerics for code readability matters a lot more to me than the particular standard used. If you want to override a particular numeric to use a different notation, LV provides that ability, but I would discourage it if you were checking in code that others were expected to read. Maybe it comes from working on a team with 50+ developers or trying to understand a random customer project, but I believe that the value of common code readability trumps any particular personal coding convention, and that applies as much intra-organization as inter-organization. We have in LabVIEW code we integrated from other teams, code that uses different coding convention, and it is significantly harder to read when you're crossing back and forth across module boundaries while debugging. Every little issue -- like changing the notation used for numerics -- piles up.

 

So as a software engineer, I believe LabVIEW is already tuned correctly -- you type in a value, it converts to the standard notation used for all numerics. If you wish to trump that notation on a particular constant/control/indicator, you have that ability. Is it an extra step to go do that? Yes, and to me, that extra effort is a right and proper barrier to usage. It is a acknowledgement from the developer that "yes, I really want to break style here, tossing out the benefits of common code formatting, and I'm conscious of my choice to do this."

 

Again, these are personal opinions. As a member of R&D, I'd like to see usability studies done before deciding whether or not this would be a good change for LabVIEW. This is one of those where the people inconvenienced by the feature are going to be vocal, and the people helped by the feature aren't even aware they're being helped and so are unlikely to voice opposition.

X.
Trusted Enthusiast
Trusted Enthusiast

I don't think the idea is about converting 2.4E9 to 2.4G.

If you enter 2.4E9, use scientific notation (exponents).

If you enter 2.4G (giga), use SI notation (symbols for multiple of 3 exponents).

 

If I input 2.4E9, there is a reason: it is because it is more legible than 2400000000 (I may even have typed the wrong number of zeroes).

There are plenty of requests out there about improving on the latter: 2,400,000,000 would be far more legible.

Unfortunately that is neither an option in Controls or Constants, even if you can cook up your own format string for the former.