Actually, niether's the case of interest... The issue's not on the Front Panel, but on the block diagram. Put a float-format (single, double, or extended precision) constant on the block diagram and right-click on it; under "visible items", one of the things that can be shown is the "Unit Label." The default value is unitless / pure number. The physical constants under Functions Palette -> Numeric -> Additional Numeric Constants have hard-wired SI units. (m/s for c, C for e_0, J /(mol K) for R, etc.) LabVIEW will only let you wire these constants (or their products, ratios, etc.) to a sub-VI if the units work out right. Also, it won't let you do something like wire e_0 and c into the input terminals of the same "Add" node because the units don't match.
This sort of thing makes sense as a general restriction, but I'd like the option to override it. The closest thing I've found is the "Convert Unit" node (Functions Palette->Numeric->Conversion->Convert Unit), which'll let you map anything to dimensionless or vice versa. However, that VI has the added "feature" of converting quantities to SI before it strips off their dimensions. (So if you write a VI that deals with lengths in cm, and pipe a 1 cm length through the "Convert Unit" node, it'll come out the other end as 0.01)
This can be inconvenient if your handbook / memorized table of constants & formulae is in something other than SI.... I have several texts which normalize out Planck's constant, the speed of light, the ionization potential of the hydrogen atom, and so on. Rather than converting everything into units that I don't use, then converting back to units I do use at the end of the calculation, I'd like to treat everything as a pure number, and use LabVIEW's numeric constants when I need those SI values.