I have build my software with the region setting: English(USA).
Now when I change the region setting to German(German), the number format will change and cause some error in my software while using the Read Spreadsheet VI to read the number. For example, 8,542.486 in English(USA) becomes 8.542,486 in German(German) and the software can only get 8.542 or 8 when I change the region setting. Is there any way that I can make it change the reading format automatically in Labview or in my software? Which means, is it possible that I don't have to change the text file from (.) to (,) that system can read it no matter in which region setting.
Thanks so much for help.
Is there any way that I can make it change the reading format automatically in Labview or in my software? Which means, is it possible that I don't have to change the text file from (.) to (,) that system can read it no matter in which region setting.
Sure you can program this behaviour!
I usually test the current regional settings at startup of my program and store the result in a global variable (or whatever you would like to store this flag). Depending on this flag I use different format strings like "%.;%f" vs. "%,;%f"…
Sometimes it still might be easier to swap comma and points in text files when reading them.
(In reality I say my users to use only the English regional settings. If they use the German settings the software opens a BIG warning dialog to enforces them to use English settings…)
Actually the proper way is to use the correct number format in the Scan from String and Format into String functions.
Whenever you use any of these settings you have to ask yourself what is the most likely expected format the user expects to be used at this point.
When you format strings to be sent to a device, it most likely will ALWAYS require a decimal point to be used. Same for strings that you receive from that device. Here you use the format specifier %.; in front of the format string to tell LabVIEW that all floating point numbers are expected to be in decimal point format.
If you read in user files from the local computer (or write them to the local computer) you most likely want to use whatever format the user has configured in his system at that point. Here the format specifier would be %; in front of the format string, to tell LabVIEW to use whatever setting the OS is configured with.
If you read in or write a file that only your application is using internally and never is supposed to be viewed by the end user, you use whatever specific format has your own preference.
It gets tricky when you start to go across system boundaries such as network communication or similar and that communication is in text format. Here any properly defined protocol either has a means to specifically define the number format to be used, or uses a fixed format. Anything else is insane and belongs into the category of badly designed protocols.
So basically rather than forcing the user to change his system settings to run your software you need to do a case by case study for the different possible number formats that your software needs to use. It's never arbitrary. Each subsystem uses a specific number format such as a device that usually needs to have decimal points, so you use that format in the device driver that you create. No need to burdon the rest of the software with any considerations about the used number format in the driver, or force a specific setting on the computer system when an application using your driver is executed!