From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to display Pi to 50 digits percision

I need to display and use the value of Pi to 50 digits of percision. The problem, LabView keeps truncating the value from 3.1415926535897932384626433832795028841971693993751  to  3.14159265358979323. Is there a way to utilize the percision of Pi to 50 digits of percision with a 32 bit processor?


0 Kudos
Message 1 of 4
(2,552 Views)
As far as I know, 15-20 is the maximum number of decimal digits that you're going to be able to get. There's a page in the online LabVIEW Help that lists the number of decimal digits based on type.
0 Kudos
Message 2 of 4
(2,534 Views)

Why do you need such a precise degree of precision?  I cant think of a single real measurement which could need such a precise value.  If you think about it you would need a huge number of bytes to store such a number and the calculation would be slow to use.  The only time I have done such calculations were for school assignments which were purely academic in nature and had no real significants to real world problem solving.  You could implement your own number using a string and make simple set of VIs which will manipulate the string as a number.  this does work (in fact this was an assignment I did in c many years ago).  Essentially you have a cluster with a string containing the digits and a signed integer which is the exponent.  Make a VI called ADD, Subtract Multiple and Divide which acts as an ALU/math processor.  It will be very slow put will allow for many digits 1000s or precision.  Other than this academic solution I dont have an answer to handle such precise # of significant digits.

 

Paul 

Paul Falkenstein
Coleman Technologies Inc.
CLA, CPI, AIA-Vision
Labview 4.0- 2013, RT, Vision, FPGA
0 Kudos
Message 3 of 4
(2,521 Views)

I can't think of a real world application either.

However, there seems to be a problem that when data is moved through multiple computers, some degree of percision is lost and we wanted to know if we could replicate the data percision loss using a single PC. The problem lies in trying to compare the bit input and output from different platforms.

Thanks for your replies.

0 Kudos
Message 4 of 4
(2,509 Views)