https://gizmodo.com/the-new-longest-pi-beats-previous-record-by-12-8-trilli-1847503360
On Monday, a team of Swiss data scientists announced that their supercomputer calculated the mathematical constant pi to a new length of 62.8 trillion digits, extending the constant beyond its previously calculated end by some 12.8 trillion digits. And to think I never even memorized 3.141592653.
According to a statement from the University of Applied Sciences of GraubUnden in Switzerland, the research team knew over the weekend that they had achieved the most exact-yet summation of the constant, which describes the ratio of a circle’s circumference to its diameter.
Pi has numerous applications, including in construction and space flight, but as David Harvey, a mathematician at the University of New South Wales in Australia, told The Guardian, “I can’t imagine any real-life physical application where you would need any more than 15 decimal places.” Other computer scientists have said that 39 digits should do, because that specificity gives you the circumference of the observable universe to within the diameter of a single atom.
---
My question, if 39 digits gives atom vs universe precision, is how they perform the calculation with any clue to its accuracy?
On Monday, a team of Swiss data scientists announced that their supercomputer calculated the mathematical constant pi to a new length of 62.8 trillion digits, extending the constant beyond its previously calculated end by some 12.8 trillion digits. And to think I never even memorized 3.141592653.
According to a statement from the University of Applied Sciences of GraubUnden in Switzerland, the research team knew over the weekend that they had achieved the most exact-yet summation of the constant, which describes the ratio of a circle’s circumference to its diameter.
Pi has numerous applications, including in construction and space flight, but as David Harvey, a mathematician at the University of New South Wales in Australia, told The Guardian, “I can’t imagine any real-life physical application where you would need any more than 15 decimal places.” Other computer scientists have said that 39 digits should do, because that specificity gives you the circumference of the observable universe to within the diameter of a single atom.
---
My question, if 39 digits gives atom vs universe precision, is how they perform the calculation with any clue to its accuracy?