Article 5PBDD Planning a world record calculation

Planning a world record calculation

by
John
from John D. Cook on (#5PBDD)

Before carrying out a big calculation, you want to have an idea how long various approaches would take. This post will illustrate this by planning an attempt to calculate Apery's constant

zeta3_def.svg

to enormous precision. This constant has been computed to many decimal places, in part because it's an open question whether the number has a closed form. Maybe you'd like to set a new record for computing it.

This post was motivated by a question someone asked me in response to this post: What the advantage is in having an approximation for binomial coefficients when we can just calculate them exactly? This post will show these approximations in action.

Apery's series

First of all, calculating (3) directly from the definition above is not the way to go. The series converges too slowly for that. If you compute the sum of the first N terms, you get about 2 log10 N decimal places. This post explains why. We can do much better, with the number of decimal places achieved being proportional to N rather than log N.

A couple days ago I wrote about a sequence rapidly converging series for (3), starting with Apery's series.

apery_series.svg

Suppose you want to compute (3) to a million decimal places using this series. How would you do this?

Since this is an alternating series, the truncation error from summing only the first N terms is bounded by the N+1 term. So you could sum terms until you get a term less than

10-106.

Great, but how long might that take?

Without some estimate of binomial coefficients, there's no way to know how long this would take without computing a bunch of binomial coefficients. But with a little theory, you can do a back-of-the-envelope estimate.

Estimating how many terms to use

For large n, the binomial coefficient in the denominator of the nth term is approximately 4n/(n). So when you include the n^3 term, you can see that the denominator of the nth term in the series is greater than 4n = 22n. In other words, each term gives you at least 2 bits of precision.

If you want 1,000,000 digits, that's approximately 3,000,000 bits. So to get a million digits of (3) you'd need to sum about 1,500,000 terms of Apery's series. By comparison, summing 1,500,000 terms in the definition of (3) would give you between 12 and 13 digits precision.

More efficient series

Apery's series is only the first of a sequence of rapidly converging series for (3). The series get more complicated but more efficient as a parameter s increases. Is it worthwhile to use a larger value of s? If you were planning to devote, say, a month of computing resources to try to set a record, it would be a good idea to spend a few hours estimating whether another approach would get more work done in that month.

By combining information here and here you could find out that the series corresponding to s = 2 has terms whose denominator is on the order of 33n. To estimate how many terms you'd need for a million digits of precision, you'd need to solve the equation

33n = 10106,

and by taking logs you find this is 1,047,951. In other words, about 1/3 fewer terms than using Apery's series. That was a win, so maybe you go on to estimate whether you should go up to s = 3. The key information you need in these calculations is the estimate on binomial coefficients given here.

Incidentally, computing (3) to a million decimal places would have given you the world record in 1997. At the moment, the record is on the order of a trillion digits.

Update: The next post shows how to actually carry out (on a smaller scale) the calculations described here using the Unix utility bc.

The post Planning a world record calculation first appeared on John D. Cook.vqinRjRm8Ho
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TheEndeavour?format=xml
Feed Title John D. Cook
Feed Link https://www.johndcook.com/blog
Reply 0 comments