Feed john-d-cook John D. Cook

Favorite IconJohn D. Cook

Link https://www.johndcook.com/blog
Feed http://feeds.feedburner.com/TheEndeavour?format=xml
Updated 2025-12-03 19:48
A Pattern Language
I first heard of the book A Pattern Language sometime in the 1990s. I had left academia, for the first time [1], and was working as a software developer. Although the book is about architecture, software developers were exited about the book because of its analogs to software development patterns. The “Gang of Four” book […]The post A Pattern Language first appeared on John D. Cook.
Email subscription changing
Email subscription for this web site has been provided by Google Feedburner. This service is going away in July. I don’t know yet what I will replace Feedburner with. Do any of you know of an alternative that automatically sends out email when there’s a new post? I use Mail Chip to distribute by monthly […]The post Email subscription changing first appeared on John D. Cook.
The base with the largest decibel
This post is an expansion on a comment by Nathan Hannon: My favorite definition of e is “the base whose version of the decibel is the largest”. I hadn’t seen that before. Sounds interesting. What does it mean? First attempt My first thought was that the statement meant that the function b logb(x) is largest […]The post The base with the largest decibel first appeared on John D. Cook.
Relating Rényi entropy and q-log entropy
I’ve written before about Rényi entropy Hq and most recently q-log entropy Sq, two generalizations of Shannon entropy. There are simple equations relating Rényi entropy and q-log entropy if we measure both in nats: I mentioned in the post on q-log entropy that there were two possible ways it could be defined. The equation above […]The post Relating Rényi entropy and q-log entropy first appeared on John D. Cook.
Generalizing Shannon entropy with q-logs
The most common way to quantify entropy is Shannon entropy. That’s what people usually mean when they say “entropy” without further qualifications. A while back I wrote about Renyi entropy as a generalization of Shannon entropy. This time I’ll explore a different generalization called q-log entropy, a.k.a. Tsallis entropy. The definition of Shannon entropy includes […]The post Generalizing Shannon entropy with q-logs first appeared on John D. Cook.
Stumbling on an abandoned uranium mine
Last week my family and I made a tour of the five national parks in Utah. In Canyonlands National Park, my son-in-law noticed some grates at the bottom of a hill just a few feet off the road we were walking on. The area was not restricted. We walked over to investigate and found that […]The post Stumbling on an abandoned uranium mine first appeared on John D. Cook.
Decibel to log in new base
Logarithms in all bases are proportional. Specifically loga(x) = logb(x) / logb(a) for any bases a and b. One way to read this is to say that if you already know about logarithms base b, logarithms base a aren’t anything fundamentally new [1]. They’re proportional to logarithms base b, and in fact the proportionality constant […]The post Decibel to log in new base first appeared on John D. Cook.
Most popular posts this year
Here are the five most popular posts so far this year. Self-reproducing cellular automata Simple trig approximations How to mentally calculate logs Trig functions across programming languages Gell-Mann amnesia The posts above had the most page views. But just counting page views doesn’t measure what regular readers necessarily most enjoy reading. It’s heavily influenced by […]The post Most popular posts this year first appeared on John D. Cook.
Cyclic permutations and trace
The trace of a square matrix is the sum of the elements on its main diagonal. The order in which you multiply matrices matters: in general, matrix multiplication is not commutative. But the trace of a product of matrices may or may not depend on the order of multiplication. Specifically, trace doesn’t change if you […]The post Cyclic permutations and trace first appeared on John D. Cook.
Trick for 2×2 eigenvalues
3Blue1Brown has a nice new video on how to calculate the eigenvalues of 2×2 matrices. The most common way to find the eigenvalues of a 2×2 matrix A is working straight from the definition, solving det(A – λI) = 0. This is fine when you’re learning what eigenvalues are. But if you’ve already learned all […]The post Trick for 2×2 eigenvalues first appeared on John D. Cook.
Universal confidence interval
Here’s a way to find a 95% confidence interval for any parameter θ. With probability 0.95, return the real line. With probability 0.05, return the empty set. Clearly 95% of the time this procedure will return an interval that contains θ. This example shows the difference between a confidence interval and a credible interval. A […]The post Universal confidence interval first appeared on John D. Cook.
What is a polynomial?
When you leave the comfort of the real numbers, you might be mistaken about what a polynomial is. Suppose you’re looking at polynomials over some finite field. Why would you do that? Numerous practical applications, but that’s a topic for another post. You look in some reference that’s talking about polynomials and you see things […]The post What is a polynomial? first appeared on John D. Cook.
Confidence interval widths
Suppose you do N trials of something that can succeed or fail. After your experiment you want to present a point estimate and a confidence interval. Or if you’re a Bayesian, you want to present a posterior mean and a credible interval. The numerical results hardly differ, though the two interpretations differ. If you got […]The post Confidence interval widths first appeared on John D. Cook.
Multicolor reproducing cellular automata
The previous post looked at a cellular automaton introduced by Edward Fredkin. It has only two states: a cell is on or off. At each step, each cell is set to the sum of the states of the neighboring cells mod 2. So a cell is on if it had an odd number neighbors turned […]The post Multicolor reproducing cellular automata first appeared on John D. Cook.
Self-reproducing cellular automata
Edward Fredkin is best known these days for the Fredkin gate, a universal reversible circuit. I recently found out that Fredkin was one of the pioneers in cellular automata. His student Edwin Banks describes a cellular automaton introduced by Fredkin [1]. Edward Fredkin of MIT has described the following interesting cellular space. If the states […]The post Self-reproducing cellular automata first appeared on John D. Cook.
Humming St. Christopher
The other day I woke up with a song in my head I hadn’t heard in a long time, the hymn Beneath the Cross of Jesus. The name of the tune is St. Christopher. When I thought about the tune, I realized it has some fairly sophisticated harmony. My memory of the hymns I grew […]The post Humming St. Christopher first appeared on John D. Cook.
Aliquot ratio distribution
The previous post looked at repeatedly applying the function s(n) which is the sum of the divisors of n less than n. It is an open question whether the sequence s( s( s( … s(n) … ) ) ) always converges or enters a loop. In fact, it’s an open question of whether the sequence […]The post Aliquot ratio distribution first appeared on John D. Cook.
The iterated aliquot problem
Let s(n) be the sum of the proper divisors of n, the divisors less than n itself. A number n is called excessive, deficient, or perfect depending on whether s(n) – n is positive, negative, or 0 respectively, as I wrote about a few days ago. The number s(n) is called the aliquot sum of […]The post The iterated aliquot problem first appeared on John D. Cook.
Cologarithms and Entropy
The term “cologarithm” was once commonly used but now has faded from memory. Here’s a plot of the frequency of the terms cololgarithm and colog from Google’s Ngram Viewer. The cologarithm base b is the logarithm base 1/b, or equivalently, the negative of the logarithm base b. cologb x = log1/b x = -logb x […]The post Cologarithms and Entropy first appeared on John D. Cook.
Corollary of a well-known fact
When students learn about decimals, they’re told that every fraction either has a terminating decimal expansion or a repeating decimal expansion. The previous post gives a constructive proof of the converse: given a repeating fraction (in any base) it shows how to find what rational number it corresponds to. Maybe you learned this so long […]The post Corollary of a well-known fact first appeared on John D. Cook.
Repeating decimals in any base
My previous post ended with a discussion of repeating binary decimals such as 0.00110011…two = 1/5. For this post I’ll explain how calculations like that are done, how to convert a repeating decimal in any base to a fraction. First of all, we only need to consider repeating decimals of the form 0.1b, 0.01b, 0.001b, […]The post Repeating decimals in any base first appeared on John D. Cook.
Chaotic image out of regular slices
Yesterday I wrote about the bits in powers of 3. That post had a low-resolution image, which has its advantages, but here’s a higher resolution image that also has its advantages. The image looks chaotic. I say this in the colloquial sense, not in the technical sense as in period three implies chaos. I just […]The post Chaotic image out of regular slices first appeared on John D. Cook.
Evolution of random number generators
The previous post showed that the bits of prime powers look kinda chaotic. When you plot them, they form a triangular shape because the size of the numbers is growing. The numbers are growing geometrically, so their binary representations are growing linearly. Here’s the plot for powers of 5: You can crop the triangle so […]The post Evolution of random number generators first appeared on John D. Cook.
Powers of 3 in binary
I was thumbing through A New Kind of Science [1] and one of the examples that jumped out at me was looking at the bits in the binary representation of the powers of 3. I wanted to reproduce the image myself and here’s the result. Here a white square represents a 1 and a blue […]The post Powers of 3 in binary first appeared on John D. Cook.
Factors of orders of sporadic groups
I was curious about the orders of the sporadic groups, specifically their prime factors, so I made a bar graph to show how often each factor appears. To back up a little bit, simple groups are to groups roughly what prime numbers are to integers. Finite simple groups have been fully classified, and they fall […]The post Factors of orders of sporadic groups first appeared on John D. Cook.
Excessive, deficient, and perfect numbers
I learned recently that weekly excess deaths in the US have dipped into negative territory [1], and wondered whether we should start speaking of deficient deaths by analogy with excessive and deficient numbers. The ancient Greeks defined a number n to be excessive, deficient, or perfect according to whether the sum of the number’s proper […]The post Excessive, deficient, and perfect numbers first appeared on John D. Cook.
Is fast grep faster?
The grep utility searches text files for regular expressions, but it can search for ordinary strings since these strings are a special case of regular expressions. However, if your regular expressions are in fact simply text strings, fgrep may be much faster than grep. Or so I’ve heard. I did some benchmarks to see. Strictly […]The post Is fast grep faster? first appeared on John D. Cook.
Tower of powers and convergence
This post will look at the “tower of powers” and ask what it means, when it converges, and how to compute it. Along the way I’ll tie in two recent posts, including one that should come as a surprise. First of all, the expression is right-associative. That is, it is the limit of x^(x^(x^…)) and […]The post Tower of powers and convergence first appeared on John D. Cook.
Lambert W strikes again
I was studying a statistics paper the other day in which the author said to solve t log( 1 + n/t ) = k for t as part of an algorithm. Assume 0 < k < n. Is this well posed? First of all, can this equation be solved for t? Second, if there is […]The post Lambert W strikes again first appeared on John D. Cook.
Comparing files with very long lines
Suppose you need to compare two files with very long lines. Maybe each file is one line. The diff utility will show you which lines differ. But if the files are each one line, this tells you almost nothing. It confirms that there is a difference, but it doesn’t show you where the difference is. […]The post Comparing files with very long lines first appeared on John D. Cook.
New ICD-10 diagnosis code weirdness
In October 2020 there were 619 new codes added to the list of ICD-10 diagnosis codes. Some of these make sense, such as four codes added for COVID: U07.1 COVID-19 Z11.52 Encounter for screening for COVID-19 Z20.822 Contact with and (suspected) exposure to COVID-19 Z86.16 Personal history of COVID-19 There are also 24 new codes […]The post New ICD-10 diagnosis code weirdness first appeared on John D. Cook.
Magic square of squares
Allen William Johnson [1] discovered the following magic square whose entries are all squares. The following Python code verifies that this is a magic square. import numpy as np M = np.array( [[ 30**2, 246**2, 172**2, 45**2], [ 93**2, 116**2, 66**2, 258**2], [126**2, 138**2, 237**2, 44**2], [260**2, 3**2, 54**2, 150**2] ]) def verify(M): m, n […]The post Magic square of squares first appeared on John D. Cook.
Martian gravity
There is a lot of talk about Mars right now, and understandably so. The flight of Ingenuity today was awesome. As Daniel Oberhaus pointed out on Twitter, … the atmosphere on the surface of Mars is so thin that it’s the equivalent of flying at ~100k feet on Earth. No rotorcraft, piloted or uncrewed, has […]The post Martian gravity first appeared on John D. Cook.
Hashing phone numbers
A cryptographic hash is also known as a one-way function because given an input x, one can quickly compute the hash h(x), but it is extremely time-consuming to try to recover x if you only know h(x). Even if the hashing algorithm is considered “broken,” it may take an enormous effort to break it. Google […]The post Hashing phone numbers first appeared on John D. Cook.
Duodecimal vs Hexadecimal
I ran across the word duodecimal recently, the Latin-derived term for base 12, and realized it’s been years, maybe decades, since I’d heard that term. I’ve used hexadecimal explicitly in 30 different blog posts, and implicitly in more, but this is the first time I’ve used duodecimal. Hexadecimal comes up constantly in application. I suppose […]The post Duodecimal vs Hexadecimal first appeared on John D. Cook.
Floor, ceiling, bracket
Mathematics notation changes slowly over time, generally for the better. I can’t think of an instance that I think was a step backward. Gauss introduced the notation [x] for the greatest integer less than or equal to x in 1808. The notation was standard until relatively recently, though some authors used the same notation to […]The post Floor, ceiling, bracket first appeared on John D. Cook.
Box in ball in box in high dimension
Start with a unit circle and draw the largest square you can inside the circle and the smallest square you can outside the circle. In geometry lingo these are the inscribed and circumscribed squares. The circle fits inside the square better than the square fits inside the circle. That is, the ratio of the area […]The post Box in ball in box in high dimension first appeared on John D. Cook.
More on why simple approximations work
A few weeks ago I wrote several blog posts about very simple approximations that are surprisingly accurate. These approximations are valid over a limited range, but with range reduction they can be used over the full range of the functions. In this post I want to look again at and Padé approximation It turns out […]The post More on why simple approximations work first appeared on John D. Cook.
Mathematical stability vs numerical stability
Is 0 a stable fixed point of f(x) = 4x (1-x)? If you set x = 0 and compute f(x) you will get exactly 0. Apply f a thousand times and you’ll never get anything but zero. But this does not mean 0 is a stable attractor, and in fact it is not stable. It’s […]The post Mathematical stability vs numerical stability first appeared on John D. Cook.
Spaceship operator in Python
Some programming languages, such as Perl, have an infix operator <=> that returns a three-state comparison. The expression a <=> b evaluates to -1, 0, or 1 depending on whether a < b, a = b, or a > b. You could think of <=> as a concatenation of <, =, and >. The <=> operator […]The post Spaceship operator in Python first appeared on John D. Cook.
Real radical roots
The previous post Does chaos imply period 3? ended with looking at a couple cubic polynomials whose roots have period 3 under the mapping f(x) = 4x(1-x). These are 64 x³ – 112 x² + 56 x – 7 and 64 x³ – 96 x² + 36 x – 3. I ended the post by […]The post Real radical roots first appeared on John D. Cook.
Does chaos imply period 3?
Sharkovskii’s theorem says that if a continuous map f from an interval I to itself has a point with period 3, then it has a point with period 5. And if it has a point with period 5, then it has points with order 7, etc. The theorem has a chain of implications that all […]The post Does chaos imply period 3? first appeared on John D. Cook.
Sarkovsky’s theorem
The previous post explained what is meant by period three implies chaos. This post is a follow-on that looks at Sarkovsky’s theorem, which is mostly a generalization of that theorem, but not entirely [1]. First of all, Mr. Sarkovsky is variously known Sharkovsky, Sharkovskii, etc. As with many Slavic names, his name can be anglicized […]The post Sarkovsky’s theorem first appeared on John D. Cook.
Period three implies chaos
One of the most famous theorems in chaos theory, maybe the most famous, is that “period three implies chaos.” This compact statement comes from the title of a paper [1] by the same name. But what does it mean? This post will look at what the statement means, and the next post will look at […]The post Period three implies chaos first appeared on John D. Cook.
Better approximation for ln, still doable by hand
A while back I presented a very simple algorithm for computing natural logs: log(x) ≈ (2x – 2)(x + 1) for x between exp(-0.5) and exp(0.5). It’s accurate enough for quick mental estimates. I recently found an approximation by Ronald Doerfler that is a little more complicated but much more accurate: log(x) ≈ 6(x – […]The post Better approximation for ln, still doable by hand first appeared on John D. Cook.
Beta distribution with given mean and variance
It occurred to me recently that a problem I solved numerically years ago could be solved analytically, the problem of determining beta distribution parameters so that the distribution has a specified mean and variance. The calculation turns out to be fairly simple. Maybe someone has done it before. Problem statement The beta distribution has two […]The post Beta distribution with given mean and variance first appeared on John D. Cook.
Close but no cigar
The following equation is almost true. And by almost true, I mean correct to well over 200 decimal places. This sum comes from [1]. Here I will show why the two sides are very nearly equal and why they’re not exactly equal. Let’s explore the numerator of the sum with a little code. >>> from […]The post Close but no cigar first appeared on John D. Cook.
Arithmetic-geometric mean
The previous post made use of both the arithmetic and geometric means. It also showed how both of these means correspond to different points along a continuum of means. This post combines those ideas. Let a and b be two positive numbers. Then the arithmetic and geometric means are defined by A(a, b) = (a […]The post Arithmetic-geometric mean first appeared on John D. Cook.
Higher roots and r-means
The previous post looked at a simple method of finding square roots that amounts to a special case of Newton’s method, though it is much older than Newton’s method. We can extend Newton’s method to find cube roots and nth roots in general. And when we do, we begin to see a connection to r-means. […]The post Higher roots and r-means first appeared on John D. Cook.
Calculating square roots
Here’s a simple way to estimate the square root of a number x. Take a guess g at the root and compute the average of g and x/g. If you want to compute square roots mentally or with pencil and paper, how accurate can you get with this method? Could you, for example, get within […]The post Calculating square roots first appeared on John D. Cook.
...28293031323334353637...