**It may well be that there is such a thing as a universal law of
complexity growth, operating at some deep level of reality. If so, complexity
growth would likely be self-similar/scale-invariant, implying, in the general
case, a complexity growth law leading to a future mathematical
singularity. Historic trends suggest that we may not be all that far away from
that point in time.**

Let'st start with simple exponential growth (the kind of growth of bacteria in a Petri dish or of money on a bank account with a fixed interest rate). If we write "C" for complexity then in the case of exponential growth we have a fixed rate of growth

dC/Cdt = const

which leads to the growth law C(t) ~ exp(t), the well known exponential growth.

Since we don't know how the growth rate should depend on the already existing complexity, in general we would have to write:

dC/Cdt = f(C) where f(C) is a monotonically growing function of C. What would be the appropriate choice for function f(C) ? Let's discuss two cases:

(1) dC/Cdt ~ log(C) and

(2) dC/Cdt ~ C^m

The growth law that satisfies equation (1) is

(3) C(t) ~ exp(exp(t)), the double exponential growth that Ray Kurzweil seems to talk about in his books most of the time.

For m>0 the growth law of equation (2) is

(4) C(t) ~ (t0-t)^-1/m, let's call this hyperbolic growth

To convince yourself that these growth laws satisfy the above differential equations, differentiate (3) and (4) and substitute into (1) and (2), respectively. Equation (4) has a mathematical singularity at time t0. For m=0, equation (2) again leads to exponential growth. Any exponent m>0 would require some kind of "auto-catalytic" process (e.g., bacteria multiplying faster when they have many close neighbors), these would be the "accelerating returns" Kurzweil talks about. What I like about equation (2) is that it is scale invariant (this is a general property of power-laws like C^m), i.e., it behaves the same way on all scales - when there is very little complexity (rocks, microbes) as in the case of lots of complexity (brains). So I think if one really is after a general law of complexity growth one would have to go with equation (2), equation (1) would be no good for that (I doubt Kurzweil realizes this).

So does hyperbolic growth really happen in nature ? Interestingly, hyperbolic growth (called "coalition growth model" by the authors) was shown to describe human population growth very well up until about 1975 - see the famous 1960 Science paper by Heinz von Foerster, Patricia Mora, and Larry Amiot (vol. 132, pp. 1291-1295). After about 1975 the growth rate seemed to drop.

Another thing I looked into myself is the growth of the annual number of scientific publications between 1500 and 2000 (the age of the printing press, now replaced by the age of Google for lack of a better name ;-). If one fits a hyperbolic growth model to those numbers one finds t0 around the year 2020 and 1/m of about 2.5...3.0 or so (my preferred value is 1/m=e ;-). The hyperbolic model fits the data actually quite well, though as in the case of population growth, the numbers seem to drop off somewhat after about 1985.

I also tried to apply hyperbolic growth to biological evolution, using a plot that I found in Carl Sagan's book "The Dragons of Eden" (chapter 2, Genes and Brains). Sagan plots in a double-logarithmic plot the time of first appearance (years before present) against the information content in bits of the most complex organism that existed at the time (genetic for simple organisms and synaptic for brains). In this kind of presentation the hyperbolic law, C(t) ~ (t0-t)^-e, with t0 approximately corresponding to the present (the one found from the science publications) is a straight line which almost exactly passes through Sagan's data points. Taken at face value, this seems to be saying that the same growth law holds for both biological evolution and the production of papers by present day scientists, the same law for genes and memes - or is this just some strange coincidence ?? One other thing of interest is the doubling time of C(t) that can be derived from the growth law:

(5) T2=log(2)/e * (t0-t), i.e., roughly (time to t0)/4

So the complexity doubling time would be about 1 Billion years for the first microbes and a couple of years if t0 is a few decades away, i.e., in agreement with the current doubling rate of the annual number of science publications. Or think of the 10 to 100 thousands of years technology doubling time of a stone age society, or the perhaps several hundreds of years doubling time of technological and cultural complexity of the societies of ancient Athens or Rome - all roughly in agreement with the above formula. Amazing !

A discussion of the same growth laws, applied to *Moore's Law*, is
available here.

..and I guess I should mention that the above was heavily inspired by Hans Moravec's "singularity math" (here and here).