Iteration Asymptotics
17 Jun 2021 - Tags: sage , featured
I really like recurrences, and the kind of asymptotic analysis that shows up in combinatorics and computer science. I think I’m drawn to it because it melds something I enjoy (combinatorics and computer science) with something I historically struggle with (analysis).
My usual tool for handling reccurences (particularly for getting asymptotic information about their solutions) is generating functions. They slaughter linear recurrences (which nowadays I just solve with sage), but through functional equations, lagrange inversion, and complex analysis, they form an extremely sophisticated theory1. Plus, I would be lying if I didn’t say I was drawn to them because of how cool they are conceptually. I’m a sucker for applying one branch of math to another, so the combination of complex analysis and enumerative combinatorics is irresistable.
Unfortunately, you can’t solve all asymptotics problems with generating functions, and it’s good to have some other tools around as well. Today we’ll be working with the following question:
If
where we allow
If
These observations tell us we should restrict attention to
those systems where
As a case study, let’s take a simple problem I found on mse the other day. I can’t actually find the question I saw anymore, or else I’d link it. It looks like it’s been asked a few times now, but none of the options are recent enough to be the one I saw. Oh well.
Define
The original problem is fairly routine, and we can solve it using
cobweb diagrams. The asymptotics are more interesting, though.
It turns out we can read the asymptotics right off of
Notice
Notice
Let’s get started!
Simple Asymptotics
We’ll be following de Bruijn’s Asymptotic Methods in Analysis in both
this section and the next. In the interest of showcasing how to actually
use these tools, I’m going to gloss over a lot of details. You can find
everything made precise in chapter
First, if
But now we have our asymptotics! If we formalize all of the
For each
Since
Life Pro Tip:
If
What about our concrete example? We know
Asymptotic Expansion
But what if you’re a real masochist, and the exponential convergence above isn’t enough for you? Well don’t worry, becaue de Bruijn has more to say.
If
In the proof that
But now we can get a great approximation for
which we can then invert to find
If we use the first, say,
How do we actually do this in practice, though? The answer is “with sage”!
This code will take a function
Since I had to (rather brutally) convert back and forth between symbolic
and ordinary power series, sage wasn’t able to keep track of the error
term for us. Thankfully, it’s pretty easy to see that the error term is
always
xxxxxxxxxx
"""
This is super sloppy because sage actually has two
different kinds of power series:
- symbolic power series
- "ordinary" power series
the ordinary power series are better in almost every way, except they
don't allow variables inside them! Since we need variables to build the
recurrence that we solve for the next coefficient, this is a problem.
The only way I was able to get this working is by hacking back and
forth between the two types of power series. If anyone has a better way
to do this PLEASE let me know.
"""
def omega(f, x, N):
"""
Compute the first N terms of omega(x)
"""
# this is a symbolic power series
f = f.series(x,N)
a1 = f.coefficient(x,1)
# initialize omega (as a symbolic power series)
o = (0 + x).series(x,N)
d = var('d')
for j in range(2,N+1):
# set up the linear recurrence that defines the jth coefficient.
# if you didn't believe symbolic series are more cumbersome than
# "ordinary" series, hopefully you do now.
#
# this comes from looking at the jth coefficient of the equation
# omega(f(x)) == a1 * omega(x)
eqn = (o + d * x^j).subs(x=f).series(x,N).coefficient(x,j) == a1 * d
o = (o + solve(eqn, d)[0].rhs() * x^j).series(x,N)
# this is a symbolic power series
return o
def iterationAsymptotics(f,N=5):
"""
Compute the first N many terms of an asymptotic expansion for x_n
"""
n = var('n')
# for some reason extracting the coefficient gives us a constant function
# and it only breaks things here? Oh well, we'll evaluate it to make
# sage happy.
a1 = f.series(x,2).coefficient(x,1)()
# we convert o to an ordinary power series, since there's no way
# to do lagrange inversion to a symbolic power series
o = omega(f,x,N).power_series(QQbar)
O = o.reverse() # lagrange inversion
# dirty hack to convert back to a symbolic series
return O.truncate().subs(x=(a1^n * o.truncate())).expand().series(x,N)
def stats(f,n=10,N=5):
"""
Run 1000 tests to see how well the asymptotic expansion
agrees with the expected output.
"""
approx = iterationAsymptotics(f,N).truncate().subs(n=n)
tests = []
for _ in range(1000):
x0 = random()
# compute the exact value of xn
cur = x0
for _ in range(n):
cur = f(cur)
# compute the approximate value of xn
guess = approx.subs(x=x0)
tests += [cur - guess]
avg_diff = mean([abs(t) for t in tests])
max_diff = max([abs(t) for t in tests])
median_diff = median([abs(t) for t in tests])
show("maximum error: ", max_diff)
show("mean error: ", avg_diff)
show("median error: ", median_diff)
show(histogram(tests, bins=50, title="frequency of various signed errors (actual $-$ approximation)"))
show(html("Type in $f$ with fixed point $0$ and $0 < |f'| < 1$"))
def _(f=input_box(-2*x / (3*x + 4), width=20, label="$f$"),
n=input_box(100, width=20, label="$n$"),
N=input_box(3, width=20, label="$N$")):
f(x) = f
a1 = f.series(x,2).coefficient(x,1)()
# we have to show things in this weird way to get things on one line
# it's convenient, though, because it also lets us modify the latex
# to print the error bounds
show(html(f"$$f = {latex(f().series(x,N).power_series(QQbar))}$$"))
show(html(f"$$\\omega = {latex(omega(f,x,N).power_series(QQbar))}$$"))
series = f"x_n = {latex(iterationAsymptotics(f,N).truncate())}"
error = f"O \\left ( \\left ( {latex(abs(a1))} \\right )^n x^{N} \\right )"
show(html("$$" + series + " \\pm " + error + "$$"))
show("How good is this approximation?")
stats(f,n,N)
As a nice exercise, you might try to modify the above code to work with
functions with a fixed point at
Be careful, though! We get much more numerical precision near
So, in our last moments together, let’s finish up that concrete example in
far more detail than anyone ever wanted. The default function that I put in
the code above is our function translated to
For
the actual answer is
which, seeing as
Of course, you should look at the statistics in the output of the code above
to see how close we get for
-
I know the basics, but there’s some real black magic people are able to do by considering what type of singularities your function has. This seems to be outlined in Flajolet and Sedgewick’s Analytic Combinatorics, but every time I’ve tried to read that book I’ve gotten quite lost quite quickly. I want to find some other references at some point, ideally at a slower pace, but if I never do I’ll just have to write up a post about it once I finally muster up the energy to really understand that book. ↩
-
It turns out you can sometimes say things if
. But convergence is slow (if you get it at all) and the entire discussion is a bit more delicate. You should see de Bruijn’s Asymptotic Methods in Analysis (chapter ) for more details. ↩ -
Notice these techniques can’t remove the
. For instance, for each , but is not . ↩ -
Which is apparently called Schröder’s equation ↩