Automatic Asymptotics with Sage
20 Jan 2022 - Tags: sage
Recurrence relations show up all the time in combinatorics and computer science, and even simple objects give recurrences that are difficult or impossible to solve. Thankfully, we can often find a generating function for our objects, and through the power of complex analysis and algebraic geometry, we can use the singularities of the generating function in order to get good asymptotic estimates. Software like Maple can automatically compute asymptotic expansions for a lot of generating functions using the asympt function… Can sage?
The answer is “yes”, which is why I’m writing this post, but the longer answer is “with some work”. Which is the real reason I’m writing this post. I had to slog through a lot of kind of crummy documentation to get this working, and I want to make sure that the next people looking into this have an easier time.
There are two modules (that I can find) which provide features for computing asymptotics:
These both have pros and cons:
The former is slightly easier to work with, and allows generating functions with multiple variables. The downside is that it only works for functions with polynomial denominators1.
The latter has nicer documentation, works for more general functions, and gives more detailed asymptotic expansions. The downside is that it only works for functions of a single variable, and it’s kind of annoying to use because it doesn’t accept symbolic inputs natively. You also have to manually provide a list of singularities2.
Let’s start with the multivariate case, since it’s more complicated. I’ll give a code dump, then explain what’s going on.
The tl;dr is that it works for functions of the form
xxxxxxxxxx
from sage.rings.asymptotic.asymptotics_multivariate_generating_functions \
import FractionWithFactoredDenominatorRing
def multivar_asy(f, N=5, alpha=None, numeric=0):
"""
compute the first N terms of the asymptotic expansion of f
setting numeric = n will give floats with n digits of precision.
if
f = sum_{ns} F_{ns} x^{ns}
for ns = (n1,...,nk) a multi-index, then we compute an
expansion for F_{r alpha} for alpha = (a1 ... ak) a given
"direction" and r --> oo
By default, we assume alpha = [1,...,1], which reduces to what we
almost certainly want in the one variable case
"""
fn = f.numerator()
fd = f.denominator()
R_internal = PolynomialRing(QQ, fd.variables())
# FFPD is the ring of quotients p/q
# where p is in SR and q is in R_internal
# rather confusingly we put the denominator ring before the numerator ring
FFPD = FractionWithFactoredDenominatorRing(R_internal, SR)
# but when we make new element of the ring, we put things in the right order
# for some reason units in the denominator get clobbered, so we manually
# add it in the numerator (this is consistent with the examples in the docs)
fdFactored = R_internal(fd).factor()
f = FFPD(fn / fdFactored.unit(), fdFactored)
# now we choose a "direction" alpha
if alpha == None:
alpha = [1] * f.dimension()
decomp = f.asymptotic_decomposition(alpha)
result = 0
n = 0
for part in decomp:
n += 1
# this is brittle, but makes things work
if part == FFPD(0, []):
continue
# p is supposed to be a minimal critical point for the denominator of part.
# let's first find the critical points
# first we find the smooth points
I = part.smooth_critical_ideal(alpha)
smoothSols = solve([SR(v) for v in I.gens()],
[SR(v) for v in R_internal.gens()],
solution_dict=True)
# next we find the singular points
J = part.singular_ideal()
singSols = solve([SR(v) for v in J.gens()],
[SR(v) for v in R_internal.gens()],
solution_dict=True)
s = smoothSols + singSols
# remove any varieties of dimension > 0 from the space of solutions
# I don't know if this will break things or not, but in my (limited)!
# testing it seems fine.
# If I were less lazy I would probably make this take the minimum value
# across the whole variety? But doing it this way makes things agree with
# the examples.
sFiltered = []
for soln in s:
keep = True
for v in soln.values():
if not v.is_constant(): # remove any solutions involving a parameter
keep = False
if keep:
sFiltered += [soln]
# if we didn't find any solutions at all, give up.
if len(sFiltered) == 0:
if len(s) != 0:
print("We finally found something where removing the varieties caused problems")
return None
else:
print("no critical points were found. Giving up.")
return None
# otherwise we get the _minimal_ singularity
pMin = sFiltered[0]
for p in sFiltered:
if sum([xi^2 for xi in p.values()]) < sum([yi^2 for yi in pMin.values()]):
pMin = p
# and finally get the asymptotics
(a,_,_) = part.asymptotics(pMin, alpha, N, numerical=numeric)
result += a
return result
This is pretty obviously cribbed from the examples here, but the basic idea is this:
We need the denominator of our generating function to be a factored polynomial, so we build a polynomial ring with the variables in the denominator and factor over that ring.
Then, we put this into FFPD
to get an object that the module
knows how to deal with.
We choose a multi-index
will give us the asymptotics of as will give us the asymptotics of as- etc.
If we view the
Next we look at the critical points of the denominator. That is, the points where its gradient is either undefined or vanishing. The former is the singular case, and the latter is the smooth case.
In the one variable case, we know the asymptotics are controlled by the
singularity closest to the origin. See chapter
From skimming these articles, we now have a whole algebraic variety of singularities, and for reasons the asymptotics are now governed by the critical points on this variety. In particular, we’re on the hunt for the critical point which is closest to the origin (which we’ll call the Dominant Singularity).
So now, what does the code do? Well we look for the location of the minimal smooth point and the minimal singular point. Then take the smaller one and run the asymptotic function provided by the module.
It seems to work quite well, and is surprisingly fast too. For a simple example, we can try
xxxxxxxxxx
multivar_asy(z/(1-z-z^2))
which outputs4
Of course,
This really shines when it comes to multivariate series, though. For instance, we know that
So if we’re interested in the asymptotics of
xxxxxxxxxx
multivar_asy((1-x*(1+y))^(-1), alpha=[3,1], N=3)
which quickly gives
Now
But what if we’re interested in generating functions whose numerator isn’t holomorphic, or whose denominator isn’t a polynomial? It’s clear that we should be interested in such things, since the famous catalan numbers already have the generating function
For cases like this (which must be of a single variable), we work with the second asymptotics package here.
This is nice because it returns an asymptotic expansion, which is quite intuitive to work with. The major downside, though, might be surprising:
It only works with (callable) python functions.
This is annoying5, but it really isn’t that much of a hassle to rewrite
your function using def
. We also need to provide the dominant singularities
by hand, which is also a bit of work6.
If and when I get around to automatically finding singularities, I’ll update this code, but for now there’s much less boilerplate here than in the previous case:
xxxxxxxxxx
# interestingly, it seems like it doesn't matter at all
# what asymptotic ring you use.
AsyRing = AsymptoticRing('QQ^n * n^QQ * log(n)^QQ', QQ)
def singlevar_asy(f, N=5, sings=None):
"""
compute the first N terms of the asymptotic expansion of f
sings is a list of dominant singularities
TODO: get these automatically somehow?
"""
return AsyRing.coefficients_of_generating_function(f, sings, precision=N)
The documentation for this function is also quite good, so I’ll stick to one example. For the catalan numbers, we’ll have:
xxxxxxxxxx
def cat(z):
return (1 - sqrt(1-4*z))/(2*z)
This looks like it has singularities at
xxxxxxxxxx
singlevar_asy(cat, sings=[1/4])
which outputs
A quick massage turns this into the (maybe not so) familiar
which you can find as formula (20) here.
Lastly, we can combine these different asymptotic approximations into one
function. In my init.sage
file, which you can find here, I’ve defined
the function
xxxxxxxxxx
def asy(f, *args, **kwargs):
try:
return multivar_asy(f,*args,**kwargs)
except:
return singlevar_asy(f,*args,**kwargs)
This is definitely too brittle to give to other people, but it works
fine for my purposes. One day I’d like to make it a bit more robust
(this is not how try
/except
statements are supposed to be used),
and ideally make it a bit more automatic (in particular following
the discussion of singlevar_asy
above). Another option might be to try
and get it working with the singularlity_analysis
function, which
would require a certain amount of parsing…
Anyways, as it stands it’s a plenty servicable substitute for maple’s
asympt
function, and I’m glad I took the time to code it up.
If anyone has any other ideas for making this more usable, or encounters any problems, definitely let me know!
If not, take care and stay warm!
I’ll see you all in the next one ^_^.
-
You can read about these algorithms and their proofs
There’s also Pemantle and Wilson’s Analytic Combinatorics in Several Variables or Melczer’s An Invitation to Analytic Combinatorics, which both look quite good. Unfortunately I haven’t had time to read either. ↩
-
I’m curious if I can automate the search for these singularities… I feel like it shouldn’t be too hard, but I need to think about it. ↩
-
As an aside, I keep reading this as “asymptomatic” out of the corner of my eye, which definitely tells you something about my current mental state… ↩
-
Sorry I’m not making these evaluate inline like I normally do. You’ll have to trust me (or check yourself!) that the computation really is quite fast.
The issue is that I don’t want to copy/paste the definition of
multivar_asy
into each cell. I think that clutters things up.I could use linked sage-cells, but linked cells have to be evaluated in the right order, and aren’t allowed to be auto-evaluated. I think that’s a bit user-unfriendly, so I’ve decided to just show the inputs/outputs (which is what really matters in any case). ↩
-
And part of me wants to go in and try to fix it ↩
-
This is something I really think should be automate-able, though, and I’ll probably fight with it some more when I have the time. ↩