Hi everyone!
Two problems were recently added to Library Checker:
- Multipoint Evaluation (Geometric Sequence). Given $$$f(x)$$$, find $$$f(ar^i)$$$ for the given $$$a, r$$$ and $$$0 \leq i < m$$$.
- Polynomial Interpolation (Geometric Sequence). Given $$$f(ar^i) = y_i$$$ for $$$0 \leq i \leq n$$$, find $$$f(x)$$$ of degree $$$n$$$.
Note: We can divide or multiply the $$$k$$$-th coefficient of initial/resulting polynomial by $$$a^k$$$, so we may assume $$$a=1$$$.
Today we'll learn how to solve both of them in $$$O(n \log n)$$$.
Evaluation at $$$r^k$$$
Well, the first one is also known as chirp Z-transform and it's by now standard (see here or here). It uses the fact that
which is also important e.g. for graphic generating functions. Using this identity, we can represent
In other words, the sequence $$$c_k = r^{\binom{k}{2}}f(r^k)$$$ is the cross-correlation of the sequences $$$a_i = f_i r^{-\binom{i}{2}}$$$ and $$$b_j = r^{\binom{j}{2}}$$$.
This approach, commonly known as Bluestein algorithm, allows to solve the first task in $$$O(n \log n)$$$. But what about the second?
Interpolation at $$$r^k$$$
It gets suspiciously difficult at this point! Naturally, we may try to invert the Bluestein algorithm, but we will fail. Why? Well, if we know $$$f(r^k)$$$, we can recover $$$c_k$$$ directly. We also know the values of $$$b_j$$$. So, it shouldn't be difficult to recover $$$a_i$$$? Let's write it as a linear system:
It would be natural to be able to reverse cross-correlation in a similar manner to how we can inverse regular convolution. Well, it doesn't work this way. Why? Let's look on the system for convolution of $$$a_i$$$ and $$$b_j$$$ instead of cross-correlation:
Notice anything? Yep. When we invert convolution, we, implicitly, find an inverse for a lower-triangular matrix. But with a cross-correlation, it is no longer the case. So, what matrix do we actually work with? Generally, matrices like the one we have for cross-correlation are known as Hankel matrices or, if we reverse the rows, as Toeplitz matrices. From their definition, it seems that inverting a cross-correlation is, in the general case, equivalent to multiplying a vector with an inverse of a general Hankel/Toeplitz matrix.
Okay. How fast can we do so in general case? Wikipedia points to so-called Levinson recursion, which runs in $$$O(n^2)$$$, and trying to find anything faster doesn't provide anything fruitful enough: fastest known algorithms seem to all be $$$O(n \log^2 n)$$$ and based on divide and conquer, similar to general solution for polynomial interpolation. If we want $$$O(n \log n)$$$ algorithm, we must do something beyond inverting general cross-correlation from Bluestein algorithm, and we must use the fact that we're working with a geometric progression.
Lagrange interpolation
let's recall how we could interpolate this polynomial with Lagrange interpolation:
For geometric progression $$$1, z, z^2, \dots, z^n$$$, we have a special case $$$x_i = z^i$$$ which allows us to greatly simplify this expression. First of all, we should notice that now we're able to compute denominator near each $$$y_i$$$ in $$$O(n)$$$ by representing it as
So, we can compute the denominator near $$$y_i$$$ as $$$z^{ni}$$$, times a prefix product of $$$1-z^{-k}$$$, times a prefix product of $$$1-z^k$$$. Now let $$$y'_i$$$ be $$$y_i$$$, divided by corresponding denominators. We can find all $$$y'_i$$$ in $$$O(n \log n)$$$ with the decomposition above.
After that what remains is for us to compute the remaining sum, which looks like
Partial fraction decomposition
Let's divide each side by $$$Q_n(x) = \prod\limits_{j=0}^n (x-z^j)$$$, then the expression rewrites as
Hmm, so, if we could find right-hand side as a formal power series, we might recover $$$f(x)$$$ by multiplying RHS with $$$Q_n(x)$$$ and then only keeping first $$$n+1$$$ coefficients. Of course, two questions arise immediately when we think about it:
- How to find first $$$n+1$$$ coefficients of the right-hand side?
- How to compute $$$Q_n(x)$$$ quickly enough?
How to expand right-hand side?
To answer both questions, let's consider polynomial reversal
With this, the expression above rewrites as
That's much better! Now we can expand right hand side as $$$\frac{1}{1-x} = 1 + x + x^2 + x^3 + \dots$$$, and get...
Huh?! The coefficient near $$$x^i$$$ in this expression is the value of $$$g(x) = \sum\limits_{j=0}^n y'_j x^j$$$ in the point $$$x=z^i$$$!
So, we can find first $$$n+1$$$ coefficients of the right-hand side via regular chirp Z-transform in $$$O(n \log n)$$$!
How to compute $$$Q_n(x)$$$ quickly?
And what about $$$Q_n^R(x)$$$? Turns out, we can also find it in $$$O(n \log n)$$$! For this, we notice that
for any $$$t$$$. If we know $$$Q_{n-t}(x)$$$, finding $$$Q_{n-t}(z^t x)$$$ can be done by multiplying the $$$k$$$-th coefficient of $$$Q_{n-t}(x)$$$ with $$$z^{tk}$$$. From this, we notice that we can compute $$$Q_n(x)$$$ with a process similar to binary exponentiation by picking $$$t = \lfloor n/2 \rfloor$$$. The procedure then works in
Altogether it solves the whole interpolation problem in $$$O(n \log n)$$$. Please refer to my submission on Library Checker for implementation.