Hi everyone!
This time I'd like to write about what's widely known as "Aliens trick" (as it got popularized after 2016 IOI problem called Aliens). There are already some articles about it here and there, and I'd like to summarize them.
Lagrange multipliers (continuous case)
Let $$$f : \mathbb R^n \to \mathbb R$$$ be the objective function and $$$g : \mathbb R^n \to \mathbb R^c$$$ be the constraint function. The constrained optimization problem
in some cases can be reduced to finding stationary points of the Lagrange function
Here $$$\lambda \cdot g(x)$$$ is the dot product of a variable vector $$$\lambda$$$, called the Lagrange multiplier, and $$$g(x)$$$. If $$$f$$$ and $$$g$$$ are continuously differentiable, for any solution $$$x^*$$$ of the initial problem there exists $$$\lambda^*$$$ such that $$$(x^*, \lambda^*)$$$ is a stationary point of $$$L(x, \lambda)$$$.
Geometrically, the idea behind Lagrange multipliers is that in the extreme point $$$x^*$$$, the tangent space to the constraint surface $$$g(x)=0$$$ should lie completely within the tangent space of the contour surface $$$f(x)=f(x^*)$$$. See the illustration below for 2D case.
For linear spaces, $$$A \subset B$$$ if and only if $$$B^\bot \subset A^\bot$$$ where $$$A^\bot$$$ is an orthogonal complement space of $$$A$$$. For a tangent space, its orthogonal complement is a normal space. For the surface $$$f(x)=f(x^*)$$$, normal is a line formed by the gradient vector $$$\nabla f$$$. And for the surface $$$g(x)=0$$$, normal is a linear space formed by the vectors $$$\nabla g_1 (x), \dots, \nabla g_c(x)$$$.
Thus, for the point $$$x^*$$$ to be the solution of constrained problem, $$$\nabla f(x^*)$$$ must be a linear combination of $$$\nabla g_1(x^*), \dots, \nabla g_c(x^*)$$$. This, in turn, is exactly the stationary point condition of $$$L(x, \lambda)$$$, specifically the $$$\lambda$$$ vector consists of coefficients of this linear combination.
Example problem
You're given an array $$$p_1, \dots, p_n$$$. You have to split it into $$$k$$$ contiguous segments such that the sum of
over all segments in partition is maximized.
Let's formally state it as a constrained optimization problem. Let $$$X$$$ be a set of all partitions of $$$1..n$$$ into non-intersecting contiguous segments.
This function is somewhat better because if $$$\lambda$$$ is fixed, the $$$L_\lambda(r)$$$ can be computed as
If we introduce the prefix sum array $$$p_m = a_1 + \dots + a_m$$$, it can be rewritten as
Thus, for this problem $$$L_\lambda(r)$$$ can be computed in $$$O(n)$$$ for all $$$r$$$. Note that $$$\lambda$$$ can influence how many segments are actually taken in the answer. With bigger $$$\lambda$$$, it is beneficial to take smaller number of segments and with smaller $$$\lambda$$$ it would be better to take bigger number of segments. What would be the actual meaning of $$$\lambda$$$ here?
Lagrange multipliers (discrete case)
In the previous section we had an assumption that $$$f(x)$$$ and $$$g(x)$$$ are continuously differentiable. How
References
- The Trick From Aliens — Serbanology
- My Take on Aliens' Trick — Mamnoon Siam's Blog
- Comment on Codeforces by _h_
- Part of the article was once revealed to me in a dream