I saw a blog (thanks to oversolver for finding it!) earlier today asking how to solve the following problem:
Given a linear function $$$ax + b$$$, find the minimum $$$x \geq 0$$$ such that $$$ax + b \in [L, R]\ (\text{mod } M)$$$.
To solve that problem, we can make the following reduction: If $$$gcd(a, M) > 1$$$, we divide everything by the GCD. The $$$x$$$ for which $$$ax + b = L\ (\text{mod } M)$$$ is $$$(L - b) a^{-1}\ (\text{mod } M)$$$. Denote this value by $$$b_0$$$. Then, the minimum $$$x$$$ to get to $$$L + y$$$ is $$$a^{-1} y + b_0\ (\text{mod } M)$$$. This gives us a reduced problem:
Find the minimum value of $$$ay + b \text{ mod } M$$$ over $$$y \leq k$$$.
This seemed pretty hard, but surprisingly I figured out how to do it in $$$\mathcal{O}(\log M)$$$ time! The algorithm is as follows:
We consider the first value $$$s$$$ among $$$[0, a)$$$ achieved by $$$ay + b$$$. If $$$b < a$$$, it is $$$b$$$. Otherwise, it is $$$b - M\text{ mod } a$$$. We check in $$$\mathcal{O}(1)$$$ if we reach $$$s$$$ for some $$$y \leq k$$$. If we do, set $$$M$$$ to $$$a$$$, $$$a$$$ to $$$-M \text{ mod } a$$$ and $$$b$$$ to $$$s$$$. Otherwise, output $$$b$$$ as the first $$$a$$$ values are the only values such that the previous value was larger, thus if we never attain them, the first value we attain is the largest we do.
Due to the way we set $$$a$$$ and $$$M$$$ at every step, we do at most $$$\mathcal{O}(\log M)$$$ steps, and every step takes $$$\mathcal{O}(1)$$$ time.
What I wanted to ask is, is this a known problem, and is there a simpler or perhaps even faster solution to it? The problem seems fairly simple, so I doubt nobody has thought about it before.
EDIT: turns out the other case handled in my original solution was pointless >_>. I removed it, much simplifying the explanation and code :)