idea & solution: xcyle
Consider the case with $$$n=m=k$$$.
Generalize the solution for all $$$n,m,k$$$.
It can be shown that for any $$$k\times k$$$ subgrid, the colors we use must be pairwise distinct. Thus, we have an lower bound of $$$\min(n,k)\cdot\min(m,k)$$$. We can show that this lower bound is indeed achievable by coloring the upper-left $$$\min(n,k)\cdot\min(m,k)$$$ subgrid with distinct colors, and copy-pasting it to fill the rest of the grid.
Time complexity: $$$O(1)$$$.
idea & solution: xcyle
Find some conditions that Bob need to win.
A general idea is that it is very difficult for Bob to win. We make some observations regarding the case where Bob wins. It is intuitive to see that for any subarray $$$[A_l,A_{l+1},\cdots,A_r]$$$, the elements' positions in $$$B$$$ must also form an interval. From here, it is easy to prove by induction that for Bob to win, either $$$A=B$$$ or $$$A=\textrm{rev}(B)$$$.
Time complexity: $$$O(n)$$$.
The problem can't be that hard, find some simple strategy.
We consider a simple strategy: walk towards the goal in a straight line.
If some circle reaches the goal first, it is obvious that we have no chance of succeeding, no matter what path we take.
Otherwise, it can be proven that we will not pass any circles on our way to the goal.
![](/predownloaded/c9/b5/c9b5e2e8da7c3339115bde64e3aa553c2977191e.png)
Suppose we start at $$$A$$$, our goal is $$$B$$$, and we got intercepted by some circle $$$C$$$ at the point $$$D$$$. It follows that $$$CD=AD$$$. According to the triangle inequality, $$$CD>CB-DB$$$ should hold. Thus, we have $$$CB-DB\le AD$$$, which means $$$CB\le AB$$$, proof by contradiction.
Time complexity: $$$O(n)$$$.
2002D1 — DFS Checker (Easy Version) and 2002D2 — DFS Checker (Hard Version)
idea & solution: xcyle
Try to find some easy checks that can be maintained.
The problem revolves around finding a check for dfs orders that's easy to maintain. We have discovered several such checks, a few checks and their proofs are described below, any one of these checks suffices to tell whether a dfs order is valid.
For every $$$u$$$, all of its children $$$v$$$ satisfies $$$[pos_v,pos_v+siz_v-1]\in[pos_u,pos_u+siz_u-1]$$$. We can maintain this check by keeping track of the number of $$$u$$$ which violates this condition, and check for each $$$u$$$ using sets.
Proof: We prove by induction. When $$$u$$$'s children only consists of leaves, it is easy to see that this check ensures $$$[pos_u,pos_u+siz_u-1]$$$ is a valid dfs order of the subtree of $$$u$$$. Then, we can merge the subtree of $$$u$$$ into a large node with size $$$siz_u$$$, and continue the analysis above.
Check 2: First we check $$$p_1=1$$$. Then, for each pair of adjacent elements $$$p_i,p_{i+1}$$$, $$$fa(p_{i+1})$$$ must be an ancestor of $$$p_i$$$. We can maintain this check by keeping track of the number of $$$u$$$ which violates this condition, and check for each $$$i$$$ by checking whether $$$p_i$$$ is in the subtree of $$$fa(p_{i+1})$$$.
Proof: For any subtree $$$u$$$, we take any $$$p_i,p_{i+1}$$$ such that $$$p_i$$$ does not belong in subtree $$$u$$$, but $$$p_{i+1}$$$ does. It follows that $$$p_{i+1}=u$$$, since only the subtree of $$$fa(u)$$$ has nodes that does not belong in subtree $$$u$$$. From this, we can gather that each subtree will be entered at most once (and form a continuous interval), and that the first visited node will be $$$u$$$, which is sufficient to say that $$$p$$$ is a dfs order.
Time complexity: $$$O((n+q)\log n)/O(n+q)$$$.
Consider an incremental solution.
Use stacks.
The problems asks for the answer for every prefix, which hints at an incremental solution.
To add a new pair to the current prefix, we need to somehow process the new block merging with old ones. Thus, we should use some structure to store the information on the last block over time.
Namely, we use a stack to keep track of all blocks that became the last. For each block, we keep two values: its color $$$c$$$ and its lifetime $$$l$$$ (the times it takes for the block to disappear).
When inserting a new block, we pop all blocks that would be shadowed by the current one (i.e. with lifetime shorter than the current block), and merging blocks with the same $$$a$$$. When merging two blocks with length $$$x$$$ and $$$z$$$, and the maximum lifetime of blocks between them is $$$y$$$, $$$y\le\min(x,z)$$$ should hold, and the new block will have lifetime $$$x+z-y$$$.
For more details, please refer to the solution code.
There also exists $$$O(n\log n)$$$ solutions using ordered sets or heaps.
Time complexity: $$$O(n)$$$.
2002F1 — Court Blue (Easy Version)
Primes are powerful.
Prime gaps are small.
We view the problem as a walk on grid, starting at $$$(1,1)$$$. WLOG, we suppose $$$l>f$$$, thus only cells $$$(a,b)$$$ where $$$a<b$$$ would be considered.
Notice that when $$$n$$$ is large enough, the largest prime $$$p\le n$$$ satisfies $$$2p>n$$$. As such, all cells $$$(p,i)$$$ where $$$i<p$$$ will be unblocked and reachable.
However, we've only bounded one side of the final result. We take this a step further, let $$$q$$$ be the second-largest prime $$$q\le n$$$. By the same logic, we assume that $$$2q>n$$$. As such, all cells $$$(i,q)$$$, where $$$p\le i\le n$$$ will be unblocked and reachable.
Thus, we have constructed an area where the optimal solution must be, with its dimensions bounded by $$$n-p$$$ and $$$n-q$$$. We just need to run any brute force solution (dfs with memorization or dp) on this area to find the result.
If we assume the asymptotic of prime gap is $$$O(P(n))$$$, this yields a $$$O(n\cdot P(n)^2\cdot\log P(n))$$$ solution, where the $$$\log\log n$$$ is from taking the gcd of two numbers which differ by $$$O(P(n))$$$. This can also be optimized to $$$O(n\cdot P(n)^2)$$$ by preprocessing gcd.
We added the constraints that $$$n$$$'s are pairwise distinct to avoid forcing participants to write memorizations. In fact, under the constraints of the problem, the maximum area is $$$39201$$$, and the sum of the $$$10^3$$$ largest areas is $$$2.36\times10^7$$$.
Time complexity: $$$O(P(n)^2\log P(n))/O(P(n)^2)$$$
2002F2 — Court Blue (Easy Version)
Try generalizing the solution of F1.
Write anything and pray that it will pass because of number theory magic.
We generalize the solution in F1. Let $$$p$$$ be the largest prime $$$\le m$$$ and $$$q$$$ be the largest prime $$$\le\min(n,p-1)$$$. The problem is that there might be $$$\gcd(q,i)\neq1$$$ for some $$$p+1\le i\le m$$$, thus invalidating our previous analysis.
To solve this, we simply choose $$$q$$$ to be the largest integer such that $$$q\le n$$$ and $$$\gcd(q,i)=1$$$ for all $$$p+1\le i\le m$$$. An asymptotic analysis of this solution is as follows:
As the length of $$$[p+1,m]$$$ is $$$O(P(m))$$$, and each of these integers have at most $$$O(\log_nm)$$$ prime divisors of $$$O(m)$$$ magnitude, which means that if we only restrict $$$q$$$ to primes, we will have to skip at most $$$O(P(n)\log_nm)$$$ primes to find the largest $$$q$$$. As the density of primes is $$$O(\frac1{P(n)})$$$, the asymptotic of $$$n-q$$$ will be $$$O(P(m)\log_nm\cdot P(n))=O(P(m)^2)$$$, our actual $$$q$$$ (which is not restricted to primes) will not be worse than this. Thus, our total area will be $$$O(P(m)^3)$$$, times the gcd complexity gives us an $$$O(P(m)^3\log m)$$$ solution. However, the actual area is much lower than this.
Under the constraints of the problem, within the data we have been able to generate, the maximum area is $$$39960$$$, and the sum of the $$$10^3$$$ largest areas is $$$3.44\times10^7$$$.
Because we only need to check whether $$$\gcd(x,y)=1$$$, the complexity can actually be optimized to $$$O(P(m)^3)$$$ with some sieves. Namely, iterating over prime divisors $$$d$$$ of $$$[p,m]$$$ and $$$[q,n]$$$ and marking all cells which has $$$d$$$ as its common divisor.
This solution is by far from optimal. We invite you to continue optimizing your solutions and try to minimize the number of cells visited in each query :)
Time complexity: $$$O(P(m)^3\log m)$$$
Keep $$$p$$$ the same, set $$$q$$$ to $$$p-L$$$ and only keep reachable cells in $$$[p,n]\times [q,m]$$$. $$$L$$$ is some constant ($$$100$$$ should work).
We found this solution during testing, tried, and failed to hack it.
Keep $$$p$$$ the same, do dfs from each cell $$$(n,p),(n-1,p),\cdots$$$, prioritizing increasing $$$W_L$$$ over increasing $$$W_F$$$, and stop the process the first time you reach any cell $$$(x,m)$$$, take the maximum of all cells visited.
This should not be worse than the intended solution, and actually runs quite fast.
Simply take all cells in $$$[n-L,n]\times [m-L,m]$$$ and mark everything outside as reachable. $$$L=50$$$ works.
We found this solution the day before the round, we don't know how to hack it either.
Do dfs with pruning. Run dfs starting at $$$(n,m)$$$, return when the cell is $$$(p,i)$$$ (i.e. obviously reachable because of primes), or when the value of the cell is smaller than the current answer. Add some memorization and it passes.
idea & solution: xcyle
We apologize for unintended solutions passing, and intended solutions failing with large constants. Brute force runs very fast on $$$n=18$$$, which forced us to increase constraints.
Meet in the middle with a twist.
Union of sets is hard. Disjoint union of sets is easy.
The problem hints strongly at meet in the middle, at each side, there will be $$$O(2^n)$$$ paths. The twist is on merging: we are given two sequence of sets $$$S_1,S_2,\cdots$$$ and $$$T_1,T_2,\cdots$$$, we have to find the maximum $$$\textrm{MEX}(S_i\cup T_j)$$$.
If we enumerate over the maximum MEX $$$P$$$, we only need to check whether there exists two sets such that $$${0,1,\cdots,P-1}\subseteq S_i\cup T_j$$$.
Instead of meeting at $$$n$$$, we perform meet in the middle at the partition of $$$Bn$$$ and $$$(2-B)n$$$. For each $$$S_i$$$, we put all its subsets $$$S^\prime_i$$$ into a hashmap. When checking MEX $$$P$$$, we iterate over all $$$T_j$$$, and check whether $$${0,1,\cdots,P-1}-T_j$$$ is in the hashmap. This gives us a $$$O((2^{2Bn}+2^{(2-B)n})\cdot n)$$$ solution, which is optimal at $$$B=\frac23$$$, where the complexity is $$$O(2^{\frac43}\cdot n)$$$.
We can substitute enumerating $$$P$$$ with binary search, yielding $$$O(2^{\frac43n}\cdot\log n)$$$, or checking whether $$$P$$$ can be increased each time, yielding $$$O(2^{\frac43n})$$$.
Instead of hashmaps, you can also insert all $$$S$$$ and $$$T$$$'s into a trie, and run a brute force dfs to search for MEX $$$P$$$. It can be proven that the complexity is also $$$O(2^{\frac43n}\cdot n)$$$.
The intended solution is $$$O(2^{\frac43 n})$$$, however solutions that have slightly higher complexity can pass with good optimizations.
Time complexity: $$$O(2^{\frac43n})$$$.
TBA