Errichto's blog

By Errichto, 3 years ago, In English

This is my 100th CF blog!

This is a list of techniques with $$$O(\sqrt n)$$$ time complexity. Watch the lecture https://youtu.be/BJhzd_VG61k, with timestamps!

  1. Square root decomposition — split the sequence into blocks of fixed size.
  2. Splitting objects (e.g. vertices) into light and heavy.
  3. Square root decomposition by the time of queries & rebuilding the structure.
  4. Mo's algorithm — processing queries in proper order and updating the answer by erasing/inserting new elements. https://cp-algorithms.com/data_structures/sqrt_decomposition.html
  5. Strings — if the sum of lengths is $$$S$$$ then there are at most $$$\sqrt{S}$$$ distinct lengths.
  6. Birthday paradox & baby-step giant-step. See P4 and P6 here, and see https://cp-algorithms.com/algebra/discrete-log.html.

P1. 398D - Instant Messanger
P2. 220B - Little Elephant and Array
P3. 86D - Powerful array (actually, skip this one because it's boring)
P4. Count triangles in a graph, i.e. cycles of size 3.
P5. Given $$$N$$$ strings with total length $$$S$$$, find a pair where one string is a substring of the other, in $$$O(S \cdot \sqrt S)$$$.

Homework (will be discussed in a second stream soon)
P6. You are given $$$N$$$ positive integers $$$a_1, a_2, \ldots, a_N$$$ and target value $$$X$$$. Check if there is subset with sum equal to $$$X$$$, in $$$O(X \cdot \sqrt S)$$$ where $$$S = \sum a_i$$$.
P7. 13E - Holes
P8. 455D - Serega and Fun
P9. You're given a sequence $$$a_1, a_2, \ldots, a_n$$$ ($$$1 \leq a_i \leq n$$$). It describes a functional graph: there is a directed edge from $$$i$$$ to $$$a_i$$$. Handle updates U i x (change $$$a_i$$$ to $$$x$$$) and answer queries Q v — if we start in $$$v$$$ and keep following edges, after how many steps do we get to an already visited node?

And two problems from recent contests: 1580C - Train Maintenance and ABC 219 G Propagation.

Thanks to krismaz for letting me use his list of techniques and problems.

  • Vote: I like it
  • +463
  • Vote: I do not like it

| Write comment?
»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Are there any easy problems as well? These looks like out of reach.

  • »
    »
    3 years ago, # ^ |
      Vote: I like it +41 Vote: I do not like it

    Problems involving sqrt decomposition tend to have a problem rating of >2000, so I highly doubt that there are easy problems for this topic.

»
3 years ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

Knowing Mo's Algorithm finishes P2 and P3.

»
3 years ago, # |
Rev. 2   Vote: I like it +8 Vote: I do not like it

Also see this nice Atcoder problem.

»
3 years ago, # |
  Vote: I like it +9 Vote: I do not like it
»
3 years ago, # |
  Vote: I like it +6 Vote: I do not like it

Coolest sqrt decomposition ive seen so far: https://codeforces.me/problemset/problem/13/E

»
3 years ago, # |
Rev. 3   Vote: I like it +32 Vote: I do not like it

Few good problems based on the Heavy set — Light set based SQRT Decomposition:

  1. https://www.hackerrank.com/contests/worldcupsemifinals/challenges/wccity/problem

  2. https://www.codechef.com/problems/KOL15C

Few good problems based on SQRT Decomposition on Queries:

  1. https://codeforces.me/contest/342/problem/E (Standard Centroid Decomposition Problem, but can also be solved using Query SQRT Decomposition)

  2. https://www.codechef.com/problems/COW3E

Few good problems based on Mo's

  1. https://codeforces.me/contest/86/problem/D

  2. https://codeforces.me/problemset/problem/940/F (Mo's with Updates)

  3. https://www.spoj.com/problems/XXXXXXXX/ (Mo's with Updates)

  4. https://codeforces.me/contest/576/problem/C

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Some really cool problems:

»
3 years ago, # |
  Vote: I like it +10 Vote: I do not like it

plug

heavy/light example too: 1545F - AquaMoon and Potatoes

»
3 years ago, # |
Rev. 2   Vote: I like it -48 Vote: I do not like it

Strings — if the sum of lengths is S then there are at most sqrt(S) distinct lengths

i think it is wrong, for example:

S = 15 the strings can be of length: 1, 2, 3, 4, 5

there are 5 distinct lengths but $$$\sqrt 15$$$ is less than 4

  • »
    »
    3 years ago, # ^ |
    Rev. 2   Vote: I like it +50 Vote: I do not like it

    It's just the magnitude. The exact limit should be $$$\sqrt{2 * S}$$$.

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

problem on square root heuristic, (not sure if is under square root decomposition, but we do decompose into square root buckets)
https://www.codechef.com/JAN19A/problems/PARRTY
codeforces(2100 rated): Time to Raid Cowavans
codeforces(2400 rated): Mr. Kitayuta's Colorful Graph

»
3 years ago, # |
Rev. 4   Vote: I like it 0 Vote: I do not like it

Update : I managed to find the bug and remove WA(was overcounting) but now I am getting TLE on test 63. I also tried various block sizes from 300-700 and it gave TLE on various tests. I am sure that my add,delete and update functions work in O(B) time. I don't know how to remove TLE now. Also changed set to unordered set but that too failed :(. Errichto if you can shed some light on how to select B and comment on if I have applied heavy/light trick correctly or not..then that would be helpful :) Thanks and sorry for tagging.

Code
  • »
    »
    3 years ago, # ^ |
      Vote: I like it +3 Vote: I do not like it

    I don't fully understand your code. For example, you sometimes check deg > B and sometimes deg >= B.

    Sets and unordered sets are slow. Change heavy to vector type. Let's see how to handle erasing:

    1) The first two heavy[x].erase(...) you can just skip. Whenever iterating heavy neighbours for(int x : heavy[u]){ in upd(u) function, remove all $$$x$$$ with too small degree.

    2) The other two occurrences heavy[v].erase(u); and heavy[u].erase(v) can be done linearly by iterating the vector. You might want to erase the small-degree elements here as well.

    Alternatively, see the author implementation 5926635

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    The bottleneck is in the iteration over unordered_set. So using an equivalent of LinkedHashMap from java got my solution accepted.

    See the accepted answer in this stackoverflow post but with vector instead of list.

    263123733

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Is there a place to submit P9?

»
3 years ago, # |
Rev. 4   Vote: I like it +13 Vote: I do not like it
»
3 years ago, # |
  Vote: I like it +10 Vote: I do not like it

https://www.codechef.com/problems/GERALD07

Hint
»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Anyone getting TLE on P1? I'm updating light and heavy nodes and getting TLE on test 32

My submission — 135313235

  • »
    »
    3 years ago, # ^ |
      Vote: I like it +5 Vote: I do not like it

    I don't know if this problem can be accepted with sets. See my discussion with DontLookBack a few comments above.

    • »
      »
      »
      3 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Thanks! changing the sets to vectors instantly gave AC, I forgot that the size of the heavy vector would also be small so deleting would be fast

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Is there any place I can submit P4 (and other problems without an attached question)

»
3 years ago, # |
  Vote: I like it +1 Vote: I do not like it

A really nice recent problem directly based on this blog: 1619H - Permutation and Queries

»
2 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Errichto when is the second stream coming out? Also, can you provide hints to P9

  • »
    »
    2 years ago, # ^ |
    Rev. 2   Vote: I like it +6 Vote: I do not like it

    P9 hint: rebuild the graph once every $$$\sqrt{Q}$$$ queries.
    Also, it might help to solve the following problem: given a functional graph and two starting nodes $$$A$$$ and $$$B$$$, find the first node where they can meet (i.e. minimize the sum of distances from $$$A$$$ and $$$B$$$ to the target).

    P9 full solution

    I'm not planning a second lecture, sorry.

    • »
      »
      »
      2 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Thanks a lot! I think I have got the solution but I am getting a different block size for better time complexity. Can you please go through the below once?

      Repeating what I have understood, let us rebuild after every S queries. Assuming that we find LCA using binary lifting, total rebuilding takes O(Q/S * NlogN)

      Now, for the current block of queries there are O(S) dead-ends. For each query of the second type, it might take us O(S * logN) time to get the first repeated node where S comes from the number of dead-ends and logN from LCA. Hence for O(S) queries of second type in a block, it will take us O(S*S*logN)

      To minimize total complexity, we should equate, S*S*S=Q*N, i.e. a block size of (QN)^(1/3)

      • »
        »
        »
        »
        2 years ago, # ^ |
          Vote: I like it +3 Vote: I do not like it

        1) You don't need LCA for every dead-end, just for the one when you enter an already visited connected component (so, an already visited dead-end).
        2) There are Q/S blocks, so the final product is S*S*Q/S instead of S*S*S.

»
2 years ago, # |
  Vote: I like it 0 Vote: I do not like it
»
2 years ago, # |
  Vote: I like it 0 Vote: I do not like it

For 1580C — Train Maintenance, I read the editorial but cannot quite understand how handling the trains satisfying $$$x_i + y_i \le \sqrt{m}$$$ can be done in $$$O(\sqrt{m})$$$. If we create an array of length $$$a$$$ for each $$$a \le \sqrt{m}$$$, I agree that updating for each array across each day takes $$$O(a)$$$ but when you sum over the arrays $$$1+2+\dots+\sqrt{m} = O(m)$$$. Am I understanding the arrays to create wrongly?

Can someone explain in more detail what each array contains and how all arrays get updated (or how information is extracted) on each day?

»
10 months ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

Why am I getting TLE on P2 ? 241207393

Edit : Solved it by using block size = 700. Also realized that the bounds are tight so no stupid things should be done that might slow down your code.