Блог пользователя gen

Автор gen, 4 года назад, перевод, По-русски
Tutorial is loading...
Tutorial is loading...
Tutorial is loading...
  • Проголосовать: нравится
  • +101
  • Проголосовать: не нравится

»
4 года назад, # |
Rev. 3   Проголосовать: нравится +137 Проголосовать: не нравится
  • »
    »
    4 года назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    How do you use so many functions and manage to get under the TLE?

    • »
      »
      »
      4 года назад, # ^ |
        Проголосовать: нравится +25 Проголосовать: не нравится

      The time complexity of your code has little to do with the number of functions one uses. You should look up basic algorithm complexity analysis (say, an intro to Big O notation) to be introduced to analyzing the runtime complexity of a program.

  • »
    »
    4 года назад, # ^ |
    Rev. 2   Проголосовать: нравится +6 Проголосовать: не нравится

    Sorry to disturb you,but could you explain how LCT in problem C worked.I do not understand your solution.

    I compute the longest suffix of edge which can form a bipartite firstly,and then I try to add edge from prefix and remove the edge in the suffix to compute the array $$$Last$$$(the same as the $$$last[]$$$ in the tutorial).but it doesn't work,because some circle is ignored(maybe in these cycles,there is a circle of odd length).

    Thank you very much.

    • »
      »
      »
      4 года назад, # ^ |
        Проголосовать: нравится +6 Проголосовать: не нравится

      First I concatenate the edges array with itself, doubling its length. For each $$$l\in [0,M-1]$$$ I compute the max $$$r$$$ such that edges $$$l\ldots r$$$ are bipartite. Maintain the maximum spanning tree of edges $$$l\ldots r$$$ (where the weight of an edge is just its index).

      • When you increment $$$l$$$, you remove edge $$$l$$$ only if it is currently present in the spanning tree.
      • When you increment $$$r$$$, you add edge $$$r$$$ to the spanning tree and remove the edge of minimum weight along the path between the two endpoints of edge $$$r$$$ if they were previously connected.

      You should check the editorial for the other problem I linked if what I said about maintaining the MST doesn't make sense. :)

»
4 года назад, # |
Rev. 2   Проголосовать: нравится +22 Проголосовать: не нравится

How do we maintain DSU for C to check if graph is bipartite?

  • »
    »
    4 года назад, # ^ |
      Проголосовать: нравится +17 Проголосовать: не нравится

    Initially set all to the same color.

    Let's say we want to add an edge from $$$u$$$ to $$$v$$$. Check if $$$u$$$ and $$$v$$$ are of the same parent(Are connected). If they are, then make sure their colors are different.

    If they have different parents, Let's say $$$p$$$ and $$$q$$$ respectively, add an edge from $$$q$$$ to $$$p$$$ in the DSU(parent[q] = p). If their colors are the same, you want an additional mark on the edge which says to reverse the color.

    When you want to get the color, go through all the edges leading to the parent, and change the color as you go along the edges if you need to reverse the color.

  • »
    »
    4 года назад, # ^ |
      Проголосовать: нравится +8 Проголосовать: не нравится

    Look here

»
4 года назад, # |
  Проголосовать: нравится +5 Проголосовать: не нравится

In problem C in subtask $$$4$$$ there is $$$l_i \leq 500$$$ in editorial but $$$l_i \leq 200$$$ in the problem statement. Which variant is correct?

»
4 года назад, # |
Rev. 2   Проголосовать: нравится -8 Проголосовать: не нравится

In problem A, subtask 3, 2*sqrt(1000) = 2 * 33 = 66, and 66 > 64, the limit of querys in that problem, so, 2*sqrt(n) is ok?, why?, i know that you can use that solution 3 times and get 3*cubic_root(n) which is good enough (it uses 30 querys or something like that), but is more hard to code, and it has a lot of case handiling. UPD: Sorry, sqrt(1000) <= 32

»
4 года назад, # |
  Проголосовать: нравится +18 Проголосовать: не нравится

For Joker subtask 5, I think the post should say:

If we choose $$$B = M \div \sqrt{Q}$$$ we get the following runtime:

(rather than $$$B = M \sqrt{Q}$$$)

otherwise the numbers don’t add up. $$$B$$$ is the block size, and there are $$$M \div B$$$ blocks. The total number of operations is $$$M$$$ per block for the right pointer + $$$QB$$$ in total for the left pointer = $$$M^2 \div B + QB$$$ in total. Equating the two terms so that neither dominates the other gives $$$B = M \div \sqrt{Q}$$$.


And to expand slightly on the algorithm: at the start of each block, initialize DSU with all edges to the left of the block’s leftmost edge. Sort queries by $$$r$$$ in descending order. For each query, add the new edges on the right, then add the appropriate edges on the left, answer the query, and immediately roll back all of those left edges, so that once again the left pointer is at the very start of the block. Don’t bother trying to remove individual left edges (and struggling to do it without also rolling back right edges): just remove them all after each query.

  • »
    »
    4 года назад, # ^ |
    Rev. 6   Проголосовать: нравится 0 Проголосовать: не нравится

    Further, I think it should be possible to use path compression to improve the complexity somewhat, although I’m not sure it will help in practice.

    The original reason for the $$$\log N$$$ factor is that path compression is not performed in the DSU because it “is impossible” with rollbacks. Of course, there’s nothing preventing one from implementing rollbacks of path compression, but the question is whether it such revertible path compression actually improves efficiency.

    I can’t answer that. But for this particular problem, we can still improve the complexity by using path compression in a part of the solution that doesn’t require rollbacks:

    Notice that the DSU operations corresponding to the right-hand-side edges are not disturbed by any rollbacks. (When a new block starts, we can rebuild the whole DSU from scratch. It is not necessary to roll back the previous block’s right-hand-side operations.) There are $$$M$$$ unions per block from the right-hand side. We can do those unions with path compression, for a total run-time of $$$M \alpha(N)$$$. Assuming revertible path compression is meaningless, we can do the left-hand-side operations without further path compression, at $$$\log N$$$ time per operation. The total time is then $$$M^2 \div B\, \alpha(N) + Q B \log N$$$ (down from $$$(M^2 \div B + Q B) \log N$$$), and equating the two terms to ensure neither dominates the other gives optimal $$$B = M \div \sqrt{Q \frac{\log N}{\alpha(N)}} \approx M \div \sqrt{Q \log N}$$$ with the grand total run-time $$$\mathrm{O}\left(\mathrm{sort}\ Q + M \sqrt{Q\, \alpha(N) \log N}\right)$$$.

    • »
      »
      »
      4 года назад, # ^ |
        Проголосовать: нравится +3 Проголосовать: не нравится

      You can do the LHS operations with path compression too, making the complexity $$$M\sqrt Q\alpha(N)$$$.

      Spoiler (71 pts)
      • »
        »
        »
        »
        4 года назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        Ah, the trick is to have the LHS-within-block build a DSU over entire sets from the RHS+previous-LHS-blocks rather than over individual nodes. The RHS sets don’t change while moving the LHS, so the LHS can use a separate DSU tree and can be wiped clean after each query without any granular rollbacks. Nice!

        • »
          »
          »
          »
          »
          4 года назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          What does LHS mean?

          • »
            »
            »
            »
            »
            »
            4 года назад, # ^ |
              Проголосовать: нравится 0 Проголосовать: не нравится

            Left-hand side. Or simply left, as opposed to right. In this case, I’m talking about about the disjoint-set union formed by the streets whose indices are smaller than the patrolled streets.

  • »
    »
    4 года назад, # ^ |
      Проголосовать: нравится +3 Проголосовать: не нравится

    Thanks, fixed!

»
4 года назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can someone explain the strategy when k is odd, in Problem A? thx

  • »
    »
    4 года назад, # ^ |
      Проголосовать: нравится +3 Проголосовать: не нравится

    When k is odd, do query of length k/2, and the problem size becomes at most k/2+1, which is acceptable. BTW the solution has missed a keypoint: let function begin be the startpoint index, then begin(n)=n/2+1-begin(n/2) when n is even, and begin(n)=n/2+2-begin(n/2+1) when n is odd.

    • »
      »
      »
      4 года назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      Could you explain more carefully how to use the strategy of k using color [1,k] to construct a new strategy of 2k+1 using color [1, 2k+1] ?

      • »
        »
        »
        »
        4 года назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        It's not using stategy k to construct 2k+1, but to use stategy k and stategy k+1 to construct 2k+1.

        For example, strategy for 3 is 2 -> 3 -> 1 and strategy for 4 is 2 -> 4 -> (1 or 3). Then stategy 7 is:

        We first calculate the startpoint, begin(7)=5-2=3. So we do query 3 -> 6 first. (6=3+(7/2))

        If res==0, then the result is in [4, 7], and the problem reduces into strategy 4, which is 6 -> 1 -> (5 or 7).

        If res==1, then range is [1, 3], so the problem reduces into strategy 3, which is 6 -> 5 -> 7.

      • »
        »
        »
        »
        4 года назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        You may refer to my solution: 89211392

        • »
          »
          »
          »
          »
          4 года назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          Got it. But I still have a question: if res == 0, why we won't use a color twice or more?

»
4 года назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

"the graph exluding..."——you missed a 'c'.(excluding)