Блог пользователя maroonrk

Автор maroonrk, история, 7 месяцев назад, По-английски

We will hold AtCoder Regular Contest 177.

The point values will be 300-400-500-700-1000-1000.

We are looking forward to your participation!

  • Проголосовать: нравится
  • +79
  • Проголосовать: не нравится

»
7 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

May be this contest should be held on another day.

It's awful that having an "Earthquake" problem on exactly 12th May.

  • »
    »
    7 месяцев назад, # ^ |
      Проголосовать: нравится +32 Проголосовать: не нравится

    I feel sorry about it. I'm afraid didn't know the exact date of that earthquake...

»
7 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

For Earthquakes problem, I figured out that we need to solve it by considering connected components and that the ways to collapse could be only of the form LLL...RRR so I thought of calculating this using a stack. But, I am still confused how to merge the results for each components, can someone explain it?

  • »
    »
    7 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Let $$$F(i, t)$$$ be the probability of the entire $$$i$$$-th connected component collapsing before or at time $$$t$$$. Of course, at each time $$$t$$$ from $$$1$$$ to $$$N$$$, at most one connected component's probability can change.

    So, you can sweep over the time $$$t$$$ from $$$1$$$ to $$$N$$$, and maintain the product of $$$F(i, t)$$$ for all $$$i$$$. This can be done with a segment tree or just simply maintaining the product (to handle the case where there are zero values, just keep track of the number of zeros to avoid division by zero).

»
7 месяцев назад, # |
  Проголосовать: нравится +21 Проголосовать: не нравится

For problem E, we can use another type of sort to reduce the time complexity to $$$O(113TV+113N)$$$ where V is the max possible value of adding all the scores

»
7 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Today's problems were really enjoyable! (The jump from C to D was higher than I expected, though.)

Unfortunately, I wasted almost an hour trying to solve a misread version of problem D (which turned out to be much harder than the actual problem).... 😅

»
7 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

I forgot to set my alarm, so I woke up after the contest ended, and lost 39 rating points :(

»
7 месяцев назад, # |
Rev. 4   Проголосовать: нравится 0 Проголосовать: не нравится

In problem E, if we let f[3]=2, f[4]=8, f[5]=113, what will the asymptotic behavior of f[n] be like?

By the way
  • »
    »
    6 месяцев назад, # ^ |
      Проголосовать: нравится +20 Проголосовать: не нравится

    I have the result that there are $$$11219$$$ patterns to be considered for $$$6$$$ problems case. I ran the following C++ program for half an hour. In this case, the maximum point value was $$$30$$$.

    C++ Program

    Computing $$$f_7$$$ seems difficult. Did anyone try it?

»
7 месяцев назад, # |
  Проголосовать: нравится +8 Проголосовать: не нравится

Sad feelings about today's problem D. I failed to solve it because I didn't believe it's possible for probability of a certain group to be 0 modulo 998244353. Lack of math knowledge led to correct solution failed two test cases :( But liked first 4 problems in general.

  • »
    »
    7 месяцев назад, # ^ |
      Проголосовать: нравится +5 Проголосовать: не нравится

    Me too, I was almost solved the problem D during contest but keep getting 2 WA on in19 and in20. If I solved this, it would be the first time I entered top 200 in ARC.

  • »
    »
    6 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Is this (having 0 modulo 998244353) really likely to happen sometime or they handmade two testcases for this bug? Is the probability MOD/N?

    • »
      »
      »
      6 месяцев назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      I guess problem-setters intentionally made one small and one large tests with such values. I wouldn't say it was a bug. It was more like: my (non-existing) math knowledge tells me that probabilities will never be zero modulo large prime number, so I won't even bother handling that case. The statement was wrong, obviously. Funny, but it was about just a couple more if-statements to be added, but for some reason I didn't try it during the contest. Painful, but very important lesson.

      • »
        »
        »
        »
        6 месяцев назад, # ^ |
          Проголосовать: нравится 0 Проголосовать: не нравится

        Math knowledge tells us that products of non-zeros will be non-zero, not sums of non-zeros or products of anything. As I mention below, you shouldn't handle this just with a couple if-statements but with different representation.

        • »
          »
          »
          »
          »
          6 месяцев назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          Unfortunately, math knowledge doesn't tell me anything if it's simply not here :(

          By a couple of if-statements I meant specifically handling zero-sums separately and not letting them to the product. Actually, the same idea is also suggested in the official AtCoder editorial. In my accepted code it literally looks like two-ish if-statements.

    • »
      »
      »
      6 месяцев назад, # ^ |
        Проголосовать: нравится +10 Проголосовать: не нравится

      In this case, when the result is a product of $$$G$$$ sums, then assuming the data is sufficiently mixed that the sums are uniformly distributed, the probability that none are zero is $$$(1-1/MOD)^G \approx 1-G/MOD$$$, so it's not very likely to turn out zero by coincidence.

      However, when the result is a product of sums, we should assume that it's possible. It doesn't even need extra special effort, the sums are 0 at the start! We can write code naturally to express the result as $$$0^k \cdot p$$$ with $$$p \neq 0$$$, and update $$$k$$$ and $$$p$$$ when a sum changes.

      Most of the time, the result is a product of very few variables, so we tend to calculate results by multiplying new values of those variables instead of always calculating their modular inverses. In that case, it can still end up zero, it's just not visible as a source of bugs.

      • »
        »
        »
        »
        6 месяцев назад, # ^ |
          Проголосовать: нравится +5 Проголосовать: не нравится

        It doesn't even need extra special effort, the sums are 0 at the start!

        Unfortunelly, me (and I think) and other people who made this mistake printed 0 until all poles can actually fall, and then started calculating when we don't have any real 0 (the ones that are 0 without mod).

        Now I can see why this was just a lazy way solving this corner case.

        • »
          »
          »
          »
          »
          6 месяцев назад, # ^ |
            Проголосовать: нравится +5 Проголосовать: не нравится

          Well, actually the wrong assumption was that the sum of non-zero number of fractions $$$\frac{1}{2^{d_{i}}}$$$ modulo large prime number is never 0. This is why we handled zero number of addends separately. Not as lazy as it seems :)

          • »
            »
            »
            »
            »
            »
            6 месяцев назад, # ^ |
              Проголосовать: нравится +5 Проголосовать: не нравится

            Us bro. It took me a lot of WA and RTE to figure out this case. Apparently people who directly bashed DS didn't had to handle to this case separately.

»
7 месяцев назад, # |
  Проголосовать: нравится -17 Проголосовать: не нравится

Why nobody is talking about question C? It was quite a jump from B to C

  • »
    »
    6 месяцев назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Maybe because it only require to observe independence? there is not much we can discuss about.

  • »
    »
    6 месяцев назад, # ^ |
    Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

    I guess you meant that problem B was easier than usual. Problem C wasn't difficult at all and required the only observation:

    Observation spoiler

    Implementation is quite straightforward and requires only graph theory basics:

    Implementation spoiler
»
6 месяцев назад, # |
Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

For problem E, I saw another idea from submissions:

With 5 questions, there are 2^5 different results if not considering penalties. Now consider the vector rank[2^5] representing the rank distribution of the 2^5 result, where rank[mask] = the final rank (allowing tie) of mask result in all 2^5 different results. If we have all possible rank distributions, then we can just iterate through them on the input and find the minimum total squared error.

Then let's brute force search for all possible rank distributions. I don't know if it's derived by trial and error, or by some math calculation, but it seems that you can get all possible rank distributions (4672 of them) when trying all possible score distributions where score of each question is <= 30. This method passes the time limit.

apiad's submission in the live game looks similar to the above method, but with 2 improvements:

  1. Use randomization to get candidate rank distributions. Their randomization generates 4649 unique rank distributions, which even though is not all (4672) but good enough, since we know from the editorial that only 113 of them are critical.

  2. Reduce the candidates rank distributions. The idea is that, for a rank distribution, if you can merge results of neighboring ranks into the same rank, you should do it, as it won't make the total squared error worse. If we use the terms from the official editorial, this is the process of reducing spaces (# = 4672) divided by hyperplanes to intersections (# = 113) of hyperplanes.

(I don't feel very confident in my interpretation. Let me know if I get any part of them wrong)