nchn27's blog

By nchn27, history, 5 years ago, In English

In use, multiset seems more versatile than priority queue due to the fact that you can remove things other than the first element. In C++, when would priority_queue be used instead of multiset?

  • Vote: I like it
  • +1
  • Vote: I do not like it

»
5 years ago, # |
  Vote: I like it +13 Vote: I do not like it

Priority_queue is faster, but usually for competitive programming using multiset is better since it is more convenient and for problems that arent very tight it makes no difference.

  • »
    »
    5 years ago, # ^ |
    Rev. 2   Vote: I like it -66 Vote: I do not like it

    How is Priority_queue faster than a multiset?

    Wont Deletion through an iterator take constant time in a multiset where as pop takes log(n) time in priority queue

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it +20 Vote: I do not like it

      I don't know the implementations in C++ nor have I benchmarked the structures, but I would definitely expect priority queue to be faster, maybe even a lot faster. Why would you expect otherwise?

      The functions of a priority queue are a subset of the functions of a multiset, if the latter is faster then the former is useless.

      You can implement a priority queue through a heap structure, while for a multiset you need a balanced binary tree structure, which is significantly heavier.

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it -56 Vote: I do not like it

        Agreed, but Wont Deletion through an iterator take constant time in a multiset where as pop takes log(n) time in priority queue

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
            Vote: I like it +16 Vote: I do not like it

          It may be so, but that's not a great way to look at it. If you add $$$N$$$ elements and then remove them all, both priority queue and multiset will be $$$O(N log N)$$$ (and in fact no structure is faster than that, because you'd break the sorting complexity lower bound). Even if multiset is quicker in the deletion part, it likely pays a price for that during the addition part.

          It may be interesting to properly compare them, but I'd be quite surprised if multiset turns out to be faster, as my experience has showed that set/map in C++ are incredibly heavy.

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
          Rev. 2   Vote: I like it +8 Vote: I do not like it

          As far as I know, multiset is implemented as a balanced binary tree as Enchom mentioned. Since it should be 'balanced' after insertion and deletion, it has to do some 're-balancing' (algorithms can vary by implementation, but generally rotating trees). Unless it does that job, there might be some possbile occasions which a sequence of insertion and deletion makes multiset's internal implementation tree unbalanced and makes operation after that very slow (up to O(n))

          To sum up, priority queue is a heap structure, which takes $$$O(\log n)$$$ time to delete top element. Multiset is a balanced binary search tree, which takes up to $$$O(\log n)$$$ time to delete anything and then assuring balance. Latter can be a lot slower (bigger constant factor).

          Currently priority queue is somewhere 1.5x to 2x faster than multiset with compiler optimizations. (Depends on if you turned -O2 on, and bunch of other factors, including your compiler version and OS) There are some benchmarks in stackoverflow, which I haven't throughly read yet.

          Edit : my friend pointed out that my numbers are wrong.

          • »
            »
            »
            »
            »
            »
            5 years ago, # ^ |
            Rev. 2   Vote: I like it +24 Vote: I do not like it

            According to this reference, he is actually right that deletion is amortized constant. I ran a few quick tests and indeed multiset deletion seems faster than priority queue deletion.

            However, as I suspected and mentioned in my other comment, this is at the cost of a very slow addition in multiset compared to priority queue. Since you can't delete more elements than you add, I can't imagine a scenario where you'd prefer multiset only due to the faster deletion.

            • »
              »
              »
              »
              »
              »
              »
              5 years ago, # ^ |
                Vote: I like it +1 Vote: I do not like it

              Oh, I misread the question and considered erasing by value. I was talking about wrong thing then. Thanks for pointing it out.

            • »
              »
              »
              »
              »
              »
              »
              5 years ago, # ^ |
                Vote: I like it +10 Vote: I do not like it

              Right...

              Maybe as gratus907 said, Addition in Multiset would probably be slower due to ReBalancing Algorithms (Rotating Tree).

              Thanks once Again!

            • »
              »
              »
              »
              »
              »
              »
              5 years ago, # ^ |
                Vote: I like it -29 Vote: I do not like it

              Amortised isn't worst case, that's still $$$O(\log)$$$.

              • »
                »
                »
                »
                »
                »
                »
                »
                5 years ago, # ^ |
                  Vote: I like it 0 Vote: I do not like it

                Worst case may be o(log) but the average time, ie the amortized time is constant.

                Check out Enchom comment on this blog. He ran few tests, and found out that deletion in a multiset is faster than popping from a priority queue

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  5 years ago, # ^ |
                    Vote: I like it -7 Vote: I do not like it

                  Yes, I read the comments before posting, unsurprisingly. It's just that when you don't mention that you're talking about amortised complexity, the assumption is that it's worst case.

                  There are implementation tricks for faster insert/delete operations for a lot of structures, for example deletion flags on nodes and lazy rebalancing. That's why I'm more interested in a test where insert/delete is mixed with searching, but that comparison is tough because heap doesn't have searching.

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  4 years ago, # ^ |
                    Vote: I like it -10 Vote: I do not like it

                  "It's just that when you don't mention that you're talking about amortised complexity, the assumption is that it's worst case."

                  This is a widely misunderstood point about amortization, but it is actually completely independent from the concept of "worst case" or "average case". Classical analysis only considers complexity on a per-operation basis, whereas amortized complexity is about analyzing on a per-algorithm basis. There is a nice rigorous definition regarding potential functions here that shows that if you sum up the amortized complexity over a sequence of operations, it will necessarily be an upper bound for the standard complexity summed over that sequence of operations (note that "worst case" is never referenced in this definition).

                  I also don't think it's necessarily correct that people default to not talking about amortized analysis. I actually can't think of a scenario where the amortized complexity is better than non-amortized complexity and we don't default to amortized (std::vector::push_back, for example). Curious to hear if you have contrary examples, though. It's quite late for me so my recall could be bad.

              • »
                »
                »
                »
                »
                »
                »
                »
                5 years ago, # ^ |
                  Vote: I like it 0 Vote: I do not like it

                What's your point? Why would we care specifically about worst case complexity in this discussion? It is worst case $$$O(logN)$$$ to delete one element, but $$$O(1)$$$ amortized. This means if you were to delete all $$$N$$$ elements it'd be $$$O(N)$$$.

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  5 years ago, # ^ |
                    Vote: I like it -35 Vote: I do not like it

                  Why would we care specifically about worst case complexity in this discussion?

                  Because that's normal?

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  5 years ago, # ^ |
                    Vote: I like it 0 Vote: I do not like it

                  Still don't get your point. Is your claim that when mixed with other operations, deletion will no longer be amortized $$$O(1)$$$? If that's true that's interesting, but I don't think it's easy to tell without knowing the specific implementation.

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  5 years ago, # ^ |
                    Vote: I like it 0 Vote: I do not like it

                  Not a claim, just one hypothesis. I'm wondering about the real implementation and what implementations there could be.

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
            Vote: I like it -22 Vote: I do not like it

          NO. Deletion of anything in a balanced binary tree requires rebalancing, which is $$$O(\log)$$$. Whether it's through an iterator is irrelevant.

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it +4 Vote: I do not like it

        I think pq is faster since it is cache friendly. pq's are internally stored in a contiguous memory(like vector), While multiset's are not.

    • »
      »
      »
      5 years ago, # ^ |
      Rev. 2   Vote: I like it 0 Vote: I do not like it

      UPD : Deleted

»
5 years ago, # |
  Vote: I like it -53 Vote: I do not like it

The comments on this blog made me re-realize how much I still have to learn to become a Master. XD

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +8 Vote: I do not like it

    This blog post, unfortunately, did not make me a master :(

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +8 Vote: I do not like it

    I'm pretty sure there are many masters like me that don't know how multisets nor priority queues are implemented...

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it -11 Vote: I do not like it

      Just kidding bro, I was just amazed by how much detailed knowledge some people have regarding their languages. :D

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it +11 Vote: I do not like it

      grandmasters too

»
5 years ago, # |
  Vote: I like it -92 Vote: I do not like it

Multisets and priority ques are used only on Div1 E-F problems and top onsite competitions like ICPC WF, and are not needed to score well in CF rounds.

Personally I have never seen a reasonable problem that would require priority que during my CP career, and believe me, I consider myself experienced competitive programmer.

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +23 Vote: I do not like it

    Too young too simple...

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +6 Vote: I do not like it

    20C - Dijkstra? is a classic problem that needs to be solved by Dijkstra algorithm with heap optimisation.

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it -45 Vote: I do not like it

      Hmm, it is still alpha contest when CF was new and Mike just copied tasks. There is no such problem in recent CF rounds (about last 2 or 3 years).

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it +9 Vote: I do not like it

        What about 960B - Minimize the error? It's a Div. 1 + Div. 2 B with difficulty *1500.

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
            Vote: I like it -12 Vote: I do not like it

          It can be solved in $$$O(n \cdot (k_1 + k_2))$$$. $$$k_1$$$ times choose $$$i$$$ such that $$$a_i - b_i$$$ is largest and decrease $$$a_i$$$, then do the same for array $$$B$$$. I don't see how priority que or multiset can be utilized here.

          • »
            »
            »
            »
            »
            »
            5 years ago, # ^ |
              Vote: I like it +6 Vote: I do not like it

            See the tutorial please.

            "This can be implemented using a priority queue or by sorting the array C and iterating over it."

            Although this problem has a solution which doesn't need priority queue, it seems that priority queue is well known by most competitive programmer according to the sentence.

            • »
              »
              »
              »
              »
              »
              »
              5 years ago, # ^ |
                Vote: I like it -16 Vote: I do not like it

              Well, according to my comment above it's not well known. Or do you believe thousands of grey coders actually know multiset and priority_queue?

              • »
                »
                »
                »
                »
                »
                »
                »
                5 years ago, # ^ |
                  Vote: I like it 0 Vote: I do not like it

                "Well, according to my comment above it's not well known." Well, I say something according to the tutorial because the tutorial is reliable. Is what you said reliable?

                "Or do you believe thousands of grey coders actually know multiset and priority_queue?" Of course grey coders don't know them, because they're Newbies. Most of them are amateur competitive programmers, maybe not even competitive programmers. They don't know multiset and priority_queue, but do you think they can solve div. 2 C~D? "Newbies don't know multiset and priority_queue" can't prove that multiset and priority_queue aren't well known.

                • »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  »
                  5 years ago, # ^ |
                    Vote: I like it +11 Vote: I do not like it

                  I will bet most grey coders even at least know what it is lol. They may not know when to use it well, but even many grey coders know things fundamental things like that.

  • »
    »
    5 years ago, # ^ |
    Rev. 2   Vote: I like it +14 Vote: I do not like it

    And 1314A - Recommendations only Div.1A using priority_queue. And it's only four weeks ago.

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +24 Vote: I do not like it

    You're just without experience to say such words.

  • »
    »
    5 years ago, # ^ |
    Rev. 2   Vote: I like it +16 Vote: I do not like it

    And 527C - Glass Carving is a typical multiset problem in Codeforces. Only Div.2C which has a similar difficulty to Div.2D today.

  • »
    »
    5 years ago, # ^ |
      Vote: I like it -8 Vote: I do not like it

    Bruh you are like if LanceTheDragonTrainer was a grey coder...

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Lol. Couldn't understand your English. But what do you think my rating really is?

      • »
        »
        »
        »
        5 years ago, # ^ |
        Rev. 3   Vote: I like it 0 Vote: I do not like it

        I assume quite good. You seem to know what you're talking about on a variety of cp topics and most people (unlike cyan i responded to) would need to be quite good to have the audacity to make bold statements and say stuff like "and believe me, I consider myself experienced competitive programmer" or belittle people for having low color. My guess would be IM or GM.

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
            Vote: I like it +4 Vote: I do not like it

          Wow. You just made my day. If I ever return to Competitive Programming, I will train hard and share my knowledge with interested parties, to repay your praise.

  • »
    »
    4 years ago, # ^ |
      Vote: I like it -21 Vote: I do not like it

    Virgin with 2 inches tool: "size doesn't matter, most girls can't even tell the difference"

»
5 years ago, # |
  Vote: I like it +13 Vote: I do not like it

i dont use multiset nor i use qriority queue ,i use set<pair<int,int>> where i store value in pair.first and index in pair.second automatically it becomes mutiset + it supports more functions like searching a specific element after a particular index etc.. :)

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Also priority queue construct in O(n)

  • »
    »
    4 years ago, # ^ |
      Vote: I like it +11 Vote: I do not like it

    what is the syntax for constructing the priority queue in O(n) time because if i have n elements initially and i want to construct the priority queue then i have to add every element using push() and for each, it will take logn time so overall complexity is O(nlogn) only.

    • »
      »
      »
      4 years ago, # ^ |
        Vote: I like it +1 Vote: I do not like it

      http://www.cplusplus.com/reference/queue/priority_queue/priority_queue/

      Use a range construction as specified here and it will be linear. You can, say, create a vector v and pass in v.begin(), v.end() to the priority queue constructor. In practice for CF problems, you won't really care about $$$n\log n$$$ vs $$$n$$$ difference except maybe in some really specific cases on harder problems where TL is tight.