This submission gives MLE for using vectors of size 5e6. Why is that?
I am using binary search with lazy propagation so my algorithm runs in O(q * log(n) * log(n))
.
Why should this be a problem?
# | User | Rating |
---|---|---|
1 | tourist | 4009 |
2 | jiangly | 3821 |
3 | Benq | 3736 |
4 | Radewoosh | 3631 |
5 | jqdai0815 | 3620 |
6 | orzdevinwang | 3529 |
7 | ecnerwala | 3446 |
8 | Um_nik | 3396 |
9 | ksun48 | 3388 |
10 | gamegame | 3386 |
# | User | Contrib. |
---|---|---|
1 | cry | 164 |
1 | maomao90 | 164 |
3 | Um_nik | 163 |
4 | atcoder_official | 161 |
5 | -is-this-fft- | 158 |
6 | awoo | 157 |
7 | adamant | 156 |
8 | TheScrasse | 154 |
8 | nor | 154 |
10 | Dominater069 | 153 |
This submission gives MLE for using vectors of size 5e6. Why is that?
I am using binary search with lazy propagation so my algorithm runs in O(q * log(n) * log(n))
.
Why should this be a problem?
Name |
---|
You are using memory for 11.000.000 32-bit integers.
Each int occupies 4 bytes.
So you are using, at least, 44.000.000 bytes
44.000.000 bytes / 1024 = 42,968 KB
42,968 KB / 1024 = 41,9 MB
Notice the special memory constraint for this problem (28 MB)
An integer takes 4 Bytes of memory. So you are taking ((5e6+5)*2+1e6+5))*4 B=44000060B=41.96MB memory
How can them one solve this problem using segment tree??
Not sure why you have lazy propagation in your code. I believe that using a normal sum segment tree is sufficient to solve the problem. If you are able to solve without lazy propagation, then you can cut a lot of memory. You can point update a position when inserting an element (by adding 1 to that position), then you can query the range from 1 to (the value you're binary searching) as a range sum.
I believe that the size of the array segment tree does not need to be 5Mil, 4 Million + 10 (or so) should be large enough. Usually it's 4 times the size.
Someone correct me if I'm wrong, but I think vector uses more memory than normal arrays. I think it's 2 times more but I'm not sure.
Not really answering you question but if all of the above fail, you can consider the following options:
4a) There is another type of array segment tree that uses 2N integers only. It is described here: https://codeforces.me/blog/entry/18051
4b) You can use a fenwick tree, which uses exactly N integers.
I'm not sure how many of these optimizations you have to make for it to work
I think the issue with vector using twice as much of memory is when you keep making push_back's causing it to run out of memory, so it allocates more memory by a factor of 2 or something like that
After seeing so many people writing that they should use $$$4 \cdot 10^6$$$ integers(the main reason I could find for this phenomenon is cp-algorithms's segment tree making an array of size $$$4 \cdot N$$$) for segment tree in this problem I felt the need to write this.
Let $$$M$$$ be the smallest power of $$$2$$$ greater than $$$N$$$. Then, if you want the $$$O(n)$$$ initialization with 1 for loop(
for (int i=M-1;i>=1;--i) seg[i]=seg[i*2]+seg[i*2+1];
) you should have an array of size $$$2 \cdot M$$$(which is in the worst case $$$4 \cdot N$$$, but if $$$N=10^5$$$ or $$$2 \cdot 10^5$$$ then $$$M$$$ is about $$$1.3 \cdot N$$$; if $$$N=5 \cdot 10^5$$$ or $$$10^6$$$ then $$$M$$$ is about $$$1.05 \cdot N$$$, so in most problems the amount of memory you need isn't that much bigger than $$$2 \cdot N$$$).If you don't need the $$$O(n)$$$ initialization with 1 for loop you should have an array of size $$$M+N$$$(which is in the worst case $$$3 \cdot N$$$, but as demonstrated above it's usually not).
In my implementation of segment tree $$$M$$$ is necessary for both update and query, so unless there's some segment tree implementation that I don't know of which doesn't require $$$M$$$ and doesn't use $$$2 \cdot N$$$ memory I see no reason for a person to use $$$4 \cdot 10^6$$$ integers in a segment tree with $$$N = 10^6$$$.
What is the point of such problems which force you to optimize such things when the logic is correct but way of solving is different?
I am very curious too, can awoo tell us?
Well, the intended solution is some obscure binary search with a single array of size N (ask bleddest for details, I got AC with segtree myself in the contest the round was based on). So it's kinda fair to not allow suboptimal realizations of some data structures to pass. I believe that MLE was set like that to prevent pbds solutions ¯_(ツ)_/¯
fivedemands, bruh.
What is the space complexity for pbds solution? I got MLE with pbds solution.
idk, I think it's linear but the constant is too high