It is an array with n elements and d[i] = how many substrings have XOR value i. What is the optimal algorithm for build this dp? substring = a[i1] , a[i2] , ... not contigous.
№ | Пользователь | Рейтинг |
---|---|---|
1 | tourist | 3993 |
2 | jiangly | 3743 |
3 | orzdevinwang | 3707 |
4 | Radewoosh | 3627 |
5 | jqdai0815 | 3620 |
6 | Benq | 3564 |
7 | Kevin114514 | 3443 |
8 | ksun48 | 3434 |
9 | Rewinding | 3397 |
10 | Um_nik | 3396 |
Страны | Города | Организации | Всё → |
№ | Пользователь | Вклад |
---|---|---|
1 | cry | 167 |
2 | Um_nik | 163 |
3 | maomao90 | 162 |
3 | atcoder_official | 162 |
5 | adamant | 159 |
6 | -is-this-fft- | 158 |
7 | awoo | 155 |
8 | TheScrasse | 154 |
9 | Dominater069 | 153 |
10 | djm03178 | 152 |
It is an array with n elements and d[i] = how many substrings have XOR value i. What is the optimal algorithm for build this dp? substring = a[i1] , a[i2] , ... not contigous.
Название |
---|
What is the range of elements ?
10^5
In the beginning you mention substrings, then in the end "not contiguous".
By the way, if you're interested in contiguous substrings with their XOR equals to a value X:
1) Create a trie with all XOR prefixes of values in array A.
2) Take advantage of XOR associativity property and then, for each prefix, search for an value inside the trie which it's value XOR the current prefix would lead to X.
Insertion on trie is
O(N*L)
with L as the number of bits of the value. As the maximum value is 32, insertions and queries are negligible. The overall complexity isO(n)
.I know the solution for contiguous substrings but I want for not contiguous. Anyway , thank you!
Take a dp[n][xor] which stores the number of substrings having xor value 'xor' starting at index i;
start iterating over the string from the end. dp[i][x]=dp[i+1][a[i]^x];
for non contiguous, use the idea of inclusion exclusion dp[i][x]=dp[i+1][x]+dp[i+1][x^a[i]];
OK , this solution is in O(n^2) , but i want in O(nlogn) or something like this.
not o(n^2), more like ~ o(n * max_a) where max_a is the biggest element in the array. That complexity can be worse or better depending on the constrains. How big are the constrains for the elements and for N?
Ok , but it isn't optimal good. N <= 10 ^ 5 or something like this
ignore
can u give me the link of problem?
There isn't a problem... I thought at this, some time and i didn't find a good solution, so I published this post.
hah :D my case is similar link.
Use Gaussian Elimination to find a basis for the space spanned by the N vectors. Let R be its rank.
Add the vector i to the basis, and use Gauss again to check whether the rank increased. If it did, the answer is 0, since i is not in the space spanned by the elements of the original array.
If adding i didn't increase the rank, the answer is 2 ^ (# of free variables); that is, 2 ^ (N — R).