I've recently found that the following code will generate wrong output.
#include <bits/stdc++.h>
const int N = 105;
std::bitset<N> ok[N][N];
int n, m;
int main() {
std::cin >> n >> m;
for (int i = 1; i <= m; i++) {
int l, r;
std::cin >> l >> r;
for (int j = l; j <= r; j++)
ok[l][r].set(j);
}
for (int i = n; i; i--)
for (int j = i; j <= n; j++) {
ok[i][j] = ok[i][j] | ok[i + 1][j] | ok[i][j - 1];
}
std::cerr << "! " << ok[2][5][2] << '\n';
return 0;
}
Compiled with -O3 -mtune=skylake -march=skylake
, the code outputs ! 0
when given the following input:
5 5
1 3
2 2
3 3
1 5
3 5
However if you simulate the code you will know that the correct answer should be ! 1
.
Note that the compiler seems to generate wrong sse instruction.
Again, I believe this code is ub-free, and has nothing to do with implementation-defined stuff.