Блог пользователя _Muhammad

Автор _Muhammad, история, 6 лет назад, По-английски

In this problem I am trying to find maximum X1 + X2. Lets X is the Xor of all given numbers. I assumed each number as a vector and then found the basis.Let see an example :

5 = 1 0 1

6 = 1 1 0

7 = 1 1 1

After using Gaussian elimination the basis are :

1 0 0

0 1 0

0 0 1

And then I iterated from most significant bit and If there is 0 in i -th bit of X then I tried to put 1 in X1. If it is not possible then I skipped that bit and went to next bit. But this process is not giving me maximum X1+X2. Here is the Implementation. Please can anyone tell me what is wrong here.Is my logic is wrong?? Or the Implementation is not prefect?

Please help me.I am stucked here.

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится