Problem (from Luogu P3175): Initially, you have the number $$$0$$$. Every second, you randomly select a number from the range $$$[0, 2^n - 1]$$$ and perform a bitwise OR operation (| in C++/C, or in Pascal) with the number in your hand. The probability of selecting number $$$i$$$ is $$$p_i$$$. It is guaranteed that $$$0 \leq p_i \leq 1$$$ and $$$\sum p_i = 1$$$. Find the expected number of seconds until the number in your hand becomes $$$2^n - 1$$$.
Two ways to calculate the expected minimum time of choosing one element in set $$$S$$$:
The first one is right, and the second one is wrong. When we have $$$n=2,p=[0.902611887323,0.012000014846,0.032707823145,0.052680274686$$$, the correct answer is about $$$16.9037009548$$$, but the second algo will give $$$20.253655892945$$$. The first one can get AC in P3175, but the second can't. Why?