These might be trivial for most users, but if you are like me, who sometimes confuse on when to use mid+1
, lef+1
etc.
To find the index of the rightmost $$$1$$$ in a monotonically decreasing function, for example $$$a=[1,1,1,1,0,0,0]$$$
Define check(i)
to return true when a[i]
is true, false otherwise.
int lef = 0, rig = n;
while(lef < rig) {
int mid=(lef + rig)/2;
if(check(mid)){
lef = mid+1;
} else {
rig = mid;
}
}
int ans = lef-1;
Notice that the mid
in lef=mid+1
is always valid, so the answer (lef-1
) is the last valid mid
. Also notice the initial values of lef
and rig
. rig
is set out of bounds.
To find the index of the leftmost $$$1$$$ in a monotonically increasing function, for example $$$a=[0,0,0,0,1,1,1]$$$
In a similar way,
int lef = -1, rig = n-1;
while(lef < rig) {
int mid=(lef + rig + 1)/2;
if(check(mid)){
rig = mid-1;
} else {
lef = mid;
}
}
int ans = rig+1;
Notice the ceiling in the definition of mid
. This is to handle rig-lef
equals one. If a
can be computed efficiently, you can instead do
int ans = lower_bound(a,a+n,1)-a; // get index of leftmost 1
You can then solve most problems by implementing your own definition of check()
. Time complexity is the time complexity of your check()
function times $$$\log N$$$. Thanks to marvenlee for inspiring this blog.
Usage examples:
There are many other ways of implementing binary search too, most notably pllk's blog