Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
# | User | Rating |
---|---|---|
1 | tourist | 3993 |
2 | jiangly | 3743 |
3 | orzdevinwang | 3707 |
4 | Radewoosh | 3627 |
5 | jqdai0815 | 3620 |
6 | Benq | 3564 |
7 | Kevin114514 | 3443 |
8 | ksun48 | 3434 |
9 | Rewinding | 3397 |
10 | Um_nik | 3396 |
# | User | Contrib. |
---|---|---|
1 | cry | 167 |
2 | Um_nik | 163 |
3 | maomao90 | 162 |
3 | atcoder_official | 162 |
5 | adamant | 159 |
6 | -is-this-fft- | 158 |
7 | awoo | 155 |
8 | TheScrasse | 154 |
9 | Dominater069 | 153 |
10 | nor | 152 |
Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
Name |
---|
You are basically asking for $$$max((a[j] - b[j]) - (a[i] - b[i]))$$$ where
i < j
holds. Let $$$c_i = a_i - b_i$$$. Then it becomes: $$$max(c[j] - pref[j - 1])$$$ where $$$pref[i]$$$ is minimum element of $$$c[0...i]$$$. This can be solved in linear time.