Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
# | User | Rating |
---|---|---|
1 | jiangly | 3898 |
2 | tourist | 3840 |
3 | orzdevinwang | 3706 |
4 | ksun48 | 3691 |
5 | jqdai0815 | 3682 |
6 | ecnerwala | 3525 |
7 | gamegame | 3477 |
8 | Benq | 3468 |
9 | Ormlis | 3381 |
10 | maroonrk | 3379 |
# | User | Contrib. |
---|---|---|
1 | cry | 168 |
2 | -is-this-fft- | 165 |
3 | Dominater069 | 161 |
4 | Um_nik | 159 |
4 | atcoder_official | 159 |
6 | djm03178 | 157 |
7 | adamant | 153 |
8 | luogu_official | 150 |
9 | awoo | 149 |
10 | TheScrasse | 146 |
Suppose we are given two increasing arrays A and B , we need to find max((a[j]-a[i-1] )- (b[j]-b[i-1])) , where j>=i .
Can we solve it in better than O(n^2) time complexity . If so , please give me any Hint .
Name |
---|
You are basically asking for $$$max((a[j] - b[j]) - (a[i] - b[i]))$$$ where
i < j
holds. Let $$$c_i = a_i - b_i$$$. Then it becomes: $$$max(c[j] - pref[j - 1])$$$ where $$$pref[i]$$$ is minimum element of $$$c[0...i]$$$. This can be solved in linear time.