tl-dr version of the problem: You have a set of numbers and you need to look for a subset with a sum that ranges from l,u given that u — l is bigger than or equal the difference between the max,min elements from the set of given numbers.
Why does the two pointer method work here? Sorting the numbers and just adding the elements until you get over the range boundaries then subtracting until you get it right (in case there's a solution this will always work), why? What's the logic behind it?
Suppose there exists an x length subset whose sum is in the range [l, u]. If no such x exists, answer is - 1.
Now consider the sorted list of elements, A:
problem link ?!
Click
Thanks
Protip : you can find all IOI 2016 solutions in http://ioinformatics.org/locations/ioi16/contest/index.shtml