Блог пользователя minimario

Автор minimario, история, 7 лет назад, По-английски

Hello again :)

Recently I was solving problem with integers a[i] which involved two operations a[i] //= 2 and a[i] -= 1. And I read that it is equivalent that when we are performing these operations, to consider a[i] as a double, transforming a[i] //= 2 to a[i] / 2, and then at the end taking the floor.

This is a little obvious, but nevertheless very surprising to me. Can someone prove it?

  • Проголосовать: нравится
  • +3
  • Проголосовать: не нравится

»
7 лет назад, # |
  Проголосовать: нравится +5 Проголосовать: не нравится

Clearly, the floor division will have a lower final value. Let's call the value we get after some finite (natural) amount of normal divisions as A, and the value we get after the same amount of floor divisions as B. Note that we can ignore the subtraction of 1 operation from A and B.

Clearly, A > B. Our task is to prove that A - B < 1 for any finite number of operations we perform (so then, A⌋ = B).

We will prove by induction: First off the base case, 1 division. If the initial number is even, then A = B. otherwise, A - B = 0.5. So it's clearly proved for this base case.

Now, let's suppose it's true after some number of operations, and we would like to apply another one. Before we apply it, let's define d = A - B — the difference. Then it's given that d < 1. Now, once we divide, first off the difference d gets halved, and then, it is either increased by 0.5 if B was odd, or it doesn't get increased at all.

If it doesn't get increased at all then the condition of the difference being smaller than 1 is kept. If the difference gets increased, then note that:

Initially, d < 1. Then d gets halved: d < 0.5. then d is increased by 0.5: d < 1. So if it's true for after some number of divisions, it's true if we apply another one.

I feel like I might have overshot and missed a really simple proof but whatever.

  • »
    »
    7 лет назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    I don't see why you can ignore the subtraction of 1 in your proof, but radoslav11 has come up with an informal one, which with some care can probably be formalised.

    Consider a number X = Y + Z, where Y is even and Z = 1 or 0. Now, when we //= 2, we get X//2 = Y/2. When we /2, we get X/2 = Y/2 + Z/2. Their difference will be Z/2. (Either 0 or 1/2).

    Now, we can subtract K, and we will get something like X//2-K = C, while X/2-K = C+Z/2 (to be precise, C = Y/2-K, but who cares). Now, we do a similar thing, divide C into D + E, where D is even and E = 1 or 0. You will get something like D+E+Z/2. And when we compare /2 vs //2, we will get something like a difference of Z/4+E/2.

    And if you keep on going, you will get some A/2+B/4+C/8+..., which is always less than 1, so taking floor is okay.

    • »
      »
      »
      7 лет назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      You can ignore the subtraction of 1 because A⌋ = B implies that A - 1⌋ = B - 1.