### Binary search, differently ordered

Shin shared a variation of binary search problem with me yesterday, which was later discovered to be a variation of an exercise from van Gasteren and Feijen's note *The Binary Search Revisited* cited by Shin's blog post *A Survey of Binary Search*. The problem: if an array `a`

is rotated from an ordered array _{0}, a_{1}, ..., a_{n-1}`a`

for some _{i+1}, a_{i+2}, ..., a_{n-1}, a_{0}, a_{1}, ..., a_{i}`i`

(so a "dip" is present in the array, namely the difference between `a`

and _{i}`a`

), can we still perform binary search on it? Here I try to record the path of thought I've taken.
_{i+1}

As described by Shin, the general binary search algorithm assumes an invariant `Φ(i, j)`

holds initially for the entire array, i.e., `Φ(0, n-1)`

is true. And then the loop pulls `i`

and `j`

closer and closer, until `j = i + 1`

. In the case of standard binary search, `Φ(i, j) := a`

where _{i} ≤ k < a_{j}`k`

is the key value to be searched for. Note that `Φ(0, n-1)`

must hold initially but the key may well not be in the array. To fix this, we add -∞ and ∞ to both ends of the array. After the loop, we have `a`

equals to _{i}`k`

if and only if `k`

is present in the array.

Since it was hinted that binary search may be applicable, I looked at the original binary search and tried to find similarities, hoping to discover a suitable way of generalisation. In the case `a`

, we know _{0} < a_{n-1}`k`

is bounded by `a`

and _{0}`a`

, i.e., _{n-1}`k`

is in the interval `[a`

. For the other case _{0}, a_{n-1}]`a`

, we know _{0} ≥ a_{n-1}`k`

can only be larger than `a`

or smaller than _{0}`a`

, i.e., _{n-1}`k`

is in `[a`

. Also observe that performing the original binary search naively is not right. For example, when the median value _{0}, ∞) ∪ (-∞, a_{n-1}]`a`

is less than _{m}`k`

, it does not necessarily mean we should assign `m`

to `i`

--- consider the case when `k`

is located at the left of the dip and `a`

at the right. The way of comparison seems a lot more complicated.
_{m}

And then it occurred to me that it would be great if `[a`

can be viewed in the same way as an ordinary interval, so we may still intuitively see the interval shrink as the loop progresses, like the standard binary search. This required gluing the two sides of the real line to an added point ∞, so a circle is formed. The interval in question is then contiguous on the circle, containing the added point ∞. I was introduced to this concept in the undergraduate algebra course taken in my fourth year, which is called the real projective line and amounts to the one-point compactification of the real line. The odd comparison rules all suddenly make sense under this view. While in general numbers on the real projective line do not have a natural ordering, in our case we can say informally that the magnitude of a number _{0}, ∞) ∪ (-∞, a_{n-1}]`x`

is the minimum distance we travel counterclockwise on the circle from `a`

to _{0}`x`

, and that a number is smaller if its magnitude is smaller. This essentially cuts the real projective line at `a`

and forms a closed ray roughly like _{0}`[a`

, _{0}, a_{0}^{-})`a`

serving as the new infinity. Walking from _{0}^{-}`a`

towards the infinity on _{0}`[a`

is equivalent to walking counterclockwise on the real projective line from _{0}, a_{0}^{-})`a`

and never reaching it again. We can say that the value domain is also rotated: rotating the array indices disrupts orderedness of the array, but we can rotate the value domain correspondingly to make the array ordered again. (I wish I could make this statement more topological! I believe it's something related to the torus.) Comparison under this ordering is simple: if two numbers are on the same side of the dip (which can be determined by comparing them with _{0}`a`

), then perform the usual comparison; otherwise, whichever on the right side of the dip is larger. The binary search algorithm doesn't have to be altered except for changing the way of comparison. We still have to insert a guard _{0}`a`

as the rightmost element of the array, but there is only one guard instead of two. Notice that this works for binary search on an ordinarily-ordered array as well: values smaller than _{0}^{-}`a`

are greater than all elements in the array under the new ordering, so searching for a value too small simply moves _{0}`i`

towards `n`

and -∞ is not needed to guard the left end of the array.

This ordering is my final version, though, which means there were some other versions. For example, I had used an ordering depending on both `a`

and _{0}`a`

and treated all values from _{n-1}`a`

to _{n-1}`a`

(both ends exclusive) as -∞. This resulted in much more complicated case analysis in the definition of the ordering. I had even made a mistake regarding the sign as essential for the comparison, not noticing that topologically 0 did not have a special role on the projective real line. This mistake made me temporarily think that modelling the situation with the real projective line was flawed. But later I discovered there is a cleaner and correct way to utilise the real projective line, which is, well, described above.
_{0}

However, there is one last serious flaw. If an ordered array is rotated such that `a`

, it should be considered legal input but does not count as ascending under the new ordering! I spotted this seemly-unfixable flaw at midnight, which deprived me of sleep for some two hours. And indeed it is not fixable but it is not a problem about the ordering! Say the two ends of the rotated array have value _{0} = a_{n-1}`v`

. If the median value is also `v`

, then we have no way to decide whether the key value is in the left segment or the right one --- the key can be in any one of them. This observation can even be developed into a full adversary argument, showing that no algorithm can correctly solve the search problem on these arrays in sub-linear time, by arguing any correct algorithm must examine the middle `n-2`

values of the array and therefore take `Ω(n)`

time: Given an algorithm `A`

and a key `k`

, consider the "flat sequences" consisting of `n`

copies of a number not equal to `k`

. If `A`

does not have to look at all `n-2`

values in the middle, then (for all but finitely many `n`

's) `A`

does not look at some value at index `α`

, which means changing the value at _{n}`α`

does not affect the output of _{n}`A`

, namely `A`

would still say the key is not found as it would for the flat sequences. Now change the value at `α`

to _{n}`k`

for every flat sequence of length `n`

and feed this set of input to `A`

. Its output must be incorrect. Thus it's not possible to perform binary search, which takes `O(log n)`

time, to correctly determine whether a value is present in this kind of arrays, which takes at least `Ω(n)`

time. It is interesting to see that a naive-looking equality can dramatically increase a seemly-simple problem's complexity.

--

It's been quite a while since I wrote something this long last time, especially in English...

A sequel to this post has been posted, which is on whether implementing this new ordering gives a better algorithm than the usual two-pass algorithm in terms of number of comparisons.

Labels: Algorithms

Oh, I should have mentioned that this problem was originally posed to me by Yu-Han Lyu. He said that it's a problem often given in interviews. A typical response is to first perform a binary search to find the dip, then somehow, perhaps virtually, "rotate" back the array.

We would like to have an algorithm that finds the key in one pass. I'm not sure we can do so using fewer comparisons than the two-pass algorithm, though.

You didn't spell out (in equation) what the ordering is in this article, have you?

> You didn't spell out (in equation) what the ordering is in this article, have you?

No, I did not. Since I didn't actually do formal calculations, I thought it was not necessary for this post to be formal.

On the other hand it seems interesting to go into the details to see what the resulting algorithm looks like. (And make sure it is correct!)

<< 回到主頁