# Problem
Suppose we are given an array A[0:n] with the special property that A[0]>=A[1] and A[n-2]<=A[n-1]. We say that an element A[x] is a local minimum if it is less than or equal to both its neighbors, or more formally, if A[x-1]>=A[x] and A[x]<=A[x + 1]. Obviously the boundary element cannot be local minimum.
For example, there are six local minima in the following array:

We can obviously find a local minimum in O(n) time by scanning through the array. Describe and analyze an algorithm that finds a local minimum in O(log n) time. [Hint: With the given boundary conditions, the array must have at least one local minimum. Why?]
# Algorithm Description
We recursively divide the array into halves until we find a local minimum.
At each midpoint m:
- If A[m] ≤ A[m−1] and A[m] ≤ A[m+1] → m is a local minimum, return it.
- Else if A[m−1] < A[m] → slope goes down to the left, so recurse on [i, m).
- Else (A[m+1] < A[m]) → slope goes down to the right, so recurse on [m+1, j).
This is safe because with the given boundary conditions (A[0] ≥ A[1] and A[n−2] ≤ A[n−1]), a local minimum is guaranteed to exist and will lie along whichever side is decreasing.
# Pseudocode
```
# input: array A with A[0] ≥ A[1] and A[n−2] ≤ A[n−1]
# search window: [i, j)
# output: index p of a local minimum
fun findLocalMin(A, i, j):
if j - i == 0: # empty window
return null
if j - i == 1: # one element → local min
return i
if j - i == 2: # two elements → return smaller
return (i if A[i] <= A[i+1] else i+1)
var m = floor((i + j) / 2) # midpoint
if A[m] <= A[m-1] and A[m] <= A[m+1]: # found local min
return m
if A[m-1] < A[m]: # slope goes down to the LEFT
return findLocalMin(A, i, m) # search left half
else: # slope goes down to the RIGHT
return findLocalMin(A, m+1, j) # search right half
```
# Correctness Proof
Goal: Show that findLocalMin(A, i, j) always returns the index of a local minimum inside [i, j).
Base cases:
- If there is 1 element (j - i = 1), it is the only element and therefore a local minimum.
- If there are 2 elements (j - i = 2), the algorithm returns the smaller one, which is a local minimum among the two.
Recursive step:
Assume the window has at least 3 elements.
Let m = floor((i + j)/2).
If A[m] <= A[m-1] and A[m] <= A[m+1], then m is lower than both neighbors → this is a local minimum → return m.
If A[m-1] < A[m], there is a downhill slope to the left.
If we keep going left, the values must eventually stop going down — the first element where they stop going down will be a local minimum.
So, the local minimum must be somewhere in [i, m), and the algorithm safely searches that half.
Otherwise (A[m+1] < A[m]), there is a downhill slope to the right, so the same logic shows there must be a local minimum in [m+1, j).
Each recursive step chooses the half that must contain a local minimum, and makes the window smaller.
Eventually, the window becomes size 1 or 2 (base cases), and the algorithm returns a correct index.
# Time Complexity
Each call only does a few comparisons → O(1) work per call.
Each step cuts the size of the window about in half.
Recurrence: T(n) = T(n/2) + O(1)
Solution: T(n) = O(log n) time.
Space: O(log n) because of the recursion depth (or O(1) if written iteratively).
# Q: Why does a local minimum always exist?
At the start of the array: A[0] ≥ A[1] → the slope goes down or flat.
At the end of the array: A[n−2] ≤ A[n−1] → the slope goes up or flat.
So the slope goes from down to up somewhere in the middle.
When a sequence goes down then up, there must be a point in the middle that is lower than both sides — this is the local minimum.