`(This was originally on my old website, I have resurrected it here.)`

We look at the generalisation of the classic two egg problem to \( e \) eggs and \( f \) floors.

The problem is to find the smallest worst case number of drops needed to find the highest floor from which an egg will not break when dropped. We can re-drop unbroken eggs. If eggs break when dropped from a particular floor then they break when dropped from higher floors, and if they don't break when dropped from a particular floor then they don't break when dropped from lower floors.

The case of \( e = 2 \) and \( f = 100 \) has been used as an interview problem.

Denote the worst case number of drops by \( W(e, f) \).

If we have \( \infty \) eggs, then we can sacrifice as many as we like. We can do a binary search, i.e. drop from the middle then if the egg breaks, drop from the middle of the lower interval, otherwise drop from the middle of the upper interval. Since we half the search space every step, we have \( W(\infty, f) = \lim_{e\to\infty} W(e, f) = \left\lceil \log_{2}(f) \right\rceil \).

Before we start a good sanity check is that we can't do better than with unlimited eggs, in fact we expect for all \( f \), \( \left( W(e, f) \right)_{e = 1}^{\infty} \) to be a weakly decreasing sequence, i.e. \( W(1, f) \geq W(2, f) \geq \cdots \geq W(\infty, f) \).

We can write a reccurrence relation for the number of drops

$$ W(e, f) = \begin{cases} \;\;\; 0 & \text{if } f = 0 \text{,} \\ \;\;\; 1 & \text{if } f = 1 \text{,} \\ \;\;\; f & \text{if } e = 1 \text{,} \\ \;\;\; \overset{(a)}{1} + \overset{(b)}{\underset{x\in[1,f]}{\min}} \left\{ \overset{(c)}{\max} \left(\overset{(d)}{W(e - 1, x - 1)}, \overset{(e)}{W(e, f - x)}\right) \right\} & \text{otherwise.} \end{cases} $$

Let's look at the terms

- Drop an egg.
- Choose the best of dropping from each floor.
- We want worst case.
- Egg breaks, \( e - 1 \) eggs to search \( f - 1 \) floors.
- Egg doesn't break, \( e \) eggs to search \( f - x \) floors.

We can directly implement this, however it is too slow, we are throwing away values which could be reused.

```
def W(e, f):
if f == 0:
return 0
elif f == 1:
return 1
elif e == 1:
return f
else:
min = float("inf")
for x in range(1, f + 1):
temp = max(W(e - 1, x - 1), W(e, f - x))
if temp < min:
min = temp
return 1 + min
```

We can rearrange and construct the solution bottom up or we can memoise (remember previous values), both are shown below.

```
# Table based algorithm.
def W_tab(e, f):
# (e + 1) x (f + 1) array to store results.
# Initialise each entry to +inf.
W = []
for i in range(e + 1):
W.append([float("inf")]*(f + 1))
# For all e. W[e][1] = 1, W[e][0] = 0.
for i in range(1, e + 1):
W[i][1] = 1
W[i][0] = 0
# For all f. W[1][f] = f.
for j in range(1, f + 1):
W[1][j] = j
for i in range(2, e + 1):
for j in range(2, f + 1):
for x in range(1, j + 1):
temp = 1 + max(W[i - 1][x - 1], W[i][j - x])
if temp < W[i][j]:
W[i][j] = temp
return W[e][f]
# Memosied algorithm.
def memoise(f):
cache = {}
def memoised(*args):
if args not in cache:
cache[args] = f(*args)
return cache[args]
return memoised
# Then we decorate the definition of W with memoise.
# @memoise
# def W(e, f):
# ...
```

We can generate data and plot it.

As the number of eggs is increased, the search converges to a binary search, we can see this happens very quickly, the gains of adding extra eggs dies fast.

`TODO: Two ways to find the optimum strategy.`