# CLRS Chapter 3. Growth of Functions

## 3.1. Asymptotic notation

### Exercises

3.1-1. Let $f(n)$ and $g(n)$ be asymptotically nonnegative functions. Using the basic definition of $\Theta$-notation, prove that $\max(f(n), g(n)) = \Theta(f(n) + g(n)).$

Since $\max(f(n),g(n)) \geq f(n)$ and $\max(f(n),g(n)) \geq g(n)$ for all $n \in \mathbb{N}$, we see that

for all $n \in \mathbb{N}$.

We now choose an integer $N$ such that $f(n) \geq 0$ and $g(n) \geq 0$ for all $n \geq N$. This implies that

for all $n \geq N$.

It follows that $\max(f(n),g(n)) = \Theta(f(n),g(n))$. $\square$

3.1-2. Show that for any real constants $a$ and $b$, where $b > 0$,

Observe that

Let $f(n) = (1+a/n)^b$ for each $n$. We fix a positive integer $N$ such that $% $ for all $n \geq N$. This, in particular, ipmlies that $1 + \vert a/n \vert > 0$ for all $n \geq N$. Since $\vert a/n \vert$ is decreasing as $n$ increases, we see that

Now, $b$ is positive, and so it follows from the above estimates that

for all $n \geq N$. We conclude that $(n+a)^b = \Theta(n^b)$. $\square$

3.1-3. Explain why the statement, “The running time of algorithm $A$ is at least $O(n^2)$,” is meaningless.

The big-O notation can only be used to give upper bounds, not lower bounds. $\square$

3.1-4. Is $2^{n+1} = O(2^n)$? Is $2^{2n} = O(2^n)$?

Since

for all $n$, we see that $2^{n+1} = O(2^n)$.

We now fix an arbitrary constant $C > 0$. There exists a positive integer $N_C$ such that $% $ for all $n \geq N_C$. It then follows that

for all $n \geq N_C$. Since the choice of $C$ was arbitrary, we conclude that $2^{2n} \neq O(2^n)$. $\square$

3.1-5. Prove Theorem 3.1.

We recall the statement of Thereom 3.1:

For any two functions $f(n)$ and $g(n)$, we have $f(n) = \Theta(g(n))$ if and only if $f(n) = O(g(n))$ and $f(n) = \Omega(g(n))$.

$(\Rightarrow)$ Suppose that $f(n) = \Theta(g(n))$. Fix positive constants $c_1$, $c_2$, and $n_0$ such that

for all $n \geq n_0$. Since

for all $n \geq n_0$, we have that $f(n) = O(g(n))$. Similarly,

for all $n \geq n_0$, and so $f(n) = \Omega(g(n))$.

$(\Leftarrow)$ Suppose that $f(n) = O(g(n))$ and $f(n) = \Omega(g(n))$. The first assumption furnishes positive constants $c_1$ and $n_1$ such that

for all $n \geq n_1$. The second assumption furnishes positive constants $c_2$ and $n_2$ such that

for all $n \geq n_2$. It follows that

for all $n \geq \max(n_1,n_2)$, whence $f(n) = \Theta(g(n))$. $\square$

3.1-6. Prove that the running time of an algorithm is $\Theta(g(n))$ if and only if its worst-case running time is $O(g(n))$ and its best-case running time is $\Omega(g(n))$.

The running time of an algorithm is bounded below by its best-case running time and bounded above by its worst-case running time. The claim now follows from Exercise 3.1-5. $\square$

3.1-7. Prove that $o(g(n)) \cap \omega(g(n))$ is the empty set.

Let $f(n) \in o(g(n))$. For each constant $c >0$, there exists a constant $n_c > 0$ such that

for all $n \geq n_c$. We fix an arbirary constant $C > 0$ and observe that, for each positive constant $N$, $n \geq \max(n_C, N)$ implies

It follows that $f(n) \notin \omega(g(n))$.

Similarly, we can show that $f(n) \in \omega(g(n))$ implies $f(n) \notin o(g(n))$. It follows that the intersection $o(g(n)) \cap \omega(g(n))$ is empty. $\square$

3.1-8. We can extend our notation to the case of two parameters $n$ and $m$ that can go to infinity independently at different rates. For a given function $g(n,m)$, we denote by $O(g(n,m))$ the set of functions

Give corresponding definitions for $\Omega(g(n,m))$ and $\Theta(g(n,m))$.

## 3.2. Standard notations and common functions

### Exercises

3.2-1. Show that if $f(n)$ and $g(n)$ are monotonically increasing functions, then so are the functions $f(n) + g(n)$ and $f(g(n))$, and if $f(n)$ and $g(n)$ are in addition nonnegative, then $f(n) \cdot g(n)$ is monotonically increasing.

Let $n \leq m$. Since $f(n) \leq f(m)$ and $g(n) \leq g(m)$, we see that

If, in addition, $f$ and $g$ are nonnegative, then

Finally, $g(n) \leq g(m)$ and the monotonicity of $f$ implies that

as was to be shown. $\square$

3.2-2. Prove equation (3.16).

3.2-3. Prove equation (3.19). Also prove that $n! = \omega(2^n)$ and $n! = o(n^n)$.

Recall that Stirling’s approximation states that

This implies that, for each $\epsilon > 0$, there exists a positive integer $N_\epsilon$ such that $n \geq N_\epsilon$ implies

We fix $\epsilon > 0$ and observe that

for all $n \geq N_\epsilon$, so that

We fix positive constants $c_1$, $c_2$, and $n_0$ such that

for all $n \geq n_0$. Since $\log$ is a monotonically increasing function, we see that

for all $n \geq n_0$.

Again, $\log$ is a monotonically increasing function, there exists a positive constant $n_1$ such that $\frac{1}{2}\log n \geq \log e$ for all $n \geq n_1$. It then follows that

for all $n \geq n_1$. Similarly, there exists a positive constant $n_2$ such that $\log c_1 + \frac{1}{2} \log n \geq 0$ for all $n \geq n_2$, and so

for all $n \geq n_2$. It follows that

for all $n \geq \max(n_0,n_1,n_2)$.

Now, we observe that

Whenever $n \geq 1$, we have $\frac{1}{2} \log n \leq n \log n$, and so

Finally, $n \log n$ is a monotonically increasing function, and so we can find a positive constant $n_3$ such that $\log c_2 \leq \frac{1}{2} n \log n$ for all $n \geq n_3$. It follows that

for all $n \geq \max(n_0, n_3)$.

We now see that

for all $n \geq \max(n_0,n_1,n_2,n_3)$. We conclude that $\log n! = \Theta(n \log n)$.

We now recall that

for all $n \geq n_0$.

Observe that $c_2 e^{-n} n^{1/2} \to 0$ as $n \to \infty$. Therefore, each constant $c > 0$ admits a positive integer $N_c$ such that $n \geq N_c$ implies

It follows that $n! = o(n^n)$.

Observe also that

whenever $n \geq 1$. Since $(n/2e)^n \to \infty$ as $n \to \infty$, we can find, for each constant $c>0$, a positive integer $N_c$ such that $n \geq N_c$ implies

Therefore,

whenever $n \geq \max(n_c, 1)$. Since the choice of $c$ was arbitrary, it follows that $n! = \omega(2^n)$. $\square$

3.2-4. Is the function $\lceil \log n \rceil !$ polynomially bounded? Is the function $\lceil \log \log n \rceil !$ polynomially bounded?

By Stirling’s approximation (and the continuity of $\log$),

Since $n \mapsto 2^n$ is a monotonically increasing function,

$\log \log n$ is a monotonically increasing function with limit $\lim_{n \to \infty} \log \log n = \infty$, and so $\lceil \log n \rceil !$ is not polynomially bounded.

Similarly,

Since $\log^{(i+1)} n \to \infty$ as $n \to \infty$, it follows that $\lceil \log^{(i)} n \rceil$ is not polynomially bounded. $\square$

3.2-5. What is asymptotically larger: $\log (\log^* n)$ or $\log^* (\log n)$?

We begin by observing that

Now,

so that

It follows that $\log^* (\log n)$ grows exponentially faster than $\log(\log^* n)$. $\square$

3.2-6. Show that the golden ratio $\phi$ and its conjugate $\hat{\phi}$ both satisfy the equation $x^2 = x + 1$.

Use the quadratic formula. $\square$

3.2-7. Prove by induction that the $i$th Fibonacci number satisfies the equality

where $\phi$ is the golden ratio and $\hat{\phi}$ its conjugate.

We first observe that

We now assume inductively that

and observe that

Since $\phi + 1 = \phi^2$ and $\widehat{\phi} + 1 = \widehat{\phi^2}$, it follows that

as was to be shown. $\square$

3.2-8. Show that $k \log k = \Theta(n)$ implies $k = \Theta(n / \log n)$.

We find constants $c_1$, $c_2$, and $n_0$ such that $n_0 \geq 2$ and that

for all $n \geq n_0$. This implies that

for all $n \geq n_0$.

Since $k \log k \geq k$ for all $k \geq 2$, we see that

for all $k \geq 2$. It follows that

for all $n \geq n_0$ and $k \geq 2$. Now, if $% $, then $% $. This cannot hold if $n \geq \max(n_0, 1/c_1)$, and so we conclude that

whenever $n \geq \max(n_0,1/c_1)$.

Now, $k \geq \log k$ for all $k \geq 1$, and so

for all $k \geq 1$. It follows that

for all $n \geq n_0$ and $k \geq 1$. Since we know that $k \geq 2$ whenever $n \geq \max(n_0,1/c_1)$, we conclude that

for all $n \geq \max(n_0,1/c_1)$.

In other words,

as was to be shown. $\square$

## Problems

### 3-1. Asymptotic behavior of polynomials

Let

where $a_d > 0$, be a degree-$d$ polynomial in $n$, and let $k$ be a constant. Use the definitions of the asymptotic notations to prove the following properties.

a. If $k \geq d$, then $p(n) = O\left(n^k\right)$.

b. If $k \leq d$, then $p(n) = \Omega\left(n^k \right)$.

c. If $k = d$, then $p(n) = \Theta(n^k)$.

d. If $k > d$, then $p(n) = o(n^k)$.

e. If $% $, then $p(n) = \omega(n^k)$.

#### Problem 3-1a

Observe that, for each $n \geq 1$,

It follows that $p(n) = O(n^k)$. $\square$

#### Problem 3-1b

For each fixed positive integer $p$, we have the limit

Therefore, each positive constant $C$ has a corresponding positive integer $N_{C,p}$ such that $n \geq N_{C,p}$ implies

Taking $p = r-s$, we see that

for all $n \geq N_{C,r-s}$.

We now set

so that

for each $1 \leq i \leq d-1$ and every $n \geq N$. From this, we deduce that

for all $n \geq N$.

Finally, we know that

whenever $n \geq N_{2/a_d, d-k}$, and so

whenever $n \geq \max(N_{2/a_d,d-k},N)$. It follows that $p(n) = \Omega(n^k)$. $\square$

#### Problem 3-1c

This follows at once from 3-1a and 3-1b. $\square$

#### Problem 3-1d

As per 3-1b, we define, for each positive integer $p$ and every positive constant $C$, a number $N_{C,p}$ to be a positive integer such that

for all $n \geq N_{C,p}$.

We fix a positive constant $c$. Observe that, for each $1 \leq i \leq d$,

whenever $n \geq N_{2(d+1)\vert a_i \vert/c, k-i}$. Moreover,

whenever $n \geq \vert 2(d+1) a_0 / c \vert^{1/k}$.

Setting

we see that, for each $n \geq N$,

Since the choice of $c > 0$ was arbitrary, it follows that $p(n) = o(n^k)$. $\square$

#### Problem 3-1e

As per 3-1b, we define, for each positive integer $p$ and every positive constant $C$, a number $N_{C,p}$ to be a positive integer such that

for all $n \geq N_{C,p}$.

As in b, we set

so that

for all $n \geq N$. We now fix a constant $c >0$ and observe that

whenever $n \geq N_{4c/a_d, d-k}$. It follows that

whenever $n \geq \max(N,N_{4c/a_d,d-k})$. Since the choice of $c$ was arbitrary, we conclude that $p(n) = \omega(n^k)$. $\square$

### 3-2. Relative asymptotic growths

Indicate, for each pair of expressions $(A,B)$ in the table below, whether $A$ is $O$, $o$, $\Omega$, $\omega$, or $\Theta$ of $B$. Assume that $k \geq 1$, $\epsilon > 0$, and $c > 1$ are constants. Your answer should be in the form of the table with “yes” or “no” written in each box.

$A$ $B$ $O$ $o$ $\Omega$ $\omega$ $\Theta$
a. $\log^k$ $n$ $n^\epsilon$ yes yes no no no
b. $n^k$ $c^n$ yes yes no no no
c. $\sqrt{n}$ $n^{\sin n}$ no no no no no
d. $2^n$ $2^{n/2}$ no no yes yes no
e. $n^{\log c}$ $c^{\log n}$ yes no yes no yes
f. $\log$ $(n!)$ $\log$ $(n^n)$ yes no yes no yes

#### Problem 3-2a

We define $f(x) = \frac{1}{\epsilon} x^\epsilon$ and $g(x) = \ln x$ and note that

Since $\epsilon > 0$, we have $\epsilon - 1 > -1$, and so

for all $x \geq 1$, meaning $f(x)$ grows faster than $g(x)$ on the interval $x \geq 1$. Since $f(1) > g(1)$, we conclude that

for all $x \geq 1$. This, in particular, implies that

for all integers $n \geq 1$.

We now rescale $(\ast)$ for logarithm base 2, using the change-of-base formula:

for each $n \geq 1$ and every $\epsilon > 0$.

An argument analogous to the above shows that

for all integers $n \geq 1$. Therefore,

for all $i \geq 1$ and $n \geq 1$, and so it suffices to show that

for an arbitrary $\epsilon > 0$.

To this end, we fix a positive constant $C$. Since $n^{\epsilon/2} \to \infty$ as $n \to \infty$, we can find a positive constant $N$ such that $n^{\epsilon/2} > \frac{2\log e}{C\epsilon}$ for all $n \geq N$. It follows from $(\ast \ast)$ that

for all $n \geq N$. Since the choice of $C$ was arbitrary, we conclude that

as was to be shown. $\square$

#### Problem 3-2b

Observe that $n^k = 2^{\log(n^k)} = 2^{k \log n}$ and $c^n = 2^{\log c^n} = 2^{( \log c)n}$. We fix a positive constant $C$ and invoke a to find a positive integer $N$ such that

for all $n \geq N$. Since $n \mapsto 2^n$ is monotonically increasing, we conclude that

for all $n \geq N$. The choice of $C$ was arbitrary, and so $n^k = o(c^n)$. $\square$

#### Problem 3-2c

We claim that there are increasing sequences of positive numbers $(n_i)_{i=1}^\infty$ and $(m_i)_{i=1}^\infty$ such that $\sin n_i > \frac{2}{3}$ and $% $ for all $i$. (The proof of the claim typically involves either Diophantine approximations or the Weyl equidistribution theorem.) Recall that $n^p = o(n^q)$ whenever $% $. For this reason, given any $C > 0$, the inequality

cannot hold for all $i$. Similarly, the inequality

cannot hold for all $i$. It follows that neither $n^{1/2}$ nor $n^{\sin n}$ dominate each other. $\square$

#### Problem 3-2d

Since $2^n = (2^{n/2})^2$, we have the estimate $2^n = \omega(2^{n/2})$ from 3-2a. $\square$

#### Problem 3-2e

Observe that $n^{\log c} = 2^{(\log c) (\log n)}$ and $c^{\log n} = 2^{(\log c) (\log n)}$. Therefore, $n^{\log c} = c^{\log n}$. $\square$

#### Problem 3-2f

We have shown in Exercise 3.2-3 that $\log(n!) = \Theta(n \log n)$. Since $\log n^n = n \log n$, it follows that $\log(n!) = \Theta(\log n^n)$. $\square$

### 3-3. Ordering by asymptotic growth rates

a. Rank the following functions by order of growth; that is, find an arrangement $g_1,g_2,\ldots,g_{30}$ of the functions satisfying $g_1 = \Omega(g_2)$, $g_2 = \Omega(g_3)$, $\ldots$, $g_{29} = \Omega(g_{30})$. Partition your list into equivalence classes such that functions $f(n)$ and $g(n)$ are in the same class if and only if $f(n) = \Theta(g(n))$. $% $

b. Give an example of a single nonnegative function $f(n)$ such that for all functions $g_i(n)$ in part (a), $f(n)$ is neither $O(g_i(n))$ nor $\Omega(g_i(n))$

#### Problem 3-3b

$f(n) = 2^{2^{2^n}}\sin n$. Take a look at Problem 3.2.c to see how to prove asymptotics for functions like this one. $\square$

### 3-4. Asymptotic notation properties

Let $f(n)$ and $g(n)$ be asymptotically positive functions. Prove or disprove each of the following conjectures.

a. $f(n) = O(g(n))$ implies $g(n) = O(f(n))$.

b. $f(n) + g(n) = \Theta\left(\min(f(n), g(n))\right)$

c. $f(n) = O(g(n))$ implies $\log(f(n)) = O(\log(g(n))$, where $\log(g(n)) \geq 1$ and $f(n) \geq 1$ for all sufficiently large $n$.

d. $f(n) = O(g(n))$ implies $2^{f(n)} = O(2^{g(n)})$

e. $f(n) = O((f(n))^2)$

f. $f(n) = O(g(n))$ implies $g(n) = \Omega(f(n))$

g. $f(n) = \Theta(f(n/2))$

h. $f(n) + o(f(n)) = \Theta(f(n))$.

#### Problem 3-4a

$f(n) = 1$, $g(n) = n$ is a counterexample. $\square$

#### Problem 3-4b

$f(n) = 1$, $g(n) = n$ is a counterexample. $\square$

#### Problem 3-4c

Fix constants $C > 0$ and $N \geq 1$ such that

for all $n \geq N$. Since $\log$ is monotonically increasing,

for all $n \geq N$. Now, we find $N'$ such that $n \geq N'$ implies $\log(g(n)) \geq 1$. For such values of $n$, we have

It now follows that

whence $\log\left(f(n)\right) = O\left(\log(g(n))\right)$, as was to be shown. $\square$

#### Problem 3-4d

$f(n) = 4n$ and $g(n) = n$ is a counterexample. $\square$

#### Problem 3-4e

$f(n) = \frac{1}{n}$ is a counterexample. $\square$

#### Problem 3-4f

This follows at once from the definitions of $O$ and $\Omega$. $\square$

#### Problem 3-4g

$f(n) = \sin n$ is a counterexample. To see this, we observe that

which is not asymptotically equivalent to $\sin \frac{n}{2}$. $\square$

#### Problem 3-4h

We fix $g \in o(f(n))$ and show that $f(n) + g(n) = \Theta(f(n))$.

For notational convenience, we let $N_c$, for an aribitrary constant $c > 0$, denote a positive integer such that

for all $n \geq N_c$. Since

for all $n \geq N_1$, we have that $f(n) + g(n) = O(f(n))$.

We now observe that

for all $n \geq N_{1/2}$. It follows that $f(n) + g(n) = \Omega(f(n))$, whence we conclude that $f(n) + g(n) = \Theta(f(n))$. $\square$

### 3-5. Variations on $O$ and $\Omega$

Some authors define $\Omega$ in a slightly different way than we do; let’s use $\substack{\infty \\ \Omega}$ (read “omega infinity”) for this alternative definition. We say that $f(n) = \substack{\infty \\ \Omega}(g(n))$ if there exists a positive constant $c$ such that $f(n) \geq cg(n) \geq 0$ for infinitely many integers $n$.

a. Show that for any two functions $f(n)$ and $g(n)$ that are asymptotically nonnegative, either $f(n) = O(g(n))$ or $f(n) = \substack{\infty \\ \Omega}(g(n))$ or both, where as this is not true if we use $\Omega$ in place of $\substack{\infty \\ \Omega}$.

b. Describe the potential advantages and disadvantages of using $\substack{\infty \\ \Omega}$ instad of $\Omega$ to characterize the running times of programs.

Some authors also define $O$ in a slightly different manner; let’s use $O'$ for the alternative definition. We say that $f(n) = O'(g(n))$ if and only if $\vert f(n) \vert = O(g(n))$.

c. What happens to each direction of the “if and only if” in Theorem 3.1 if we substitute $O'$ for $O$ but still use $\Omega$?

Some authors define $\tilde{O}$ (read “soft-oh”) to mean $O$ with logarithmic factors ignored:

d. Define $\tilde{\Omega}$ and $\tilde{\Theta}$ in a similar manner. Prove the corresponding analog to Theorem 3.1.

#### Problem 3-5a

We assume that $f(n) \notin O(g(n))$. This means that each $C > 0$ and every $N > 0$ admits a positive integer $n_{C,N} \geq N$ such that

Now, $I = \{n_{2^{-i},2^i} : i \geq 1\}$ is an infinite set of integers such that

for all $n \in I$. It follows that $f(n) = \substack{\infty \\ \Omega}(g(n))$.

Setting $f(n) = \sin n$ and $g(n) = \cos n$, we see that $f(n) \notin O(g(n)) \cup \Omega(g(n))$. $\square$

#### Problem 3-5b

Introduction of $\substack{\infty \\ \Omega}$ renders all functions asymptotically comparable, which can be useful for making asymptotic statements about a wide class of functions without having to rule out the pathological cases (such as those involving periodic functions like $\sin$). In this manner, the utility of $\substack{\infty \\ \Omega}$ is analogous to that of limit superior and limit inferior, two generalizations of limit that exists for all sequences and functions.

On the other hand, the relaxed lower bound $\substack{\infty \\ \Omega}$ provides may not be all that useful for determining tight, or sometimes even reasonable, lower bounds for a specific function. For example, if

then our intuition suggests that $g(n)$ cannot be a good lower bound of $f(n)$. Nevertheless, $f(n) = \substack{\infty \\ \Omega}(g(n))$. $\square$

#### Problem 3-5c

$f(n) = \Omega(g(n))$ forces $f$ to be eventually positive, and so $f(n) \in \Omega(g(n)) \cap O'(g(n))$ implies $f(n) = O(g(n))$. It follows from Theorem 3.1 that $f(n) = \Theta(g(n))$. Conversely, if $f(n) = \Theta(g(n))$, then Theorem 3.1 implies that $f(n) = O(g(n))$ and $f(n) = \Omega(g(n))$. Since $O(g(n)) \subseteq O'(g(n))$, it follows that $f(n) = O'(g(n))$. $\square$

#### Problem 3-5d

We define

% and

If $f(n) = \tilde{\Theta}(g(n))$, then we can find positive constants $c_1$, $c_2$, $k_2$, $k_2$, and $n_0$ such that

for all $n \geq n_0$. This, in particular, implies that

for all $n \geq n_0$, and so $f(n) = \tilde{\Omega}(g(n))$. Similarly,

for all $n \geq n_0$, and so $f(n) = \tilde{O}(g(n))$.

Conversely, we assume that $f(n) = \tilde{O}(g(n))$ and $f(n) = \tilde{\Omega}(g(n))$. The $\tilde{\Omega}$-estimate furnishes positive constants $c_1$, $k_1$, and $n_1$ such that

for all $n \geq n_1$. The $\tilde{O}$-estimate furnishes positive constants $c_2$, $k_2$, and $n_2$ such that

for all $n \geq n_2$. It follows that

for all $n \geq \max(n_1,n_2)$, whence $f(n) = \tilde{\Theta}(g(n))$.

We conclude that $f(n) = \tilde{\Theta}(g(n))$ if and only if $f(n) = \tilde{O}(g(n))$ and $g(n) = \tilde{\Omega}(g(n))$. $\square$

### 3-6. Iterated functions.

We can apply the iteration operator $\ast$ used in the $\log^*$ function to any monotonically increasing function $f(n)$ over the reals. For a given constant $c \in \mathbb{R}$, we define the iterated function $f_c^*$ by

which need not be well defined in all cases. In other words, the quantity $f_c^*(n)$ is the number of iterated applications of reduce its argument down to $c$ or less.

For each of the following functions $f(n)$ and constants $c$, give as tight a bound as possible on $f_c^*(n)$.

$f$ $(n)$ $c$
a. $n$ $-$ $1$ 0
b. $\log$ $n$ 1
c. $n/2$ 1
d. $n/2$ 2
e. $\sqrt{n}$ 2
f. $\sqrt{n}$ 1
g. $n^{1/3}$ 2
h. $n$ $/$ $\log$ $n$ 2

#### Problem 3-6a

$f^{(i)}(n) = n - i$, and so $f^*_c(n) = n$. $\square$

#### Problem 3-6b

We define $g(n) = 2^n$. For notational convenience, we write $g^{-1}(n) = \log n$ and $g^{-i}(n) = (g^{-1} \circ \cdots \circ g^{-1})(n)$.

For each nonnegative $n$, there exists a unique integer $i$ such that

We can find the value of $i$ by computing $g^{(k)}(1)$ for each $k \geq 1$ until we reach the desired interval.

Since $g$ and $g^{-1}$ are monotonically increasing, we see that

and that

It follows that $f_c^*(n) = i$. $\square$

#### Problem 3-6c

$f^{(i)}(n) = n/2^i$, and so $f_c^*(n) = \lceil \log n \rceil$. $\square$

#### Problem 3-6d

$f^{(i)}(n) = n^{1/2^i}$, and so $f_c^*(n) = \lceil \log n \rceil - 1$. $\square$

#### Problem 3-6e

$f^{(i)}(n) = n^{1/2^i}$, and so $f_c^*(n) = \lceil \log \log n \rceil$ whenever $n \geq 2$. We have $f_c^*(1) = 0$. $\square$

#### Problem 3-6f

Observe that $f^{(i)}(n) = n^{1/2^i}$. Since we deal exclusively with positive integers, $n^{1/2^i}$ does converge to 1. Nevertheless, the square-root function is strictly increasing, and so $n > 1$ implies $\sqrt{n} > \sqrt{1} = 1$. By induction, $f^{(i)}(n) > 1$ whenever $n > 1$. It follows that

#### Problem 3-6g

$f^{(i)}(n) = n^{1/3^i}$, and so $f_c^*(n) = \lceil \log_3 \log n \rceil$ whenever $n \geq 8$, where $\log_3$ is logarithm base 3. We have $f_c^*(n) = 0$ for $1 \leq n \leq 7$. $\square$

#### Problem 3-6h

$% $ whenever $n > 2$, and so

## Additional remarks and further results

### Stirling’s approximation and Euler–Maclaurin summation

In CLRS (and in many elementary probability/statistics textbooks), Stirling’s approximation is given as

A more precise asymptotic expansion can be calculated through the Euler–Mclaurin summation formula:

Euler–Maclaurin summation formula (Sedgewick/Flajolet 4.3). Let $f(x)$ be a function defined on the interval $[1,\infty)$ and suppose that the derivatives $f^{(i)}(x)$ exist and are absolutely integrable for $1 \leq i \leq 2m$, where $m$ is a fixed constant. Then

where

is a constant dependent on the function $f$, $B_{2k}$ is the $2k$th Bernoulli number, given as the $2k$th coefficient of the Taylor series

and $R_{2m}$ is a remainder term satisfying the asymptotic estimate

For now, we take for granted that

By the formula, we have that

It now follows that

### General form of the Euler–Maclaurin summation formula

To generalize the discrete Euler–Mclaurin summation formula, we introduce the Bernoulli polynomials

which constitute the coefficients of the Taylor series

Differentiating the Taylor series term-by-term, we see that

for each $m > 1$ and every $x \in \mathbb{R}$.

We now observe that integrating a function by parts with Bernoulli polynomial $B_1(x) = x-1/2$ yields

Setting $g(x) = f(x+k)$, we obtain the identity

We sum on $[a,b]$ to obtain

We have thus obtained the following relationship between a sum and the corresponding integral:

We can furnish a bound on $(\ast)$ by recursively integrating by parts with Bernoulli polynomials. Since $B_{i+1}'(x) = (i+1) B_{i}(x)$, we have that

Dividing through by $(i+1)!$, we obtain a recurrence relation:

Computing recursively, we obtain:

Since $B_1(x) = x - \frac{1}{2}$ and $B_i(0) = B_i(1) = B_i$ for $i > 1$ with $B_i = 0$ for odd $i > 1$, we see that

We now substitute $g(x) = f(x+k)$ and sum over $[a,b]$ to obtain

where

The remainder satisfies the estimate

### Linear probing and the Ramanujan $Q$-function

An important tool in the theory of hashing (CLRS ch.11 & TAOCP vol.3, sec.6.4) is linear probing, which searches for a table with $M$ cells, $N$ of which are occupied, for a given key $K$, inserting the key into the table whenever $K$ is not in the table and $M > N$.

A famous theorem of Knuth (TAOCP vol.3, sec.6.4, Theorem K) gives an asymptotic analysis of the average number of cells linear probing examines, in terms of the Ramanujan $Q$-function

For a successful search—i.e., $K$ is inserted—the average number is

For an unsuccessful search—i.e., $K$ is not inserted—the average number is

Since

we see that

Setting $\alpha = N/M$, we obtain

which is a useful estimate for small values of $\alpha$.

At the opposite extreme, we see that $Q_1(M,M-1) = M$, and that

which yields the following estimate (Sedgewick/Flajolet Theorem 4.8):

Knuth (TAOCP vol.1, sec.1.2.11.3) gives a refined estimate:

### Ramanujan distributions

Recall from the previous section that the $Q$-distribution is

The term-by-term asymptotic estimate (Sedgewick/Flajolet, Theorem 4.4) is as follows:

Closely related to the $Q$-distribution is the $R$-distribution

which is also asymptotically equivalent to

as $M \to \infty$. Knuth’s estimate (TAOCP vol.1, sec.1.2.11.3) is

The term-by-term asymptotic estimate of the $R$-distribution is

as well.

Many interesting probability distributions can be represented as products of the $Q$-distribution and the $R$-distribution. For example, each even term of the binomial distribution

can be rewritten as follows (Sedgewick/Flajolet, Theorem 4.6):

Stirling’s formula yields

whence we see that