11. Evaluating algorithms

There are several methods to analyze an algorithm, the following chapters will concern themselves with two possible methods, using the worst-case scenario for InsertionSort. Worst-case InsertionSort takes \(\frac{n^2-n}{2}\) comparisons.

Acknowledgments

I’d like to acknowledge the failure that is the german education system, which forces the computer science students in Wuerzburg, Germany to use an idiotic method for analyzing algorithms in their algorithms and data structures class. Or as we say in Germany: “Danke Merkel!”

11.1. Big O notation

See also Big O notation on Wikipedia.

Notation Name Description Formal Definition Limit Definition
\(f(n) = o(g(n))\) Small O; Small Oh \(f\) is dominated by \(g\) asymptotically \(\forall k>0 \; \exists n_0 \; \forall n>n_0 \; |f(n)| < k\cdot g(n)\) \(\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0\)
\(f(n) = O(g(n))\) Big O; Big Oh; Big Omicron \(|f|\) is bounded above by \(g\) (up to constant factor) asymptotically \(\exists k>0 \; \exists n_0 \; \forall n>n_0 \; |f(n)| \leq k\cdot g(n)\) \(\limsup_{n \to \infty} \frac{\left|f(n)\right|}{g(n)} < \infty\)
\(f(n) = \Theta(g(n))\) Big Theta \(f\) is bounded both above and below by \(g\) asymptotically \(\exists k_1>0 \; \exists k_2>0 \; \exists n_0 \; \forall n>n_0\)\(k_1\cdot g(n) \leq f(n) \leq k_2\cdot g(n)\) \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n))\) (Knuth version)
\(f(n)\sim g(n)\) On the order of \(f\) is equal to \(g\) asymptotically \(\forall \varepsilon>0\;\exists n_0\;\forall n>n_0\;\left|{f(n) \over g(n)}-1\right|<\varepsilon\) \(\lim_{n \to \infty} {f(n) \over g(n)} = 1\)
\(f(n) = \Omega(g(n))\) Big Omega in number theory (Hardy\u2013Littlewood) \(|f|\) is not dominated by \(g\) asymptotically \(\exists k>0 \; \forall n_0 \; \exists n>n_0 \; |f(n)| \geq k\cdot g(n)\) \(\limsup_{n \to \infty} \left|\frac{f(n)}{g(n)}\right| > 0\)
\(f(n) = \Omega(g(n))\) Big Omega in complexity theory (Knuth) \(f\) is bounded below by \(g\) asymptotically \(\exists k>0 \; \exists n_0 \; \forall n>n_0 \; f(n) \geq k\cdot g(n)\) \(\liminf_{n \to \infty} \frac{f(n)}{g(n)} > 0\)
\(f(n) = \omega(g(n))\) Small Omega \(f\) dominates \(g\) asymptotically \(\forall k>0 \; \exists n_0 \; \forall n>n_0 \ |f(n)| > k\cdot |g(n)|\) \(\lim_{n \to \infty} \left|\frac{f(n)}{g(n)}\right| = \infty\)

C-= runs rst-adjust() to extend over-/underlines of titles

11.2. Bakado - the way of the idiot

The way of the idiot is the standard method taught in the course “Algorithmen und Datenstrukturen” at the Julius-Maximilian-University Wuerzburg, as of winter 2018/19. The way of the idiot is named so, because it is tedious and it can take a long time to analyze an algorithm using this method, yet it is universally applicable to analyze any algorithm. Unfortunately it is also the only surefire method to score points on the exam for the “Algorithmen und Datenstrukturen”-course.

Let \(f(n)\) be the worst-case run time of some algorithm \(f\). Let \(g(n)\) be the known worst-case run time of some other algorithm \(g\), known to be in runtime class \(\Theta(g)\): \(g(n) \in \Theta(g)\). Assuming, that \(f(n)\) is also in \(\Theta(g)\), we now have to prove, that \(f(n) \in o(g)\) and \(f(n) \in O(g)\). In that case we have found the holy grail of the real \(\Theta\) housewives: \(\Theta(f) = \Theta(g)\).

\[\begin{split}O(f) = O(g) & \not\rightarrow \Theta(f) = \Theta(g)\\ o(f) = o(g) & \not\rightarrow \Theta(f) = \Theta(g)\\ O(f) = O(g) \wedge o(f) = o(g) & \rightarrow \Theta(f) = \Theta(g)\end{split}\]

Let \(f(n) =\) \(\frac{n^2-n}{2}\) be the worst-case run time of InsertionSort.

We now assume \(\Theta(f) = \Theta(g) = \Theta(n^2)\). We now have to prove, that \(f(n) \in o(n^2)\) and \(f(n) \in O(n^2)\).

The strategy is as follows

@startuml
start
:Use g(n) with known Theta(g) to compare f(n) to;

:try to prove Theta(f) = Theta(g) => o(f) = o(g) and O(f) = O(g);
if (proved f in o?) then (yes)
    if (proved f in O?) then (yes)
       :Theta(f) = Theta(g);
    else (no)
       end
    endif
else (no)
   end
endif
stop
@enduml

11.2.1. Proving \(f(n) \in o(g)\)

\[o(g) = \{f: \mathbb{N} \rightarrow \mathbb{R} | \textrm{ There exist two constants } c \textrm{ and } n_0 \textrm{ such that for all } n \geq n_0: f(n) \geq c \cdot g(n)\}\]

It is now necessary to find constant values for both \(c\) and \(n_0\), such that the definition above holds true, in order to prove that \(f(n)\) \(\in o(g)\).

In the case of InsertionSort with the worst-case runtime of \(\frac{n^2-n}{2}\) we can choose \(c = 1\) and \(n_0 = 0\), because \(n^2 \geq \frac{n^2-n}{2}\) always holds true.

11.2.2. Proving \(f(n) \in O(g)\)

\[O(g) = \{f: \mathbb{N} \rightarrow \mathbb{R} | \textrm{ There exist two constants } c \textrm{ and } n_0 \textrm{ such that for all } n \geq n_0: f(n) \leq c \cdot g(n)\}\]

It is now necessary to find constant values for both \(c\) and \(n_0\), such that the definition above holds true, in order to prove that \(f(n)\) \(\in O(g)\).

In the case of InsertionSort with the worst-case runtime of \(\frac{n^2-n}{2}\) we can choose \(c = \frac{1}{4}\) and \(n_0 = 2\), because if we pick \(c = \frac{1}{4}\) we can solve the following equation for \(n\):

\[\begin{split}\frac{1}{4}\cdot n^2 & \leq \frac{n \cdot (n-1)}{2}\\ \frac{1}{4}\cdot n & \leq \frac{n-1}{2}\\ \frac{1}{2}\cdot n & \leq n-1\\ n & \leq 2n-2\\ 2 & \leq n\\\end{split}\]

this is now our \(n_0\).

11.3. Taidanado - the way of the lazy

The way of the lazy is also known as “Gankona tensaido - the way of the stubborn genius”, the method is known by this name, because it can be applied to analyze any algorithm and it only takes a matter of seconds to apply.

While applying the way of the lazy, one needs to compare the function which one wants to analyze \(f(n)\) to another function \(g(n)\):

\[\begin{split}\lim_{n\to\infty} \frac{f(n)}{g(n)} = c \rightarrow \left\{\begin{array}{ll} f(n) \in o(g) & \text{for } c = 0\\ f(n) \in \Theta(g) & \text{for } 0 < c < \infty\\ f(n) \in O(g) & \text{for } c = \infty \end{array}\right.\end{split}\]

In the case of InsertionSort \(f(n) =\) \(\frac{n^2-n}{2}\) with \(g(n) = n^2\): The limit is \(\lim_{n\to\infty} \frac{n^2-n}{2}\cdot\frac{1}{n^2} = \frac{1}{2}\), therefore InsertionSort is in \(\Theta(n^2)\).

11.4. Satori o aita tensaido - the way of the enlightened genius

Taidanado is extremely useful to get a hint for the constants to choose in the proofs of bakado. And whether to proof or disproof any theorem.