Skip to content

Commonly-used notations in the theory of Algorithms

To summarise our analysis of an algorithm there are a couple of commonly used notations to help describe our order of growth.

notationprovidesexampleshorthand forused to
Big Thetaasymptotic order of growth$Θ(N^{2})$$\frac{1}{2} N^{2}$
$10 N^{2}$
$5 N^{2} + 22 N log N + 3N$
classify algorithms
Big Oh$Θ(N^{2})$ and smaller$O(N^{2})$$10 N^{2}$
$100 N$
$22 N log N + 3 N$
develop upper bounds
Big Omega$Θ(N^{2})$ and larger
$Ω(N^{2})$$\frac{1}{2} N^{2}$
$N^{5}$
$N^{3} + 22 N log N + 3 N$
Develop Lower Bounds

Lets take our 1-sum problems and try to do the following:

  1. Establish “difficulty” of a problem
  2. Develop “optimal” algorithms
  • Is there a Zero in the array?
    • Upper Bound: Specific algorithm (example Brute Force) running time = O(N)
    • Lower Bound: Examine all N Entries (any unexamined one might be 0) = Ω(N )

Optimal Algorithm: Upper bound and lower bound match therefore brute force is optimal. = Θ(N )

Lets take our 3-sum problem and try to do the following:

  1. Establish “difficulty” of the problem
  2. Develop “optimal algorithms”
  • Upper Bound: Brute-force algorithm running time is $O( N^{3} )$
  • Upper bound: Improved specific Algorithm - running time is $O( N^{2}Log N)$
  • Lower Bound: (Proof no other algorithm can do better) examine all N entries to solve 3-Sum. Running time of the optimal algorithm for solving 3-Sum is Ω(N ).

Open Problems

  • We don’t know what the optimal algorithm for 3-Sum is as we don’t have exact upper and lower bounds