🍁
Data Structures and Algorithms
  • Introduction
  • Introduction to Algorithms Analysis
    • Growth Rates
    • Big-O, Little-o, Theta, Omega
    • Analysis of Linear Search
    • Analysis of Binary Search
  • Recursion
    • The runtime stack
    • How to Write a Recursive Function
      • Example: the Factorial Function
    • Drawbacks of Recursion and Caution
  • Lists
    • Implementation
    • Linked List
      • Nodes
      • Iterator
      • Template Singly Linked List
      • Doubly Linked List
      • Circular Linked List
  • Stacks
    • Stack Operations
    • Stack Implementations
    • Stack Applications
  • Queue
    • Queue Operations
    • Queue Implementations
    • Queue Applications
  • Tables
    • Simple Table
    • Hash Table
      • Bucketing
      • Chaining
      • Linear Probing
      • Quadratic Probing and Double Hashing
  • Sorting
    • Simple Sorts
      • Bubble Sort
      • Insertion Sort
      • Selection Sort
    • Merge Sort
      • Merge Sort Implementation
    • Quick Sort
    • Heap Sort
      • Binary heap
      • Binary heap basics
      • Insertion into a binary heap
      • Delete from a binary heap
      • Implementation
      • Sorting
  • Introduction to Trees, Binary Search Trees
    • Definitions
    • Tree Implementations
    • Binary Trees
    • Binary Search Trees
      • Insertion
      • Removal
      • Traversals
  • AVL Trees
    • Height Balance
    • Insertion
    • Why it works
  • Red Black Trees
    • Insertion Example
  • 2-3 Trees
  • Graphs
    • Representation
  • Complexity Theory
  • Appendix: Mathematics Review
Powered by GitBook
On this page
  • Formally:
  • Informally:
  • Another way of saying this:
  • An easy way to think about big-O
  • What all this means

Was this helpful?

  1. Introduction to Algorithms Analysis

Big-O, Little-o, Theta, Omega

PreviousGrowth RatesNextAnalysis of Linear Search

Last updated 5 years ago

Was this helpful?

Big-O, Little-o, Omega, and Theta are formal notational methods for stating the growth of resource needs (efficiency and storage) of an algorithm. There are four basic notations used when describing resource needs. These are: O(f(n)), o(f(n)), Ω(f(n))\Omega(f(n))Ω(f(n)), and Θ(f(n))\Theta(f(n))Θ(f(n)). (Pronounced, Big-O, Little-O, Omega and Theta respectively)

Formally:

"T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n))" iff for some constants ccc and n0n_0n0​, T(n)<=cf(n)T(n) <=c f(n)T(n)<=cf(n) for all n>=n0n >= n_0n>=n0​

"T(n)T(n)T(n) is Ω(f(n))\Omega(f(n))Ω(f(n))" iff for some constants ccc and n0n_0n0​, T(n)>=cf(n)T(n)>=cf(n)T(n)>=cf(n) for all n>=n0n >= n_0n>=n0​

"T(n)T(n)T(n) is Θ(f(n))\Theta(f(n))Θ(f(n))" iff T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n)) AND T(n)T(n)T(n) is Ω(f(n))\Omega(f(n))Ω(f(n))

"T(n)T(n)T(n) is o(f(n))o(f(n))o(f(n))" iff T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n)) AND T(n)T(n)T(n) is NOT Θ(f(n))\Theta(f(n))Θ(f(n))

Informally:

"T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n))" basically means that f(n)f(n)f(n) describes the upper bound for T(n)T(n)T(n)

"T(n)T(n)T(n) is Ω(f(n))\Omega(f(n))Ω(f(n))" basically means that f(n)f(n)f(n) describes the lower bound for T(n)T(n)T(n)

"T(n)T(n)T(n) is Θ(f(n))\Theta(f(n))Θ(f(n))" basically means that f(n)f(n)f(n) describes the exact bound for T(n)T(n)T(n)

"T(n)T(n)T(n) is o(f(n))o(f(n))o(f(n))" basically means that f(n)f(n)f(n) is the upper bound for T(n)T(n)T(n) but that T(n)T(n)T(n) can never be equal to f(n)f(n)f(n)

Another way of saying this:

"T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n))" growth rate of T(n)T(n)T(n) <= growth rate of f(n)f(n)f(n)

"T(n)T(n)T(n) is Ω(f(n))\Omega(f(n))Ω(f(n))" growth rate of T(n)T(n)T(n) >= growth rate of f(n)f(n)f(n)

"T(n)T(n)T(n) is Θ(f(n))\Theta(f(n))Θ(f(n))" growth rate of T(n)T(n)T(n) == growth rate of f(n)f(n)f(n)

An easy way to think about big-O

What all this means

Let's take a closer look a the formal definition for big-O analysis

The way to read the above statement is as follows.

  • n is the size of the data set.

In summary, when we are looking at big-O, we are in essence looking for a description of the growth rate of the resource increase. The exact numbers do not actually matter in the end.

"T(n)T(n)T(n) is o(f(n))o(f(n))o(f(n))" growth rate of T(n)T(n)T(n) < growth rate of f(n)f(n)f(n)

The math in big-O analysis can often be intimidates students. One of the simplest ways to think about big-O analysis is that it is basically a way to apply a rating system for your algorithms (like movie ratings). It tells you the kind of resource needs you can expect the algorithm to exhibit as your data gets bigger and bigger. From best (least resource requirements ) to worst, the rankings are: O(1)O(1)O(1), O(log⁡n)O(\log n)O(logn), O(n)O(n)O(n), O(nlog⁡n)O(n \log n)O(nlogn), O(n2)O( n^2 )O(n2), O(n3)O(n^3)O(n3), O(2n)O(2^n)O(2n). Think about the graphs in the grow rate section. The way each curve looks. That is the most important thing to understand about algorithms analysis

"T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n))" if for some constants ccc and n0n_0n0​, T(n)<=cf(n)T(n)<=cf(n)T(n)<=cf(n) for all n>=n0n >= n_0n>=n0​

f(n)f(n)f(n) is a function that is calculated using n as the parameter.

O(f(n))O(f(n))O(f(n)) means that the curve described by f(n)f(n)f(n) is an upper bound for the resource needs of a function.

This means that if we were to draw a graph of the resource needs of a particular algorithm, it would fall under the curve described by f(n)f(n)f(n). What's more, it doesn't need to be under the exact curve described by f(n)f(n)f(n). It could be under a constant scaled curve for f(n)f(n)f(n)... so instead of having to be under the n2n^2n2 curve, it can be under the 10n210n^210n2 curve or the 200n2200n^2200n2 curve. In fact it can be any constant, as long as it is a constant. A constant is simply a number that does not change with n. So as nnn gets bigger, you cannot change what the constant is. The actual value of the constant does not matter though.

The other portion of the statement n>=n0n >= n_0n>=n0​ means that T(n)<=cf(n)T(n)<=cf(n)T(n)<=cf(n) does not need to be true for all values of nnn. It means that as long as you can find a value n0n_0n0​ for which T(n)<=cf(n)T(n)<=cf(n)T(n)<=cf(n) is true, and it never becomes untrue for all nnn larger than n0n_0n0​, then you have met the criteria for the statement T(n)T(n)T(n) is O(f(n))O(f(n))O(f(n))