🍁
Data Structures and Algorithms
  • Introduction
  • Introduction to Algorithms Analysis
    • Growth Rates
    • Big-O, Little-o, Theta, Omega
    • Analysis of Linear Search
    • Analysis of Binary Search
  • Recursion
    • The runtime stack
    • How to Write a Recursive Function
      • Example: the Factorial Function
    • Drawbacks of Recursion and Caution
  • Lists
    • Implementation
    • Linked List
      • Nodes
      • Iterator
      • Template Singly Linked List
      • Doubly Linked List
      • Circular Linked List
  • Stacks
    • Stack Operations
    • Stack Implementations
    • Stack Applications
  • Queue
    • Queue Operations
    • Queue Implementations
    • Queue Applications
  • Tables
    • Simple Table
    • Hash Table
      • Bucketing
      • Chaining
      • Linear Probing
      • Quadratic Probing and Double Hashing
  • Sorting
    • Simple Sorts
      • Bubble Sort
      • Insertion Sort
      • Selection Sort
    • Merge Sort
      • Merge Sort Implementation
    • Quick Sort
    • Heap Sort
      • Binary heap
      • Binary heap basics
      • Insertion into a binary heap
      • Delete from a binary heap
      • Implementation
      • Sorting
  • Introduction to Trees, Binary Search Trees
    • Definitions
    • Tree Implementations
    • Binary Trees
    • Binary Search Trees
      • Insertion
      • Removal
      • Traversals
  • AVL Trees
    • Height Balance
    • Insertion
    • Why it works
  • Red Black Trees
    • Insertion Example
  • 2-3 Trees
  • Graphs
    • Representation
  • Complexity Theory
  • Appendix: Mathematics Review
Powered by GitBook
On this page
  • Growth Rates
  • Constant Growth Rate
  • Logrithmic Growth Rate
  • Linear Growth Rate
  • Log Linear
  • Quadratic Growth Rate
  • Cubic Growth Rate
  • Exponential Growth Rate

Was this helpful?

  1. Introduction to Algorithms Analysis

Growth Rates

PreviousIntroduction to Algorithms AnalysisNextBig-O, Little-o, Theta, Omega

Last updated 5 years ago

Was this helpful?

Growth Rates

Algorithms analysis is all about understanding growth rates. That is as the amount of data gets bigger, how much more resource will my algorithm require? Typically, we describe the resource growth rate of a piece of code in terms of a function. To help understand the implications, this section will look at graphs for different growth rates from most efficent to least efficient.

Constant Growth Rate

A constant resource need is one where the resource need does not grow. That is processing 1 piece of data takes the same amount of resource as processing 1 million pieces of data. The graph of such a growth rate looks like a horizontal line

Logrithmic Growth Rate

A logrithmic growth rate is a growth rate where the resource needs grows by one unit each time the data is doubled. This effectively means that as the amount of data gets bigger, the curve describing the growth rate gets flatter (closer to horizontal but never reaching it). The following graph shows what a curve of this nature would look like.

Linear Growth Rate

A linear growth rate is a growth rate where the resource needs and the amount of data is directly proportional to each other. That is the growth rate can be described as a straight line that is not horizontal.

Log Linear

Quadratic Growth Rate

A quadratic growth rate is one that can be described by a parabola.

Cubic Growth Rate

While this may look very similar to the quadratic curve, it grows significantly faster

Exponential Growth Rate

An exponential growth rate is one where each extra unit of data requires a doubling of resource. As you can see the growth rate starts off looking like it is flat but quickly shoots up to near vertical (note that it can't actually be vertical)

A loglinear growth rate is a slightly curved line. the curve is more pronounced for lower values than higher ones

Constant Curve
Logarithmic Curve
Linear Curve
Quadratic Curve
Cubic Curve
Exponential Curve