Asymptotic Analysis

Asymptotic notation provides a mathematical framework to describe the dominant behavior of an algorithm's resource usage (usually time or space) as the input size approaches infinity. It uses notations like Big-O Notation, Big-Theta Notation, and Big-Omega Notation to categorize different bounds on this resource usage, not just the worst case.

Introducing Asymptotic Notation

An asymptotic function is usually defined as a function where is the input size of an algorithm.

There are 5 different notations which can be thought of intuitively in terms of inequalities.

NotationInequality

Worst-case ()

  • T(n) = maximum time on any input of size n
  • Provides an upper bound on running time
  • This is an absolute guarantee

The worst case running time corresponds to Big-O Notation.

Best-case ()

  • T(n) = minimum time on any input of size n
  • Provides a lower bound on running time
  • Cheating! Can represent a slow algorithm that works fast on some input.

The best case running time corresponds to Big-Omega Notation

Average-case ()

  • T(n) = expected time over all inputs of size n
  • Very useful, but treat with care: what is “average”?
  • Random (equally likely) inputs
  • Real-life input ➔ Need to assume the statistical distribution of input

The average case running time corresponds to Big-Theta Notation

Rate of Growth

Hint

Low order terms in a function are relatively insignificant for large n.

For example:

If , we say that and have the same rate of growth.

Common Rates of Growth:

FunctionGrowth Rate
Constant
Logarithmic
Log-squared
Linear
unnamed
Quadratic
Cubic
Exponential

Asymptotically Positive Functions

An asymptotically positive function is one that is always positive for sufficiently large n

Asymptotically Negative Functions

An asymptotically negative function is one that is always negative for sufficiently large n