起因:
In the analysis of algorithms, it is common to estimate the running time in the asymptotic sense, that is, to estimate the running time for arbitrarily large inputs.
O-notation.
For a given function g(n), we denote by O(g(n)) the following set of functions.
O(g(n)) = { f (n) : there exist positive constants c and n0 such that 0≤ f(n)≤c·g(n) for all n≥n0}.
If f (n) ∈ O(g(n)), we write f (n) = O(g(n)), and we call g(n) an asymptotic upper bound for f (n).
f (n) = O(g(n)) means that, for large values of n, the function g(n) is an upper bound on f (n), to within a constant factor. In other words, f (n) grows at most as fast as g(n).
Ω-notation.
For a given function g(n), we denote by Ω(g(n)) the following set of functions.
Ω(g(n)) = { f (n) : there exist positive constants c and n0 such that 0≤c·g(n)≤ f(n)foralln≥n0}.
If f (n) ∈ Ω(g(n)), we write f (n) = Ω(g(n)), and we call g(n) an asymptotic lower bound for f (n).
f (n) = Ω(g(n)) means that, for large values of n, the function g(n) is a lower bound on f (n), to within a constant factor. That is, f (n) grows at least as fast as g(n).
Θ-notation.
For a given function g(n), we denote by Θ(g(n)) the following set of functions.
Θ(g(n)) = { f (n) : there exist positive constants c1, c2 and n0 such that 0≤c1·g(n)≤ f(n)≤c2·g(n)for all n≥n0}.
If f (n) ∈ Θ(g(n)), we write f (n) = Θ(g(n)), and we call g(n) an asymptotically tight bound for f (n).
f (n) = Θ(g(n)) means that, for large values of n, the function f (n) is equal to g(n) to within a constant factor. That is, f (n) and g(n) have the same rate of growth.
Why do we need n0 ?
The purpose of the n ≥ n0 condition is to avoid inconvenient behavior for small ns.
One example is when f (n) is negative for a small n.
Asymptotic notation in equations
We defined the equal sign of f (n) = O(g(n)) to mean f (n) ∈ O(g(n)).
Note that here “=” is not symmetric: f (n) = O(g(n)) does not imply O(g(n)) = f (n).
- When asymptotic notation appears only on the right-hand side of an equation (or inequality), it stands for some anonymous function that we do not want to specify.
Example: 2n^3 + 3n^2 + 5 = 2n^3 + Θ(n^2).
Interpretation: There is some function f (n) in Θ(n^2),
namely f (n) = 3n^2 + 5, such that 2n^3 +3n^2 +5 = 2n^3 + f(n).
- When asymptotic notation appears (also) on the left of an equation, we mean: No matter how the anonymous functions are chosen on the left, there is a way to choose the anonymous functions on the right to make the statement valid.
Example: 2n^3 + Θ(n^2) = Θ(n^3).
Interpretation: For any choice f (n) ∈ Θ(n^2),
there is a function g(n) ∈ Θ(n^3) such that 2n^3 + f (n) = g(n).
- When asymptotic notation appears only on the left, the formula is often invalid.
Example: A statement like O(g(n)) = f (n) is false,
because O(g(n)) contains more than one function and they cannot all be equal to f (n).
o-notation
o-notation is used to denote an asymptotic upper bound that is not best possible. For a given function g(n), we denote by o(g(n)) the following set of functions.
o(g(n)) = { f (n) : for any positive constant c, there exists a constant n0 such that 0≤ f(n)<c·g(n) for all n ≥ n0}.
f (n) = o(g(n)) means that f (n) grows slower than g(n).
ω-notation
ω-notation is used to denote an asymptotic lower bound that is not best possible. For a given function g(n), we denote by ω(g(n)) the following set of functions.
ω(g(n)) = { f (n) : for any positive constant c there exists a constant n0 such that 0 ≤ c.g(n) < f(n) for alln≥n0}.
n→∞ g(n)
f (n) = ω(g(n)) means that f (n) grows faster than g(n).