Basics of Time Complexity – Coding N Concepts | ninjasquad
In this tutorial, weβll learn how to calculate time complexity of a function execution with examples.
Time Complexity
Time complexity is generally represented by big-oh notation π
.
If time complexity of a function is π(n)
, that means function will take n
unit of time to execute.
These are the general types of time complexity which you come across after the calculation:-
Type | Name |
---|---|
π (1) | Constant |
π (logβn) | Logarithmic |
π (βn | Root |
π (n) | Linear |
π (nΒ²) | Quadratic |
π (nΒ³) | Cubic |
π (2βΏ) | Exponential |
π (nβΏ) | Exponential |
Time Complexity in the increasing order of their value:-
1 < logβn < βn < n < nlogβn < nΒ² < nΒ³ ... < 2βΏ < 3βΏ ... < nβΏ
Time Complexity Calculation
We are going to understand time complexity with loads of examples:-
for
loop
Letβs look at the time complexity of for
loop with many examples, which are easier to calculate:-
Example 1
loop run `n` times
hence Time Complexity = π(n)
Example 2
for(int i=0; i<n; i=i+2){}
loop run `n/2` times which is still linear
hence Time Complexity = π(n)
Example 3
loop run `n` times
hence Time Complexity = π(n)
Example 4
for(int i=1; i<n; i++){}
for(int j=1; j<n; j++){}
two individual loops run `n+n = 2n` times which is still linear
hence Time Complexity = π(n)
Example 5
for(int i=0; i<n; i++){
for(int j=0; j<n; j++){}
}
Nested loop run `nxn = nΒ²` times which is non-linear
hence Time Complexity = π(nΒ²)
Example 6
for(int i=1; i<n; i=i*2){}
Calculation:
ββββββββββββββββββββββββββββ
values of `i` in `for` loop
ββββββββββββββββββββββββββββ
1
2
2Β²
2Β³
.
.
2α΅
ββββββββββββββββββββββββββββ
The loop will terminate when:-
i β₯ n
2α΅ β₯ n
k β₯ logβ(n)
so logβ(n) is total unit of time taken by the loop
hence Time complexity = π(logβ(n))
Example 7
for(int i=1; i<n; i=i*3){}
Calculation:
Similar to the last example,
value of `i` in `for` loop is i = 3α΅
The loop will terminate when:-
i β₯ n
3α΅ β₯ n
k β₯ logβn
so logβn is total unit of time taken by the loop
hence Time complexity = π(logβn)
Example 8
for(int i=n; i>1; i=i/2){}
Calculation:
ββββββββββββββββββββββββββββ
values of `i` in `for` loop
ββββββββββββββββββββββββββββ
n
n/2
n/2Β²
n/2Β³
.
.
n/2α΅
ββββββββββββββββββββββββββββ
The loop will terminate when:-
i β 1
n/2α΅ β 1
2α΅ β₯ n
k β₯ logβ(n)
so logβ(n) is total unit of time taken by the loop
hence Time complexity = π(logβ(n))
Example 9
for(int i=0; i<n; i++){
for(int j=1; j<n; j=j*2){}
}
Calculation:
outer loop complexity = π(n)
inner loop complexity = π(logβ(n))
hence Time complexity = π(nlogβ(n))
Example 10
int p = 0;
for(int i=1; p<=n; i++){
p = p + i;
}
Calculation:
value of `i` and `p` in `for` loop:-
βββββββββββββββββββ
i p
βββββββββββββββββββ
1 0+1=1
2 1+2=3
3 1+2+3=4
4 1+2+3+4
. .
. .
k 1+2+3+4+...+k
k k(k+1)/2
βββββββββββββββββββ
The loop will terminate when:-
p > n
k(k+1)/2 > n
kΒ² > n
k > βn
so βn is total unit of time taken by the loop
hence Time complexity = π(βn)
Example 11
int p = 0;
for(int i=1; i<n; i=i*2){
p++;
}
for(int j=1; j<p; j=j*2){
// statement
}
Calculation:
1. first `for` loop will take logβ(n) unit of time to execute.
At the end of first loop value of p = logβ(n)
2. second `for` loop will take logβ(p) unit of time to execute.
hence Time Complexity = π(logβ(p)) = π(logβlogβ(n))
while
loop
If you understand how to calculate the time complexity of for
loop then while
loop is piece of cake.
Example 1
int i=1;
while(i<n){
//statement
i=i*2;
}
Time Complexity = π(logβ(n))
Example 2
int i=n;
while(i>1){
//statement
i=i/2;
}
Time Complexity = π(logβ(n))
Example 3
int i=1;
int k=1;
while(k<n){
//statement
k=k+i;
i++;
}
Time Complexity = π(βn)
Variable Time Complexity
It is not necessary that function always take fixed unit of time to execute, sometime it depends on the input parameters.
Here are some examples where time complexity is not fixed:-
Example 1
method(n, m){
while(n!=m){
if(n>m){
n = n-m;
}else{
m = m-n;
}
}
}
best case time = π(1)
(when n = m)
worst case time = π(n)
(when n is very larger then m (e.g. n=16, m=1))
Example 2
method(n){
if(n<5){
//statement
}else{
for(i=0;i<n;i++){
//statement
}
}
}
best case time = π(1)
(when n < 5)
worst case time = π(n)
(when n β₯ 5)
Recursive Functions
Letβs see the time complexity of recurring (or recursive) functions:-
Example 1
test(int n){
if(n>0){
//statement
test(n-1);
}
}
// T(n) = T(n-1) + 1 = π(n)
Calculation:
Base case
T(0) = 1
Time taken by nth task is time taken by (n-1)th task plus 1
T(n) = T(n-1) + 1 --(1)
Similarly time taken by (n-1)th task is (n-2)th task plus 1
T(n-1) = T(n-2) + 1 --(2)
T(n-2) = T(n-3) + 1 --(3)
T(n) = T(n-3) + 3 --after substituting (2),(3) in (1)
T(n) = T(n-k) + n
...
Assume (n-k)th is 0th task means n=k
T(n) = T(0) + n
T(n) = 1 + n β n
hence Time Complexity = π(n)
Example 2
test(int n){
if(n>0){
for(int i=0; i<n; i++){
//statement
}
test(n-1);
}
}
// T(n) = T(n-1) + n = π(nΒ²)
Calculation:
Base case
T(0) = 1
Time taken by nth task:-
T(n) = T(n-1) + n
T(n-1) = T(n-2) + n-1
T(n-2) = T(n-3) + n-2
..
..
T(n) = T(n-3) + (n-2) + (n-1) + n
T(n) = T(n-k) + (n-(k-1)) + (n-(k-2)) + ... + (n-1) + n
Assume (n-k)th is 0th task means n=k
T(n) = T(n-n) + (n-n+1) + (n-n+2) + ... + (n-1) + n
T(n) = T(0) + 1 + 2 + 3 + ... + n
T(n) = 1 + n(n+1)/2 β nΒ²
hence Time Complexity = π(nΒ²)
Example 3
test(int n){
if(n>0){
for(int i=0; i<n; i=i*2){
//statement
}
test(n-1);
}
}
// T(n) = T(n-1) + logβ(n) = π(nlogβ(n))
Calculation:
Base case
T(0) = 1
Time taken by nth task:-
T(n) = T(n-1) + logβ(n)
T(n-1) = T(n-2) + logβ(n-1)
T(n-2) = T(n-3) + logβ(n-2)
..
..
T(n) = T(n-3) + logβ(n-2) + logβ(n-1) + logβ(n)
T(n) = T(n-k) + logβ1 + logβ2 + ... + logβ(n-1) + logβ(n)
Assume (n-k)th is 0th task means n=k
T(n) = T(0) + logβ(n)!
T(n) = 1 + nlogβ(n) β nlogβ(n)
hence Time Complexity = π(nlogβ(n))
Example 4
test(int n){
if(n>0){
//statement
test(n-1);
test(n-1);
}
}
// T(n) = 2T(n-1) + 1 = π(2βΏ)
Calculation:
Base case
T(0) = 1
Time taken by nth task:-
T(n) = 2T(n-1) + 1
T(n-1) = 2T(n-2) + 1
T(n-2) = 2T(n-3) + 1
..
..
T(n) = 2[2[2T(n-3) + 1] + 1] + 1
T(n) = 2Β³T(n-3) + 2Β² + 2 + 1
T(n) = 2α΅T(n-k) + 2α΅β»ΒΉ + 2α΅β»Β² + .. + 2Β² + 2 + 1
Assume (n-k)th is 0th task means n=k
T(n) = 2βΏT(0) + (2βΏ-1)
T(n) = 2βΏ + (2βΏ-1) β 2βΏ
hence Time Complexity = π(2βΏ)
Example 5
test(int n){
if(n>0){
//statement
test(n-1);
test(n-1);
test(n-1);
}
}
T(n) = 3T(n-1) + 1
Time Complexity = π(3βΏ)
Example 6
test(int n){
if(n>0){
for(int i=0; i<n; i++){
//statement
}
test(n-1);
test(n-1);
}
}
//
T(n) = 2T(n-1) + n
Time Complexity = π(n2βΏ)
Example 7
test(int n){
if(n>0){
//statement
test(n/2);
}
}
Base case
T(1) = 1
T(n) = T(n/2) + 1
T(n/2) = T(n/2Β²) + 1
T(n) = [T(n/2Β²) + 1] + 1
T(n) = T(n/2α΅) + k
Assume (n/2α΅)th is last task means
n/2α΅ = 1
2α΅ = n
k = logβ(n)
T(n) = T(1) + logβ(n) = 1 + logβ(n) β logβ(n)
hence Time Complexity = π(logβ(n))
Example 8
test(int n){
if(n>0){
for(int i=0; i<n; i++){
//statement
}
test(n/2);
}
}
Base case
T(1) = 1
T(n) = T(n/2) + n
T(n/2) = T(n/2Β²) + n/2
T(n) = [T(n/2Β²) + n/2] + n
T(n) = T(n/2α΅) + n/2α΅β»ΒΉ + n/2α΅β»Β² + .. + n/2Β² + n/2 + n
Assume (n/2α΅)th is last task means
n/2α΅ = 1
2α΅ = n
k = logβ(n)
T(n) = T(1) + n[1/2α΅β»ΒΉ + 1/2α΅β»Β² + .. + 1/2 + 1]
T(n) = 1 + n[1 + 1] = 1 + 2n β n
hence Time Complexity = π(n)
Example 9
test(int n){
if(n>0){
//statement
test(n/2);
test(n/2);
}
}
T(n) = 2T(n/2) + 1
T(n) = 2α΅T(n/2α΅) + k + k-1 + k-2 + ... + 1
Assume n/2α΅ = 1 means k = logβ(n)
T(n) = nT(1) + k(k+1)/2 β n + (logβ(n))Β² β n
hence Time Complexity = π(n)
Example 10
Quick Sort when pivot is middle element:-
quickSort(int[] arr, int low, int high) {
if (low < high){
int pi = partition(arr, low, high); // n
quickSort(arr, low, pi - 1); // T(n/2)
quickSort(arr, pi + 1, high); // T(n/2)
}
}
Base case
T(1) = 1
T(n) = 2T(n/2) + n
T(n/2) = 2T(n/2Β²) + n/2
T(n) = 2[2T(n/2Β²) + n/2] + n
T(n) = 2α΅T(n/2α΅) + n + n + ... + n
T(n) = 2α΅T(n/2α΅) + nk
Assume (n/2α΅)th is last task means
n/2α΅ = 1
2α΅ = n
k = logβ(n)
T(n) = nT(1) + nlogβ(n)
T(n) = n + nlogβ(n) = nlogβ(n)
Time Complexity = π(nlogβ(n))
Asymptotic Notations
We can represent the function complexity in following ways:-
Symbol | Name | Bound |
---|---|---|
π | big-oh | upper bound |
β¦ | big-omega | lower bound |
β¬ | big-theta | average bound |
Example 1
For e.g. f(n) = 2n + 3
can be represented as
π(n) or any notation with higher weightage such as π(nlogβn) or π(nΒ²) or π(nΒ³) ...
β¦(n) or any notation with lower weightage such as β¦(βn), β¦(logβn), β¦(1) ...
β¬(n) and only β¬(n) since this is average bound
Ideally you represent the function complexity to nearest type of complexity so in above case π(n), β¦(n), β¬(n) are best representations.
Example 2
f(n) = 2nΒ² + 3n + 4
1nΒ² β€ 2nΒ² + 3n + 4 β€ 9nΒ²
can be represented as π(nΒ²), β¦(nΒ²), or β¬(nΒ²)
Example 3
f(n) = nΒ²logβn + n
nΒ²logβn β€ 2nΒ²logβn + n β€ 3nΒ²logβn
can be represented as π(nΒ²logβn), β¦(nΒ²logβn), or β¬(nΒ²logβn)
Example 4
f(n) = !n = nx(n-1)x(n-2)x ...x2x1 = nβΏ
n β€ !n β€ nβΏ
can be represented as π(nβΏ) upper-bound, β¦(n) lower-bound
can not be represented as β¬ since there is no common average-bound.
Example 5
f(n) = log!n = log(nx(n-1)x(n-2)x ...x2x1) = log(nβΏ) = nlog(n)
1 β€ log!n β€ nlog(n)
can be represented as π(nlog(n)) upper-bound, β¦(1) lower-bound
can not be represented as β¬ since there is no common average-bound.
- It is always preferable to represent complexity in big-theta β¬, if possible, which is more accurate and tight bound.
- Big-oh π(n) is the most popular notation to represent function complexity which you come across.
Note: Do not mix these notations with best case, worst case, or average case time complexity. All type of cases can be represented by π, β¦, and β¬ notations.
Source: Internet