Often times when dealing with algorithms, we want to know which algorithm takes more time and which one takes less time. To know it, first one need a way to measure the time taken by the algorithm. The thing is we cannot measure the time taken by a algorithm in seconds. This is because the code for algorithms can be written in different languages.
If you code for a faster algorithm in python and a slower one in C++ both of them might take same amount of time. Even if we try to compare their times by coding both in a single language, that would still be a bad idea. Because an algorithm running on a faster computer will take fewer nano seconds than the same algorithm on a slower computer. Hence "Seconds" are not a useful unit to measure time taken by a algorithm.
Better Way(Time Complexity)
So, what could be a better way. one way to solve this problem is by instead of asking "how many seconds will this algorithm take?" we can ask a different question i.e. "How many operations does this algorithm take?".
Example:
Let's consider two simple algorithms:
- Algorithm to pop last element of an array/list.
- Algorithm to pop element at index i in the list.
The first algorithm is simple. It You just take out the last element of the array and no need to do anything other than that.
This takes Just a single operation. Hence time complexity in Big-O notation is given by: O(1)
Whereas the second algorithm is different. To remove an element at a specific index you have to shift all the elements after this index left.
In the worst case if the index i=0 then we have to shift the whole array to left. So in worst case if there are n elements in the array, All the n elements are shifted left.
We have to perform n shifts in worst case, i.e. 'n' operations need to be performed. This makes the time complexity of this algorithm in Big O is given as O(n), where n is no. of elements in the array.
These Big O notations are standard ways to represent time complexity of an algorithm. The Big O notation is a way to tell the amount of operations the algorithm will take given n number of inputs.
Important Time Complexities
Some Important time complexities that you will see while solving DSA questions are:
O(1): Takes Constant amount of time
O(log(n)): Time increases logarithmically with no. of inputs
O(n) : Time increases linearly with inputs
O(n^2): Quadratic amount of time taken
O(2^n): Time increases exponentially with inputs
O(n!): Algorithm performs factorial number of operations.
The rate at which the time increases for algorithms for each of these types of algorithms is visually shown in the graph below.
As you can see the no. of operations for O(n!) algorithm increases rapidly and stays constant for O(1) as the amount of inputs increases. Hence, The O(n!) is the worst time complexity and O(1) is the best.
Note
When we say time complexity we don't actually mean the exact amount of operations that an algorithm will take, instead it means how the no. of operations increase with increase in no. of inputs.
let's say we consider popping two numbers at specific index and you might think now the time complexity would be O(2n), But that's not the case the time complexity would still be O(n). This is because don't care whether it takes n, 2n, 3n or some constant n operations, we see that the no. of operations algorithm takes is in the order of n's Hence time complexity is O(n) or linear.
Top comments (0)