1️⃣ Big-O Notation Basics
-
O(1) - Constant Time → Execution time remains the same regardless of input size.
-
O(log n) - Logarithmic Time → Execution time increases logarithmically.
-
O(n) - Linear Time → Execution time grows proportionally with input size.
-
O(n log n) - Linearithmic Time → Common in sorting algorithms like Merge Sort.
-
O(n²) - Quadratic Time → Common in nested loops.
-
O(2ⁿ) - Exponential Time → Grows exponentially, common in recursive algorithms.
-
O(n!) - Factorial Time → Worst-case, grows extremely fast (e.g., brute-force permutations).
2️⃣ Common Data Structure Operations
Operation |
Array |
Linked List |
Stack/Queue |
Hash Table |
BST (Balanced) |
Access |
O(1) |
O(n) |
O(n) |
N/A |
O(log n) |
Search |
O(n) |
O(n) |
O(n) |
O(1) |
O(log n) |
Insert (End) |
O(1) |
O(1) |
O(1) |
O(1) |
O(log n) |
Insert (Middle) |
O(n) |
O(n) |
O(n) |
O(1) |
O(log n) |
Delete (End) |
O(1) |
O(1) |
O(1) |
O(1) |
O(log n) |
Delete (Middle) |
O(n) |
O(n) |
O(n) |
O(1) |
O(log n) |
3️⃣ Time Complexity of Sorting Algorithms
Algorithm |
Best Case |
Average Case |
Worst Case |
Bubble Sort |
O(n) |
O(n²) |
O(n²) |
Insertion Sort |
O(n) |
O(n²) |
O(n²) |
Selection Sort |
O(n²) |
O(n²) |
O(n²) |
Merge Sort |
O(n log n) |
O(n log n) |
O(n log n) |
Quick Sort |
O(n log n) |
O(n log n) |
O(n²) |
Heap Sort |
O(n log n) |
O(n log n) |
O(n log n) |
Counting Sort |
O(n + k) |
O(n + k) |
O(n + k) |
4️⃣ Time Complexity of Searching Algorithms
Algorithm |
Best Case |
Average Case |
Worst Case |
Linear Search |
O(1) |
O(n) |
O(n) |
Binary Search |
O(1) |
O(log n) |
O(log n) |
Ternary Search |
O(1) |
O(log n) |
O(log n) |
5️⃣ Time Complexity of Common Recursive Algorithms
Algorithm |
Time Complexity |
Fibonacci (Naive) |
O(2ⁿ) |
Fibonacci (DP) |
O(n) |
Factorial Recursion |
O(n) |
Tower of Hanoi |
O(2ⁿ) |
6️⃣ Master Theorem for Recurrence Relations
For recurrence relations of the form: T(n) = aT(n/b) + f(n)
- If f(n) = O(n^c) where c < log_b(a) → O(n^log_b(a))
- If f(n) = O(n^c) where c = log_b(a) → O(n^c log n)
- If f(n) = O(n^c) where c > log_b(a) → O(f(n))
Examples:
| Recurrence Relation | Time Complexity |
|------------------------|----------------|
| T(n) = 2T(n/2) + O(n) | O(n log n) |
| T(n) = 4T(n/2) + O(n²) | O(n²) |
| T(n) = T(n/2) + O(1) | O(log n) |
💡 Pro Tip: Use this cheat sheet to quickly estimate time complexities during coding interviews and optimizations!
Top comments (0)