If you’re learning data structures and algorithms, you’ll often hear about terms like O(n) and O(log n). These are used to describe how fast (or slow) an algorithm runs and how much memory it uses. Letβs break it down in a simple way so even beginners can understand.
Time complexity tells us how the execution time of an algorithm increases as the input size increases. It helps us compare the efficiency of different algorithms.
Space complexity is the amount of memory an algorithm uses while running. It includes both the input data and any extra memory used (like temporary variables, stacks, etc.).
O(n) means the algorithm runs in direct proportion to the input size.
n.
for i in range(n):
print(arr[i])
This is called **linear time** β more data means more time linearly.
If the algorithm stores a copy of the array or another structure that depends on input size, it may also use O(n) space.
new_list = arr.copy()
O(log n) means the algorithm cuts down the work in half at every step. This is very efficient.
def binary_search(arr, target):
left, right = 0, len(arr)-1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
Usually constant (O(1)) unless recursion is used, then the call stack may take O(log n) space.
| Aspect | O(n) | O(log n) |
|---|---|---|
| Type | Linear | Logarithmic |
| Speed | Slower for large inputs | Faster for large inputs |
| Example Algorithm | Linear Search | Binary Search |
| Use Case | Go through each element | Divide and conquer |
Start analyzing your code with time and space in mind β it will help you in coding interviews, internships, and real-world development!