Skip to content

Commit 8f41f8b

Browse files
authored
Merge pull request #727 from MuraliDharan7/add-time-complexity
added Time Complexity in DSA beginner module
2 parents 44d1b87 + 85dce4b commit 8f41f8b

File tree

1 file changed

+277
-0
lines changed

1 file changed

+277
-0
lines changed

dsa/beginner/001-Time-Complexity.md

Lines changed: 277 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,277 @@
1+
---
2+
id: 001-time-complexity
3+
title: Time Complexity
4+
sidebar_label: Time Complexity
5+
tags:
6+
- dsa
7+
- data-structures
8+
- algorithms
9+
- introduction
10+
- basics
11+
- beginner
12+
- programming
13+
- time-complexity
14+
- data structure
15+
- algorithm
16+
sidebar_position: 2
17+
---
18+
# Introduction to Time Complexity
19+
20+
## What is Time Complexity?
21+
Time complexity is a way to measure how the runtime of an algorithm changes as the size of the input changes. It helps us understand and compare the efficiency of different algorithms.
22+
23+
## Big O Notation
24+
Big O notation is a mathematical notation used to describe the upper bound of an algorithm's runtime. It provides a worst-case scenario for the time complexity of an algorithm.
25+
26+
## Example
27+
28+
Q. Imagine a classroom of 100 students in which you gave your pen to one person. You have to find that pen without knowing to whom you gave it.
29+
30+
Here are some ways to find the pen and what the $O$ order is.
31+
32+
1. $O(n2)$: You go and ask the first person in the class if he has the pen. Also, you ask this person about the other 99 people in the classroom if they have that pen and so on,
33+
This is what we call $O(n2)$.
34+
2. $O(n)$: Going and asking each student individually is $O(N)$.
35+
3. $O(log n)$: Now I divide the class into two groups, then ask: “Is it on the left side, or the right side of the classroom?” Then I take that group and divide it into two and ask again, and so on. Repeat the process till you are left with one student who has your pen. This is what you mean by $O(log n)$
36+
37+
- The $O(n2)$ searches if only one student knows on which student the pen is hidden.
38+
- The $O(n)$ if one student had the pen and only they knew it.
39+
- The $O(log n)$ search if all the students knew, but would only tell me if I guessed the right side.
40+
41+
## Use Cases
42+
43+
| Input Length | Worst Accepted Time Complexity | Usually Type of Solutions |
44+
|--------------|--------------------------------|---------------------------------------------|
45+
| 10-12 | O(N!) | Recursion and backtracking |
46+
| 15-18 | O(2^N * N) | Recursion, backtracking, and bit manipulation|
47+
| 18-22 | O(2^N * N) | Recursion, backtracking, and bit manipulation|
48+
| 30-40 | O(2^(N/2) * N) | Meet in the middle, Divide and Conquer |
49+
| 100 | O(N^4) | Dynamic programming, Constructive |
50+
| 400 | O(N^3) | Dynamic programming, Constructive |
51+
| 2K | O(N^2 * log N) | Dynamic programming, Binary Search, Sorting, Divide and Conquer |
52+
| 10K | O(N^2) | Dynamic programming, Graph, Trees, Constructive |
53+
| 1M | O(N * log N) | Sorting, Binary Search, Divide and Conquer |
54+
| 100M | O(N), O(log N), O(1) | Constructive, Mathematical, Greedy Algorithms|
55+
56+
57+
### Common Time Complexities
58+
59+
#### a) O(1) - Constant Time Complexity
60+
The runtime does not change with the input size.
61+
62+
```python
63+
def get_first_element(arr):
64+
return arr[0]
65+
```
66+
The time it takes to get the first element is constant, regardless of the size of the array.
67+
68+
#### b) O(log n) - Logarithmic Time Complexity
69+
The runtime grows logarithmically with the input size.
70+
71+
```python
72+
def binary_search(arr, target):
73+
low = 0
74+
high = len(arr) - 1
75+
while low <= high:
76+
mid = (low + high) // 2
77+
if arr[mid] == target:
78+
return mid
79+
elif arr[mid] < target:
80+
low = mid + 1
81+
else:
82+
high = mid - 1
83+
return -1
84+
```
85+
The time it takes to find the target grows logarithmically with the size of the array.
86+
87+
#### c) O(n) - Linear Time Complexity
88+
The runtime grows linearly with the input size.
89+
90+
```python
91+
def print_elements(arr):
92+
for element in arr:
93+
print(element)
94+
```
95+
The time it takes to print all elements grows linearly with the size of the array.
96+
97+
#### d) O(n^2) - Quadratic Time Complexity
98+
The runtime grows quadratically with the input size.
99+
100+
```python
101+
def bubble_sort(arr):
102+
n = len(arr)
103+
for i in range(n):
104+
for j in range(0, n-i-1):
105+
if arr[j] > arr[j+1]:
106+
arr[j], arr[j+1] = arr[j+1], arr[j]
107+
```
108+
The time it takes to sort the array grows quadratically with the size of the array.
109+
110+
#### e) O(2^n) - Exponential Time Complexity
111+
The runtime doubles with each additional element in the input.
112+
113+
```python
114+
def fibonacci(n):
115+
if n <= 1:
116+
return n
117+
else:
118+
return fibonacci(n-1) + fibonacci(n-2)
119+
```
120+
The time it takes to calculate the nth Fibonacci number doubles with each additional element.
121+
122+
#### f) O(n!) - Factorial Time Complexity
123+
The runtime grows factorially with the input size.
124+
125+
```python
126+
def permutations(arr, l, r):
127+
if l == r:
128+
print(''.join(arr))
129+
else:
130+
for i in range(l, r+1):
131+
arr[l], arr[i] = arr[i], arr[l]
132+
permutations(arr, l+1, r)
133+
arr[l], arr[i] = arr[i], arr[l]
134+
```
135+
The time it takes to generate all permutations of the array grows factorially with the size of the array.
136+
137+
## How to Calculate Time Complexity?
138+
139+
1. **Identify the basic operations** in your algorithm (e.g., comparisons, assignments).
140+
2. **Count the number of times these operations are executed** in terms of the input size $N$.
141+
3. **Express this count using Big O notation** to simplify and generalize the complexity.
142+
143+
## Time Complexity of Different Data Structures
144+
145+
- **Array**:
146+
- Access: $O(1)$
147+
- Search: $O(N)$
148+
- Insertion: $O(N)$
149+
- Deletion: $O(N)$
150+
151+
- **Linked List**:
152+
- Access: $O(N)$
153+
- Search: $O(N)$
154+
- Insertion: $O(1)$
155+
- Deletion: $O(1)$
156+
157+
- **Hash Table**:
158+
- Access: $O(1)$
159+
- Search: $O(1)$
160+
- Insertion: $O(1)$
161+
- Deletion: $O(1)$
162+
163+
- **Binary Search Tree**:
164+
- Access: $O(log N)$
165+
- Search: $O(log N)$
166+
- Insertion: $O(log N)$
167+
- Deletion: $O(log N)$
168+
169+
## Analyzing Algorithms with Time Complexity
170+
171+
1. **Linear Search**:
172+
```python
173+
def linear_search(arr, target):
174+
for i in range(len(arr)):
175+
if arr[i] == target:
176+
return i
177+
return -1
178+
```
179+
Time Complexity: $O(N)$
180+
181+
2. **Binary Search**:
182+
```python
183+
def binary_search(arr, target):
184+
low = 0
185+
high = len(arr) - 1
186+
while low <= high:
187+
mid = (low + high) // 2
188+
if arr[mid] == target:
189+
return mid
190+
elif arr[mid] < target:
191+
low = mid + 1
192+
else:
193+
high = mid - 1
194+
return -1
195+
```
196+
Time Complexity: $O(log N)$
197+
198+
3. **Bubble Sort**:
199+
```python
200+
def bubble_sort(arr):
201+
n = len(arr)
202+
for i in range(n):
203+
for j in range(0, n-i-1):
204+
if arr[j] > arr[j+1]:
205+
arr[j], arr[j+1] = arr[j+1], arr[j]
206+
```
207+
Time Complexity: $O(N^2)$
208+
209+
4. **Merge Sort**:
210+
```python
211+
def merge_sort(arr):
212+
if len(arr) > 1:
213+
mid = len(arr) // 2
214+
left_half = arr[:mid]
215+
right_half = arr[mid:]
216+
217+
merge_sort(left_half)
218+
merge_sort(right_half)
219+
220+
i = j = k = 0
221+
222+
while i < len(left_half) and j < len(right_half):
223+
if left_half[i] < right_half[j]:
224+
arr[k] = left_half[i]
225+
i += 1
226+
else:
227+
arr[k] = right_half[j]
228+
j += 1
229+
k += 1
230+
231+
while i < len(left_half):
232+
arr[k] = left_half[i]
233+
i += 1
234+
k += 1
235+
236+
while j < len(right_half):
237+
arr[k] = right_half[j]
238+
j += 1
239+
k += 1
240+
```
241+
Time Complexity: $O(N log N)$
242+
243+
5. **Fibonacci (Recursive)**:
244+
```python
245+
def fibonacci(n):
246+
if n <= 1:
247+
return n
248+
else:
249+
return fibonacci(n-1) + fibonacci(n-2)
250+
```
251+
Time Complexity: $O(2^N)$
252+
253+
## Practical Tips
254+
255+
- **Optimize Your Code**: Aim to reduce the time complexity for better performance, especially for large inputs.
256+
- **Consider Both Time and Space Complexity**: While time complexity is important, space complexity (the amount of memory used) is also crucial.
257+
- **Know Your Problem Domain**: Some problems inherently require higher time complexity. Understanding the problem domain helps in choosing the right algorithm.
258+
259+
## Resources to Learn More
260+
261+
1. **Books**:
262+
- *Introduction to Algorithms* by Cormen, Leiserson, Rivest, and Stein
263+
- *Algorithm Design Manual* by Steven S. Skiena
264+
265+
2. **Online Courses**:
266+
- [Coursera Algorithms Specialization](https://www.coursera.org/specializations/algorithms)
267+
- [edX Data Structures and Algorithm](https://www.edx.org/course/data-structures-and-algorithms)
268+
269+
3. **Websites**:
270+
- [GeeksforGeeks Time Complexity](https://www.geeksforgeeks.org/analysis-of-algorithms-set-1-asymptotic-analysis/)
271+
- [Khan Academy Algorithms](https://www.khanacademy.org/computing/computer-science/algorithms)
272+
273+
4. **Interactive Tools**:
274+
- [VisuAlgo](https://visualgo.net/en)
275+
- [Big O Cheat Sheet](https://www.bigocheatsheet.com/)
276+
277+
Understanding time complexity is essential for writing efficient algorithms and is a fundamental skill in computer science and software development. By analyzing and optimizing time complexity, you can ensure your programs run efficiently, even with large inputs.

0 commit comments

Comments
 (0)