The DAA course equips us with tools to design efficient algorithms, analyze their performance, and solve computational problems using different strategies. This comprehensive overview explores the fundamental paradigms and techniques that form the backbone of modern algorithm design.
We started with understanding what algorithms are and how to evaluate their efficiency using recurrence relations and Big-O notation.
Time Complexity: O(n²)Time Complexity: O(n²)Time Complexity: O(n)Use case: Linear Search is still useful in small or unsorted datasets.
Consider array: [5, 2, 4, 6, 1, 3]
Pass 1: [2, 5, 4, 6, 1, 3]
Pass 2: [2, 4, 5, 6, 1, 3]
Pass 3: [2, 4, 5, 6, 1, 3]
Pass 4: [1, 2, 4, 5, 6, 3]
Pass 5: [1, 2, 3, 4, 5, 6]
This paradigm breaks a problem into smaller parts, solves them recursively, and merges results. The "divide and conquer" technique leads to efficient algorithms for a wide range of problems.
O(log n)O(n log n)Avg: O(n log n), Worst: O(n²)Real-world: QuickSort is widely used in libraries due to its average-case speed.
Merge Sort guarantees O(n log n) performance in all cases but requires extra memory.
Quick Sort is typically faster in practice due to better cache locality and less overhead, but can degrade to O(n²) in worst case scenarios.
Partition example for Quick Sort with array [3, 8, 2, 5, 1, 4, 7, 6] and pivot=4:
After partition: [3, 2, 1, 4, 8, 5, 7, 6]
Make the best choice at each step hoping for an overall optimal solution. While not always guaranteed to find the global optimum, greedy algorithms are efficient for many problems.
O(E log E)O(E log V)O((V+E) log V)Use case: Network design, resource allocation, and scheduling.
Items: [(value=60, weight=10), (value=100, weight=20), (value=120, weight=30)]
Knapsack capacity: 50
Value-to-weight ratios: [6, 5, 4]
Solution: Take all of item 1, all of item 2, and 2/3 of item 3
Total value: 60 + 100 + (120 × 2/3) = 240
Overlapping subproblems and optimal substructure are solved by storing past results (memoization). DP is powerful for optimization problems where brute force would be exponential.
O(m×n)O(n³)O(n×W)O(V³)Real-world: Used in DNA sequence alignment, text comparison, and route optimization.
DP Table (Partial):
| "" | B | D | C | A | B | A
---+----+----+----+----+----+----+----
"" | 0 | 0 | 0 | 0 | 0 | 0 | 0
A | 0 | 0 | 0 | 0 | 1 | 1 | 1
B | 0 | 1 | 1 | 1 | 1 | 2 | 2
C | 0 | 1 | 1 | 2 | 2 | 2 | 2
Result: The LCS is "BCBA" with length 4
Systematically tries all options and backtracks if a solution fails. Backtracking is essential for solving constraint satisfaction problems and combinatorial optimization.
O(n!)Key idea: Depth-first search + undo invalid decisions when hitting constraints.
. Q . .
. . . Q
Q . . .
. . Q .
This is one of the two valid solutions for the 4-Queens problem.
Backtracking tries placing a queen in each column, then backtracks when attacks are detected.
Graphs model complex systems like networks, cities, and social connections. From traversal to pathfinding and network flow, graph algorithms are foundational in computer science.
O(V+E)O((V+E) log V)O(V×E)O(V³)Depth-First Search (DFS): Explores as far as possible along a branch before backtracking. Useful for topological sorting, detecting cycles, and maze generation.
Breadth-First Search (BFS): Explores all neighbors at the current depth before moving to vertices at the next depth level. Ideal for finding shortest paths in unweighted graphs.
For the same graph, DFS and BFS may produce different traversal orders!
Random choices lead to simpler or faster average-case behavior. Complexity theory classifies problems based on their inherent difficulty.
Fun fact: P vs NP is one of the seven Millennium Prize Problems, with a $1 million reward!
Well-known NP-Complete problems include:
These problems are believed to require exponential time in the worst case.
String processing algorithms are fundamental for text manipulation, pattern matching, and bioinformatics applications.
O(m×n)O(m+n)O(m+n) averagePattern: "ABABCABAB"
Prefix table: [0,0,1,2,0,1,2,3,4]
The prefix table helps skip unnecessary comparisons by leveraging previously matched characters, making KMP much faster than naive matching for repeated patterns.
Understanding time and space complexity is crucial for algorithm selection. Below is a comparison of major algorithms we've studied.
| Algorithm | Best Case | Average Case | Worst Case | Space | Stable |
|---|---|---|---|---|---|
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| Dijkstra's | - | O((V+E) log V) | O((V+E) log V) | O(V) | - |
| Dynamic Programming | - | Problem-dependent | Problem-dependent | Often O(n²) | - |
Throughout the lab sessions, we coded many algorithms using C, reinforcing concepts with hands-on experience. These included recursive and iterative approaches, visualizing sorting methods, understanding graph traversal via adjacency lists, and implementing complex algorithms from scratch.
Our lab work focused on empirical analysis where we measured actual running times, compared theoretical complexities with practical performance, and optimized implementations for real-world scenarios.