HelloUniversity Icon HelloUniversity

Algorithmic Strategies: Greedy Algorithm

Published on: September 1, 2025 by Henson M. Sagorsor



Introduction to Office Suites

Algorithmic Strategies: Greedy Algorithm

"Algorithms are the heart of computer science." This quote from Thomas H. Cormen reminds us why studying them matters. In fact, more than 60% of coding interview problems involve well-known algorithmic strategies. One of the most practical approaches you’ll encounter is the greedy algorithm.

The greedy algorithm works on a simple principle: at each step, choose the option that looks best right now. No looking back. No second-guessing. Just the best move at the moment! This makes it an excellent starting point for solving real-world problems where speed and simplicity matter.

You’ll see greedy strategies powering everything from network routing to file compression. Think about Huffman coding, which reduces storage space, or Dijkstra’s algorithm, which finds shortest paths in a graph. Each one is built on the greedy choice property and optimal substructure.

In this lesson, we’ll break down the key characteristics of greedy algorithms, study concrete examples like the activity selection problem, and even explore where greedy strategies fail. By the end, you’ll know not only how to implement them but also when they deliver the most value.



Concept of Greedy Algorithm

At its core, a greedy algorithm makes decisions step by step. At each stage, it selects the option that provides the most immediate benefit without reconsidering earlier choices. Unlike dynamic programming or backtracking, there is no revisiting of past decisions.

This approach can be extremely efficient when applied to problems that fit the model. When a problem aligns with the greedy choice property and optimal substructure, a greedy algorithm doesn’t just give a quick answer—it gives the best one.

Greedy Choice Property

The greedy choice property means that making the best decision at the current step will lead to the overall optimal solution. Each local decision adds up to a globally efficient result. For example, in the activity selection problem, choosing the activity that ends earliest ensures more room for other activities later.

Optimal Substructure

Optimal substructure means that solving smaller parts of the problem optimally will naturally build toward the best solution for the whole problem. Think of it like assembling a puzzle: if each small section is correct, then the entire puzzle forms perfectly. This property makes greedy strategies reliable for problems like Huffman coding and Dijkstra’s algorithm.


1. Activity Selection Problem

The activity selection problem demonstrates how a greedy algorithm works in scheduling. Given a set of activities with start and finish times, the task is to select the maximum number of activities that do not overlap.

Greedy Approach

Always choose the activity that finishes earliest. This keeps as much room as possible for the remaining activities and ensures the maximum number of non-overlapping selections.

Algorithm Steps

  1. Sort activities by their finish times.
  2. Select the first activity (earliest finish).
  3. For each next activity in order:
    • If its start time ≥ finish time of the last selected activity → select it.
    • Otherwise → skip it.

Time Complexity

Sorting takes O(n log n), and the selection loop runs in O(n). Overall complexity: O(n log n).

Step-by-Step Example

Suppose we have the following activities:

Activity Start Time Finish Time
A114
A235
A306
A457
A589
A659

Processing Order and Decisions

  • A1 (1–4): First activity → ✅ selected.
  • A2 (3–5): Starts at 3, overlaps with A1 (ends at 4) → ❌ skipped.
  • A3 (0–6): Starts before A1 finishes → ❌ skipped.
  • A4 (5–7): Starts after A1 ends → ✅ selected.
  • A5 (8–9): Starts after A4 ends → ✅ selected.
  • A6 (5–9): Starts at 5, overlaps with A4 (ends at 7) → ❌ skipped.

Final Selected Set: A1, A4, A5


2. Huffman Coding

Huffman coding is a greedy algorithm used for data compression. It creates an optimal prefix code that minimizes the number of bits required to represent characters. More frequent characters get shorter codes, while less frequent ones get longer codes. This is the principle behind many compression formats like ZIP, JPEG, and MP3.

Greedy Approach

The greedy idea is to repeatedly combine the two least frequent symbols into a new node until only one tree remains. This ensures that the least frequent characters are placed deeper in the tree, while frequent characters stay closer to the root with shorter codes.

Algorithm Steps

  1. Create a leaf node for each character and insert them into a min-heap based on frequency.
  2. While the heap has more than one node:
    • Extract the two nodes with the smallest frequency.
    • Create a new internal node with these two as children. The frequency of the new node = sum of the two nodes.
    • Insert the new node back into the heap.
  3. When only one node remains, this is the root of the Huffman tree.
  4. Assign codes by traversing the tree (left = 0, right = 1).

Time Complexity

Building the tree requires heap operations, resulting in O(n log n) complexity, where n is the number of characters.

Step-by-Step Example

Suppose we have characters with the following frequencies:

Character Frequency
A5
B9
C12
D13
E16
F45

Processing Order and Decisions

  • Take A (5) and B (9) → merge into node AB (14).
  • Now heap = {C:12, D:13, AB:14, E:16, F:45}.
  • Take C (12) and D (13) → merge into node CD (25).
  • Now heap = {AB:14, E:16, CD:25, F:45}.
  • Take AB (14) and E (16) → merge into node ABE (30).
  • Now heap = {CD:25, ABE:30, F:45}.
  • Take CD (25) and ABE (30) → merge into node C-D-A-B-E (55).
  • Now heap = {55, F:45}.
  • Finally, merge F (45) and 55 → root node (100).

By traversing the final Huffman tree, we assign binary codes:

  • A: 1100
  • B: 1101
  • C: 100
  • D: 101
  • E: 111
  • F: 0

Notice how F, the most frequent character, received the shortest code (0). Less frequent characters like A and B received longer codes. This greedy process ensures the overall number of bits is minimized.


3. Prim's Algorithm (Minimum Spanning Tree)

Prim’s algorithm is a greedy algorithm used to find a minimum spanning tree (MST) for a connected, undirected graph. The MST connects all vertices together with the smallest possible total edge weight, making it widely applicable in network design, clustering, and routing.

Greedy Approach

Start from any vertex, and at each step, choose the smallest edge that connects a vertex inside the MST to a vertex outside. This greedy choice guarantees the MST grows with minimal total weight.

Algorithm Steps

  1. Pick an arbitrary starting vertex and mark it as part of the MST.
  2. Find the edge with the smallest weight that connects a vertex in the MST to a vertex outside.
  3. Add that edge (and the new vertex) to the MST.
  4. Repeat steps 2–3 until all vertices are included.

Time Complexity

Using a priority queue to efficiently select edges, the complexity is O(E log V), where E is the number of edges and V is the number of vertices.

Step-by-Step Example

Consider a graph with vertices {A, B, C, D, E} and the following weighted edges:

Edge Weight
A–B2
A–C3
B–C1
B–D4
C–D5
C–E6
D–E7

Processing Order and Decisions

  • Start at A. Possible edges: A–B (2), A–C (3). Smallest = A–B (2) → ✅ select edge A–B.
  • MST now: {A, B}. Next edges: A–C (3), B–C (1), B–D (4). Smallest = B–C (1) → ✅ select edge B–C.
  • MST now: {A, B, C}. Next edges: B–D (4), C–D (5), C–E (6). Smallest = B–D (4) → ✅ select edge B–D.
  • MST now: {A, B, C, D}. Next edges: C–E (6), D–E (7). Smallest = C–E (6) → ✅ select edge C–E.
  • MST now: {A, B, C, D, E}. All vertices included → stop.

Final MST edges: A–B (2), B–C (1), B–D (4), C–E (6)
🔹 Total weight = 13.


4. Dijkstra's Algorithm (Shortest Path)

Dijkstra’s algorithm is a greedy algorithm that finds the shortest path from a source vertex to all other vertices in a weighted graph with non-negative edges. It’s widely used in GPS navigation, network routing, and real-time pathfinding systems.

Greedy Approach

At each step, pick the unvisited vertex with the smallest tentative distance. Once chosen, that vertex’s shortest path is finalized. This greedy choice ensures efficiency: every update extends the shortest paths outward step by step.

Algorithm Steps

  1. Initialize all distances to infinity (∞), except the source, which is 0.
  2. Mark all vertices as unvisited.
  3. Select the unvisited vertex with the smallest distance.
  4. Update the tentative distances of its neighbors.
  5. Mark the current vertex as visited (finalized).
  6. Repeat steps 3–5 until all vertices are visited.

Time Complexity

Using a priority queue, the complexity is O(E + V log V), where E is the number of edges and V is the number of vertices.

Step-by-Step Example

Consider a graph with vertices {A, B, C, D, E} and the following edges:

Edge Weight
A–B4
A–C2
B–C1
B–D5
C–D8
C–E10
D–E2

Processing Order and Decisions

  • Start at A. Distances: A=0, B=∞, C=∞, D=∞, E=∞.
  • Update neighbors → B=4, C=2. Mark A as visited.
  • Pick smallest unvisited (C=2). Update neighbors:
    • B = min(4, 2+1=3) → update to 3.
    • D = min(∞, 2+8=10) → update to 10.
    • E = min(∞, 2+10=12) → update to 12.
  • Pick smallest unvisited (B=3). Update neighbors:
    • D = min(10, 3+5=8) → update to 8.
  • Pick smallest unvisited (D=8). Update neighbors:
    • E = min(12, 8+2=10) → update to 10.
  • Pick smallest unvisited (E=10). No more updates.

Final shortest distances from A: A=0, B=3, C=2, D=8, E=10


Counterexample: The Coin Change Problem

Not all problems can be solved optimally with a greedy algorithm. A good illustration is the coin change problem, where the goal is to make change for a given amount with the fewest coins possible.

Suppose you have coin denominations {1, 3, 4} and you need to make change for 6. A greedy approach would first pick 4, then two 1s, for a total of 3 coins. But the optimal solution is two 3s, using only 2 coins. This shows that the greedy method does not always guarantee the best solution.

Limitations of Greedy Algorithms

  • Not always optimal: Greedy algorithms only work when the problem has the greedy choice property and optimal substructure. Without these, results may be suboptimal.
  • Local vs global optima: They focus on the best local choice, which doesn’t always add up to the best global solution.
  • No backtracking: Once a decision is made, it is never revisited. If the problem requires reevaluating earlier steps, greedy algorithms fail.

These limitations make it important to recognize when a problem is suitable for greedy strategies and when a different approach—like dynamic programming or divide and conquer—is needed.


quiz Mandatory Assessment

All students must complete the assessment for this lesson. Your submission is required for course completion.

assignment_turned_in Take the Greedy Algorithm Assessment

warning Don’t miss this! Assessment link is required for all students.


Further Reading & Resources

  • Lesson Notes (PDF)
  • Lecture Slides
  • Kleinberg & Tardos, Algorithm Design – Chapter on Greedy Strategies
  • Cormen, Leiserson, Rivest & Stein (CLRS), Introduction to Algorithms
  • GeeksforGeeks – “Greedy Algorithms” section (search for activity selection, Huffman coding, Prim’s, Dijkstra’s)

Expand Your Knowledge

Dive deeper into technology and productivity with these related articles:






We'd Like to Hear Your Feedback

Comments

No comments yet. Be the first to share your thoughts!