When a Triple Loop Burned Weeks and a HashTable Fixed It in Minutes

When a Triple Loop Burned Weeks and a HashTable Fixed It in Minutes

Some failures don’t originate from bad infrastructure, legacy systems, or incompatible APIs. Sometimes, the most expensive performance bottlenecks stem from a seemingly innocent piece of logic in a scrappy script written without a clear grasp of scale.

This is the story of how a routine Active Directory update spiraled into a costly operational mess—until four lines of code turned it around.


Background

One of the organization I worked for deployed Microsoft System Service Manager as the ITSM tool. To be fair it was not a good choice to start with and Microsoft later discontinued this product altogether. It was a large global organization with offices spread across 150+ countries.

In order to accurately identify user, this tool needed accurate user to country mapping the Active Directory. The implementation was outsourced to a Consultant who knew the tool, but failed to understand the Importance of Complexity and Scale. This is where the trouble started. 

A Simple Problem at Scale

The task to update the country code, and names field in Active Directory for approximately 90,000 users, from a CSV containing user identifiers and their updated country codes.

Input 1 (user_country.csv)

SamAccountName,CountryCode
jdoe,IN
asmith,CH

Now this CSV didn't had the Country Names so our consultant downloaded another csv from internet where country code(IN) were mapped to Country name(India).

Input 2 (country_map.csv`)

CountryCode,CountryName
IN,India
CH,Switzerland

The consultant wrote  PowerShell script, ingest the two lists, match each user with their country name, and update AD accordingly.

The result, however, was anything but straightforward. The job didn’t finish in hours. It barely progressed. Servers were running hot, memory usage kept rising, and CPU usage stayed pegged at 100%. The environment wasn't crashing—but it was unusable.


The Sawtooth Memory Usage

We we tried to look at the memory and CPU usage in like Task Manager, we saw a classic saw tooth pattern

Memory (MB)
│
│         /\       /\         /\
│        /  \     /  \       /  \
│       /    \   /    \     /    \
│______/      \_/      \___/      \____ Time →

The sawtooth memory pattern emerged because the script used deeply nested loops that continuously created temporary objects during each iteration.

However, the inner loops were generating objects faster than the garbage collector could reclaim them. As a result, memory usage would rise steadily until the GC kicked in, briefly reducing consumption, only for it to rise again—each peak higher than the last. This cycle repeated endlessly, producing the characteristic sawtooth graph with a rising baseline.


The Root Cause: Brute Force Disguised as a Script

The original script looked innocent at first glance. But a closer inspection revealed a classic anti-pattern: a triple nested loop attempting linear comparisons at scale. What worked for small batches became catastrophic with real data.

Upon inspection, the root cause was simple but dangerous: the consultant had used a triple nested loop:

foreach ($adUser in $allADUsers) {
    foreach ($entry in $userCountryList) {
        if ($entry.SamAccountName -eq $adUser.SamAccountName) {
            foreach ($mapEntry in $countryMapList) {
                if ($mapEntry.CountryCode -eq $entry.CountryCode) {
                    Set-ADUser ...
                }
            }
        }
    }
}

Each foreach loop was comparing entries linearly, resulting in:


The Fix: 4 Lines of Code

All we needed was to replace linear searches with indexed lookups using hash tables:

$userMap = $userCountryList | Group-Object SamAccountName -AsHashTable
$countryMap = $countryMapList | Group-Object CountryCode -AsHashTable

Then, instead of blindly looping:

if ($userMap.Contains($sam)) {
    $code = $userMap[$sam][0].CountryCode
    if ($countryMap.Contains($code)) {
        $name = $countryMap[$code][0].CountryName
        Set-ADUser ...
    }
}

The new version finished the entire job in minutes.


The Broader Lesson: Complexity Is Not Optional

This wasn’t a lesson in PowerShell optimization. This was a lesson in computational thinking.

The consultant didn’t lack skill—he lacked foresight. He treated the script like a static set of instructions, not a dynamic system operating at scale. What worked for ten users in a test file was silently killing performance in production.

Every developer—even one writing a “simple” script—must internalize the basics:


Retrospective: What Really Went Wrong

We often attribute slowness to external factors: “AD is slow,” “the network is flaky,” “maybe the server is overloaded.” But sometimes, the fault is entirely in the logic layer.

The script didn’t need retries, threads, or better hardware.

It needed basic algorithmic awareness.

This is a blind spot in many IT automation efforts: treating scripting like wiring, not engineering. The result is brittle systems that crack the moment they meet production-scale data.


Final Thought

In the end, this wasn't a heroic rescue—it was a correction. A tiny shift in mindset, from looping over data to indexing it, turned a collapsing script into a 7-minute operation.

It’s easy to underestimate the cost of “just one more loop.”

But in large environments, complexity compounds fast. The most powerful performance improvement doesn’t come from better hardware—it comes from replacing loops with lookups.

And sometimes, it’s the difference between hours of downtime and a job well done.