Continuing the Evolution of the Dev Server and SSG: Eliminating Unnecessary Rebuilds

In previous articles, I detailed how to create a static site generator (SSG) in Node.js and implementing a simple local dev server with auto-reload via EventStream.
This article solves the problem of false rebuild triggersβ€”when the dev server initiates updates despite no actual file changesβ€”with minimal effort. I’ll explain how to fix this using content hashing.

🐒 The Problem

When using chokidar (or other file-watchers), you often encounter situations where saving a file, even if the content remains unchanged, triggers a rebuild β†’ the browser tab refreshes, but no changes are visible.

This leads to ⏳ slower "edit β†’ build β†’ test" cycles. Resource-heavy operations like generation, minification, and file system writes are triggered unnecessarily. Developers get frustrated 😡 due to constant rebuilds and page "flashing," losing precious focus.

We waste πŸ”₯ resources on rebuilds that change nothing.

🧐 Adding a Change Filter

Core Idea: On each file change event (change), check if the content has actually changed, not just metadata (e.g., modification time).

A naive solution would be to store every file in memory and compare byte-by-byte. But this risks memory exhaustion.

❗️ Note! If your tools require keeping files in memory for compilation, streaming comparison can be faster! By streaming content, you can detect changes at the first mismatch (no need to read the entire file from disk). Update the in-memory file from that point. No redundant disk reads. Efficient, but only if tools work directly in memory, which is orders of magnitude faster than SSD. πŸ“ I’ll explore this approach in a separate article.

Thus, we hash file content using πŸ” SHA-256. Such a key or signature is quick to compare and compact to store in memory.

⚠️ Hypothetically, different content could produce the same hash, but achieving this is unrealistic for our use case.

πŸ˜‚ Remember, hashing is like contraception β€” no 100% guarantee.

Step 1: Pre-Indexing

An index (Map) stores pairs <path β†’ hash> for quick comparison between new and old states.

const index = new Map<string, string>();

Hash calculation function:

import fs from "node:fs";
import crypto from "node:crypto";

const hashFile = (filePath: string): Promise<string | undefined> =>
  fs.promises
    .readFile(filePath)
    .then((content) =>
      crypto.createHash("sha256").update(content).digest("hex")
    )
    // ignore errors
    .catch(() => undefined);

Using glob to pre-find tracked files and compute their hashes:

const createIndex = async (dirs: string[], ignore: RegExp): Promise<void> => {
  console.info("Building initial files index...");
  for (const dir of dirs) {
    const iter = fs.promises.glob(path.join(dir, "**/*"));
    for await (let filePath of iter) {
      // ignore excluded files
      if (ignore.test(filePath)) {
        continue;
      }
      // skip directories
      if (fs.lstatSync(filePath).isDirectory()) {
        continue;
      }
      // normalize keys
      filePath = filePath.replaceAll("\\", "/");

      // compute and save
      const hash = await hashFile(filePath);
      index.set(filePath, hash);
    }
  }
};

Step 2: Event Handler

Goal: Trigger onChange only when real changes occur in files (new, modified, or deleted).

  1. Start a watcher for all directories. fs.watch events add candidates to a queue; submitWatchEvents schedules updates.
for (const dir of dirs) {
  fs.watch(
    dir,
    {
      persistent: true,
      recursive: true,
    },
    async (_, filename: string) => {
      // filter ignored files
      if (!ignored.test(filename)) {
        // normalize path
        const filePath = path.join(dir, filename).replaceAll("\\", "/");
        // track changes
        changedFiles.add(filePath);
        // schedule update
        submitWatchEvents();
      }
    }
  );
}
  1. A debounced scheduler processes accumulated changes after a 60ms delay.
const submitWatchEvents = debounce(() => {
  // process candidates
  const files = [...changedFiles];
  // reset for future changes
  changedFiles.clear();
  // check each file
  for (const file of files) {
    void checkFileChanged(file, onChange);
  }
}, 60);
  1. For each candidate, compute the new hash. If it differs from the stored value, trigger onChange.
const checkFileChanged = async (
  filePath: string,
  onChange: (event: FileChangeEvent) => void
) => {
  let kind = FileChangeKind.change;
  const prevHash = index.get(filePath);
  const hash = await hashFile(filePath);
  if (hash) {
    index.set(filePath, hash);
    if (!prevHash) {
      kind = FileChangeKind.add;
    }
  } else {
    index.delete(filePath);
    kind = FileChangeKind.remove;
  }
  if (prevHash !== hash) {
    console.info(`✨ ${kind} change: ${filePath}`);
    onChange({ filePath, kind, hash, prevHash });
  }
};

πŸ’₯ Performance

Initial startup time increases slightly due to indexing, but subsequent iterations are minimal.

⚑️ Choosing a Hashing Algorithm

πŸ› οΈ Parsing vs. Bytes

Some file changes don’t affect build output (e.g., adding empty lines or formatting). Avoid rebuilds for such cases:

  1. Format files (e.g., using prettier).
  2. Parse or analyze the AST.
  3. Compute hashes only if parsing succeeds.
  4. Pause tracking for files with errors until fixed.
try {
  const formatted = prettier.format(content);
  compileSourceFile(formatted);
  errors.delete(filePath);
  const hash = crypto.createHash("sha256").update(formatted).digest("hex");
  // ... proceed
} catch (err) {
  // Pause live-reload for broken files
  errors.set(filePath, err);
}

βš–οΈ Balancing Speed and Accuracy

🏁 Conclusion

  1. πŸ” Indexing: Hash files on startup.
  2. πŸ›‘ Filtering: Ignore unchanged files.
  3. 🧠 Advanced Logic: Parsing and validation for critical files.

πŸš€ This approach reduces redundant iterations, conserves resources, and lets developers focus on meaningful changes.

πŸ”Ž Have you faced similar issues? How does your file-watching system work? Share your experiences!


❀️ Your Feedback Matters!

✨ Ping and follow me on social networks (until I add site comments)!