Continuing the Evolution of the Dev Server and SSG: Eliminating Unnecessary Rebuilds
In previous articles, I detailed how to create a static site generator (SSG) in Node.js and implementing a simple local dev server with auto-reload via EventStream.
This article solves the problem of false rebuild triggersβwhen the dev server initiates updates despite no actual file changesβwith minimal effort. Iβll explain how to fix this using content hashing.
The Problem
When using chokidar
(or other file-watchers
), you often encounter situations where saving a file, even if the content remains unchanged, triggers a rebuild β the browser tab refreshes, but no changes are visible.
This leads to slower "edit β build β test" cycles. Resource-heavy operations like generation, minification, and file system writes are triggered unnecessarily. Developers get frustrated
due to constant rebuilds and page "flashing," losing precious focus.
We waste resources on rebuilds that change nothing.
Adding a Change Filter
Core Idea: On each file change event (change
), check if the content has actually changed, not just metadata (e.g., modification time).
A naive solution would be to store every file in memory and compare byte-by-byte. But this risks memory exhaustion.
Note! If your tools require keeping files in memory for compilation, streaming comparison can be faster! By streaming content, you can detect changes at the first mismatch (no need to read the entire file from disk). Update the in-memory file from that point. No redundant disk reads. Efficient, but only if tools work directly in memory, which is orders of magnitude faster than SSD.
Iβll explore this approach in a separate article.
Thus, we hash file content using SHA-256. Such a key or signature is quick to compare and compact to store in memory.
Hypothetically, different content could produce the same hash, but achieving this is unrealistic for our use case.
Remember, hashing is like contraception β no 100% guarantee.
Step 1: Pre-Indexing
An index (Map
) stores pairs <path
β hash
> for quick comparison between new and old states.
const index = new Map<string, string>();
Hash calculation function:
import fs from "node:fs";
import crypto from "node:crypto";
const hashFile = (filePath: string): Promise<string | undefined> =>
fs.promises
.readFile(filePath)
.then((content) =>
crypto.createHash("sha256").update(content).digest("hex")
)
// ignore errors
.catch(() => undefined);
Using glob
to pre-find tracked files and compute their hashes:
const createIndex = async (dirs: string[], ignore: RegExp): Promise<void> => {
console.info("Building initial files index...");
for (const dir of dirs) {
const iter = fs.promises.glob(path.join(dir, "**/*"));
for await (let filePath of iter) {
// ignore excluded files
if (ignore.test(filePath)) {
continue;
}
// skip directories
if (fs.lstatSync(filePath).isDirectory()) {
continue;
}
// normalize keys
filePath = filePath.replaceAll("\\", "/");
// compute and save
const hash = await hashFile(filePath);
index.set(filePath, hash);
}
}
};
Step 2: Event Handler
Goal: Trigger onChange
only when real changes occur in files (new, modified, or deleted).
- Start a watcher for all directories.
fs.watch
events add candidates to a queue;submitWatchEvents
schedules updates.
for (const dir of dirs) {
fs.watch(
dir,
{
persistent: true,
recursive: true,
},
async (_, filename: string) => {
// filter ignored files
if (!ignored.test(filename)) {
// normalize path
const filePath = path.join(dir, filename).replaceAll("\\", "/");
// track changes
changedFiles.add(filePath);
// schedule update
submitWatchEvents();
}
}
);
}
- A debounced scheduler processes accumulated changes after a 60ms delay.
const submitWatchEvents = debounce(() => {
// process candidates
const files = [...changedFiles];
// reset for future changes
changedFiles.clear();
// check each file
for (const file of files) {
void checkFileChanged(file, onChange);
}
}, 60);
- For each candidate, compute the new
hash
. If it differs from the stored value, triggeronChange
.
const checkFileChanged = async (
filePath: string,
onChange: (event: FileChangeEvent) => void
) => {
let kind = FileChangeKind.change;
const prevHash = index.get(filePath);
const hash = await hashFile(filePath);
if (hash) {
index.set(filePath, hash);
if (!prevHash) {
kind = FileChangeKind.add;
}
} else {
index.delete(filePath);
kind = FileChangeKind.remove;
}
if (prevHash !== hash) {
console.info(`
${kind} change: ${filePath}`);
onChange({ filePath, kind, hash, prevHash });
}
};
Performance
Initial startup time increases slightly due to indexing, but subsequent iterations are minimal.
Choosing a Hashing Algorithm
- Built-in
crypto.createHash("sha256")
is fast and supported out-of-the-box. - For higher performance, consider supersha or xxhash. Benchmark beforehand.
- Recommendation: Start with SHA-256, then switch to lighter algorithms or WebAssembly modules if bottlenecks arise.
Parsing vs. Bytes
Some file changes donβt affect build output (e.g., adding empty lines or formatting). Avoid rebuilds for such cases:
- Format files (e.g., using
prettier
). - Parse or analyze the AST.
- Compute hashes only if parsing succeeds.
- Pause tracking for files with errors until fixed.
try {
const formatted = prettier.format(content);
compileSourceFile(formatted);
errors.delete(filePath);
const hash = crypto.createHash("sha256").update(formatted).digest("hex");
// ... proceed
} catch (err) {
// Pause live-reload for broken files
errors.set(filePath, err);
}
Pros:
- React only to valid changes.
- Avoid building broken code.
Cons:
- Formatting and parsing add overhead.
Balancing Speed and Accuracy
- Debounce reduces redundant events. Adjust the delay for responsiveness.
- Hashing filters unchanged files.
- Parsing/formatting is optional for non-critical files.
- Organize file structure to avoid copying thousands of files.
- Skip large files rarely edited manually.
Conclusion
Indexing: Hash files on startup.
Filtering: Ignore unchanged files.
Advanced Logic: Parsing and validation for critical files.
This approach reduces redundant iterations, conserves resources, and lets developers focus on meaningful changes.
Have you faced similar issues? How does your file-watching system work? Share your experiences!
Your Feedback Matters!
Which hashing algorithms have you used? For what tasks?
Have you encountered "false" rebuilds in your projects?
What should I add or clarify in this article?
Ping and follow me on social networks (until I add site comments)!