JavaScript has a problem: it has no canonical implementation of the ECMAScript standard. At least two engine implementations, V8 and JSC, can be considered "standard", while there are many lesser significant implementations. V8 is used on the client by Chromium and its derivatives, and on the server by Node and Deno, while JSC is used by Safari on the client and recently by Bun on the server.
Most other languages have a canonical or standard implementation, and some, like PHP, only really have one. This also means that those languages can afford to have a reliable standard library that is ensured to be stable and well-tested.
In lieu of a standard library, JavaScript puts conveniently few things into the
language itself, like Date. The rest is taken care of by web standards,
mostly governed by WHATWG and W3C. Anything else is either what the different
JavaScript runtimes come up with, or is to be found in userland, on npm.
A concrete example: filesystem access
A prominent example of this problem are filesystem access functions. Node itself has three ways of accessing the filesystem.
The oldest is the synchronous API, which blocks the event loop until the operation completes:
import fs from "node:fs";
const data = fs.readFileSync("file.txt", "utf8");
console.log(data);
Then came the callback API, which is non-blocking but quickly leads to deeply nested code in complex flows:
import fs from "node:fs";
fs.readFile("file.txt", "utf8", (err, data) => {
if (err) throw err;
console.log(data);
});
Finally, a promise-based API was backfilled once the ecosystem matured — added in Node v10, stable from v14:
import { readFile } from "node:fs/promises";
const data = await readFile("file.txt", "utf8");
console.log(data);
These three APIs don't reflect any coherent design intent. They reflect Node's age: each one was the idiomatic approach at the time it was introduced, and all three coexist today.
The other major runtimes made different choices. Deno goes async-only:
const data = await Deno.readTextFile("file.txt");
console.log(data);
Bun takes a lazier approach. Bun.file() returns a handle — it doesn't read
anything until you tell it how:
const file = Bun.file("file.txt");
const data = await file.text();
console.log(data);
You chain .text(), .json(), or .arrayBuffer() depending on what you
actually need. Bun also ships full Node fs compatibility as a fallback, which
is itself a statement about the interoperability problem: compatibility with
Node is now effectively a minimum requirement for any new runtime.
This progression — sync → callbacks → promises → lazy handles — is a useful lens for the broader problem. When a language has no standard library and multiple competing runtimes, the same operation can end up with four different APIs across three different runtimes, with no guarantee of convergence.
This is the problem rcompat is designed to address.
Enter rcompat
rcompat is an abstraction layer over JavaScript server runtimes. Rather than picking a winner or waiting for consensus that may never come, it gives you a single, stable API that delegates to native runtime calls under the hood.
The filesystem example from above looks like this in rcompat:
import fs from "@rcompat/fs";
const file = fs.ref("file.txt");
const data = await file.text();
console.log(data);
The API will look familiar if you've used Bun — intentionally so. Where a native runtime API is well-designed, rcompat adopts it rather than inventing something new. But rcompat is not merely yet another standard layered on top of the others. It doesn't implement the filesystem itself. It delegates.
Runtime keys: the mechanism
rcompat makes use of a feature called runtime keys, supported by all major
runtimes, which allows a package to specify divergent import paths depending on
the environment. Here is a representative excerpt from @rcompat/fs's
package.json:
"imports": {
"#native": {
"@rcompat/source": "./src/private/native/node.ts",
"bun": "./lib/private/native/bun.js",
"deno": "./lib/private/native/deno.js",
"default": "./lib/private/native/node.js"
}
}
When the code runs in Bun, Bun matches the "bun" condition and loads the file
that delegates to Bun.file() calls directly. Deno does the same with "deno".
Node falls back to "default". The abstraction above the seam is identical;
only what happens below is different.
This is wholly different from writing Node code and relying on Bun or Deno's Node compatibility layers. Those layers exist for pragmatic reasons, but they
are not optimised — which is precisely why Bun ships a --bun flag to bypass
legacy Node paths and use native implementations instead. rcompat skips the
compatibility layer entirely: you write once, and each runtime executes its own
optimal path.
Forward compatibility
Because rcompat's abstraction sits above the runtime boundary rather than being tied to any one runtime's API surface, it is also forward-compatible. An emerging runtime — call it oden — could support code built on rcompat without having to implement the full Node API. It only needs to satisfy rcompat's interface. The ecosystem of packages built on rcompat becomes runnable on that new runtime on day one.
Well-known symbols
A lesser-used feature of the JavaScript language is well-known symbols. The
language defines a handful of these — Symbol.iterator, Symbol.toPrimitive,
Symbol.replace, and others — as stable hooks that objects can implement to
integrate with built-in behaviour:
class Replaceable {
[Symbol.replace](string, replacement) {
return string.split(this.pattern).join(replacement);
}
}
These symbols are effectively eternal. They are part of the language
specification and will not change. @rcompat/symbol extends this idea to the
rcompat ecosystem, exposing well-known symbols that packages and their
consumers can coordinate around:
import symbol from "@rcompat/symbol";
// Implement this to signal you can be validated by a rcompat-aware validation library
class MyPayload {
#value: string;
constructor(value: string) {
this.#value = value;
}
[symbol.parse]() {
// parseable representation of this object
return this.#value;
}
}
// Implement this to signal that @rcompat/fs can call [symbol.stream]()
// to retrieve a ReadableStream representation of your object
class MyPayload {
[symbol.stream]() {
return new ReadableStream({ /* ... */ });
}
}
@rcompat/symbol guarantees the same permanence the language standard does. A
symbol published under this package is a contract, not an implementation detail.
instanceof fragility
instanceof in JavaScript is more fragile than it appears. If two copies of
the same package end up in the dependency tree — a common occurrence with npm,
where semver ranges can resolve to different patch versions across packages —
an object created by one copy will fail an instanceof check against the class
from the other:
// Two copies of the same package, resolved to different versions
import { FileRef as FileRefA } from "@rcompat/fs"; // v1.0.0
import { FileRef as FileRefB } from "@rcompat/fs"; // v1.0.1
const file = new FileRefA("file.txt");
console.log(file instanceof FileRefB); // false — even though they are "the same" class
The same problem can arise with built-in types across different JavaScript
realms — separate vm.Context instances in Node, for example — where even
[] instanceof Array can return false.
rcompat addresses this for custom objects with branded is checks. Classes
expose a static is method backed by a versioned Symbol.for brand:
import is from "@rcompat/is";
const brand = Symbol.for("std:fs/FileRef/v0");
class FileRef {
[brand] = true;
static is(x: unknown): x is FileRef {
return is.versioned(x, brand);
}
}
is.versioned resolves three cases cleanly:
- The brand matches exactly →
true - The value carries no such brand →
false - The value carries a different version of the brand (e.g.
"std:fs/FileRef/v1") → throws a descriptive error indicating a version mismatch
This gives @rcompat/fs the ability to publish a new major version that
remains backward-compatible with older FileRef instances — but to explicitly
and loudly break when the interface has changed so fundamentally that silent
interop would be unsafe.
Reusable errors
The same philosophy extends to error handling. @rcompat/error exposes
CodeError, a structured error class that carries a machine-readable code
alongside its message:
import TemplateError from "#TemplateError";
import is from "@rcompat/is";
const brand = Symbol.for("std:error/CodeError/v0");
type Code = string;
export default class CodeError extends TemplateError {
#code: Code;
[brand] = true;
constructor(code: Code, strings: TemplateStringsArray, ...params: unknown[]) {
super(strings, ...params);
this.#code = code;
}
get code() {
return this.#code;
}
static is(x: unknown): x is CodeError {
return is.branded(x, brand);
}
static matches(x: unknown, code: Code): x is CodeError {
return CodeError.is(x) && x.code === code;
}
}
A framework can throw a CodeError with a well-known code, and any layer of
the stack — middleware, error boundaries, logging infrastructure — can catch
and handle it precisely, without resorting to string matching on error messages
or fragile instanceof checks:
try {
await handleRequest(req);
} catch (err) {
if (CodeError.matches(err, "fs/not-found")) {
return respond(404, "Not found");
}
throw err;
}
Because CodeError uses the same branded is pattern, it is safe across
package versions and module realms.
The long view
Each piece of rcompat — the runtime abstraction, the conditional exports, the eternal symbols, the branded identity checks, the structured errors — is designed to be self-consistent and composable. Together they sketch out what a genuine standard library for server-side JavaScript could look like: stable interfaces, native performance on each runtime, and no dependency on any one runtime winning.
The long-term ambition of the rcompat authors is to see these patterns
standardised. The std: namespace used in rcompat's brand symbols is not
accidental — it anticipates a future where runtimes formally adopt shared
symbols and interfaces under a common prefix, in the same way that WHATWG's
URL, Fetch, and Streams APIs moved from browser-specific implementations into
shared specifications that all runtimes now implement.
Whether that standardisation happens through WHATWG, a new working group, or gradual de facto convergence, rcompat intends to be there already — a working proof that the coordination problem is solvable, and that JavaScript's fragmented server landscape does not have to be permanent.