I Benchmarked JSON Parsing in Bun, Node, Rust, and Go

I’m just going to start posting about JSON everyday. Well ok, maybe not every day, but for the next few days at least. Later this week I’ve committed to writing a guide on getting started with CLIs for non-programmers, so stay tuned for that.

This morning I benchmarked JSON parsing across four runtimes: Bun, Node, Rust, and Go.

The Results

  • Bun is the overall winner on large files — 307-354 MB/s, beating even Rust’s serde_json for untyped parsing
  • Rust wins on small/nested data (225 MB/s small, 327 MB/s nested) due to low overhead
  • Node is close behind Bun — V8’s JSON.parse is very optimized
  • Go is ~3x slower than the JS runtimes on large payloads (encoding/json is notoriously slow)
  • Memory: Bun reports 0 delta (likely GC reclaims before measurement), Rust’s tracking allocator shows the true heap cost (73-96MB), Go uses 52-65MB

Rust’s numbers were the most honest here since the tracking allocator catches everything. We should take Bun result with grain of salt because benchmarking memory in GC’d languages is tricky.

The json parser in v8 in node is the exact same as what is in Chrome…

Here’s the full test results if you want to dig into the numbers yourself.

More JSON content coming soon. You’ve been warned.

/ Programming / Bun / Node / Json / Benchmarks / Rust / Go