JavaScript Core
intermediate
#structuredClone#performance#serialization

Structured Clone vs JSON.stringify: Choosing the Right Tool for Cloning and Passing Data

JSON is convenient but lossy; structured cloning preserves complex graphs and enables zero‑copy transfers. Learn a mental model, decision rules, and a step‑by‑step path from quick JSON hacks to robust, measurable solutions.

October 20, 2025
4 min read
Share this article:

Cloning is deceptively hard. JSON.stringify/parse is familiar and often “good enough,” but it silently drops functions, Map, Set, circular references, and loses prototypes. It also produces strings, not objects. Modern runtimes ship structuredClone, which preserves most built‑ins correctly and unlocks zero‑copy transfers for big data. This article gives you a mental model, decision rules, and a progressive path with measurement so you know what to use and why.

TL;DR

  • Use structuredClone for correctness (handles Map, Set, Date, circular refs, TypedArrays, File/Blob, etc.).
  • For very large ArrayBuffers/TypedArrays, transfer them for zero-copy performance.
  • JSON is still useful for human-readable snapshots and wire formats, but it’s not a correct clone.

Quick comparison

js
// JSON fails: circular refs, Map/Set, class instances, BigInt, etc.
const a = {}; a.self = a;
try { JSON.parse(JSON.stringify(a)) } catch (e) {
  console.log('JSON fails on circular references'); // ✅
}

// structuredClone: correct cyclic support
const b = {}; b.self = b;
const c = structuredClone(b);
console.log(c.self === c); // true

Objects handled by structuredClone:

  • Primitives, objects, arrays, Date, RegExp (sans lastIndex), Map, Set, ArrayBuffer, typed arrays, Blob, File, FileList, ImageData, DOMException, URL, URLSearchParams, DataView, and more. Not functions or DOM nodes.

Mental model

  • JSON is a serialization format producing strings based on a restricted subset of JavaScript values.
  • Structured cloning is a graph copy algorithm across realms/threads that preserves identity for many built‑ins and supports cycles.
  • Transferables hand over ownership of their backing memory; after transfer, the sender cannot use it (buffer becomes “detached”).

Transferables: zero-copy speed

js
const buf = new ArrayBuffer(1024 * 1024 * 32); // 32MB
const view = new Uint8Array(buf);
// Fill view...

// Transfer ownership—source becomes detached.
// In Workers, use postMessage(value, [transferList]).
const cloned = structuredClone(view, { transfer: [view.buffer] });
console.log(view.byteLength, view.buffer.byteLength); // 33554432, 0 after detach
  • Transfer avoids copying; useful in Workers/WebTransport/WebRTC data channels.
  • Post-transfer, the source ArrayBuffer is detached and unusable.

Decision rules

  • In‑memory duplicates for complex graphs → structuredClone.
  • Passing large typed arrays/ArrayBuffers across threads → structuredClone with transfer.
  • Human‑readable persistence/logging or strict schema on the wire → JSON.
  • Legacy environments without structured clone → JSON (or a polyfill) with explicit, known schemas.

Performance notes

  • Small to medium objects: structuredClone competitive with JSON, often faster for complex shapes.
  • Large nested graphs or typed arrays: structuredClone is usually superior; transfers make it O(1) copy.
  • JSON can be faster for flat, primitive-only data and produces a string (useful for caching/logging).

Progressive path and fallback strategy

Start here and graduate as needs grow:

  1. Prototype with JSON for simple, primitive‑only state or logs.
  2. Move to structuredClone when graphs become complex or you cross worker boundaries.
  3. Add transfer for very large buffers.
  4. For legacy targets, include a vetted polyfill.
js
export function clone(value) {
  if (typeof globalThis.structuredClone === 'function') {
    return globalThis.structuredClone(value);
  }
  // Minimal fallback: handles cycles and common built-ins via a library or a small polyfill.
  // For production, prefer a vetted lib (e.g., @ungap/structured-clone) when legacy support is required.
  return JSON.parse(JSON.stringify(value)); // Last-resort: lossy
}

Pitfalls

  • Functions, DOM nodes, and proxies are not cloneable.
  • RegExp.lastIndex resets; WeakMap/WeakSet cannot be cloned.
  • Transferring detaches source buffers—guard call sites to avoid accidental reuse.

Worker pattern example

js
// main.js
const worker = new Worker('worker.js', { type: 'module' });
const buf = new Uint8Array(1024 * 1024 * 8).buffer; // 8MB
worker.postMessage({ buf }, [buf]); // transfer

// worker.js
self.onmessage = (e) => {
  const { buf } = e.data; // owns buf now
  const view = new Uint8Array(buf);
  // process...
  // Optionally transfer back:
  postMessage({ result: view.buffer }, [view.buffer]);
};

Measuring results

  • Use performance.now() around clone calls to compare small vs large payloads.
  • In DevTools, watch memory when cloning large graphs repeatedly; look for detached buffers when transferring.

When to keep JSON

  • APIs expecting strings, human-auditable logs/snapshots, or storage where readability matters.
  • Ensure you control the schema and accept lossy conversion.

Use structuredClone for correctness by default; reach for JSON when producing human‑readable text or interoperating with APIs where you control the schema.