import ddn.data.json : parseJSON, toJSON, JsonError;
import ddn.var : var;
JsonError err;
var v;
assert(parseJSON(v, `{"a":1, "b":"x"}`, err));
assert(v["a"].as!long == 1);
// Serialize back to JSON
auto jsonText = toJSON(v); // e.g. `{"a":1,"b":"x"}`ddn.data.json
High-Performance Strict JSON Parser/Writer
This module provides a strictly RFC 8259 compliant JSON parser and writer, optimized for performance. Unlike ddn.data.json5, this module does not support JSON5 extensions such as comments, trailing commas, unquoted keys, single-quoted strings, hex numbers, or special numbers (NaN/Infinity).
Mapping rules to var:
- Objects: map to
var.Type.OBJECTwithstringkeys andvarvalues. - Arrays: map to
var.Type.ARRAYwith elements converted tovarrecursively. - Strings: map to
var.Type.STRING(double-quoted only). - Booleans: map to
var.Type.BOOL. - null: map to
var.Type.NULL. - Numbers: integers use
Type.LONGorType.ULONG; decimals useType.DOUBLE.
Writer rules:
- Keys are always double-quoted per RFC 8259.
- Values are emitted compactly by default; pretty printing is configurable.
- Deterministic output is achieved by lexicographically sorting object keys by default.
Examples
See Also
ddn.data.json5 for JSON5 support with extensions.Types 15
Policy for handling duplicate keys in JSON objects.
JSON objects with duplicate keys are technically valid per RFC 8259, but their behavior is implementation-defined. This enum controls how the parser handles such cases.
Policy controlling parser and writer behavior.
This struct provides configuration options that affect how JSON is parsed and serialized, allowing customization for different use cases.
bool preferSignedIntegersPrefer signed integers when a value fits both signed and unsigned.size_t maxDepthMaximum nesting depth allowed during parsing to prevent runaway recursion.JsonDuplicateKeyPolicy duplicateKeyPolicyHow to handle duplicate keys in objects.bool sortKeysOnWriteWriter: sort object keys for deterministic output.bool asciiOnlyWriteWriter: emit only ASCII by escaping non-ASCII code points.bool prettyPrintWriter: enable multi-line pretty printing with indentation.string indentWriter: indentation string for pretty printing (used only when `prettyPrint`).Options controlling JSON serialization behavior.
This struct provides fine-grained control over how values are serialized to JSON text.
bool prettyPretty-print output with indentation.string indentIndentation string for pretty printing (used only when `pretty`).bool asciiOnlyEscape non-ASCII code points as \uXXXX.bool sortKeysSort object keys for deterministic output.size_t maxDepthMaximum nesting depth allowed during writing to prevent runaway recursion. When exceeded, the writer emits `null` at the offending position.Reusable scratch buffers for JSON writing.
This structure allows caller-controlled reuse of temporary allocations across repeated serializations, reducing GC pressure for high-throughput scenarios.
Currently used for:
- Deterministic object serialization (
sortKeys == true) via per-depth key buffers.
string[][] keyBuffersPer-depth buffers used to collect and sort object keys.Token kinds produced by the JSON lexer.
These represent the fundamental syntactic elements of JSON text as defined by RFC 8259.
A token produced by the JSON lexer.
Tokens represent individual syntactic elements extracted from the input. The lexeme field is a slice into the original source, avoiding allocation during tokenization.
JsonTokenKind kindThe kind of token.const(char)[] lexemeSlice into the source text containing the token's characters.size_t line1-based line number where the token starts.size_t column1-based column number where the token starts.size_t startIndex0-based byte offset where the token starts.size_t endIndex0-based byte offset where the token ends (exclusive).High-performance JSON lexer over a UTF-8 buffer.
The lexer tokenizes strict JSON according to RFC 8259:
- Skips ASCII whitespace only (space, tab, newline, carriage return).
- No BOM handling — strict JSON does not allow BOM.
- No comment handling — JSON does not support comments.
- Returns slices for strings and numbers without allocating.
- Tracks line/column for diagnostics.
Performance optimizations:
- Compile-time lookup tables for character classification.
- Single-pass tokenization without backtracking.
- Zero allocations during tokenization (returns slices).
private const(char)[] _srcprivate size_t _iprivate size_t _lineprivate size_t _colthis(const(char)[] input)Construct a lexer over the given input.Reviver delegate type for transforming parsed values.
The callback is invoked in post-order (children first), similarly to JavaScript JSON.parse(text, reviver).
Parameters
key | Object key or array index (as decimal string). Root is passed as `""`. |
value | Current value. |
Returns
Internal parser state for JSON parsing.
private JsonLexer _lexerprivate const(char)[] _srcprivate JsonPolicy _policyprivate JsonToken _currentprivate size_t _depththis(const(char)[] input, JsonPolicy policy)Callbacks for SAX-style JSON parsing.
Any callback may be left null to ignore that event.
Event semantics:
onObjectStart/onObjectEndare emitted for `{` / `}`.onArrayStart/onArrayEndare emitted for `[` / `]`.onKeyis emitted for each object property key, before its value.onValueis emitted for scalar values only (null/bool/number/string).
void delegate() @safe onObjectStartCalled when an object begins (`{`).void delegate() @safe onObjectEndCalled when an object ends (`}`).void delegate() @safe onArrayStartCalled when an array begins (`[`).void delegate() @safe onArrayEndCalled when an array ends (`]`).void delegate(const string key) @safe onKeyCalled for each object key.void delegate(const var value) @safe onValueCalled for each scalar value (null, bool, number, string).Result of a streaming JSON parse attempt.
Incremental/streaming JSON parser that accepts input in chunks.
This API is intended for large inputs or network streams where the complete document may not be available at once.
Notes:
- The parser produces at most one root JSON value.
- After a successful parse (
JsonStreamResult.OK), the instance becomes done;call
reset()to parse another document. - When
finish()has not been called yet, parse failures that look like"unexpected end of input" are reported as
NEED_MORE.
Examples
import ddn.data.json : JsonStreamParser, JsonStreamResult, JsonError;
import ddn.var : var;
JsonStreamParser sp;
sp.push(`{"a":1,`);
var v; JsonError err;
assert(sp.tryParse(v, err) == JsonStreamResult.NEED_MORE);
sp.push(`"b":2}`);
assert(sp.tryParse(v, err) == JsonStreamResult.OK);
assert(v["b"].as!long == 2);JsonStreamResult tryParse(out var value, out JsonError err) @safeAttempt to parse the buffered data as a single JSON root value.bool _looksIncomplete(const ref JsonError e) const @safeHeuristic: detect failures likely caused by an incomplete final chunk.this(const JsonPolicy policy)Construct a streaming parser using `policy`.Replacer delegate type for transforming/filtering values during serialization.
The callback is invoked for each value during JSON serialization, allowing transformation or filtering of values before they are written.
Parameters
key | Object key or array index (as decimal string). Root is passed as `""`. |
value | Current value to be serialized. |
Returns
var.init (null) to omit the key from output
(only effective for object properties, not array elements).
Error codes for JSON parsing and writing operations.
These codes categorize the types of errors that can occur during JSON processing, enabling programmatic error handling.
A non-throwing error report produced by the JSON parser/writer.
This struct captures detailed information about parsing or writing errors, including position information for diagnostics.
size_t line1-based line number where the error occurred (0 if unknown).size_t column1-based column number where the error occurred (0 if unknown).size_t index0-based byte offset into the original input (0 if unknown).JsonErrorCode codeMachine-readable error category.string messageHuman-readable message.string contextExcerpt of the source around `index` with a caret line, if available.this(size_t line, size_t column, string message,
JsonErrorCode code = JsonErrorCode.UNKNOWN,
size_t index = 0,
string context = "")Construct an error report.Functions 33
string[] _scratchKeys(ref JsonWriteScratch scratch, const size_t depth) ref @safeObtain a cleared scratch key buffer for a given writer recursion `depth`.bool decodeJsonString(const(char)[] lexeme, out string decoded, out string errorMsg) @safeDecode a JSON string literal, processing escape sequences.bool _parseDouble(const(char)[] s, out double result) @trustedParse a string as a double value.bool _parseDoubleStdConv(const(char)[] s, out double result) @safeStandard library fallback for parseDouble.bool parseJsonNumber(const(char)[] lexeme, out var result, out string errorMsg,
bool preferSigned = true) @safeParse a JSON number literal and return the appropriate `var` type.bool parseJSON(out var value, const(char)[] input, out JsonError err,
JsonPolicy policy = JsonPolicy.init) @safeParse JSON text into a `var` value.var parseJSON(const(char)[] input) @safeConvenience overload that parses JSON and returns the value directly.bool parseJSON(out var value, const(char)[] input, JsonReviver reviver, out JsonError err,
JsonPolicy policy = JsonPolicy.init) @safeParse JSON text and apply `reviver` to each value.void _applyReviver(ref var value, const string key, scope JsonReviver reviver) @safeApply `reviver` to `value` in post-order traversal.bool parseJsonSax(const(char)[] input, ref JsonSaxHandler handler, out JsonError err,
const JsonPolicy policy = JsonPolicy.init) @safeParse JSON text and emit SAX-style events into `handler`.bool isValidJSON(const(char)[] input, out JsonError err, JsonPolicy policy = JsonPolicy.init) @safeValidate JSON text without building a value tree.bool isValidJSON(const(char)[] input) @safeConvenience overload that validates JSON without returning error details.bool _validateValue(const(char)[] src, ref JsonLexer lx, ref JsonToken cur,
out JsonError err, const JsonPolicy policy, size_t depth) @safeValidate a JSON value without building a var tree.bool _validateArray(const(char)[] src, ref JsonLexer lx, ref JsonToken cur,
out JsonError err, const JsonPolicy policy, size_t depth) @safeValidate a JSON array structure.bool _validateObject(const(char)[] src, ref JsonLexer lx, ref JsonToken cur,
out JsonError err, const JsonPolicy policy, size_t depth) @safeValidate a JSON object structure.bool minify(out string output, const(char)[] input, out JsonError err) @safeMinify JSON text by removing all optional whitespace.string minify(const(char)[] input) @safeConvenience overload that minifies JSON and returns the result directly.string toJSON(const var value, JsonWriteOptions opts = JsonWriteOptions.init) @safeSerialize a `var` value to a JSON string.string toJSON(const var value, JsonWriteOptions opts, ref JsonWriteScratch scratch) @safeSerialize a `var` value to JSON using the provided scratch buffers.void writeJSON(W)(ref W output, const var value, JsonWriteOptions opts = JsonWriteOptions.init) @safeWrite a `var` value as JSON to an output range.string toJSON(const var value, JsonReplacer replacer, JsonWriteOptions opts = JsonWriteOptions.init) @safeSerialize a `var` value to JSON with a replacer function.var _applyReplacer(const var value, const string key, scope JsonReplacer replacer) @safeApply replacer to a value recursively.void _writeJsonImpl(W)(const var value, ref W buf, const JsonWriteOptions opts,
uint depth, ref JsonWriteScratch scratch) @safeInternal implementation of JSON writing.void _writeDouble(W)(double d, ref W buf) @safeWrite a double value in RFC 8259 compliant format.void _writeString(W)(scope const(char)[] s, ref W buf, bool asciiOnly) @safeWrite a string value with proper escaping.void _writeArray(W)(const var value, ref W buf, const JsonWriteOptions opts,
uint depth, ref JsonWriteScratch scratch) @safeWrite an array value.void _writeObject(W)(const var value, ref W buf, const JsonWriteOptions opts,
uint depth, ref JsonWriteScratch scratch) @safeWrite an object value.void _writeIndent(W)(ref W buf, string indent, uint depth) @safeWrite indentation for pretty printing.bool _isUtf8Continuation(char b) @safe pure nothrow @nogcChecks if a byte is a UTF-8 continuation byte (10xxxxxx pattern).size_t _alignUtf8Start(const(char)[] src, size_t pos, size_t minPos) @safe pure nothrow @nogcAdjusts a position backwards to align with a UTF-8 character boundary.size_t _alignUtf8End(const(char)[] src, size_t pos, size_t maxPos) @safe pure nothrow @nogcAdjusts a position forwards to align with the end of a UTF-8 character.string _jsonContextWindow(const(char)[] src, size_t index, size_t context = 30) @safeBuild a context window for diagnostics.JsonError _makeError(const(char)[] src,
const size_t line,
const size_t column,
const size_t index,
const JsonErrorCode code,
const string msg) @safeBuild a structured error with a context window.Variables 5
bool[256] _jsonWhitespaceLookup table for JSON whitespace characters.
RFC 8259 defines whitespace as: space (0x20), tab (0x09), newline (0x0A), carriage return (0x0D). This table allows O(1) whitespace detection instead of multiple comparisons.
bool[256] _jsonDigitLookup table for decimal digits (0-9).
bool[256] _jsonHexDigitLookup table for hexadecimal digits (0-9, a-f, A-F).
ubyte[256] _jsonHexValueLookup table for hex digit values (0-15, or 0xFF for invalid).
char[256] _jsonEscapeCharLookup table for valid single-character escape sequences.
Maps escape character to decoded value, or 0 for invalid. Valid escapes: `"`, `\`, `/`, b, f, n, r, t