DevToolBoxGRATIS
Blog

Guida al Formattatore JSON Online: Abbellire, Minificare, Validare e Gestire File JSON Grandi

13 min di letturadi DevToolBox

TL;DR

A JSON formatter transforms compact, single-line JSON into readable, indented text. Use JSON.stringify(obj, null, 2) in JavaScript, json.dumps(obj, indent=2) in Python, or jq on the command line. For instant formatting without installing anything, try our free JSON Formatter tool. This guide covers prettifying, minifying, validating, JSON5/JSONC, streaming large files, editor integrations, CI/CD linting, JSON Schema, and performance benchmarks. JSON Formatter Tool

Key Takeaways

  • JSON formatting (pretty-printing) adds indentation and line breaks without changing the data.
  • JSON.stringify(obj, null, 2) is the built-in way to format JSON in JavaScript and TypeScript.
  • Minification strips whitespace to reduce payload size by 10-30% for production APIs.
  • Online formatters provide zero-install formatting, validation, and error detection in the browser. โ€” Try it free.
  • CLI tools like jq and python -m json.tool format JSON from pipes and files instantly.
  • JSON5 and JSONC allow comments and trailing commas, but standard JSON does not.
  • For files over 100 MB, use streaming parsers (jq --stream, Python ijson, Go json.Decoder) to avoid memory issues.
  • JSON Schema validation catches structural errors beyond syntax, enforcing types, required fields, and value constraints.

What Is JSON Formatting and Why Does It Matter?

JSON (JavaScript Object Notation) is the universal data interchange format for web APIs, configuration files, databases, and message queues. When a REST API returns a response, the JSON payload is typically minified: all whitespace is stripped to minimize bandwidth. A response like {"{"users":[{"id":1,"name":"Alice","roles":["admin","dev"],"settings":{"theme":"dark","notifications":true}}]}"} is perfectly valid but nearly impossible to read.

JSON formatting (also called JSON prettifying, beautifying, or pretty-printing) is the process of adding consistent indentation, line breaks, and spacing to raw JSON text. The semantic content stays identical; only the visual presentation changes. This makes it possible to quickly identify nesting levels, spot missing commas, find mismatched brackets, and verify that the data structure matches your expectations.

Whether you call it a JSON formatter, JSON beautifier, JSON prettifier, or JSON pretty-printer, the operation is the same: parse the JSON string into a data structure, then serialize it back with whitespace and indentation.

โ†’ Try our free JSON Formatter tool now

JSON.stringify with Indent Parameter

JavaScript and TypeScript provide a built-in method for formatting JSON. The JSON.stringify() function accepts three arguments: the value to serialize, an optional replacer function (or array of keys to include), and the number of spaces for indentation.

The third argument controls formatting. Pass 2 for two-space indentation (the most common convention), 4 for four spaces, or "\t" for tab-based indentation. When the space argument is omitted or set to 0, the output is minified.

// Format JSON with 2-space indentation
const data = {
  name: "Alice",
  age: 30,
  roles: ["admin", "developer"],
  settings: { theme: "dark", notifications: true }
};

// 2-space indent (most common)
console.log(JSON.stringify(data, null, 2));

// 4-space indent
console.log(JSON.stringify(data, null, 4));

// Tab indent
console.log(JSON.stringify(data, null, "\t"));

// Custom replacer: redact sensitive fields
const safeStringify = JSON.stringify(data, (key, value) => {
  if (key === "password" || key === "apiKey") return "[REDACTED]";
  return value;
}, 2);

The replacer parameter (second argument) is powerful for filtering output. You can pass an array of property names to include only specific keys, or a function that transforms values during serialization. For example, you might redact sensitive fields like passwords or API keys before logging.

Minification vs Prettification Trade-offs

Minification and prettification are two sides of the same coin. Understanding when to use each is essential for building efficient systems.

Prettified JSON is human-readable with indentation and line breaks. Use it during development, debugging, logging (when log volume is not critical), configuration files, documentation, and code reviews. The trade-off is increased byte size: a typical 2-space indented JSON file is 10-30% larger than its minified equivalent.

Minified JSON strips all unnecessary whitespace. Use it in production API responses, database storage, message queues, network transfers, and anywhere bandwidth or storage costs matter. Modern HTTP compression (gzip, Brotli) reduces the difference significantly, but minification still saves parsing time on the client side because the parser has fewer characters to process.

In practice, most systems minify JSON for wire transfer and prettify it on demand for human consumption. Your API should return minified JSON, and your debugging tools should format it on the fly.

// Minify JSON (no whitespace)
const minified = JSON.stringify(data);
// {"name":"Alice","age":30,"roles":["admin","developer"],...}

// Parse and re-format a JSON string
const raw = '{"name":"Alice","age":30}';
const parsed = JSON.parse(raw);
const pretty = JSON.stringify(parsed, null, 2);
/*
{
  "name": "Alice",
  "age": 30
}
*/

Online JSON Formatters vs CLI Tools

There are two main approaches to formatting JSON: browser-based online tools and command-line tools. Each has distinct advantages.

Online formatters require zero installation. Paste your JSON, get formatted output instantly. They typically include syntax validation, error highlighting with line numbers, customizable indentation (2 spaces, 4 spaces, tabs), one-click copy, and minification mode. All processing happens client-side in JavaScript, so your data never leaves the browser.

โ†’ Format JSON in your browser

CLI tools integrate into your existing workflow. The two most popular are jq and Python:

jq is the Swiss Army knife for JSON on the command line. Run jq . file.json to pretty-print with color-coded syntax highlighting. Use jq -c . to minify. Extract specific fields with jq '.users[0].name'. Filter arrays with jq '.items[] | select(.price > 10)'. Install via brew install jq, apt install jq, or choco install jq.

# Pretty-print with jq (color-coded output)
jq . data.json

# Minify with jq
jq -c . data.json

# Extract a specific field
jq '.users[0].name' data.json

# Filter array elements
jq '.items[] | select(.price > 10)' data.json

# Pretty-print with Python (no install needed)
python3 -m json.tool data.json

# Pipe from curl to jq
curl -s https://api.example.com/users | jq .

# Validate JSON (exit code 0 = valid, non-zero = invalid)
jq empty config.json && echo "Valid" || echo "Invalid"

python -m json.tool ships with every Python installation. Run cat file.json | python3 -m json.tool to format and validate in one step. It reports syntax errors with line numbers. No additional installation required.

Other CLI tools worth mentioning: fx (interactive JSON browser, npm install -g fx), gron (makes JSON greppable), and jless (terminal JSON viewer with vim-like navigation).

JSON Validation: Syntax Errors, Trailing Commas, and Comments

Valid JSON must follow strict syntax rules defined in RFC 8259. Common errors that break JSON parsing include:

  • Trailing commas after the last element in an object or array: {"{"a": 1, "b": 2,}"}
  • Single quotes instead of double quotes: {"{'a': 1}"} is not valid JSON
  • Unquoted keys: {"{a: 1}"} must be {"{"a": 1}"}
  • Comments: JSON does not support // or /* */ comments
  • Trailing characters after the root value
  • NaN, Infinity, and undefined are not valid JSON values
  • Hexadecimal numbers: 0xFF is not valid JSON

When you paste invalid JSON into a formatter, the tool should report the exact error location with a line number and column. Our JSON Formatter validates your input automatically and highlights errors inline.

// Common JSON syntax errors and their fixes

// ERROR: Trailing comma
// { "a": 1, "b": 2, }   <-- trailing comma after 2
// FIX:
{ "a": 1, "b": 2 }

// ERROR: Single quotes
// { 'name': 'Alice' }
// FIX:
{ "name": "Alice" }

// ERROR: Unquoted keys
// { name: "Alice" }
// FIX:
{ "name": "Alice" }

// ERROR: Comments
// { "debug": true  // enable debug mode }
// FIX: Remove comments or use JSONC/JSON5
{ "debug": true }

// ERROR: JavaScript values
// { "value": undefined, "count": NaN, "max": Infinity }
// FIX:
{ "value": null, "count": 0, "max": 999999 }

JSON5 and JSONC (JSON with Comments)

The strict rules of standard JSON create friction for configuration files where human readability matters. Two extensions address this:

JSONC (JSON with Comments) adds support for single-line (//) and multi-line (/* */) comments plus trailing commas. VS Code uses JSONC for its settings files (settings.json, tsconfig.json). TypeScript configuration files are JSONC by default.

JSON5 is a superset of JSON that adds: single-line and multi-line comments, trailing commas, single-quoted strings, unquoted object keys (if valid identifiers), hexadecimal numbers, leading and trailing decimal points (.5, 5.), positive Infinity, negative Infinity, NaN, and multi-line strings with backslash line continuation.

// JSONC example (VS Code settings, tsconfig.json)
{
  // This is a single-line comment
  "compilerOptions": {
    "target": "ES2022",
    "module": "ESNext",
    "strict": true, // trailing comma allowed in JSONC
  },
  /* Multi-line comment
     explaining the configuration */
  "include": ["src/**/*"],
}

// JSON5 example (more relaxed syntax)
{
  name: 'DevToolBox',       // unquoted keys, single quotes
  version: 2,
  features: [
    'formatting',
    'validation',
    'minification',         // trailing comma
  ],
  config: {
    maxFileSize: 0xFF,      // hexadecimal numbers
    ratio: .5,              // leading decimal point
    debug: true,
  },
}

While these formats are convenient for configuration, they are not interchangeable with standard JSON. API endpoints should always return strict JSON. Use JSON5 or JSONC only for local configuration files that will be parsed by tools that explicitly support them.

Formatting Large JSON Files: Streaming and Memory

Standard JSON parsers load the entire file into memory before processing. For files under 50 MB, this works fine. For files over 100 MB or multi-gigabyte datasets, you need streaming approaches:

jq --stream: Processes JSON as a stream of path-value pairs without loading the entire document. Ideal for extracting specific fields from huge files: jq --stream 'select(.[0][-1] == "name") | .[1]' huge.json.

Python ijson: An iterative JSON parser that yields items one at a time. Install with pip install ijson. Process millions of records with constant memory usage.

# jq streaming: extract names from a huge array
jq --stream 'select(.[0][-1] == "name") | .[1]' huge.json

# Python ijson: iterate over items with constant memory
import ijson

with open("huge.json", "rb") as f:
    for item in ijson.items(f, "users.item"):
        print(item["name"])

# Node.js stream-json: process large JSON files
const { parser } = require("stream-json");
const { streamArray } = require("stream-json/streamers/StreamArray");
const fs = require("fs");

fs.createReadStream("huge.json")
  .pipe(parser())
  .pipe(streamArray())
  .on("data", ({ value }) => {
    console.log(value.name);
  });

Go encoding/json Decoder: Go standard library provides json.NewDecoder(reader) which reads from an io.Reader and can process tokens incrementally using Token() and More() methods.

Node.js streams: Libraries like stream-json and JSONStream process JSON as Node.js streams, enabling processing of files larger than available memory.

As a rule of thumb, if the JSON file is larger than 25% of your available RAM, use a streaming parser.

JSON Formatting in VS Code, IntelliJ, and Vim

Every major code editor provides built-in or plugin-based JSON formatting.

VS Code: Open any .json file and press Shift+Alt+F (Windows/Linux) or Shift+Option+F (macOS) to format. VS Code auto-detects JSON and provides syntax highlighting, bracket matching, folding, and inline error detection. Install Prettier for consistent formatting across your team. For JSONC files (like tsconfig.json), VS Code uses its built-in formatter that supports comments.

IntelliJ IDEA / WebStorm: Press Ctrl+Alt+L (Windows/Linux) or Cmd+Alt+L (macOS) to reformat. IntelliJ supports JSON Schema mapping, allowing autocomplete and validation against schema definitions. Configure indentation in Settings > Editor > Code Style > JSON.

Vim / Neovim: Format JSON with an external command: :%!jq . formats the entire buffer using jq. Or use :%!python3 -m json.tool. For a plugin-based approach, install vim-json for syntax highlighting or use the built-in LSP with a JSON language server for validation and formatting.

Sublime Text: Open Command Palette (Ctrl+Shift+P) and select "Pretty Print (JSON)". Sublime handles very large JSON files (hundreds of MB) better than most editors.

# VS Code keyboard shortcuts for JSON
# Format document:     Shift+Alt+F (Win/Linux) / Shift+Option+F (Mac)
# Fold all:            Ctrl+K Ctrl+0
# Unfold all:          Ctrl+K Ctrl+J
# Toggle word wrap:    Alt+Z

# Vim: format entire buffer with jq
:%!jq .

# Vim: format with Python
:%!python3 -m json.tool

# Vim: format selected lines with jq
:'<,'>!jq .

JSON Linting in CI/CD Pipelines

Automated JSON validation in your CI/CD pipeline catches configuration errors before they reach production. Here are practical approaches for different environments:

GitHub Actions: Use jq to validate JSON files in a workflow step. A non-zero exit code from jq indicates invalid JSON, which fails the build.

# GitHub Actions: validate all JSON files
name: Validate JSON
on: [push, pull_request]
jobs:
  validate:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Validate JSON files
        run: |
          find . -name "*.json" -not -path "./node_modules/*" | while read f; do
            jq empty "$f" || { echo "Invalid JSON: $f"; exit 1; }
          done

# Pre-commit hook (.pre-commit-config.yaml)
repos:
  - repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.5.0
    hooks:
      - id: check-json
      - id: pretty-format-json
        args: [--autofix, --indent, "2"]

# Node.js one-liner validation
node -e "JSON.parse(require('fs').readFileSync('config.json','utf8'))"

# Python validation with error details
python3 -c "
import json, sys
try:
    json.load(open(sys.argv[1]))
    print('Valid JSON')
except json.JSONDecodeError as e:
    print(f'Invalid: {e}')
    sys.exit(1)
" config.json

Pre-commit hooks: Use the pre-commit framework with the check-json hook to validate all JSON files on every commit. This catches errors at the earliest possible stage.

ESLint: The eslint-plugin-json package validates JSON files as part of your existing JavaScript/TypeScript linting setup. It catches syntax errors, enforces consistent formatting, and can auto-fix indentation.

Custom scripts: A simple Node.js one-liner validates JSON: node -e "JSON.parse(require('fs').readFileSync('config.json','utf8'))". If the file is invalid, the script throws and the CI step fails.

JSON Schema Validation

JSON Schema goes beyond syntax validation. It defines the expected structure of your JSON data: what properties should exist, what types they should have, which fields are required, value constraints (min, max, pattern, enum), and nested object and array structures.

A JSON Schema is itself a JSON document that describes the shape of other JSON documents. It uses keywords like type, properties, required, items, minLength, maximum, pattern, and enum to express constraints.

// JSON Schema example
{
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "type": "object",
  "required": ["name", "email", "age"],
  "properties": {
    "name": {
      "type": "string",
      "minLength": 1,
      "maxLength": 100
    },
    "email": {
      "type": "string",
      "format": "email"
    },
    "age": {
      "type": "integer",
      "minimum": 0,
      "maximum": 150
    },
    "roles": {
      "type": "array",
      "items": { "type": "string", "enum": ["admin", "user", "editor"] },
      "minItems": 1,
      "uniqueItems": true
    }
  },
  "additionalProperties": false
}

// JavaScript: validate with Ajv
import Ajv from "ajv";
import addFormats from "ajv-formats";

const ajv = new Ajv();
addFormats(ajv);

const validate = ajv.compile(schema);
const valid = validate(data);
if (!valid) {
  console.error(validate.errors);
}

# Python: validate with jsonschema
from jsonschema import validate, ValidationError

try:
    validate(instance=data, schema=schema)
    print("Valid")
except ValidationError as e:
    print(f"Invalid: {e.message}")

Popular validation libraries include Ajv (JavaScript, the fastest JSON Schema validator), jsonschema (Python), and gojsonschema (Go). VS Code supports JSON Schema out of the box: map a schema URL to a file pattern in settings, and you get autocomplete, inline validation, and hover documentation.

JSON Schema is widely adopted for API documentation (OpenAPI/Swagger uses JSON Schema for request/response definitions), form generation, configuration validation, and data pipeline contracts.

Performance Benchmarks for Large JSON Files

Performance varies dramatically across JSON parsers and languages. Here are approximate benchmarks for parsing a 100 MB JSON file on modern hardware (M2 MacBook Pro, 16 GB RAM):

See also: JSON Viewer

ParserParse TimePeak Memory
simdjson (C++/Rust)~120 ms~200 MB
Go encoding/json~800 ms~350 MB
Python json (CPython)~3,500 ms~600 MB
Node.js JSON.parse~400 ms~400 MB
jq (streaming)~2,000 ms~50 MB
Python ijson (streaming)~5,000 ms~30 MB

Key insight: streaming parsers use dramatically less memory at the cost of higher total processing time. For one-shot parsing where you need the full data structure in memory, V8 (Node.js/Chrome) and simdjson are fastest. For memory-constrained environments processing huge files, streaming parsers are the only viable option.

The formatting step itself (serializing back to indented text) typically adds 20-40% to the parse time. For a round trip (parse + format), expect roughly 1.3x the parse time.

Code Example: JavaScript / TypeScript

// Format JSON with 2-space indentation
const obj = { name: "Alice", age: 30, roles: ["admin", "dev"] };
const formatted = JSON.stringify(obj, null, 2);
console.log(formatted);

// Sort keys alphabetically
const sortedKeys = JSON.stringify(obj, Object.keys(obj).sort(), 2);

// Custom replacer: redact sensitive fields
function safeStringify(data) {
  return JSON.stringify(data, (key, value) => {
    const sensitiveKeys = ["password", "secret", "apiKey", "token"];
    if (sensitiveKeys.includes(key)) return "[REDACTED]";
    return value;
  }, 2);
}

// Minify JSON (no whitespace)
const minified = JSON.stringify(obj);

// Parse and re-format a JSON string
function formatJson(jsonString) {
  try {
    const parsed = JSON.parse(jsonString);
    return { success: true, result: JSON.stringify(parsed, null, 2) };
  } catch (error) {
    return { success: false, error: error.message };
  }
}

// Format JSON in Node.js from a file
const fs = require("fs");
const raw = fs.readFileSync("data.json", "utf8");
const pretty = JSON.stringify(JSON.parse(raw), null, 2);
fs.writeFileSync("data-formatted.json", pretty);

Code Example: Python

import json

# Format JSON with 2-space indentation and sorted keys
data = {"name": "Alice", "age": 30, "roles": ["admin", "dev"]}
formatted = json.dumps(data, indent=2, sort_keys=True, ensure_ascii=False)
print(formatted)

# Read, format, and write a JSON file
with open("data.json", "r", encoding="utf-8") as f:
    data = json.load(f)

with open("data-formatted.json", "w", encoding="utf-8") as f:
    json.dump(data, f, indent=2, ensure_ascii=False)

# Validate JSON from stdin
import sys

def validate_json(filepath):
    try:
        with open(filepath, "r", encoding="utf-8") as f:
            json.load(f)
        print(f"{filepath}: Valid JSON")
        return True
    except json.JSONDecodeError as e:
        print(f"{filepath}: Invalid JSON at line {e.lineno}, col {e.colno}")
        print(f"  Error: {e.msg}")
        return False

# Minify a JSON file
def minify_json(input_path, output_path):
    with open(input_path, "r", encoding="utf-8") as f:
        data = json.load(f)
    with open(output_path, "w", encoding="utf-8") as f:
        json.dump(data, f, separators=(",", ":"), ensure_ascii=False)

# Command-line usage:
# python3 -m json.tool input.json > formatted.json
# python3 -m json.tool --compact input.json > minified.json

Code Example: Go

package main

import (
    "bytes"
    "encoding/json"
    "fmt"
    "os"
)

// Format JSON with indentation
func formatJSON(input []byte) (string, error) {
    var buf bytes.Buffer
    err := json.Indent(&buf, input, "", "  ")
    if err != nil {
        return "", fmt.Errorf("invalid JSON: %w", err)
    }
    return buf.String(), nil
}

// Streaming decoder for large files
func streamLargeJSON(filename string) error {
    f, err := os.Open(filename)
    if err != nil {
        return err
    }
    defer f.Close()

    decoder := json.NewDecoder(f)

    // Read opening bracket
    _, err = decoder.Token()
    if err != nil {
        return err
    }

    // Read array elements one by one
    for decoder.More() {
        var item map[string]interface{}
        if err := decoder.Decode(&item); err != nil {
            return err
        }
        fmt.Printf("Name: %s\n", item["name"])
    }

    return nil
}

func main() {
    raw := []byte(`{"name":"Alice","age":30,"roles":["admin","dev"]}`)
    formatted, err := formatJSON(raw)
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error: %v\n", err)
        os.Exit(1)
    }
    fmt.Println(formatted)
}

Frequently Asked Questions

What is the difference between a JSON formatter and a JSON validator?

A JSON formatter adds indentation and line breaks to make JSON readable. A JSON validator checks whether the JSON conforms to correct syntax rules (RFC 8259). Most online tools combine both: they attempt to parse the input, report errors if invalid, and format the output if valid. Our JSON Formatter tool does both automatically.

Does formatting change the JSON data?

No. Formatting only adds or removes whitespace (spaces, tabs, newlines). The semantic content, including all keys, values, arrays, objects, strings, numbers, booleans, and null values, remains identical. A minified JSON and its prettified version parse to the exact same data structure.

What indent size should I use for JSON?

Two spaces is the most common convention and is used by default in most tools (Prettier, ESLint, jq). Four spaces is also popular and provides slightly better readability for deeply nested structures. Tabs are rare in JSON but supported by JSON.stringify. Choose one and be consistent across your project.

Can JSON have comments?

Standard JSON (RFC 8259) does not support comments. However, JSONC (JSON with Comments) allows // and /* */ comments and is used by VS Code for settings files and TypeScript for tsconfig.json. JSON5 also supports comments plus additional relaxed syntax. If you need comments in configuration, use JSONC or JSON5 with a parser that supports them.

How do I format a large JSON file without running out of memory?

Use a streaming parser. On the command line, jq --stream processes JSON without loading the entire file. In Python, the ijson library parses iteratively with constant memory. In Node.js, the stream-json package processes JSON as a Node stream. In Go, json.NewDecoder reads tokens incrementally from an io.Reader.

What is the fastest way to validate JSON in a CI pipeline?

The simplest approach is a one-liner: in Node.js use node -e "JSON.parse(require('fs').readFileSync('file.json','utf8'))", in Python use python3 -m json.tool file.json > /dev/null, or use jq . file.json > /dev/null. For comprehensive validation against a schema, use Ajv (JavaScript) or jsonschema (Python) in a dedicated validation script.

Is JSON5 compatible with standard JSON?

JSON5 is a superset of JSON: every valid JSON document is also valid JSON5. However, JSON5 features like comments, trailing commas, and unquoted keys are not valid standard JSON. You cannot send JSON5 to an API that expects standard JSON. JSON5 is designed for configuration files parsed by tools that explicitly support it.

How do JSON formatters handle Unicode and special characters?

JSON requires UTF-8 encoding. JSON formatters preserve all Unicode characters as-is. Non-ASCII characters can optionally be escaped as \uXXXX sequences, but this is not required. The ensure_ascii parameter in Python json.dumps controls this behavior. Most modern formatters preserve Unicode characters without escaping them.

Conclusion

JSON formatting is a fundamental developer skill that improves readability, debugging speed, and collaboration. Whether you use an online tool for quick formatting, jq for command-line workflows, or JSON.stringify for programmatic formatting, the key is to keep JSON minified for machines and prettified for humans. Combine formatting with validation and JSON Schema to catch both syntax and structural errors before they reach production.

Format, validate, and minify your JSON with our free online JSON Formatter.

๐• Twitterin LinkedIn
รˆ stato utile?

Resta aggiornato

Ricevi consigli dev e nuovi strumenti ogni settimana.

Niente spam. Cancella quando vuoi.

Prova questi strumenti correlati

{ }JSON Formatter๐ŸŒณJSON Viewer / Treeโœ“JSON Validator๐Ÿ”JSON Path Finder

Articoli correlati

JSON Formatter & Beautifier: The Complete Guide to Pretty-Printing JSON (2026)

Learn how to format, beautify, and pretty-print JSON online, in VS Code, on the command line, and in JavaScript/Python. Covers validation, minification, viewers, and common errors.

Come aprire file JSON: Guida completa per sviluppatori

Scopri come aprire file JSON in VS Code, browser, terminale (cat/jq), Notepad++, Python e Node.js.

Guida Completa JSON Schema: Validazione, Tipi e Best Practice

Padroneggia JSON Schema: tipi, regole di validazione, riferimenti $ref e schemi condizionali.

JSON vs YAML vs TOML: Quale formato di configurazione usare?

Confronta i formati di configurazione JSON, YAML e TOML.