DevToolBoxKOSTENLOS
Blog

JSON zu CSV: 5 Methoden zur JSON-CSV-Konvertierung

11 Min. Lesezeitvon DevToolBox

Converting JSON to CSV is one of the most common data transformation tasks developers and analysts face. Whether you need to import API data into a spreadsheet, prepare a dataset for a legacy system, or generate a report that non-technical stakeholders can open in Excel, knowing how to convert JSON to CSV reliably is essential. This guide covers five proven methods: online converter tools, Python with pandas, JavaScript/Node.js, the jq command-line tool, and Excel Power Query. Each approach has different strengths, so choose the one that fits your workflow. If you need a quick conversion right now, try our free online tool.

Convert JSON to CSV instantly with our free online CSV-JSON Converter.

Why Convert JSON to CSV?

JSON (JavaScript Object Notation) is the standard data format for web APIs, NoSQL databases, and modern applications. CSV (Comma-Separated Values) is the universal tabular format supported by spreadsheets, databases, data analysis tools, and legacy systems. Converting between them bridges these two worlds.

Common scenarios for JSON to CSV conversion include: exporting API response data for analysis in Excel or Google Sheets, preparing data imports for SQL databases or CRM systems, generating human-readable reports from application data, migrating data between systems with different format requirements, and feeding data into visualization tools like Tableau or Power BI that prefer CSV input.

The main challenge in JSON to CSV conversion is flattening nested structures. JSON supports arbitrarily nested objects and arrays, while CSV is strictly two-dimensional (rows and columns). Different tools handle this flattening differently, which is why understanding multiple methods is valuable.

Method 1: Convert JSON to CSV Online (Fastest)

Online converter tools are the fastest way to transform JSON to CSV without writing any code. Our CSV-JSON Converter tool handles the conversion instantly in your browser:

Try our free JSON to CSV / CSV to JSON Converter.

  1. Navigate to the CSV-JSON Converter tool.
  2. Select the JSON to CSV direction.
  3. Paste your JSON data or upload a .json file.
  4. Click Convert. The tool automatically flattens nested objects using dot notation (e.g., address.city becomes a column header).
  5. Copy the CSV output or download it as a .csv file.

Online tools are ideal for one-off conversions, quick data checks, and situations where you cannot install software. They run entirely in your browser, so your data never leaves your machine. For automated or batch conversions, use one of the programmatic methods below.

Method 2: Convert JSON to CSV with Python (pandas)

Python with pandas is the most popular method for converting JSON to CSV in data science and backend workflows. The pandas library provides powerful tools for reading JSON, normalizing nested structures, and exporting to CSV with full control over formatting:

import pandas as pd
import json

# === Method 1: Flat JSON array ===
# Read JSON file directly into a DataFrame
df = pd.read_json('data.json')
df.to_csv('output.csv', index=False)

# === Method 2: Nested JSON with json_normalize ===
with open('data.json', 'r') as f:
    data = json.load(f)

# Flatten nested objects (e.g., user.address.city)
df = pd.json_normalize(data)
df.to_csv('output_flat.csv', index=False)

# === Method 3: Nested JSON with specific record_path ===
# For JSON like: {"results": [{"name": "A", "scores": [1,2]}]}
df = pd.json_normalize(
    data,
    record_path='items',           # path to the array
    meta=['id', 'name'],           # fields from parent
    meta_prefix='parent_',
    sep='.'                        # separator for nested keys
)
df.to_csv('output_nested.csv', index=False)

# === Method 4: Handle missing fields ===
df = pd.json_normalize(data)
df.fillna('', inplace=True)        # replace NaN with empty string
df.to_csv('output_clean.csv', index=False, encoding='utf-8-sig')

The pd.json_normalize() function is the key to handling nested JSON. It recursively flattens nested objects into columns with dot-separated names. For arrays within objects, you may need to use the record_path and meta parameters to specify how to expand them. For simple flat JSON arrays, pd.read_json() works directly without normalization.

If you prefer the standard library without installing pandas, Python's built-in csv and json modules work together:

import json
import csv

# Read JSON
with open('data.json', 'r') as f:
    data = json.load(f)

# Write CSV using built-in modules
if isinstance(data, list) and len(data) > 0:
    keys = data[0].keys()  # column headers from first object

    with open('output.csv', 'w', newline='', encoding='utf-8') as f:
        writer = csv.DictWriter(f, fieldnames=keys)
        writer.writeheader()
        writer.writerows(data)

    print(f"Converted {len(data)} rows to CSV")

The standard library approach is lightweight and works without any external dependencies. It is best for simple, flat JSON arrays where each object has the same keys. For nested JSON with varying structures, pandas json_normalize() is significantly more powerful.

Method 3: Convert JSON to CSV with JavaScript / Node.js

JavaScript and Node.js are natural choices for JSON to CSV conversion since JSON is a native JavaScript format. Here are approaches for both browser and server environments:

// === Browser: Convert JSON array to CSV string ===
function jsonToCsv(jsonData) {
  if (!Array.isArray(jsonData) || jsonData.length === 0) return '';

  const headers = Object.keys(jsonData[0]);
  const csvRows = [
    headers.join(','),  // header row
    ...jsonData.map(row =>
      headers.map(header => {
        const value = row[header] ?? '';
        // Escape quotes and wrap in quotes if needed
        const str = String(value);
        return str.includes(',') || str.includes('"') || str.includes('\n')
          ? `"${str.replace(/"/g, '""')}"`
          : str;
      }).join(',')
    )
  ];
  return csvRows.join('\n');
}

// Usage
const data = [
  { name: "Alice", age: 30, city: "NYC" },
  { name: "Bob", age: 25, city: "London" }
];
const csv = jsonToCsv(data);
console.log(csv);

// === Node.js: Read JSON file, write CSV file ===
const fs = require('fs');
const data = JSON.parse(fs.readFileSync('data.json', 'utf-8'));
const csv = jsonToCsv(data);
fs.writeFileSync('output.csv', csv, 'utf-8');

// === Using json2csv package (recommended for production) ===
// npm install json2csv
const { parse } = require('json2csv');
const opts = {
  fields: ['name', 'age', 'address.city'],  // dot notation for nested
  flatten: true,
  defaultValue: ''
};
const csvOutput = parse(data, opts);
fs.writeFileSync('output.csv', csvOutput);

For production use, consider the json2csv npm package which handles edge cases like quoted fields, nested objects, and custom delimiters. Install with npm install json2csv and use the parse() function with options for field selection and flattening.

The Node.js approach above reads a JSON file, extracts headers from the first object, and writes each row as comma-separated values. For large files, use a streaming approach with fs.createReadStream() and a JSON streaming parser like JSONStream to avoid loading the entire file into memory.

Method 4: Convert JSON to CSV with jq (Command Line)

jq is a lightweight command-line JSON processor that can transform JSON to CSV with a single command. It is perfect for shell scripts, CI/CD pipelines, and quick terminal conversions:

# === Basic: Flat JSON array to CSV ===
# Input: [{"name":"Alice","age":30},{"name":"Bob","age":25}]
jq -r '(.[0] | keys_unsorted) as $keys |
  ($keys | @csv),
  (.[] | [.[$keys[]]] | @csv)' data.json > output.csv

# === Select specific fields ===
jq -r '["name","age","city"],
  (.[] | [.name, .age, .city]) | @csv' data.json

# === Handle nested objects with dot notation ===
jq -r '["name","street","city"],
  (.[] | [.name, .address.street, .address.city]) | @csv' data.json

# === Handle arrays by joining elements ===
jq -r '["name","tags"],
  (.[] | [.name, (.tags | join(";"))]) | @csv' data.json

# === Pipe from curl (API response to CSV) ===
curl -s "https://api.example.com/users" | \
  jq -r '(.[0] | keys_unsorted) as $k |
  ($k | @csv), (.[] | [.[$k[]]] | @csv)' > users.csv

The jq approach works best with flat or consistently structured JSON arrays. For deeply nested JSON, you may need more complex jq filters to flatten the structure before conversion. The @csv filter handles proper CSV escaping including quoting fields that contain commas or newlines.

For more complex transformations, jq supports variables, conditionals, and recursive descent. You can flatten nested objects with [.key1, .nested.key2, (.array | join(";"))] to handle arrays by joining their elements with a delimiter.

Method 5: Convert JSON to CSV with Excel / Power Query

Excel Power Query provides a visual, no-code approach to importing and transforming JSON data. This method is ideal for business users and analysts who work primarily in Excel:

  1. Open Excel and go to Data > Get Data > From File > From JSON.
  2. Select your .json file. The Power Query Editor opens with a preview of the data.
  3. Click To Table to convert the JSON structure to a tabular format.
  4. Expand nested columns by clicking the expand icon (two arrows) next to column headers. Select which nested fields to include.
  5. Remove or rename columns as needed using the Transform tab.
  6. Click Close & Load to import the data into an Excel worksheet.
  7. Save the file as CSV using File > Save As > CSV (Comma delimited).

Power Query remembers your transformation steps, so if the source JSON file is updated, you can refresh the query to re-import the data with the same transformations applied. Google Sheets also supports JSON import via the =IMPORTDATA() function for CSV URLs or through Google Apps Script for direct JSON parsing.

Handling Nested JSON: Flattening Strategies

The biggest challenge in JSON to CSV conversion is handling nested objects and arrays. CSV is a flat, two-dimensional format, so all nested structures must be flattened. Here are the common strategies:

Dot notation flattening: Nested keys are joined with dots. For example, {"user": {"name": "Alice", "address": {"city": "NYC"}}} becomes columns user.name and user.address.city. This is the default behavior in pandas json_normalize() and most online tools.

Array handling: Arrays can be handled by joining elements with a delimiter (e.g., tags: ["a","b"] becomes a;b), by creating separate rows for each array element (one-to-many expansion), or by creating numbered columns (tags.0, tags.1). The best approach depends on your downstream use case.

Selective extraction: For deeply nested JSON, you may want to extract only specific fields rather than flattening everything. This produces cleaner CSV with only the columns you need. Tools like jq and pandas let you select specific paths.

Method Comparison: Which Should You Choose?

MethodBest ForHandles NestingSetup Required
Online ToolQuick one-off conversionsAuto-flatteningNone
Python / pandasData science, automation, large filesExcellent (json_normalize)pip install pandas
JavaScript / Node.jsWeb apps, API integrationsGood (with json2csv)npm install json2csv
jqShell scripts, CI/CD, terminalManual (filters)brew/apt install jq
Excel Power QueryBusiness users, ad-hoc analysisVisual expandExcel installed

Tips for Clean JSON to CSV Conversion

Follow these best practices to avoid common pitfalls when converting JSON to CSV:

Inspect your JSON structure first: Before converting, understand whether your JSON is a flat array of objects, contains nested objects, or includes arrays within objects. Use our JSON Viewer tool to explore the structure visually.

Handle missing fields gracefully: Not all JSON objects may have the same keys. Ensure your conversion method handles missing fields by inserting empty values rather than shifting columns. pandas and json2csv handle this automatically.

Watch for special characters: CSV fields containing commas, quotes, or newlines must be properly escaped with double quotes. Most libraries handle this automatically, but verify the output if you are writing custom conversion code.

Choose the right delimiter: While comma is the standard CSV delimiter, some locales use semicolons (especially in Europe where commas are decimal separators). Most tools let you specify the delimiter. Use TSV (tab-separated) if your data contains many commas.

Preserve data types: CSV is a text format, so numbers, booleans, and dates lose their type information. If type preservation matters, consider keeping the data in JSON format or adding a header row that documents the expected types.

Frequently Asked Questions

How to convert JSON to CSV online?

Use an online JSON to CSV converter like the DevToolBox CSV-JSON Converter at viadreams.cc/en/tools/csv-json. Paste your JSON data, select JSON to CSV direction, and click Convert. The tool automatically flattens nested objects and generates downloadable CSV output. No installation or signup required, and the conversion runs entirely in your browser for privacy.

Can Excel open JSON files?

Yes, Excel can open JSON files using Power Query. Go to Data > Get Data > From File > From JSON, select your file, and use the Power Query Editor to expand nested structures into columns. You can then save the result as CSV. Alternatively, convert the JSON to CSV first using an online tool or Python script, then open the CSV directly in Excel.

How to convert nested JSON to CSV?

Nested JSON requires flattening before CSV conversion. Use Python pandas with pd.json_normalize() for automatic dot-notation flattening, jq with custom filters to select and flatten specific fields, or an online tool that handles nesting automatically. The key decision is how to handle arrays: join elements with a delimiter, create separate rows, or create numbered columns.

What is the best Python library for JSON to CSV conversion?

pandas is the best Python library for JSON to CSV conversion. Use pd.json_normalize() for nested JSON and pd.read_json() for flat JSON arrays. Both produce a DataFrame that can be exported to CSV with df.to_csv(). For simple cases without dependencies, Python built-in json and csv modules work well together.

Can I convert large JSON files (1GB+) to CSV?

Yes, but you need streaming approaches to avoid memory issues. In Python, use ijson for streaming JSON parsing combined with csv.writer for output. In Node.js, use JSONStream with a writable CSV stream. With jq, the --stream flag enables incremental processing. Online tools typically have size limits and are not suitable for files over 50-100MB.

Converting JSON to CSV is a fundamental data transformation skill. For quick conversions, use our online CSV-JSON converter. For automated workflows, Python with pandas offers the most powerful and flexible approach. Command-line users will appreciate jq for its speed and scriptability. Excel Power Query serves business users who prefer a visual interface. Whichever method you choose, the key is understanding your JSON structure and choosing the right flattening strategy for nested data.

Convert JSON to CSV instantly with our free online tool.

Related Developer Tools and Guides

𝕏 Twitterin LinkedIn
War das hilfreich?

Bleiben Sie informiert

Wöchentliche Dev-Tipps und neue Tools.

Kein Spam. Jederzeit abbestellbar.

Verwandte Tools ausprobieren

📊CSV ↔ JSON Converter{ }JSON FormatterTSJSON to TypeScript📊JSON to Table

Verwandte Artikel

CSV zu JSON Konverter: Vollstandiger Leitfaden mit Code-Beispielen

Kostenloser Online CSV-zu-JSON-Konverter. Lernen Sie CSV in JSON umzuwandeln mit Code-Beispielen in JavaScript, Python und Bash.

JSON Formatter & Validator: JSON Online Formatieren und Validieren

Kostenloser Online JSON Formatter und Validator. JSON formatieren, Syntaxfehler finden, mit Code-Beispielen in JavaScript und Python.

JSON Parse Error: Unexpected Token — Finden und Beheben

JSON-Parse-Fehler Schritt für Schritt beheben. Ursachen, Problemlokalisierung und Validierungstools.