How to Convert CSV to JSON (and JSON to CSV)

Turning spreadsheet chaos into structured data (and back again)

📊 🔄 📋 🧮

TL;DR: CSV is the language of spreadsheets; JSON is the language of APIs. Converting between them is easy for simple cases, but watch out for commas inside fields, type inference (zip codes losing leading zeros!), and nested JSON that needs flattening. Use proper parsers — don't just split(',').

Picture this: your product manager emails you an Excel export. Your API needs JSON. Your data analyst wants CSV. You're the translator in the middle. Welcome to every developer's Tuesday.

Let's walk through how to convert between CSV and JSON cleanly, handle the edge cases that trip everyone up, and avoid the traps that lead to 3 AM debugging sessions.

A Quick Look at Both Formats

CSV is beautifully simple — each line is a record, commas separate fields:

name,email,age,city
Alice Johnson,alice@example.com,32,New York
Bob Smith,"Smith, Bob",28,San Francisco
Charlie Brown,charlie@example.com,45,"Portland, OR"

The JSON equivalent is an array of objects:

[
  {
    "name": "Alice Johnson",
    "email": "alice@example.com",
    "age": "32",
    "city": "New York"
  },
  {
    "name": "Bob Smith",
    "email": "Smith, Bob",
    "age": "28",
    "city": "San Francisco"
  }
]
🔀 Same data, two languages — CSV speaks spreadsheet, JSON speaks web

CSV to JSON in JavaScript (The Right Way)

⚠️

Don't do this: line.split(',') — it breaks the moment a field contains a comma! Bob's name "Smith, Bob" would get split into two separate fields. Always use a proper parser.

Here's a function that correctly handles quoted fields (fields with commas, newlines, or quotes inside them):

function csvToJson(csv) {
  const lines = csv.trim().split('\n');
  const headers = parseCSVLine(lines[0]);
  const result = [];

  for (let i = 1; i < lines.length; i++) {
    if (!lines[i].trim()) continue;
    const values = parseCSVLine(lines[i]);
    const obj = {};
    headers.forEach((header, index) => {
      obj[header.trim()] = values[index]?.trim() ?? '';
    });
    result.push(obj);
  }
  return result;
}

function parseCSVLine(line) {
  const fields = [];
  let current = '';
  let inQuotes = false;

  for (let i = 0; i < line.length; i++) {
    const char = line[i];
    if (inQuotes) {
      if (char === '"' && line[i + 1] === '"') {
        current += '"';
        i++; // skip escaped quote
      } else if (char === '"') {
        inQuotes = false;
      } else {
        current += char;
      }
    } else {
      if (char === '"') {
        inQuotes = true;
      } else if (char === ',') {
        fields.push(current);
        current = '';
      } else {
        current += char;
      }
    }
  }
  fields.push(current);
  return fields;
}

This parser follows the RFC 4180 CSV standard — it handles double-quoted fields and escaped quotes ("") properly. It's not the prettiest code, but it works on real-world data.

JSON to CSV in JavaScript

function jsonToCsv(jsonArray) {
  if (!jsonArray.length) return '';
  const headers = Object.keys(jsonArray[0]);

  const escapeField = (value) => {
    const str = String(value ?? '');
    if (str.includes(',') || str.includes('"') || str.includes('\n')) {
      return '"' + str.replace(/"/g, '""') + '"';
    }
    return str;
  };

  const csvRows = [
    headers.map(escapeField).join(','),
    ...jsonArray.map(row =>
      headers.map(h => escapeField(row[h])).join(',')
    )
  ];
  return csvRows.join('\n');
}

The Edge Cases That'll Get You

Type Inference: Proceed with Caution

CSV has zero concept of data types — everything's a string. When converting to JSON, you might want to auto-detect numbers and booleans:

function inferType(value) {
  if (value === '') return null;
  if (value === 'true') return true;
  if (value === 'false') return false;
  if (!isNaN(value) && value.trim() !== '') {
    return Number(value);
  }
  return value;
}
😄

Cautionary Tale: Auto-detecting numbers sounds great until zip code 07001 becomes 7001, and suddenly half of New Jersey's mail goes to the void. Phone numbers and IDs should always stay as strings!

Nested JSON to Flat CSV

JSON supports nesting; CSV doesn't. You'll need to flatten nested objects first:

// { name: "Alice", address: { city: "NYC", zip: "10001" } }
// becomes: { name: "Alice", "address.city": "NYC", "address.zip": "10001" }

function flatten(obj, prefix = '') {
  const result = {};
  for (const key in obj) {
    const fullKey = prefix ? `${prefix}.${key}` : key;
    if (typeof obj[key] === 'object' && obj[key] !== null && !Array.isArray(obj[key])) {
      Object.assign(result, flatten(obj[key], fullKey));
    } else {
      result[fullKey] = obj[key];
    }
  }
  return result;
}
🏗️ Flattening nested JSON: like turning a house into a blueprint

Not All "CSV" Uses Commas

European files often use semicolons (because commas serve as decimal separators there). TSV uses tabs. Always check what delimiter your data actually uses:

# Common delimiters:
# Comma:     name,email,age
# Semicolon: name;email;age
# Tab:       name\temail\tage
# Pipe:      name|email|age

The Python Way (Much Easier)

import csv, json

# CSV to JSON — Python makes this embarrassingly simple
with open('data.csv') as f:
    data = list(csv.DictReader(f))
with open('data.json', 'w') as f:
    json.dump(data, f, indent=2)

# JSON to CSV
with open('data.json') as f:
    data = json.load(f)
with open('data.csv', 'w', newline='') as f:
    writer = csv.DictWriter(f, fieldnames=data[0].keys())
    writer.writeheader()
    writer.writerows(data)

Python's csv.DictReader handles most edge cases automatically — quoted fields, different line endings, the works. When in doubt, let Python do the heavy lifting.

Performance for Big Files

💡

Pro Tip: For files over 10,000 rows, don't load everything into memory at once. Use streaming parsers like csv-parse in Node.js. In the browser, process large files in a Web Worker to keep the UI responsive.

// Node.js streaming — handles massive files gracefully
import { createReadStream } from 'fs';
import { parse } from 'csv-parse';

const records = [];
createReadStream('large-file.csv')
  .pipe(parse({ columns: true, skip_empty_lines: true }))
  .on('data', (row) => records.push(row))
  .on('end', () => console.log(`Parsed ${records.length} rows`));
🏎️ Streaming: because loading a 2GB CSV into memory is never the answer

Wrapping Up

Converting between CSV and JSON is one of those "simple until it isn't" tasks. The happy path takes five minutes. Then you discover fields with commas, nested objects, inconsistent delimiters, and zip codes turning into numbers — and suddenly it's midnight. Use battle-tested libraries for production code, always validate your output, and never trust line.split(',').

Try It Yourself

Paste CSV and get JSON, or paste JSON and get CSV — instantly in your browser, no data sent to any server.

Open CSV-JSON Converter →