WebToolsPlanet
converter Tools

CSV to JSON Converter

Convert CSV files to JSON arrays instantly — with header mapping, delimiter options, and type inference.

Last updated: March 25, 2026

Client-Side Processing
Input Data Stays on Device
Instant Local Execution

Find this tool useful? Support the project to keep it free!

Buy me a coffee

What is CSV to JSON Converter?

CSV (Comma-Separated Values) is the most common format for tabular data exports — from Excel, Google Sheets, databases, and reporting systems. JSON (JavaScript Object Notation) is the standard format for web APIs, NoSQL databases, and JavaScript applications. Converting between them is one of the most frequent tasks in data engineering, backend development, and analytics workflows.

This tool parses CSV input using the first row as keys (headers) and converts each subsequent row into a JSON object. The result is an array of objects ready to use in JavaScript, import into MongoDB, post to an API, or use as seed data in a database. You can configure the delimiter (comma, semicolon, tab/TSV), choose whether to infer value types (numbers and booleans instead of strings), and download the result as a .json file.

How to Use CSV to JSON Converter

1

Paste CSV data into the input area, or click "Upload CSV" to load a .csv file

2

Confirm the delimiter (auto-detected from the first line: comma, semicolon, or tab)

3

Toggle "Infer Types" if you want numbers and booleans converted from strings automatically

4

Review the JSON output in the right panel — it updates in real time as you edit

5

Click "Copy JSON" or "Download .json" to export the result

Common Use Cases

  • Importing Google Sheets or Excel export data into a MongoDB collection as JSON documents
  • Converting a database export CSV into JSON to seed a development database
  • Transforming API export CSVs into JSON for consumption by a frontend application
  • Converting analytics report exports (Google Analytics, HubSpot, Salesforce) for processing
  • Preparing fixture/test data in JSON format from a spreadsheet of test cases
  • Migrating data between systems that use different formats (CRM to API)
  • Converting TSV (tab-separated) financial report exports into JSON
  • Quickly checking what JSON structure a CSV file would produce before writing a parser

Example Input and Output

Converting a product catalog CSV from an e-commerce export into a JSON array:

CSV input
id,name,price,in_stock
1,Running Shoes,89.99,true
2,Yoga Mat,34.50,true
3,Kettlebell 20kg,65.00,false
JSON output (array of objects)
[
  { "id": 1, "name": "Running Shoes", "price": 89.99, "in_stock": true },
  { "id": 2, "name": "Yoga Mat", "price": 34.50, "in_stock": true },
  { "id": 3, "name": "Kettlebell 20kg", "price": 65.00, "in_stock": false }
]

Privacy First

All CSV parsing runs locally in your browser. Your data never leaves your device — it is not uploaded, transmitted, or stored on our servers. This is especially important for CSVs containing customer or financial data.

Encoding Tip

If your CSV contains accented characters that appear garbled, it may be encoded in Latin-1 (ISO-8859-1) instead of UTF-8. Re-save the file as UTF-8 in Excel (File → Save As → CSV UTF-8 (Comma delimited)) or Google Sheets (File → Download → CSV) before converting.

TSV vs CSV

Google Sheets exports TSV (tab-separated) by default when you use "Copy as TSV". Excel uses comma-separated CSV, but European Windows locales may use semicolons. Check your file's first row to confirm the delimiter before converting.

Frequently Asked Questions

What delimiter does this support (comma, semicolon, tab)?
All three are supported: comma (,) for standard CSV, semicolon (;) for European locales (common in Excel exports from EU countries), and tab (\t) for TSV (Tab-Separated Values) files. The delimiter is auto-detected from the first line, or you can manually select it.
What does "Infer Types" mean?
By default, all CSV values are strings. With Infer Types enabled, numeric strings like "42" become the number 42, "3.14" becomes 3.14, "true"/"false" become booleans, and "null"/"" become null. This is useful when importing into MongoDB or using the data in JavaScript where type correctness matters.
What happens to missing values (empty cells)?
Empty cells are converted to empty strings ("") by default, or null if Infer Types is enabled. If a row has fewer columns than the header, the missing keys will be omitted from that object.
Can I convert CSV with quoted fields containing commas?
Yes. The parser handles RFC 4180 compliant CSV — fields enclosed in double quotes (like "Smith, John") are treated as a single value even if they contain commas, newlines, or double-quote pairs ("").
What if my CSV has no header row?
Toggle "No Header Row" and the tool will use column0, column1, column2... as automatic key names. You can then manually rename keys in the JSON output.
Is there a file size limit?
There is no server-side limit since parsing runs in your browser. Very large CSV files (50MB+) may be slow to process depending on your device. For very large datasets, consider using a command-line tool like csv2json or jq for better performance.

How This Tool Works

The CSV input is split into lines, with the first line parsed as headers. Each subsequent line is tokenized by the selected delimiter, with respect for quoted fields (RFC 4180 compliant quote handling). Each row is mapped to an object using the header array as keys. If type inference is enabled, regex patterns test each value for numeric, boolean, and null patterns before assignment. The resulting array of objects is serialized to JSON with 2-space indentation using JSON.stringify().

Technical Stack

Browser-native JavaScriptRFC 4180 CSV parsingReal-time conversionClient-side only