JSON to SQL Converter: Transform JSON Data to INSERT Statements
Learn how to convert JSON arrays and nested objects into SQL CREATE TABLE and INSERT statements for MySQL, PostgreSQL, and SQL Server.
Why Convert JSON to SQL?
JSON is the standard format for API responses, configuration files, and NoSQL database exports. When you need to analyze that data with SQL queries, join it with existing relational data, or migrate from a document database to a relational one, you need to convert JSON to SQL INSERT statements.
Common use cases include importing API data into a reporting database, migrating from MongoDB to PostgreSQL or MySQL, loading seed data for development environments, and archiving JSON logs into a queryable format.
The main challenge is mapping JSON's flexible schema (nested objects, arrays, mixed types) to SQL's rigid table structure. This guide shows you how to handle every case.
JSON to SQL Type Mapping
| JSON Type | MySQL | PostgreSQL | SQL Server |
|---|---|---|---|
| string | VARCHAR / TEXT | VARCHAR / TEXT | NVARCHAR |
| number (integer) | INT / BIGINT | INTEGER / BIGINT | INT / BIGINT |
| number (decimal) | DECIMAL / DOUBLE | NUMERIC / DOUBLE PRECISION | DECIMAL / FLOAT |
| boolean | TINYINT(1) | BOOLEAN | BIT |
| null | NULL | NULL | NULL |
| object (nested) | JSON or flatten | JSONB or flatten | NVARCHAR(MAX) or flatten |
| array of primitives | JSON | JSONB or TEXT[] | NVARCHAR(MAX) |
| array of objects | Separate table | Separate table or JSONB | Separate table |
Common Conversions with Examples
Flat JSON Array to INSERT
The simplest case: a JSON array of flat objects where every value is a primitive.
[
{"name": "Alice", "email": "alice@example.com", "age": 32},
{"name": "Bob", "email": "bob@example.com", "age": 28},
{"name": "Carol", "email": "carol@example.com", "age": 45}
]
-- MySQL / PostgreSQL
CREATE TABLE users (
id SERIAL PRIMARY KEY, -- or INT AUTO_INCREMENT for MySQL
name VARCHAR(100) NOT NULL,
email VARCHAR(255) NOT NULL,
age INTEGER
);
INSERT INTO users (name, email, age) VALUES
('Alice', 'alice@example.com', 32),
('Bob', 'bob@example.com', 28),
('Carol', 'carol@example.com', 45);
Handling Nested Objects
When JSON contains nested objects, you can either flatten them into columns or store them as JSON.
[
{
"name": "Alice",
"address": {"city": "New York", "zip": "10001"}
}
]
Option 1: Flatten
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
address_city VARCHAR(100),
address_zip VARCHAR(20)
);
INSERT INTO users (name, address_city, address_zip) VALUES
('Alice', 'New York', '10001');
Option 2: Store as JSON column
-- PostgreSQL
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
address JSONB
);
INSERT INTO users (name, address) VALUES
('Alice', '{"city": "New York", "zip": "10001"}');
-- Query the nested field
SELECT name, address->>'city' AS city FROM users;
Boolean and NULL Values
[
{"product": "Widget", "in_stock": true, "discount": null},
{"product": "Gadget", "in_stock": false, "discount": 15.5}
]
-- PostgreSQL
INSERT INTO products (product, in_stock, discount) VALUES
('Widget', TRUE, NULL),
('Gadget', FALSE, 15.5);
-- MySQL
INSERT INTO products (product, in_stock, discount) VALUES
('Widget', 1, NULL),
('Gadget', 0, 15.5);
JSON Arrays as Values
[
{"title": "SQL Guide", "tags": ["sql", "database", "tutorial"]},
{"title": "API Design", "tags": ["api", "rest"]}
]
-- PostgreSQL (using JSONB)
CREATE TABLE articles (
id SERIAL PRIMARY KEY,
title VARCHAR(200),
tags JSONB
);
INSERT INTO articles (title, tags) VALUES
('SQL Guide', '["sql", "database", "tutorial"]'),
('API Design', '["api", "rest"]');
-- Query articles with a specific tag
SELECT title FROM articles WHERE tags ? 'sql';
-- MySQL (using JSON)
CREATE TABLE articles (
id INT AUTO_INCREMENT PRIMARY KEY,
title VARCHAR(200),
tags JSON
);
INSERT INTO articles (title, tags) VALUES
('SQL Guide', '["sql", "database", "tutorial"]'),
('API Design', '["api", "rest"]');
-- Query articles with a specific tag
SELECT title FROM articles WHERE JSON_CONTAINS(tags, '"sql"');
Normalizing Arrays of Objects
When JSON contains arrays of objects, the best approach is usually to create a separate table.
[
{
"order_id": 1001,
"customer": "Alice",
"items": [
{"product": "Widget", "qty": 2, "price": 9.99},
{"product": "Gadget", "qty": 1, "price": 24.99}
]
}
]
CREATE TABLE orders (
order_id INT PRIMARY KEY,
customer VARCHAR(100)
);
CREATE TABLE order_items (
id SERIAL PRIMARY KEY,
order_id INT REFERENCES orders(order_id),
product VARCHAR(100),
qty INT,
price DECIMAL(10,2)
);
INSERT INTO orders (order_id, customer) VALUES (1001, 'Alice');
INSERT INTO order_items (order_id, product, qty, price) VALUES
(1001, 'Widget', 2, 9.99),
(1001, 'Gadget', 1, 24.99);
Date Strings in JSON
[
{"event": "signup", "timestamp": "2026-03-11T14:30:00Z"},
{"event": "purchase", "timestamp": "2026-03-11T15:45:22Z"}
]
-- PostgreSQL
INSERT INTO events (event, timestamp) VALUES
('signup', '2026-03-11T14:30:00Z'::TIMESTAMPTZ),
('purchase', '2026-03-11T15:45:22Z'::TIMESTAMPTZ);
-- MySQL
INSERT INTO events (event, event_time) VALUES
('signup', '2026-03-11 14:30:00'),
('purchase', '2026-03-11 15:45:22');
Data Type Mapping
| JSON Pattern | Best SQL Type | Example |
|---|---|---|
| Short strings (<256 chars) | VARCHAR(255) | "alice@example.com" |
| Long strings | TEXT | Article body text |
| Small integers | INT | 42, -7, 0 |
| Large integers | BIGINT | 9007199254740992 |
| Decimals | DECIMAL(10,2) | 99.99, 3.14 |
| true/false | BOOLEAN | true, false |
| ISO date strings | TIMESTAMP | "2026-03-11T14:30:00Z" |
| Date-only strings | DATE | "2026-03-11" |
| UUID strings | UUID (PG) / CHAR(36) | "550e8400-e29b-..." |
| Nested objects/arrays | JSONB / JSON | {"key": "value"} |
Step-by-Step Conversion Guide
- Validate the JSON. Ensure the input is valid JSON. Use a linter or
JSON.parse()to catch syntax errors before conversion. - Analyze the structure. Identify top-level keys, their data types, which fields are nested objects, and which are arrays. This determines your table schema.
- Design the schema. Map JSON keys to column names (convert camelCase to snake_case if needed). Choose whether to flatten nested objects, store as JSON columns, or normalize into separate tables.
- Handle edge cases. Decide how to treat null values, missing keys (some objects may not have all fields), and mixed-type arrays. Set columns as nullable where appropriate.
- Generate the SQL. Create the CREATE TABLE statement with proper types and constraints. Then generate INSERT statements for each JSON object, escaping string values and converting booleans to the database's format.
- Test and verify. Load the SQL into your database and run validation queries. Check that row counts match and that nested JSON data is queryable.
For related topics, see our CSV to SQL converter and SQL vs NoSQL comparison guide.
Convert Automatically with AI2SQL
Describe the shape of your JSON data in plain English and AI2SQL generates the CREATE TABLE and INSERT statements for your target database. Say "create a table for user records with name, email, address as JSON, and signup date" and get production-ready SQL instantly.
Try AI2SQL free and transform your JSON data into SQL in seconds. Browse all converter tools.
Frequently Asked Questions
How do I convert a JSON array to SQL INSERT statements?
Each object in the JSON array becomes one INSERT row. The object keys become column names and the values become the row data. For example, [{"name": "Alice", "age": 30}] becomes INSERT INTO table_name (name, age) VALUES ('Alice', 30). Strings get single quotes, numbers stay unquoted, null becomes NULL, and booleans map to TRUE/FALSE or 1/0 depending on the database.
How do I handle nested JSON objects when converting to SQL?
You have three options: 1) Flatten the nested object into separate columns (address.city becomes address_city), 2) Store the nested object as a JSON column if your database supports it (MySQL JSON, PostgreSQL JSONB), or 3) Normalize into separate tables with foreign keys. Option 2 is fastest for migration, option 3 is best for querying.
Can I store JSON directly in a SQL database?
Yes. MySQL has a JSON column type, PostgreSQL has JSON and JSONB (binary, indexed), and SQL Server supports JSON functions with NVARCHAR columns. PostgreSQL's JSONB is the most powerful, supporting GIN indexes for fast queries. However, for structured data that you query frequently, normalizing into proper columns and tables gives better performance.