CSV to SQL Converter: Generate INSERT Statements from CSV
Learn how to convert CSV files into SQL CREATE TABLE and INSERT statements for MySQL, PostgreSQL, SQL Server, and SQLite.
Why Convert CSV to SQL?
CSV files are the universal data exchange format. Every spreadsheet, analytics tool, and data export can produce CSV. But to query, join, filter, and aggregate that data efficiently, you need it in a relational database. Converting CSV to SQL lets you take advantage of indexes, joins, transactions, and the full power of SQL.
Common scenarios include importing customer lists from a CRM export, loading product catalogs from suppliers, seeding a development database with test data, and migrating data between systems that only share CSV as a common format.
The conversion involves two steps: creating the table schema (CREATE TABLE) and generating the data insertion statements (INSERT INTO). This guide covers both, with database-specific syntax for MySQL, PostgreSQL, SQL Server, and SQLite.
Key Differences by Database
| Feature | MySQL | PostgreSQL | SQL Server | SQLite |
|---|---|---|---|---|
| Bulk import | LOAD DATA INFILE | COPY ... FROM | BULK INSERT | .import |
| String quoting | Single quotes | Single quotes | Single quotes | Single quotes |
| Escape quotes | \' or '' | '' | '' | '' |
| Auto-increment | AUTO_INCREMENT | SERIAL | IDENTITY(1,1) | AUTOINCREMENT |
| Boolean | TINYINT(1) | BOOLEAN | BIT | INTEGER |
| Date format | DATETIME | TIMESTAMP | DATETIME2 | TEXT |
| Multi-row INSERT | Yes | Yes | Yes (max 1000) | Yes (3.7.11+) |
| NULL handling | \N in LOAD DATA | Empty string in COPY | Empty field | Empty field |
Common Conversions with Examples
Basic CSV to INSERT
Given this CSV data:
name,email,age,active
Alice Johnson,alice@example.com,32,true
Bob Smith,bob@example.com,28,false
Carol Williams,carol@example.com,45,true
The MySQL INSERT statements would be:
CREATE TABLE users (
id INT AUTO_INCREMENT PRIMARY KEY,
name VARCHAR(100) NOT NULL,
email VARCHAR(255) NOT NULL,
age INT,
active TINYINT(1) DEFAULT 1
);
INSERT INTO users (name, email, age, active) VALUES
('Alice Johnson', 'alice@example.com', 32, 1),
('Bob Smith', 'bob@example.com', 28, 0),
('Carol Williams', 'carol@example.com', 45, 1);
For PostgreSQL:
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL,
email VARCHAR(255) NOT NULL,
age INTEGER,
active BOOLEAN DEFAULT TRUE
);
INSERT INTO users (name, email, age, active) VALUES
('Alice Johnson', 'alice@example.com', 32, TRUE),
('Bob Smith', 'bob@example.com', 28, FALSE),
('Carol Williams', 'carol@example.com', 45, TRUE);
Handling Special Characters
CSV values containing commas, quotes, or newlines require careful escaping in SQL:
company,address,notes
"O'Brien & Co","123 Main St, Suite 4","Line 1
Line 2"
INSERT INTO companies (company, address, notes) VALUES
('O''Brien & Co', '123 Main St, Suite 4', 'Line 1
Line 2');
Handling NULL Values
Empty CSV fields should become SQL NULL, not empty strings:
name,email,phone
Alice,alice@example.com,555-0100
Bob,bob@example.com,
INSERT INTO contacts (name, email, phone) VALUES
('Alice', 'alice@example.com', '555-0100'),
('Bob', 'bob@example.com', NULL);
Date and Timestamp Columns
order_id,customer,order_date,total
1001,Alice,2026-03-01,149.99
1002,Bob,2026-03-05,89.50
-- MySQL
INSERT INTO orders (order_id, customer, order_date, total) VALUES
(1001, 'Alice', '2026-03-01', 149.99),
(1002, 'Bob', '2026-03-05', 89.50);
-- PostgreSQL (same syntax for simple dates)
INSERT INTO orders (order_id, customer, order_date, total) VALUES
(1001, 'Alice', '2026-03-01'::DATE, 149.99),
(1002, 'Bob', '2026-03-05'::DATE, 89.50);
Bulk Import (Large Files)
For CSV files with thousands of rows, use native bulk import instead of individual INSERT statements:
-- MySQL
LOAD DATA INFILE '/path/to/data.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
-- PostgreSQL
COPY users (name, email, age, active)
FROM '/path/to/data.csv'
WITH (FORMAT csv, HEADER true);
-- SQL Server
BULK INSERT users
FROM '/path/to/data.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW = 2
);
Automatic Type Detection
When converting CSV to a CREATE TABLE statement, scan the data to determine the best SQL type for each column:
| CSV Pattern | Recommended SQL Type |
|---|---|
| All integers (1, 42, -7) | INT or BIGINT |
| Decimal numbers (3.14, 99.99) | DECIMAL(p,s) or FLOAT |
| true/false, yes/no, 1/0 | BOOLEAN or TINYINT(1) |
| YYYY-MM-DD | DATE |
| YYYY-MM-DD HH:MM:SS | DATETIME or TIMESTAMP |
| Everything else | VARCHAR(n) where n = max length |
Data Type Mapping by Database
| CSV Data | MySQL | PostgreSQL | SQL Server |
|---|---|---|---|
| Short text (<=255) | VARCHAR(255) | VARCHAR(255) | NVARCHAR(255) |
| Long text | TEXT | TEXT | NVARCHAR(MAX) |
| Integer | INT | INTEGER | INT |
| Large integer | BIGINT | BIGINT | BIGINT |
| Decimal | DECIMAL(10,2) | NUMERIC(10,2) | DECIMAL(10,2) |
| Boolean | TINYINT(1) | BOOLEAN | BIT |
| Date | DATE | DATE | DATE |
| Datetime | DATETIME | TIMESTAMP | DATETIME2 |
VARCHAR(255) | VARCHAR(255) | NVARCHAR(255) | |
| Money | DECIMAL(12,2) | NUMERIC(12,2) | MONEY |
Step-by-Step Conversion Guide
- Inspect the CSV. Open the file and check the delimiter (comma, semicolon, tab), whether values are quoted, and the encoding (UTF-8 is safest for SQL import).
- Design the table schema. Map each CSV column header to a SQL column. Determine the data type by scanning values in that column. Add a primary key column if the CSV does not have one.
- Create the table. Write and run the CREATE TABLE statement with the chosen types and constraints (NOT NULL, DEFAULT, UNIQUE).
- Generate INSERT statements. For small files (under 1000 rows), generate multi-row INSERT statements. For larger files, use the database's bulk import command (LOAD DATA, COPY, BULK INSERT).
- Handle edge cases. Escape single quotes in string values, convert empty fields to NULL, and validate date formats match what the database expects.
- Verify the import. Run
SELECT COUNT(*) FROM table_nameand compare against the CSV row count. Spot-check a few rows to ensure data integrity.
For related topics, see our guide on Excel to SQL conversion and SQL fundamentals.
Convert Automatically with AI2SQL
Instead of manually writing CREATE TABLE and INSERT statements, describe your CSV structure to AI2SQL and it generates the correct SQL for your target database. Just say "create a table for customer data with name, email, signup date, and subscription tier" and AI2SQL produces the schema with the right types.
Try AI2SQL free and generate your first SQL from a data description in seconds. See also: JSON to SQL converter. Browse all converter tools.
Frequently Asked Questions
How do I convert a CSV file to SQL INSERT statements?
Map each CSV header to a column name and each row to an INSERT statement. For example, a CSV with columns name,email,age becomes INSERT INTO table_name (name, email, age) VALUES ('John', 'john@example.com', 30). You can automate this with AI2SQL or scripting tools. For large files, use bulk import tools like MySQL's LOAD DATA INFILE or PostgreSQL's COPY command.
What SQL data types should I use for CSV columns?
CSV files are untyped, so you need to infer types from the data. Columns with only numbers map to INT or DECIMAL. Date-formatted values map to DATE or DATETIME. Everything else defaults to VARCHAR with a length matching the longest value. Boolean-like columns (true/false, yes/no, 1/0) can use BOOLEAN or TINYINT(1).
How do I handle special characters in CSV to SQL conversion?
Single quotes in string values must be escaped by doubling them: O'Brien becomes O''Brien. Backslashes may need escaping depending on the database. NULL values in CSV (empty fields) should be converted to the SQL keyword NULL without quotes. Use parameterized queries or prepared statements when possible to avoid SQL injection risks.