CSV to SQL Converter
Convert CSV data to SQL CREATE TABLE and INSERT statements. Auto-detects column types with configurable options.
How to Use
- Paste your CSV data into the input panel. The first row must contain column headers.
- Set the table name for your SQL output (default:
my_table). - Choose the output mode: CREATE TABLE + INSERT (both), INSERT only, or CREATE TABLE only.
- Select the CSV delimiter (comma, tab, semicolon, or pipe) and quote character.
- Toggle Infer types to auto-detect column data types or default all columns to TEXT.
- Click Copy to copy the generated SQL to your clipboard.
Why Convert CSV to SQL?
CSV (Comma-Separated Values) is the most common format for tabular data exchange. Spreadsheets, data exports, and ETL pipelines all produce CSV files. However, importing CSV into a relational database requires SQL statements — specifically CREATE TABLE to define the schema and INSERT INTO to load the data. Manually writing these statements for large datasets is tedious and error-prone.
This tool automates the entire conversion process. Paste your CSV, configure your options, and get production-ready SQL in seconds. The generated SQL is compatible with MySQL, PostgreSQL, SQLite, SQL Server, and any ANSI SQL-compliant database.
Understanding the Generated SQL
CREATE TABLE Statements
The CREATE TABLE statement defines your database table schema. Each CSV column header becomes a column name, sanitized to contain only letters, numbers, and underscores. When type inference is enabled, the tool examines every value in each column to determine the most appropriate SQL type:
- INTEGER — assigned when all non-empty values are whole numbers (e.g.,
42,-7,0) - REAL — assigned when values contain decimal points (e.g.,
3.14,-0.5) - BOOLEAN — assigned when all values are
trueorfalse(case-insensitive) - TEXT — the default type for string data and mixed-type columns
INSERT Statements
INSERT statements add rows to your table. You can generate them in two formats:
- Multi-row VALUES — a single
INSERT INTO ... VALUESstatement with multiple value tuples, separated by commas. This is more efficient for bulk loading and is supported by MySQL, PostgreSQL, and SQLite. - Separate INSERT statements — one
INSERT INTOper row. This is more portable across SQL dialects and easier to debug when individual rows fail.
Empty CSV values are converted to NULL, which correctly represents missing data in SQL. String values containing single quotes are escaped using the SQL convention of doubling them ('').
Working with CSV Delimiters
While commas are the standard delimiter, real-world CSV files use various separators. European locales often use semicolons because commas serve as decimal separators. Tab-separated values (TSV) are common in bioinformatics and data science exports. Pipe-delimited files appear in legacy systems and mainframe data exports. This tool supports all four common delimiters.
Quoted fields allow the delimiter character to appear within values. For example, with double-quote quoting and comma delimiter, the value "New York, NY" is treated as a single field containing New York, NY. Escaped quotes within quoted fields (doubled quote characters) are handled correctly.
Database Import Alternatives
For very large CSV files (millions of rows), using generated INSERT statements may be slow. Most databases provide native bulk-import commands that are significantly faster:
- MySQL —
LOAD DATA INFILE '/path/to/file.csv' INTO TABLE my_table - PostgreSQL —
COPY my_table FROM '/path/to/file.csv' WITH CSV HEADER - SQLite —
.import --csv file.csv my_table - SQL Server —
BULK INSERT my_table FROM '/path/to/file.csv'
These native commands bypass the SQL parser and load data directly, making them 10-100x faster than executing individual INSERT statements. Use this tool for small-to-medium datasets, schema exploration, or when you need portable SQL scripts that work across different database systems.
Common Use Cases
- Database seeding — generate SQL scripts to populate development or test databases with sample data from spreadsheets
- Data migration — convert exported CSV data from one system into SQL for import into another relational database
- Schema design — quickly generate a CREATE TABLE statement from a CSV to establish table structure, then refine column types and constraints
- Learning SQL — convert familiar spreadsheet data into SQL to practice queries on real data
SQL Safety and Escaping
The generated SQL escapes all string values by doubling single quotes — the standard SQL escape mechanism. The value O'Brien becomes 'O''Brien' in the output. Column names are sanitized to contain only alphanumeric characters and underscores, preventing SQL injection in the schema definition. While the generated SQL is safe for direct execution, always review the output before running it against a production database.
Related Tools
View and sort CSV data with the CSV Viewer. Format and beautify SQL queries with the SQL Formatter. Convert JSON arrays to CSV with the JSON to CSV Converter. Convert SQL schemas to NoSQL formats with the SQL to NoSQL Converter. Compare CSV file versions with the Diff Checker. Format JSON data with the JSON Formatter.
Frequently Asked Questions
- What CSV format does this tool expect?
- The tool expects CSV data where the first row contains column headers and each subsequent row contains data values. Headers become SQL column names. Supported delimiters include comma, tab, semicolon, and pipe.
- How does the tool infer column types?
- When "Infer types" is enabled, the tool analyzes all values in each column. If every non-empty value is an integer, the column is typed as INTEGER. If values contain decimals, it uses REAL. If all values are "true" or "false", it uses BOOLEAN. Everything else defaults to TEXT.
- How are empty CSV values handled?
- Empty values in the CSV are converted to NULL in the generated SQL statements. This is the standard SQL approach for representing missing or unknown data.
- What is the difference between single and multi-row INSERT?
- Multi-row INSERT combines all rows into a single INSERT statement with multiple VALUES tuples, which is more efficient for bulk imports. Single INSERT generates a separate INSERT INTO statement for each row, which can be easier to debug and is compatible with older SQL databases that do not support multi-row INSERT.
- How are special characters in values handled?
- Single quotes in data values are escaped by doubling them (e.g., O'Brien becomes O''Brien) following SQL string literal conventions. Column names are sanitized to contain only letters, numbers, and underscores.
- Which SQL dialects are supported?
- The generated SQL uses standard ANSI SQL syntax compatible with MySQL, PostgreSQL, SQLite, SQL Server, and most relational databases. The CREATE TABLE and INSERT statements follow universal conventions.
- Is my data sent to a server?
- No. All CSV parsing and SQL generation happens entirely in your browser. No data is transmitted to any server, making it safe to use with sensitive or proprietary data.
- Can I convert CSV files exported from Excel or Google Sheets?
- Yes. Excel and Google Sheets export CSV files using comma delimiters by default. Simply paste the CSV content or copy it from your spreadsheet. For European locales that use semicolons as delimiters, switch the delimiter option to Semicolon.
Use this tool from AI agents.
The CodeTidy MCP Server lets Claude, Cursor, and other AI agents
use this tool and 46 others directly. One command: npx @codetidy/mcp