CockroachDB supports importing data from from .sql and some .csv files.

We're also working to develop more robust ways to import data, such as from PostgreSQL and from backups stored on cloud hosting providers.

Import from SQL File

You can execute batches of INSERT statements stored in .sql files from the command line, letting you import data into your cluster.

$ cockroach sql --database=[database name] < statements.sql
Grouping each INSERT statement to include approximately 500 rows will provide the best performance.

Import from CSV

You can import numeric data stored in .csv files by executing a bash script that reads values from the files and uses them in INSERT statements.

To import non-numerical data, convert the .csv file to a .sql file (you can find free conversion software online), and then import the .sql file.


This template reads 3 columns of numerical data, and converts them into INSERT statements, but you can easily adapt the variables (a, b, c) to any number of columns.

> \| IFS=","; while read a b c; do echo "INSERT INTO csv VALUES ($a, $b, $c);"; done < test.csv;


In this SQL shell example, use \! to look at the rows in a CSV file before creating a table and then using \| to insert those rows into the table.

> \! cat test.csv
12, 13, 14
10, 20, 30
> CREATE TABLE csv (x INT, y INT, z INT);

> \| IFS=","; while read a b c; do echo "INSERT INTO csv VALUES ($a, $b, $c);"; done < test.csv;

> SELECT * FROM csv;
| x  | y  | z  |
| 12 | 13 | 14 |
| 10 | 20 | 30 |

See Also

Yes No