Migrate from a CockroachCloud Free (beta) to CockroachCloud Cluster

This page has instructions for migrating data from a CockroachCloud Free (beta) cluster to a paid CockroachCloud cluster, by exporting to CSV and using IMPORT. You may want to migrate to the paid version of CockroachCloud if:

  • You reach (or get close to reaching) the upper limit of usage of up to 1 vCPU and 5GB storage per CockroachCloud Free (beta) cluster.
  • You want to use any of the capabilities that are not yet available in CockroachCloud Free (beta) clusters, like the ability to enable backups or to upload data with the IMPORT command.

The steps below use sample data from the tpcc workload.

Before you start

These instructions assume you already have the following:

Step 1. Export data to a local CSV file

In CockroachCloud Free (beta) clusters, all external service integrations are disabled. This means that if you want to export data, you need to use cockroach sql --execute to query the data you want to export, and then pipe the data to a local file. For example:

$ cockroach sql \
--url 'postgres://<username>:<password>@free-tier.<region>.cockroachlabs.cloud:26257?sslmode=verify-full&sslrootcert=<path/to/certs_dir>/cc-ca.crt&options=--cluster=<cluster_name>' \
--execute "SELECT * FROM tpcc.warehouse" --format=csv > /Users/<username>/<path/to/file>/warehouse.csv

By running the example command above, the following data is exported to the local warehouse.csv file:


Repeat this step for each table you want to migrate. For example, let's export one more table (district) from the tpcc database:

$ cockroach sql \
--url 'postgres://<username>:<password>@free-tier.<region>.cockroachlabs.cloud:26257?sslmode=verify-full&sslrootcert=<path/to/certs_dir>/cc-ca.crt&options=--cluster=<cluster_name>' \
--execute "SELECT * FROM tpcc.district" --format=csv > /Users/<username>/<path/to/file>/district.csv

This will create the district.csv file with the following data:


Step 2. Host the files where the CockroachCloud cluster can access them

After you've exported your Cockroach Free (beta) cluster data to your local machine, you now need to upload the files to a storage location where the paid CockroachCloud cluster can access them. We recommend using cloud storage or userfile.

In this example, we'll use Amazon S3 to host the two files (warehouse.csv and district.csv) created in Step 1.

Step 3. Import the CSV


For best practices for optimizing import performance in CockroachDB, see Import Performance Best Practices.

  1. Create the database you want to import the tables into. For example:

  2. Write an IMPORT statement that matches the schema of the table data you're importing.


    You can use the SHOW CREATE TABLE statement in the CockroachCloud Free (beta) cluster to view the CREATE statement for the table you're migrating.


    The column order in your schema must match the column order in the file being imported.

    For example, to import the data from warehouse.csv into a warehouse table, use the following statement:

    > IMPORT TABLE tpcc.warehouse (
        w_id INT8 NOT NULL,
        w_name VARCHAR(10) NULL,
        w_street_1 VARCHAR(20) NULL,
        w_street_2 VARCHAR(20) NULL,
        w_city VARCHAR(20) NULL,
        w_state CHAR(2) NULL,
        w_zip CHAR(9) NULL,
        w_tax DECIMAL(4,4) NULL,
        w_ytd DECIMAL(12,2) NULL,
        CONSTRAINT "primary" PRIMARY KEY (w_id ASC),
        FAMILY "primary" (w_id, w_name, w_street_1, w_street_2, w_city, w_state, w_zip, w_tax, w_ytd)
      ) CSV DATA ('s3://<bucket_name>/warehouse.csv?AWS_ACCESS_KEY_ID=<access_key>&AWS_SECRET_ACCESS_KEY=<secret_key>')
        skip = '1';

    Notice that we used the skip option in the above command. This is because the first line of the CSV file we created in Step 1 is the header row, not actual data to import. For more information about the options available for IMPORT ... CSV, see Import options.

            job_id       |  status   | fraction_completed | rows | index_entries | bytes
      652283814057476097 | succeeded |                  1 |    1 |             0 |    53
    (1 row)

    To import data into an existing table, use IMPORT INTO.

  3. Repeat the above for each CSV file you want to import. For example, let's import the second file (district.csv) we created earlier:

    > IMPORT TABLE tpcc.district (
        d_id INT8 NOT NULL,
        d_w_id INT8 NOT NULL,
        d_name VARCHAR(10) NULL,
        d_street_1 VARCHAR(20) NULL,
        d_street_2 VARCHAR(20) NULL,
        d_city VARCHAR(20) NULL,
        d_state CHAR(2) NULL,
        d_zip CHAR(9) NULL,
        d_tax DECIMAL(4,4) NULL,
        d_ytd DECIMAL(12,2) NULL,
        d_next_o_id INT8 NULL,
        CONSTRAINT "primary" PRIMARY KEY (d_w_id ASC, d_id ASC),
        FAMILY "primary" (d_id, d_w_id, d_name, d_street_1, d_street_2, d_city, d_state, d_zip, d_tax, d_ytd, d_next_o_id)                     
      ) CSV DATA ('s3://<bucket_name>/district.csv?AWS_ACCESS_KEY_ID=<access_key>&AWS_SECRET_ACCESS_KEY=<secret_key>')
        skip = '1';
            job_id       |  status   | fraction_completed | rows | index_entries | bytes
      652285202857820161 | succeeded |                  1 |   10 |             0 |  1017
    (1 row)
  4. (Optional) To verify that the two tables were imported, use SHOW TABLES:

    > SHOW TABLES FROM tpcc;
      schema_name | table_name | type  | owner  | estimated_row_count
      public      | district   | table | lauren |                  10
      public      | warehouse  | table | lauren |                   1
    (2 rows)

Step 4. Add any foreign key relationships

Once all of the tables you want to migrate have been imported into the CockroachCloud cluster, add the foreign key relationships. To do this, use ALTER TABLE ... ADD CONSTRAINT. For example:

> ALTER TABLE tpcc.district ADD CONSTRAINT fk_d_w_id_ref_warehouse FOREIGN KEY (d_w_id) REFERENCES tpcc.warehouse(w_id);

See also

YesYes NoNo