Migrate from Shapefiles

On this page Carat arrow pointing down
As of May 16, 2023, CockroachDB v21.2 is no longer supported. For more details, refer to the Release Support Policy.

CockroachDB supports efficiently storing and querying spatial data.

This page has instructions for migrating data from ESRI Shapefiles into CockroachDB using shp2pgsql and IMPORT.


We are using shp2pgsql in the example below, but ogr2ogr could also be used, e.g. ogr2ogr -f PGDUMP file.sql -lco LAUNDER=NO -lco DROP_TABLE=OFF file.shp


An ogr2ogr version of 3.1.0 or higher is required to generate data that can be imported into CockroachDB.

In the example below we will import a tornadoes data set that is available from the US National Weather Service (NWS).


Please refer to the documentation of your GIS software for instructions on exporting GIS data to Shapefiles.

Before You Begin

To follow along with the example below, you will need the following prerequisites:

Step 1. Download the Shapefile data

First, download and unzip the tornado data:

wget http://web.archive.org/web/20201018170120/https://www.spc.noaa.gov/gis/svrgis/zipped/1950-2018-torn-initpoint.zip
unzip 1950-2018-torn-initpoint.zip
cd 1950-2018-torn-initpoint/

Step 2. Convert the Shapefile data to SQL

To load the tornado Shapefile into CockroachDB, we must first convert it to SQL using the shp2pgsql tool:

shp2pgsql 1950-2018-torn-initpoint.shp > tornado-points.sql &

Step 3. Host the files where the cluster can access them

Each node in the CockroachDB cluster needs to have access to the files being imported. There are several ways for the cluster to access the data; for a complete list of the types of storage IMPORT can pull from, see import file locations.

For local testing, you can start a local file server. The following command will start a local file server listening on port 3000:

python3 -m http.server 3000

Step 4. Prepare the database

Next, create a tornadoes database to store the data in, and switch to it:

cockroach sql --insecure
USE tornadoes;

Step 5. Import the SQL

Since the file is being served from a local server and is formatted as PostgreSQL-compatible SQL, we can import the data using the following IMPORT PGDUMP statement:

IMPORT PGDUMP ('http://localhost:3000/tornado-points.sql') WITH ignore_unsupported_statements;
        job_id       |  status   | fraction_completed | rows  | index_entries |  bytes
  584874782851497985 | succeeded |                  1 | 63645 |             0 | 18287213
(1 row)

See also

Yes No
On this page

Yes No