1 Introduction

In this document, I intend to explain the steps that I went through to create a global network of ‘walking routes’, that is PGrouting ready network to calculate the time and distance it would take to ‘walk’ or ‘sail’ between two points on earth.

A PDF version of this document is provided here, https://www.wnvermeulen.com/empires/routing_tutorial.pdf

This projected is part of a paper with Pierre-Louis Vézina and Gunes Gokmen on the Imperial Routes of Global trade, published in Journal of Economic Growth.

I’m an autodidact in GIS methods, and while I had some basic knowledge in SQL when I started, I learned a lot on how to do things efficiently within this project. Nevertheless, I’m sure there are still a lot of inefficient parts in my code. When coding, everyone makes a continuous trade-off between (a) quick and dirty coding but potentially slow execution time and (b) slow and elegant coding with fast execution. Most of the time, I go for fast solutions only when the execution is unbearably long, or when I have epiphanies. With every block of code I learned a bit more, and am better able to implement efficient solutions early on, such that the difference between (a) and (b) becomes smaller over time.

Many solutions were adapted from online sources, in particular stackexchange. However, I did not keep track of all these sources, so references for the implementations that I used are very scarce. I’m sorry for that, since the online community has been invaluable and people deserve a huge credit for offering their solutions publicly. Perhaps my full solution posted here can be a form of payback. My implementation is also quite similar to http://orbis.stanford.edu and probably I used pieces of code, or strategies from their implementation.

I did this in PostGIS rather than the seemingly more commonly used Python with ArcGIS or QGIS. I needed routing, and at the time that I started with this, the most readily available solution seemed to be PGRouting. I use QGIS to create maps and to check that spatial manipulation in PostGIS is doing what I intent, but not to actually do any ‘real’ analysis or manipulation. The only exception, as will be noted below, is if sometimes I have to add or remove individual edges based on their specific location on the globe.

1.1 Game plan

The basic sequence is as follows

  1. create database
  2. load relevant external data, e.g. global altitude data
  3. create a globe with dots, clip by land.
  4. connect dots to a mesh, attach altitude information for land-based dots
  5. calculate distance and time-cost for walking, given altitude difference
  6. repeat the above for water, with lower resolution and different costs of traversing water based on winds.
  7. calculate optimal routes for current day cities, to each nearest city.
  8. network of routes becomes global routing network.

There is a lot of housekeeping for each, and in between, steps. Removal and creation of indices, removal and recreation of (temporary) tables, etc. As with other types of coding and data projects, over 90% of the code is housekeeping.

One issue with Postgress, is that it is not straight forward to use multiple treads to do time-consuming operations. For instance, when you have millions of rows, and a process on each row takes a few milliseconds, you’ll be waiting a long-time. However, when the operation is completely independent across rows, such an operation could be easily split between processes. I do this manually, by starting new threads in new consoles. Therefore, the script below should probably not be run in one go. Each block needs to be separately called in and waited for completion before starting the next. I looked into ways to automate multi-threading, but found it to much of a hassle to get it going so left it out.

2 Prerequisites

You’ll need an installed Postgres system, with Postgis and PGrouting installed. I followed this person’s instructions: http://www.kyngchaos.com/software/postgres/

You need a large and fast harddisk to do this. I coded everything on a 2014 iMac with 16GB of RAM, but had to buy a 512GB ssd drive to put the database on. SSD because otherwise things take way too long, and additional drive because especially the global raster on altitudes is space intensive. When I interact with the system, e.g. when loading data into the database, I only put code that I used for my system. A windows version would probably be similar.

My database runs on my local machine, not online. I use a standard set of username/password, and the database is called ‘empires’ after the main project for which this was done. Any paths to the local file system to load or store data are indicated as <path>, and need to be replaced with something suitable.

This tutorial is not an introduction to PostgreSQL and PostGIS. So, to really understand what’s going on, some prior knowledge is useful. Perhaps some minimum required knowledge is the following

  • SQL commands ends with ;. So when copying, this is how to recognize how commands break over lines.
  • when copying from the PDF version of this tutorial, things might go wrong quickly. While I use indentation for clarity of code, these indents should be spaces, not tabs. At least that’s my experience.
  • The PostgreSQL are not case sensitive. There is an old convention to have all commands and terms in capitals, e.g. SELECT * FROM cities WHERE capital = TRUE, but I’m too lazy to implement this structurally and in any decent editor with syntax highlighting the colour schemes replaced the issue of making code more readable. Yet the capitalisation remained, albeit not perfectly.
  • I cut lines to make things more readable. The character for continuation on the next line is / in python and \ in bash.

3 Setting up the database and loading data

Create database with required features, then log in using the postgres account,in Bash:

createdb -U wessel empires
psql -U postgres

then in the postgres console:


Get out of psql console (cntl+d) and then log back in using your own user account, in bash:

psql -U wessel -d empires

(in Postgres) Attach the required extensions to the empires database.

CREATE EXTENSION postgis_topology;

Given the global scale of the project, we cannot use a projection without significantly distorting distance calculations later. Therefore, we use the standard EPSG:4326 projection, which saves all data as long/lat coordinates, and apply functions where necessary that account for the spherical dimension. Take-away: load or save all external data in 4326.

3.1 Global altitude data

Global altitude data was taken from http://www.viewfinderpanoramas.org/Coverage%20map%20viewfinderpanoramas_org3.htm. I used two python files to download all the tiles and load them as a raster table in the database. This is a lot of data, around 16GB in zip files, and in the database the table is (with indices) 25GB.


from lxml import html
import requests
import urllib2
from os import listdir
from sys import stdout

url = 'http://www.viewfinderpanoramas.org/' / 
      + 'Coverage%20map%20viewfinderpanoramas_org3.htm'

page = requests.get(url)
tree = html.fromstring(page.text)

links = tree.xpath('/html/body/map/area/@href')

# build in check for file existence
from os import listdir
from os.path import isfile, join
mypath = '<path>/dem'
files_in_dir = [ f for f in listdir(mypath) if isfile(join(mypath,f)) ]

for link in links:
    filename = link.split('/')[-1]
    if filename not in files_in_dir:
        if filename not in ['01-15.zip', '16-30.zip', 
                            '31-45.zip', '46-60.zip']:  
            file = urllib2.urlopen(link)
            with open(join(mypath,filename), 'wb') as output:
                while True:
                    data = file.read()
                    if data:

transferring to the database

from sys import stdout

import zipfile
from os import listdir, remove
from os.path import isfile, join, basename
from subprocess import call
import shutil

def unzip(source_filename, dest_dir):
    with zipfile.ZipFile(source_filename) as zf:
        for member in zf.infolist()[1::]:
            file = member.filename.split('/')[1::]
            filename = basename(member.filename)
            #print filename
            if not filename: continue
            zf.extract(member, dest_dir)

mypath = '<path>/dem/'

onlyfiles = [ f for f in listdir(mypath) if isfile(join(mypath,f)) ]


for file in listdir(mypath):
    if file.endswith(".zip"):
        stdout.write(str(i)+" "+file+'\n')
        unzip(join(mypath,file), join('/Volumes/WD/','temp'))
        if i==0: # prepare table on first run
            command = "raster2pgsql -s 4326 -p -M -t 100x100 -N -32768" /
                 + "<path>/temp/*.hgt dem3 | psql -U wessel -d empires"
        else: # insert data
            command = "raster2pgsql -s 4326 -a -t 100x100 -N -32768" / 
                 + "<path>/temp/*.hgt dem3 | psql -U wessel -d empires"
        return_code = call(command, shell=True)  
        return_code = call("rm -R <path>/temp/" , shell=True)  
# vacuum and index afterwards

You could probably do it in one go, but for my development I chose to do it in steps. Keeping the raw altitude data can also become useful when creating tables for location specific maps, although in principle it could also be extracted from the table that is created here.


CREATE INDEX dem3_gist_idx ON dem3 USING GIST (ST_ConvexHull(rast));

3.2 External shapefiles

These shapefiles are obtained from https://www.naturalearthdata.com/downloads/

From bash, load land mass shapefile, current day countries, lakes and rivers. (The \ signs are just for clarity and to allow for lines to break. Each command starts with shp2pgsql.)

shp2pgsql -I -s 4326 -W "LATIN1" <path>/ne_50m_land/ne_50m_land.shp \
  land | psql -d empires -U wessel
shp2pgsql -I -s 4326 -W "LATIN1" <path>/ne_50m_admin_0_countries.shp \
  countries | psql -d empires -U wessel
shp2pgsql -d -s 4326 -I -W "LATIN1" <path>/ne_10m_lakes.shp  \
  public.lakes | psql -d empires -U wessel
shp2pgsql -c -s 4326 -I -W "LATIN1" \
  <project path>/ne_10m_rivers_lake_centerlines.shp  \
  public.rivers | psql -d empires -U wessel

4 The Grid: a globe with dots

Imagine the earth. Now we would like to have dots at equally spaced distances from each other, we’ll call this the grid. This only works when keep either latitude or longitude constant. Let’s keep latitude (the vertical dimension) constant, and choose the equator (lat=0) to start. Then, given the circumspection of the earth (360 degrees, and around 44000 km), and a desired distance between two dots (e.g. 5 km), you can create a string (or ring) of dots at equally spaced distances.

Moving to the next latitude, about 5 km north, we’d need to calculate the number of required dots given that the distance required to go the full 360 degrees will have slightly reduced. I work this backwards. First I create an index table that holds the information on the latitudes that the rings of dots need to be created and I calculate the distance in longitudal degrees for each latitude. Then I have all the information to create the actual grid.

drop table if exists grid_index cascade;
create table grid_index (
id serial primary key,
latt real,
latt_d real, -- latitude geography km distance from 0 degrees
latt_d_57 real, -- latitude geography km distance from 57 degrees
long_d real -- how many degrees is 1 (or other defined) km
create index grid_index_latt_idx on grid_index(latt);
create index grid_index_latt_d_idx on grid_index(latt_d);
create index grid_index_long_d_idx on grid_index(long_d);

From point 0,0 on earth, looking towards the North pole, gives me the distance points that are equally spaced at 5000 meters (5km). A more precise routing mesh would require here to set the distance to something smaller, but this will also increase the disk space and computer power required to do everything below. I took 5km as reasonable resolution that still allows for sufficient detail of global altitude, yet manageable with my home system. I don’t need to go the full 90 degrees, since there is no land that high. So I stop at 81 degrees north.

insert into grid_index (latt_d) (
    select y
        geography(st_point(0,0)), geography(st_point(0,81)))
       )::int, 5000) as y

Do the same, going south. Now I can stop already at -57 degrees, because we do not care about Antarctica.

insert into grid_index (latt_d) (
    select -1*y
     geography(st_point(0,0)), geography(st_point(0,-57)))
   )::int, 5000) as y

Then given the distances, I need to get the degrees of these points. I do this by creating a buffer from the origin point with a distance as large as defined in the above. Given the buffer, obtain the maximum altitude point, which is the number I want. However, buffers don’t work nice after some extreme distance. So I take new reference point after 57 degrees

update grid_index set 
 latt = st_ymax(st_buffer(st_point(0,0)::geography,latt_d)::geometry) 
 where latt_d>0 
update grid_index set 
 latt = st_ymin(st_buffer(st_point(0,0)::geography,-1*latt_d)::geometry) 
 where latt_d<0;

-- update for equator only
update grid_index set latt = 0 where latt_d = 0;

update grid_index set latt_d_57 = floor(latt_d-st_distance(
 geography(st_point(0,0)), geography(st_point(0,57)))) 

update grid_index set 
 latt = 

update grid_index set 
 long_d = 

The final preparatory step is to realise that we only need coordinates where there is land. The obvious solution is to go ahead with a full set of coordinates and then clip this against geometries of land. However, this turned out to be a very time-consuming and inefficient procedure, because of the huge number of dots in play. So instead, for each latitude that we have defined above, I create a line across the globe and intersect this with the land geometry, so I have vertical lines that only exist where there is land, and exactly overlay the rings of dots that will be created later. Again this is a sort of index table, where each row holds a combination of a latitude and longitudal limits that define the extent (through min and max) of the grid to be created.

-- define min and max long_d for each latt in land for each polygon. 
drop table if exists long_limits;
create table long_limits as(
 st_xmin(clipped_geom)::real long_min , 
 st_xmax(clipped_geom)::real long_max
  (ST_Dump(ST_Intersection(box.geom, land.geom))).geom As clipped_geom
   FROM  (
     latt , 
       st_makeline(st_point(-180, latt),St_point(180, latt)),
       4326) as geom 
     FROM grid_index ) as box
   INNER JOIN land
    ON ST_Intersects(box.geom, land.geom)
  )  As clipped
create index long_limits_latt on long_limits(latt);

Now I’m ready to create rows of x coordinates for each latitude, the grid. This is the first time that I split the code over four processes that I execute separately by opening four windows of the postgres console, based on latitudes. Each process creates its own table, after which the four tables are joined in a view. Since the grid is not my final objective, the objective is a mesh that connects each grid-point, this was my solution to deal with scale and speed issues.

First, prepare the four tables.

drop table if exists grid_1;
create table grid_1 (
    latt  real,
    long  real
drop table if exists grid_2;
create table grid_2 (
    latt  real,
    long  real
drop table if exists grid_3;
create table grid_3 (
    latt  real,
    long  real
drop table if exists grid_4;
create table grid_4 (
    latt  real,
    long  real

Then, the code for each table is identical, except for the table reference and the limits of the coordinates that it deals with. This will create 134,495,648 points. In order to make the id’s unique between the tables, set the start of the sequences of the serials manually to 0; 200,000,000; 300,000,000; 400,000,000.

I first create a full ring of dots, after which the relevant ones are filtered out using the long_limits that were defined above.

Table 1

insert into grid_1 (latt, long) (
with idx as (
    select * 
    from grid_index 
    grid_index.latt<(-57 + (57+81)/4) --the latitude limit for Table 1
 generate_series((-180*10^5)::int, (180*10^5)::int, 
 floor(idx.long_d*10^5)::int)::real/(10^5) as long 
 from idx); -- this creates the longitudal coordiantes, by going from 
            -- -180 degrees to 180 degrees in steps of long_d. The 
            -- multiplication by 10^5 is to avoid computer rouding 
            -- along the way

-- filter the points above to those that are relevant, 
-- these therefore filter out all coordinates that are in oceans
drop table if exists grid_1_filtered;
create table grid_1_filtered as (
 grid_1.latt, grid_1.long 
 on (grid_1.latt=long_limits.latt and 
     grid_1.long >= long_limits.long_min and 
--here we create the actual geospatial grid points.
alter table grid_1_filtered add column geom GEOMETRY;
update grid_1_filtered set 
  geom = st_setsrid(ST_POINT(long::real, latt::real),4326); 

-- housekeeping
drop table grid_1;
alter table grid_1_filtered rename to grid_1;
create index grid_1_geom on grid_1 using gist(geom);

-- note the adjustment of the pkey sequence, 
-- to allow for unique pkey when the four tables are combined.
create sequence grid_1_pkey_seq start with 1;
alter table grid_1 
 add column pkey INTEGER DEFAULT nextval('grid_1_pkey_seq');
alter sequence grid_1_pkey_seq owned by grid_1.pkey;
create index grid_1_pkey_idx on grid_1(pkey);
alter table grid_1 add constraint grid_1_pkey_chk CHECK(pkey<200000000);

-- adding altitude by looking up the value in dem3 raster table.
alter table grid_1 add column alt real;
update grid_1 set alt = st_value(rast, geom) 
 from dem3 where st_intersects(rast, geom);

This creates the required points starting at 57 degrees south.

A cut out from around Tasmania

A cut out from around Tasmania

Table 2

insert into grid_2 (latt, long) (
with idx as (
    select * from grid_index 
     grid_index.latt>=(-57 + (57+81)/4) AND 
     grid_index.latt<(-57 + 2*(57+81)/4)
select idx.latt, generate_series((-180*10^5)::int, (180*10^5)::int,
 floor(idx.long_d*10^5)::int)::real/(10^5) as long from idx);

drop table if exists grid_2_filtered;
create table grid_2_filtered as (
 select grid_2.latt, grid_2.long 
  long_limits on 
  (grid_2.latt=long_limits.latt and 
   grid_2.long >= long_limits.long_min and 
alter table grid_2_filtered add column geom GEOMETRY;
update grid_2_filtered 
 set geom = st_setsrid(ST_POINT(long::real, latt::real),4326); 
drop table grid_2;
alter table grid_2_filtered rename to grid_2;
create index grid_2_geom on grid_2 using gist(geom);

create sequence grid_2_pkey_seq start with 200000001;
alter table grid_2 
 add column pkey INTEGER DEFAULT nextval('grid_2_pkey_seq');
alter sequence grid_2_pkey_seq owned by grid_2.pkey;
create index grid_2_pkey_idx on grid_2(pkey);
alter table grid_2 
 add constraint grid_2_pkey_chk CHECK(pkey>200000000 and pkey<300000000);
alter table grid_2 
 add column alt real;
update grid_2 
 set alt = st_value(rast, geom) from dem3 where st_intersects(rast, geom);

Table 3

insert into grid_3 (latt, long) (
with idx as (
 select * from grid_index 
 where grid_index.latt>=(-57 + 2*(57+81)/4) AND 
       grid_index.latt<(-57 + 3*(57+81)/4)
 generate_series((-180*10^5)::int, (180*10^5)::int, 
                 as long from idx

drop table if exists grid_3_filtered;
create table grid_3_filtered as (
 select grid_3.latt, grid_3.long from 
 on (grid_3.latt=long_limits.latt and 
     grid_3.long >= long_limits.long_min and 
alter table grid_3_filtered add column geom GEOMETRY;
update grid_3_filtered 
 set geom = st_setsrid(ST_POINT(long::real, latt::real),4326); 
drop table grid_3;
alter table grid_3_filtered rename to grid_3;
create index grid_3_geom on grid_3 using gist(geom);

create sequence grid_3_pkey_seq start with 300000001;
alter table grid_3 
 add column pkey INTEGER DEFAULT nextval('grid_3_pkey_seq');
alter sequence grid_3_pkey_seq owned by grid_3.pkey;
create index grid_3_pkey_idx on grid_3(pkey);
alter table grid_3 
 add constraint grid_3_pkey_chk CHECK(pkey>300000000 and pkey<400000000);
alter table grid_3 add column alt real;
update grid_3 
 set alt = st_value(rast, geom) from dem3 where st_intersects(rast, geom);

and finally, Table 4

insert into grid_4 (latt, long) (
with idx as (
 select * from grid_index where grid_index.latt>=(-57 + 3*(57+81)/4)
select idx.latt, 
   (-180*10^5)::int, (180*10^5)::int,floor(idx.long_d*10^5)::int
  )::real/(10^5) as long from idx);

drop table if exists grid_4_filtered;
create table grid_4_filtered as (
 select grid_4.latt, grid_4.long 
 from grid_4 
 join long_limits 
 on (grid_4.latt=long_limits.latt and 
     grid_4.long >= long_limits.long_min and 
alter table grid_4_filtered add column geom GEOMETRY;
update grid_4_filtered 
 set geom = st_setsrid(ST_POINT(long::real, latt::real),4326); 
drop table grid_4;
alter table grid_4_filtered rename to grid_4;
create index grid_4_geom on grid_4 using gist(geom);

create sequence grid_4_pkey_seq start with 400000001;
alter table grid_4 
  add column pkey INTEGER DEFAULT nextval('grid_4_pkey_seq');
alter sequence grid_4_pkey_seq owned by grid_4.pkey;
create index grid_4_pkey_idx on grid_4(pkey);
alter table grid_4 add constraint grid_4_pkey_chk CHECK(pkey>400000000);
alter table grid_4 add column alt real;
update grid_4 
 set alt = st_value(rast, geom) 
 from dem3 where st_intersects(rast, geom);

We can check that this is correct, by taking two sequential points and calculate their distance.

-- distance in degrees between two points:
select st_distance(p1.geom, p2.geom) dist 
 from grid_4 as p1, grid_4 as p2 
 where p1.pkey=401316438 and p2.pkey=401316439;
-- in meters
select st_distance(geography(p1.geom), geography(p2.geom)) dist 
 from grid_4 as p1, grid_4 as p2 
 where p1.pkey=401316438 and p2.pkey=401316439;

Combine the four tables in a view.

create view grid as (
select * from grid_1
select * from grid_2
select * from grid_3
select * from grid_4);

5 Connect the dots to create a mesh

The idea is to connect each dot with its nearest neighbour. This is fine when keeping latitude constant, but when going up or down it can be tricky. The grid is not rectangular, since we’re dealing with a globe.

First, prepare the mesh table. It’s called o_mesh, since I borrowed the code from somewhere in stackexchange.

drop table if exists o_mesh CASCADE;
create table o_mesh 
 "source" int,
 target int,
 direction smallint, -- varchar(2),
 dist real

The following is repeated for each of the four grid tables, by just replacing grid_1 with grid_2, grid_3 and grid_4 sequentially.

The idea is to connect each two points together once. This sounds quite straightforward, but what we have is a large set of dots, with no information on which two dots are closest. Moreover, the dots are only equally spaced longitudinal given a latitude, at 5km. We also know that these rings are 5km spaced from each other in latitude. But these do not line-up exactly. So we know that the maximum distance is sqrt(52+52), but this is in km, where we need it in degrees, which will vary as we move up the globe (i.e. at different latitudes). Therefore, the maximum allowed distance in degrees must be calculated for each latitude separately. (I guess people that understand spherical calculations better than I do, will have a closed form formula for this. I do not.)

We also only want each link once, so we need to limit links to only a quarter of the quadrant, pure east, pure north, and anything north-east. This is implemented at the very end.

INSERT INTO o_mesh (source, target, direction, geom, dist)

with indeks as (
 idx2.latt+0.000001 as next_latt,
 idx3.latt-0.000001 as prev_latt,
 st_distance(st_point(0,idx1.latt), st_point(idx1.long_d,idx2.latt)) 
   select row_number() over () id1, * 
   from grid_index order by latt
  ) idx1 
  left join 
   select (row_number() over () -1) id2, * 
   from grid_index order by latt
  ) idx2
  on id1=id2 
  left join 
   select (row_number() over () +1) id3, * 
   from grid_index order by latt
  ) idx3
  on id1=id3

source.pkey as source, target.pkey as target, (
when source.long = target.long AND source.latt < target.latt then 1 --'N'
when source.long < target.long AND source.latt < target.latt then 2 --'NE'
when source.long < target.long AND source.latt = target.latt then 3 --'E'
when source.long < target.long AND source.latt > target.latt then 4 --'SE'
when source.long = target.long AND source.latt > target.latt then 5 --'S'
when source.long > target.long AND source.latt > target.latt then 6 --'SW'
when source.long > target.long AND source.latt = target.latt then 7 --'W'
when source.long > target.long AND source.latt < target.latt then 8 --'NW'

else 9 --'ER'

end ) as direction,

-- the line that represent the connection
  source.geom::GEOMETRY, target.geom::GEOMETRY
 ), 4326) AS geom,

-- the distance in meters between two points
st_distance(geography(source.geom), geography(target.geom)) AS dist


-- distance should match what is made in the grid.
grid_1.*, dist_limit, next_latt, prev_latt
 left join 
 indeks on grid_1.latt=indeks.latt order by grid_1.latt 
) AS source 
 (select *, FROM grid_1) AS target 
-- one way links only
  target.long > source.long  
  AND target.latt between (source.prev_latt) and (source.next_latt) 
  (target.long = source.long
   AND target.latt > source.latt AND target.latt <= next_latt)
and ST_DWithin(target.geom, source.geom, dist_limit);