Placing the WGS84 image on Mercator web map

Why?

Most map images are rendered from WGS84 sources, such as GeoTIFF, NetCDF etc. It means that each pixel of these images is indexing a geo-coordinate position.

For example, we suppose a NetCDF file with longitudes(from 110.1 to 120.0, step 0.1), latitudes(from 20.1 to 30.0, step 0.1) and values. So, there are 100*100 values in this file. Each value could be represented by a pixel in the PNG image file. Finally, we get an image with 100*100 pixels.

We put this 100*100 pixels PNG image on a web map, such as Leaflet.js. Leaflet.js give us a convenient way to display an image on a map using the ImageOverlay. Below the ImageOverlay, is the tiled layer as TileLayer. Here, I don’t give a lot of explanation about the tiled layer. But the basis you must know is the projected coordinate system addressed as EPSG:3857 or Web Mercator. You can find the official details on epsg.io. Also, a useful feature are listed on the page and I put it below.

Area of use: World between 85.06°S and 85.06°N.

So, let me list the differences between these two coordinate systems.

Coordinate System Alias Area of use
EPSG:4326 WGS84 between 90°S and 90°N
EPSG:3857 Web Mercator between 85.06°S and 85.06°N

How?

The 100*100 PNG image we have created before, is based on WGS84. So the area of use is between 90°S and 90°N. If we use the ImageOverlay to display out image and pass the configuration bbox as [[30.0,110.1],[20.1,120.0]]. Probably, the image is not placed at the right position. And how to solve it?

It’s so simple!

On EPSG:4326 system, each pixel represent an unit of longitude or latitude. Contracting to EPSG:3857, the scale of latitude is different. So, that’s the problem!

Back to our image, we should scale the Y axis from 90.0 to 85.06. So, the ratio is 90/85.06 = 1.058076652... and the height of our image should be scaled to 100/(90/85.06) as proximately 94 or 95. Another way to do this is calculate the latitude we pass to the ImageOverlay. Each latitude should be calculated as [lat]/(90/85.06). So the bbox should be [[28.35333,110.1],[18.99673,120.0]]. Finally, we will see the right result.

A beginner's guide to build a map website in React

Preface

As a GIS developer, I will change my role to a WebGIS Front-end developer when I need to build an interactive map to present some geographic data. Due to my latest work, I’m familiar with building it, using the VueJS framework, cause it’s prevalent among Chinese developers. Even though, thinking globally, the React framework holds the most significant proportion of all MVVM-driven and Front-end frameworks.

Get Started with React

If you are a Front-end Developer, and familiar with other frameworks rather than React. It would be so easy for you to build a React web application after learning React from its official tutorial. So, get your hands dirty, and here we go.

React Offical Website https://beta.reactjs.org/

Create a single-page WebApp

Create React App is an officially supported way to create single-page React applications. It offers a modern build setup with no configuration.

1
2
3
npx create-react-app map-on-react
cd map-on-react
yarn start

Prepare dependencies

Build the map component

Build the layer component

Compose our components

Result and conclusion

Helps you to gain an understanding of the GeoHash Algorithm step by step

Outline

  1. What is the GeoHash Algorithm and why do we need it?
  2. How did the GeoHash split the world into a hash array
  3. The calculating procedure
  4. Verify the result

Tutorial

1. What is the GeoHash Algorithm and why do we need it?

Geohash is a geographic location coding algorithm that leads us dealing the location task which is popular in many applications and service like LBS or so on, and make it efficiency.

Let’s discuss an application about when you calling a taxi to go somewhere. It’s easy to think that the app service would send a message or request to the nearest driver who could serves you as soon as possible.

Generally, you refresh your location with your country, city and geographic coordinates and let the service known where you are. And the taxi driver would also do this process to upload their position. So, the service in background could calculates the distance between you and the drivers, finding the nearest one to contact you. Quite simple, isn’t it?

Further, let’s put us hands dirty in the process above and check the efficiency. First, we generate 100,001 coordinates in longitude and latitude. One of these is your position, and the rest of these are the driver’s. I put our Python script below so you could copy it to your own .ipynb file and go ahead.

1
2
3
4
5
6
7
8
9
import numpy as np

# Simulate a position as yours
your_pos = [np.random.uniform(115, 125),np.random.uniform(20, 25)]

# Create 100,000 taxicabs
size = 10000000
drivers_pos = [[np.random.uniform(115, 125),np.random.uniform(20, 25)] for i in range(0,size)]

Here, we have prepared the simulation data and be ready for the next step. In this step, we will find out 10 drivers near you. So, let’s do it by the codes below.

1
2
3
4
5
import math

# define a function to calcuate the distance in simply way
def simple_distance(lon_a, lat_a, lon_b, lat_b):
return math.sqrt((lon_a - lon_b)**2 + (lat_a - lat_b)**2)

2. How did the GeoHash split the world into a hash array?

3. The calculating procedure

4. Verify the result

References & Documentations

GeoHash Wikipedia

https://en.wikipedia.org/wiki/Geohash

GeoHash Explorer

https://geohash.softeng.co/

A simple way to import raster data into the PostGIS database

Outline

  1. Installing the PostGIS Database
  2. Enabling the raster extension in PostGIS
  3. Preparing or creating the raster data
  4. Knowing about the Raster WKB/WKT format
  5. Creating the importation-SQL file to execute

Tutorial

1. Installing the PostGIS Database

The easiest way to install the PostGIS database is using Docker, or you could choose another way to install it. Hence, install the Docker environment and docker-compose plugin on your computer, and here, the hyperlink below is the tutorial and documentation of how to install Docker.

Docker Engine installation overview | Docker Documentation

Whether the Docker has already installed, you could use docker-compose to run the database by configuring the docker-compose.yml file. The configuration file example is below the paragraph.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
version: "3.7"
services:
  pgdb:
    image: "postgis/postgis:15-master"
    container_name: pgdb
    restart: always
    environment:
      POSTGRES_DB: gis_data
      POSTGRES_USER: xyf
      POSTGRES_PASSWORD: xyf
      TZ: Asia/Shanghai
    volumes:
      - "/data/postgis/data/:/var/lib/postgresql/data/"
      - "/data/postgis/tmp/:/tmp"
    ports:
      - "5432:5432"
    networks:
      - "enet"
networks:
  enet:
    name: "enet"
    external: true

Once you have done it and the PostGIS database has been running well, you could make a connection to the database, using the DBMS application such as DBeaver or GIS application addressed QGIS.

2. Enabling the raster extension in PostGIS

There are a few extensions supported by PostGIS spatial engine. The default running configuration shows that 3 of these extensions have been enabled. But it was unfortunate to say the raster extension is not enabled by default way. So we should switch on it by ourselves.
The first step is to make your command line jump into the docker container, which is the database container you have created before.

1
sudo docker exec -it pgdb /bin/bash

After entering the container bash, you could connect to the database by using the command line tool that Postgres offered and named ‘psql’. It’s pretty easy to do. In addition, here are some explanations about the parameters. ‘-U’ is the user and ‘-d’ is the database name. I hope you could find it in the docker-compose configuration file that we have mentioned before.

1
psql -U xyf -d gis_data

After you finish your connection, I would say ‘Congratulation!’ to you and give you the instruction for the next step. These three SQL commands below will help you to enable the postgis_raster extension, and the gdal_drivers and query the status of the available extensions. You would see the result of your query formed as a table heading the name, default_version and installed_version. So easy, isn’t it? Make sure the installed_version of postgis_raster, yes the column is not NULL.

1
2
3
4
CREATE EXTENSION postgis_raster;
SET postgis.gdal_enabled_drivers = 'ENABLE_ALL';
SELECT name, default_version,installed_version
FROM pg_available_extensions WHERE name LIKE 'postgis%' or name LIKE 'address%';

3. Preparing or creating the raster data

I’m sure that you are sure that one of the key steps we should do before importing raster data into the database is owing the raster data. Haha, if you don’t have your own raster data. You could create it using QGIS with the tools called Interpolation or so on. Then, export it into the GeoTiff format and copy or move the GeoTiff file you exported to the file path you have configured on docker-compose.yml and go through our next step.

Test Raster
Raster Data

Suggestions: DO NOT MAKE A LARGE RASTER DATA OR YOU WOULD CRY ON IT!

4. Knowing about the Raster WKB/WKT format

How does the PostGIS database store the raster data? It’s a pretty good question. As we know that PostGIS store the vector data with data type ‘geometry_column’ or ‘geom’. And the way to store raster data is familiar with it, demonstrating a new data type called ‘raster’ or ‘rast’. When you make a selection by querying some raster data on a database. You would see the HEX result looks like ‘01000001CDDB….’ and it is the WKB/WKT data format of a raster. About all, the key to importing raster data into the PostGIS database is converting your own raster data into WKB/WKT raster format.

You may find out the attributes of Raster WKB/WKT by clicking the link below.
WKTRaster/Documentation01 – PostGIS (osgeo.org)

5. Creating the importation-sql file to execute

I would say it’s much more convenient to make the conversion by using the script or tool that PostGIS has already done before, named ‘raster2pgsql’. You could find the usage of ‘raster2pgsql’ via the command ‘raster2pgsql’ after we jump into the container in which the database is running on it. We should find out and reach the path where the testing GeoTiff is waiting for us.

1
raster2pgsql -s 4326 -I -C /tmp/test.tif nc.test_raster > test.sql

We may specify the SRID of our raster in the progress, and the example illustrates the parameter with ‘-s 4326’. You could get the explanations about parameters ‘-I’ and ‘-C’ by yourself. ‘/tmp/test.tif’ is the file path of your GeoTiff file. ‘nc.test_raster’ shows the schema with ‘nc’ and the table name is ‘test_raster’. Finally, the SQL file would be created into the path you running this command and named test.sql.

1
2
3
4
5
6
7
BEGIN;
CREATE TABLE "nc"."test_raster" ("rid" serial PRIMARY KEY,"rast" raster);
INSERT INTO "nc"."test_raster" ("rast") VALUES ('010000010019CC5D8DE459B93F14F061B29570B9BFD174763238005C403E7799994DC0374000000000000000000000000000000000E610000029001A004B000000008087C3C0F355F2B1BB1C394014ED2AA4FC3C39409A780778D25E394016FC36C478813940B1FB8EE1B1A339407714E7A8A3C33940FA4674CFBADE3940111B2C9CA4F13940DAE3857478F839402CF180B229EF3940EB724A404CD23940778192020BA0394083A7902BF55......'::raster);
CREATE INDEX ON "nc"."test_raster" USING gist (st_convexhull("rast"));
ANALYZE "nc"."test_raster";
SELECT AddRasterConstraints('nc','test_raster','rast',TRUE,TRUE,TRUE,TRUE,TRUE,TRUE,FALSE,TRUE,TRUE,TRUE,TRUE,TRUE);
END;

You could execute this SQL script using DBeaver, and the ‘nc.test_raster’ would be imported into the database, or you visual it by using QGIS, connecting the database and dropping it into the layers panel. That’s all.

Finally
Layer

References & Documentations

raster2pgsql

https://postgis.docs.acugis.com/en/latest/components/raster2pgsql/index.html#documentation

using_raster_dataman

https://postgis.net/docs/using_raster_dataman.html#RT_Raster_Loader

docker_hub_postgis

https://hub.docker.com/r/postgis/postgis

WKTRaster

https://trac.osgeo.org/postgis/wiki/WKTRaster/Documentation01

Introduce how to import GeoJSON data into MongoDB

Outline

  1. Get ready for your GeoJSON data
  2. Convert GeoJSON data to JsonArray data for importation
  3. Use MongoDB command line tool to import data
  4. Finally, import the data successfully and find them

Tutorial

1. Get ready for your GeoJSON data

GeoJSON is a format for encoding a variety of geographic data structures. A GeoJSON object may represent a geometry, a feature, or a collection of features. GeoJSON supports the following geometry types: Point, LineString, Polygon, MultiPoint, MultiLineString, MultiPolygon, and GeometryCollection.

The codes below demonstrate a simple GeoJSON file, which is constructed with 4-point features. You could copy the text and paste into your text editor and save it with the .geojson suffix, and load the file you saved into a GIS application like QGIS or ArcMap.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
{
"type": "FeatureCollection",
"name": "earthquakes",
"crs": {
"type": "name",
"properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" }
},
"features": [
{
"type": "Feature",
"properties": {
"DateTime": "1970/01/04 17:00:40.20",
"Latitude": 24.139,
"Longitude": 102.503,
"Depth": 31.0,
"Magnitude": 7.5,
"MagType": "Ms",
"NbStations": 90,
"Gap": null,
"Distance": null,
"RMS": 0.0,
"Source": "NEI",
"EventID": 1970010440
},
"geometry": { "type": "Point", "coordinates": [102.503, 24.139] }
},
{
"type": "Feature",
"properties": {
"DateTime": "1970/01/06 05:35:51.80",
"Latitude": -9.628,
"Longitude": 151.458,
"Depth": 8.0,
"Magnitude": 6.2,
"MagType": "Ms",
"NbStations": 85,
"Gap": null,
"Distance": null,
"RMS": 0.0,
"Source": "NEI",
"EventID": 1970010640
},
"geometry": { "type": "Point", "coordinates": [151.458, -9.628] }
},
{
"type": "Feature",
"properties": {
"DateTime": "1970/01/08 17:12:39.10",
"Latitude": -34.741,
"Longitude": 178.568,
"Depth": 179.0,
"Magnitude": 6.1,
"MagType": "Mb",
"NbStations": 59,
"Gap": null,
"Distance": null,
"RMS": 0.0,
"Source": "NEI",
"EventID": 1970010840
},
"geometry": { "type": "Point", "coordinates": [178.568, -34.741] }
},
{
"type": "Feature",
"properties": {
"DateTime": "1970/01/10 12:07:08.60",
"Latitude": 6.825,
"Longitude": 126.737,
"Depth": 73.0,
"Magnitude": 6.1,
"MagType": "Mb",
"NbStations": 91,
"Gap": null,
"Distance": null,
"RMS": 0.0,
"Source": "NEI",
"EventID": 1970011040
},
"geometry": { "type": "Point", "coordinates": [126.737, 6.825] }
}
]
}

2. Convert GeoJSON data to JsonArray data for importation

The key thing to importing GeoJSON data into Mongodb is the file conversion from GeoJSON to JsonObject or JsonArray. In addition, there are a lot of ways to do so, but, I want to tell you the best way by using a tool called jq. ( jq Official Website here!)

When you finish your download, you could do the conversion by using the command below. Importantly, you must put your own configuration text to replace the text between the square brackets and remove the square brackets finally. Then, try it yourself.

1
jq --compact-output ".features" '[Your GeoJSON File]' > '[GeoJSON File converted]'

3. Use mongodb command line tool to import data

In this step, you should check your system environment config and make sure the MongoDB client or command line tools have been installed well firstly. Then, you could use the command below and replace the content between square brackets as you do well in the previous step.

1
mongoimport -u '[Your MongoDB User]' --password '[Your Password]' --db '[Database to Import]' -c '[Collection to Import]' --file '[Path to Your GeoJSON File]' --jsonArray '[MongoDB Connection String]'

Here is the positive result that tells you which MongoDB has been connected and how many documents have been imported.

1
2
connected to: mongodb://***********
4 document(s) imported successfully. 0 document(s) failed to import.

4. Finally, import the data successfully and find them

Use MongoDB Client to find or query the data from the collection you have created. And, here is the result.

Query Result

References & Documentations

GeoJSON Documentation

https://geojson.org/geojson-spec

MongoDB geospatial

https://www.mongodb.com/docs/manual/geospatial-queries/#std-label-geospatial-geojson

jq-tools Offical Website

https://stedolan.github.io/jq/