MapX is an online platform for managing geospatial data on natural resources, developed by UNEP/GRID-Geneva - a data centre resulting from the partnership between UN Environment Programme, the Swiss Federal Office for the Environment and the University of Geneva.
Field applications of MapX are varied and include chemical management, disaster risk reduction, biodiversity planning, land-use planning, extractive industry, renewable energy and environmental security.
MapX targets a wide community of users that are primarily UN Environment Programme and partners, the Secretariats of Multilateral Environmental Agreements (MEAs) and other UN agencies mandated to collect and use geospatial data in environmental decision-making. Civil society groups, non-governmental organizations, academia and citizens complement this set of users.
MapX was designed in 2014 and since then continuously improved with wide international stakeholder consultations.
MapX is fully integrated into the World Environment Situation Room, which is the UNEP data and knowledge platform.
Development servers are launched from within Docker containers, to match as closely as possible the environment found in production. Some commands, tools and config are still currently needed on your local computer.
Mendatory
docker
v20.10+
Optional
node
v16.0+g++
npm
yq
git
Some browsers require to modify your hosts file to link custom MapX local "subdomains". It could be as simple as adding those lines to /etc/hosts/
and restarting your browser, if needed:
127.0.0.1 app.mapx.localhost
127.0.0.1 api.mapx.localhost
127.0.0.1 search.mapx.localhost
127.0.0.1 wsecho.mapx.localhost
127.0.0.1 probe.mapx.localhost
127.0.0.1 apidev.mapx.localhost
127.0.0.1 dev.mapx.localhost
127.0.0.1 geoserver.mapx.localhost
The included docker-compose.yml
allows to setup a development environment.
Trigger the following script which init some required directories and copy the default environment variable to ./mapx.dev.env
(if missing):
./mapx.dev.init.sh
Finally, launch the mapx stack:
# Pull the latest builds
docker compose pull
# Launch postgres first : in case of first launch, some tables and roles must be created
docker compose up pg
# Launch other services
docker compose up
The application should be available at http://app.mapx.localhost:8880/ (curl -H Host:app.mapx.localhost http://127.0.0.1:8880/).
An admin user is available as admin@localhost
which can be used to login; get the password by browsing the web mail at http://mail.mapx.localhost:8880/.
./build.sh -v <version>
app
# Build app js code, update docker image
cd /app/
npm run docker
api/express
# Build api js code, update docker image
cd /api/
npm run docker
geoserver
# Update geoserver docker image
cd /geoserver/
./build -a
meili search
# Update meili docker image
cd /meili/
./build -a
Postgis: OperationalError: could not access file "$libdir/postgis-X.X
Solution: run docker compose exec pg update-postgis.sh
Install all modules listed as dependencies in package.json
for the app
service, the sdk
and the websocket handler ws_handler
:
cd ./app
npm install
cd ./app/src/js/sdk/
npm install
cd ./app/src/js/ws_handler/
npm install
Optionally, if you want to develop submodules as el
, mx_valid
or rebuilding sprites
:
cd ./app/src/js/el/
npm install
cd ./app/src/js/is_test/
npm install
# Note: could requires specific version of node
cd ./app/sprites/
npm install
Start a development session for the app
service:
- Automatically build all client side dependencies, included dictionaries and translation ( which needs some config, see below )
$ cd ./app
$ npm run dev
- Launch the server from within the running
app_dev
container. In another terminal window, launch the dev server :
docker compose exec app_dev R
> source('run.R')
# OR, as a single line for a non-interactive session:
docker compose exec app_dev Rscript --vanilla run.R
Then, an instance of mapx should be available at http://dev.mapx.localhost:8880/ for which the source code from ./app/
is mounted as /app/
in the container.
Note for auto-translation:
Automatic translation requires a valid Google cloud config file, which path should be refered in the host – not inside the docker container – as an environment variable named GOOGLE_APPLICATION_CREDENTIALS
, accessible from your local node. You can test this with :
$ node -e 'console.log(process.env.GOOGLE_APPLICATION_CREDENTIALS)'
# Should return, as an example :
# > /home/<your name>/.google_cloud_<your service>.json
Setup the environmental variables for the api
service in mapx.dev.env
as follows:
API_HOST=api
API_PORT=3030
API_PORT_DEV=3333
API_PORT_PUBLIC=8880
API_HOST_PUBLIC=api.mapx.localhost
API_HOST_PUBLIC_DEV=apidev.mapx.localhost
Start the Express.js
development server:
$ docker compose up -d
$ docker compose exec api_dev node inspect index.js port=3333
debug> c
The instance now should use the api service at http://apidev.mapx.localhost:8880/ for which the source from ./api/
is mounted as /api/
in the container.
If you want to use the prod version of the api_dev
service, setup the environmental variables in mapx.dev.env
as follows:
API_HOST=api
API_PORT=3030
API_PORT_DEV=3030
API_PORT_PUBLIC=8880
API_HOST_PUBLIC=api.mapx.localhost
API_HOST_PUBLIC_DEV=api.mapx.localhost
Mapx use a custom end-to-end testing tool, which features the mapx's sdk
. The testing coverage is partial, but should cover the largest part of all MapX features, while also tesing the sdk
, as a all tests are written using common sdk
async methods.
cd app/src/js/sdk
npn run tests
Run tests within the development container:
docker compose exec api sh
npm run test
docker compose exec routines node inspect routines.js
debug> c
A sample dataset of countries boundaries ( polygons ) is included in this code repo, and will add a table named mx_countries
in the database. The main purpose of this layer is croping dataset during exportation.
Administrative boundaries generalized by UNEP/GRID-Geneva (2019) based on the Global Administrative Unit Layers (GAUL) dataset (G2015_2014), implemented by FAO within the CountrySTAT and Agricultural Market Information System (AMIS) projects (2015).
Starting with version 1.8.26, a lightweight version of mx_countries
is included.
It was generated with mapshaper using the following parameters:
- import:
- detect line intersections = true
- snap vertices = true
- simplification:
- prevent shape removal = true
- method: Visvalingam / weighted area
- percentage of removable points to retain: 3%
Once the simplification was done, the data was repaired in mapshaper
and then in QGIS 3.18
using the Fix geometries
tool. All geometries are valid according to GEOS rules.
The generalization was made using Feature Manipulation Engine (FME) with the following settings:
- Algorithm: Douglas (generalize)
- Generalization Tolerance: 0.02
- Preserve Shared Boundaries: Yes
- Shared Boundaries Tolerance: None
- Preserve Path Segment: No
Geometries obtained from FME have been repaired in PostGIS using ST_MakeValid() function.
You are free to modify and/or adapt any Data provided for your own use, reproduction as well as unlimited use within your organization. The Data is licensed and distributed by UNEP/GRID-Geneva. Redistribution to a Third party or reseller is formerly prohibited at any stage whatsoever by UNEP/GRID-Geneva.
Due to the generalization process, the administrative boundaries of the countries have been modified. Therefore, this dataset can only be used for unofficial cartographic purposes for global mapping using a scale not higher than 1:25 million. It should not be used in any way as a reference for national boundaries. Territorial information from this dataset do not imply the expression of any opinion whatsoever on the part of the UNEP/GRID-Geneva concerning the legal status of any country, territory, city or area, or of its authorities, or concerning the delimitation of its frontiers or boundaries. The Data is being delivered to you “AS IS” and UNEP/GRID-Geneva makes no warranty as to its use or performance.
UNEP/GRID-Geneva cannot be held responsible for a misuse of this file and its consequences.
Procedure to follow if PostgreSQL passwords need to be updated for security reason (or any other reasons).
-
Launch MapX stack with Docker Compose:
docker compose up
-
Once your stack is up, update PostgreSQL passwords in the environment file:
POSTGRES_PASSWORD
POSTGRES_USER_WRITE_PASSWORD
POSTGRES_USER_READ_PASSWORD
POSTGRES_USER_CUSTOM_PASSWORD
-
Connect to PostgreSQL using psql:
docker compose exec pg psql -U {POSTGRES_USER}
-
Queries to run in psql to update the passwords. Be careful to respect the order in which the queries are run.
ALTER ROLE {POSTGRES_USER_READ} WITH PASSWORD '{POSTGRES_USER_READ_PASSWORD}'; ALTER ROLE {POSTGRES_USER_WRITE} WITH PASSWORD '{POSTGRES_USER_WRITE_PASSWORD}'; ALTER ROLE {POSTGRES_USER_CUSTOM} WITH PASSWORD '{POSTGRES_USER_CUSTOM_PASSWORD}'; ALTER ROLE {POSTGRES_USER} WITH PASSWORD '{POSTGRES_PASSWORD}'; \q
-
Force Compose to stop and recreate all containers to avoid any problems related to passwords update:
docker compose up -d --force-recreate
© 2014-present unepgrid.ch