Note: This repository serves as a reference implementation for interacting with Planetary Computer APIs on Azure. Ths code supports the production deployment of the Planetary Computer Data Catalog and Explorer applications. This repository is not meant to be reusable in other situations without significant modification, and the repository maintainers cannot provide any support for non-development deployments of this code. Additionally, this application is under constant development, including some significant planned refactors.
That said, it is hoped that components or examples contained here will be helpful for users developing applications using the open-source components also used by the Planetary Computer, or against Planetary Computer APIs themselves. Please review the terms of use.
To file general issues or ask questions, please visit the Planetary Computer repository.
A homepage, data catalog, and visualizations for the Planetary Computer.
- Docker
- docker-compose
The entire development environment is created as part of a multi-container docker-compose setup. To fetch and build the images, run:
./scripts/update
Now you can start the development server, and the site should be accessible at http://localhost:4280.
./scripts/server
If you want to run just the frontend development server on your host, ensure you have Node 14 installed and run:
npm install
npm start
To build the latest docs or external notebook, or if any new dependencies have been added, re-run ./scripts/update
(you may need to refresh the app in your browser if the site was running).
There are four main components to the application:
- etl - downloads and processes external files to be included in the application build
- docs - a sphinx-powered, markdown based documentation system
- api - an Azure Function app that provides a lightweight backend
- src - the main React application, bootstrapped from Create React App
First, copy .env.sample
file to .env
, and ensure the configuration values are set.
Name | Value | Description |
---|---|---|
REACT_APP_API_ROOT |
https://planetarycomputer-staging.microsoft.com | The root URL for the STAC API, either prod, staging or a local instance using http://localhost:8080/stac. If the URL ends in 'stac', this is a special case that is handled by replacing 'stac' with the target service, e.g. 'data' or 'sas' |
REACT_APP_TILER_ROOT |
Optional | The root URL for the data tiler API, if not hosted from the domain of the STAC API. |
REACT_APP_IMAGE_API_ROOT |
PC APIs pcfunc endpoint | The root URL for the image data API for animations. For a local instance use http://localhost:8080/f. |
REACT_APP_AZMAPS_CLIENT_ID |
Retrieve from Azure Portal | The Client ID used to authenticate against Azure Maps. |
REACT_APP_ONEDS_TENANT_KEY |
Lookup at https://1dswhitelisting.azurewebsites.net/ | Telemetry key (not needed for dev) |
REACT_APP_AUTH_URL |
Optional. URL to root pc-session-api instance | Used to enable login work. |
Run ./scripts/server --api
to launch a development server with a local Azure Functions host running.
In the local development setups, the Azure Maps token is generated using the local developer identity. Be sure to
az login
and az account set --subscription "Planetary Computer"
to ensure the correct token is generated. Your identity
will also need the "Azure Maps Search and Render Data Reader" permission, which can be set with:
USER_NAME=$(az account show --query user.name -o tsv)
az role assignment create \
--assignee "$USER_NAME" \
--role "Azure Maps Search and Render Data Reader" \
--scope "/subscriptions/9da7523a-cb61-4c3e-b1d4-afa5fc6d2da9/resourceGroups/pc-datacatalog-rg/providers/Microsoft.Maps/accounts/pc-datacatalog-azmaps" \
--subscription "Planetary Computer"
Note, you may need to assign this role via an identity that has JIT admin privileges enabled against the Planetary Computer subscription.
The REACT_APP_API_ROOT
can be set to a local instance of the Metadata API if you are
prototyping changes to collections, e.g., http://localhost:8080/stac. However, as a shortcut, you can also run the
./scripts/mockstac
script in order to locally serve a static json file from
/mockstac/collections
. Simply alter the contents of the JSON file as you need,
and set your REACT_APP_API_ROOT
value to http://localhost:8866
and restart
your dev server.
A simple feature flag system is included in the application. To add a flagged feature during development:
- Wrap the component to be shown or hidden with a
Feature
component, providing aname
. - In
/utils/featureFlags.js
add an object to the list withname
,description
, andactive
(bool
) - Use the script in
/extra/ff-bookmarklet.js
to toggle features on a running site- Use a JS minifier like https://javascript-minifier.com/ to minimize the file
- Add a new bookmark, and paste the minified JS as the URL. You'll likely need add the
javascript:
label back to the front of the screen.
- Toggle feature flags on or off. Be sure that the site works in both states.
- Remove the
Feature
component and the flag entry when the feature is mature enough to be included.
To debug or extend the small API backend, please read the API README.
The project contains cypress end-to-end tests and jest based unit tests.
To run jest based unit test and perform linting and code format analysis, run ./scripts/test
.
The project contains a prettier config file that is used to format code, which
should integrate with your editor. Alternatively you can run ./scripts/format
to format all files. The CI system will check for formatting errors.
If you're on WSL2, be sure to set up your system to run the Cypress GUI: https://wilcovanes.ch/articles/setting-up-the-cypress-gui-in-wsl2-ubuntu-for-windows-10/
You may also need to install cypress locally on your computer, with npm install cypress
and ./node_modules/.bin/cypress install
.
- Install Google Chrome in your WSL2 environment (Cypress ships with a chromium-based electron browser)
- Run
npm run cypress:open
to run the GUI and debug tests, or - Run
npm run cypress:run
to run the headless version in the terminal
Both test suites are run from CI.
Note- the cypress tests currently involve the sentinel-2-l2a collection, but running the backend locally only comes out of the box with naip, so the tests will fail.
Service | Port |
---|---|
Webpack Dev Server | 3000 |
Functions App Dev Server | 7071 |
Mock STAC API Server | 8866 |
Name | Description |
---|---|
clean |
Removes intermediate build files from docs and dataset codefiles |
format |
Runs black and prettier against Python and Java/TypeScript files |
mockstac |
Serves contents of /mockstac/collections at http://localhost:8866 |
server |
Runs frontend development server |
test |
Runs unit tests and linter |
update |
Install dependencies and build etl and docs content. Use --devdocs to develop against a local notebook repo. |
npm * |
Run configured npm commands like npm add , npm lint , npm test , etc |
There are 3 Azure Static Web App services enabled, for staging, test and production. They are configured via the GitHub workflow files. Generally, merging to deployment branches will initiate a build and deploy with the service framework:
develop
: Deploys to staging and test (pc-datacatalog
,pc-datacatalog-test
)main
: Deploys to production (pc-datacatalog-production
)
Opening a PR against either branch will also create an ephemeral staging environment, and a site link will be added to the PR comment section.
The release process can be managed with git flow, initialized with the default settings. To bring forth a production release, pull local develop
and main
to latest, and follow these steps:
- Start a release
git flow release start X.Y.Z
- Bump the version number in
package.json
and check it in
git status # check staging area is clean
git add package.json
git commit -m "X.Y.Z"
- Publish the release
git flow release publish X.Y.Z
- Finish and push the release branch
- When prompted, keep default commit messages
- Use
X.Y.Z
as the tag message
git flow release finish -p X.Y.Z
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.