-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transfer reprocessed HLS environmental justice granules and full provider metadata to production #110
Comments
Step 1 completed and verified in veda-data-store s3 console by (first I ran with --dryrun)
|
I accidentally copied over the old/original stac metadata in addition to the metadata for the reprocessed assets (CRS adjustment).
|
The following collections were updated:
I hand picked a few to check the hrefs were correctly updated and they were. |
I created a notebook to validate that the collections were valid and marked that AC as done given that they are: https://github.com/NASA-IMPACT/veda-data/compare/jt/validation-notebook?expand=1 |
I re-opened this issue because there is one last step for these reprocessed HLS data: publish to mcp-test and prod |
What
We need to transfer a custom subset of HLS granules from two collections in the staging version of veda-backend to production. These collections include detailed provider metadata and have complex storage paths in S3 that require this work to be done by hand (the flat, minimal, structure of the datasets pipelines used for many veda collections would cause us to lose the detailed structure of these granules and metadata which are required for multi band maps and analyses like those used in this notebook: https://nasa-impact.github.io/veda-docs/notebooks/quickstarts/hls-visualization.html)
Background: These collections include a small number of tiles that were selected to spotlight before and after imagery from two environmental justtice stories and demonstrate how titiler+pgstac can be used to investigate that data in the cloud. Early HLS (harmonized landsat sentinel) data were selected for reprocessing (CRS data were missing from these older files). The data were downloaded from the host DAAC for reprocessing so we need to copy this set of reprocessed files to veda-data-store and then we need to update the stac metadata to href the new veda-data-store location (much more detailed than our in-house dataset metadata so we do not want to execute the build-stac pipeline for these).
Historical VEDA HLS issues
1. Transfer the files
s3://covid-eo-data/hlsl30-ej-reprocessed/
tos3://veda-data-store/hlsl30-ej-reprocessed/
without modifying the detailed key paths. I.e.s3://covid-eo-data/hlsl30-ej-reprocessed/2017/19QHA/HLS.L30.T19QHA.2017157T144341.v2.0/HLS.L30.T19QHA.2017157T144341.v2.0.B01.tif
will go tos3://veda-data-store/hlsl30-ej-reprocessed/2017/19QHA/HLS.L30.T19QHA.2017157T144341.v2.0/HLS.L30.T19QHA.2017157T144341.v2.0.B01.tif
s3://covid-eo-data/hlss30-ej-reprocessed
tos3://veda-data-store/hlss30-ej-reprocessed
2. Update the hrefs in stac item records with a python script or notebook that can be reused if needed
hlsl30-ej-reprocessed
collection and iterate over the assets to update the hrefs (nowcovid-eo-data
will be replaced withveda-data-store
will be replaced in every asset href). Publish these updated items. Alternatively, the objects copied in the first step include stac metadata that would be nice to update and use in this process (for examples3://covid-eo-data/hlss30-ej-reprocessed/2017/19QHA/HLS.S30.T19QHA.2017193T150719.v2.0/HLS.S30.T19QHA.2017193T150719.v2.0_stac-ej-reprocessed.json
is a complete stac item record describing all band assets and thumbnails but the asset hrefs are covid-eo-data).hlss30-ej-reprocessed
3. Cleanup reprocessed environmental justice story config in veda-data
hlsl30-ej-reprocessed
andhlss30-ej-reprocessed
(should be already)The text was updated successfully, but these errors were encountered: