Replies: 2 comments 1 reply
-
BTW, this is built on top of kerchunk and may be useful too: |
Beta Was this translation helpful? Give feedback.
1 reply
-
I added a updated version of the above argopy/argopy/stores/kerchunker.py Line 41 in 0f01c3c >>> ncfile = "s3://argo-gdac-sandbox/pub/dac/coriolis/6903090/6903090_prof.nc"
>>> ak = ArgoKerchunker(store='memory') # default
>>> ak = ArgoKerchunker(store='local', root='.')
>>> ak = ArgoKerchunker(store='local', root='/kerchunk_data_folder') # Custom local storage folder
>>> ak = ArgoKerchunker(store=fsspec.filesystem('dir', path='s3://.../kerchunk_data_folder/', target_protocol='s3'))
>>> ak.supported(ncfile)
>>> ak.translate(ncfile) # takes 1 or a list of uris, requires the kerchunk library
>>> ak.to_kerchunk(ncfile) # Take 1 netcdf file uri, return kerchunk json data (translate or load from store)
>>> ak.pprint(ncfile)
>>> ak.open_dataset(ncfile) # Return xarray dataset of the netcdf file using zarr engine from kerchunk data |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
One more time I came across kerchunk during ADMT25 but I finally understood what it is about !
Thanks @DocOtak !
So I gave it a try and produced the following
ArgoKerchunker
class, that could be used like this:I still need to figure how to properly integrate this into argopy and if it is worth it, compared to working on more Analysis-ready and cloud-optimized (ARCO) format.
For instance, the resulting dataset has variables not casted properly.
While argopy already provides the
argo
engine to open properly (ie cast all variables correctly) a netcdf file, I wonder how to implement casting lazily with zarr.Don't hesitate to shoot your opinion below !
Argo kerchunk helper class
Beta Was this translation helpful? Give feedback.
All reactions