Skip to content

Commit

Permalink
Merge gfdl site fix into main (#329)
Browse files Browse the repository at this point in the history
* add CM4 diag table to repo (#269)

* add CM4 diag table to repo

* swap diag table docx with trimmed-down text file

* Improve mamba usage in CI

- install mamba (https://github.com/mamba-org/mamba) first, and use it
to speed up installation of all other conda envs, including the one for
synthetic data. Note that currently only the dependency solve is
accelerated (https://github.com/mamba-org/mamba/issues/633), but this is
sufficient for a large improvement over conda.

- Remove the --mamba flag from the conda_env_setup.sh script's
arguments, as this was taken out in NOAA-GFDL/MDTF-diagnostics#166.

* Add temp extremes distshape to CI (#273)

* Initial commit of code from Arielle Catalano

* Correct extension for settings.jsonc

* Add freq=day to varlist for backwards compatibility

* Commit conda env for temp_extremes_distshape

* Commit conda env for temp_extremes_distshape

* Run temp_extremes_distshape in python_base instead of custom env

* adding new POD, documentation, and obs subset code

* Fix case of CF variable names

* Set Matplotlib backend in settings.jsonc

* Propagate gs command-line flags to GFDL branch

* Temporary debugging statements

* Update temp_extremes_distshape settings.jsonc

* Update format of settings.jsonc

* Add missing 'log' argument to GFDL data source attribute classes

* log InitVar must be 1st arg to __post_init__ due to inheritance

* Restore missing class attributes to GFDL CMIP6 DataSources

* Convert .pdf documentation to .rst

* Fix obs filenames and templating in html

* Write temp files to WK_DIR, not POD_HOME

temp_extremes_distshapes code was writing temporary parameter .json
files to $POD_HOME, which should be read-only. Change root directory for
writing all these files from $POD_HOME to $WK_DIR.

* Correct variable names

* Correct method for colorbar tick labels

* Handle case where no contours are labeled

* Correct cartopy longitude set_extents

* Correct Moments_plot colorbar positioning

* Add documentation for set_extent() fix

* add test jsonc files for temp_extremes_distshape CMIP

* added test yaml file just for temp_extremes_distshape

* add call create the inputdata directory before untarring the obs_data file to the test yaml

* add checks to test yaml

* Specify full paths in ubuntu set3 jsonc

* remove comments from test jsonc files
change CASENAME to match the format output by updated synthetic data generator in the set3 test jsonc files

* update mdtf_tests.yml to include this branch for testing before PR is submitted

* remove this branch from mdtf_tests.yml

* Switch paths back to relative locations in ubuntu set 3 jsonc

* change model name to synthetic in github actions set 3 jsonc files

* changed ls to just show temp_extremes_distshape obs_data directory

* fixed path in ls

* added print statements for path checks

* add pring statement to group_relative_links

* removed print statements from output_manager and verify_links

* changed test yml to experimental status and added ls for wkdir after POD runs

* fix typo

* remove some test lines in test_cmip.yml

* change exit status for missing files to 0 for debugging in verify_links.py

* remove debugging in verify_links
comment out pod.deactivate call in verify_links  for debugging

* removed PS from the output figure path in TempExtDistShape_CircComps_usp.py bc file is png not postscript

* changed ls to wildcard in test_cmip.yml

* dump output log to terminal

* define output figure path as separate variable with os.path.join in TempExtDistShape_ShiftRatio_util.py

* revert changes to path name

* change set3 env to python3

* change cartopy to version 0.19 in env_python3_base.yml

* revert debugging mods to output_manager.py

* remove extraneous checks from mdtf_tests.yml
test yaml with MDTF_base environment for set 3 tests

* replace _MDTF_python3_base with _MDTF_base in mdtf_tests.yml set 3 tests

Co-authored-by: Thomas Jackson <tom.jackson314@gmail.com>
Co-authored-by: tsjackson-noaa <thomas.jackson@noaa.gov>
Co-authored-by: Arielle Catalano <acat2@ggr-ch-412-ac.psu.ds.pdx.edu>

* Indicate which settings are required, in website and CLI help

* Move --large-file flag higher in list of options, so that data_manager options can come last

* Clarify behavior of --site and plug-in settings

* Don't do full startup when only printing CLI help message

* Apply changes to CLI to NOAA_GFDL site

* Correct GFDL default paths to values in default_gfdl.jsonc

* Do more validation on input paths

* Do more input path validation in GFDL-specific code

* Validate CASE_ROOT_DIR before other checks

* Only validate non-empty CASE_ROOT_DIR

Fixes broken CI.

* Remove Tom from codeowners file

* Merge of Tropical Sea Level POD (#271)

* create pod

* main script update

output dir still need to change to env var

* altrimetry observational data preprocessing code

process the daily data from CMEMS to monthly data

* calculate the wind stress curl for model and obs

* unfinished

OMIP model variable is not in the example model

* small function to show used memory

* calculate the gridded area for observational data

* function designed for xr.Dataset

only used the da_linregress func in the main script

* create naming convension for OMIP CESM2

* setting of the config file

* update to relative path

to with respect to $CODE_ROOT

* update the file name to MDTF $CASENAME format

* include the documentation of the tool

* change variable name to match fieldlist

* update the jsonc files

still cannot run

* remove unused function and repeating constant

* remove redundent import and multiple defined var

* clear unused function

* change comment

* correct start year and end year to string

* change the OUTPUT_DIR setting

* correct the comma at the end of the list

* fixed the output dir blank

* syntex error corrected

* correct the jsonc format

* add print out

* correct the environment variable

* removed commented part that is not used

* include kwarg for earth radius for flexibility

* update the html format for this particular diag

* corrcet error and add references

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* updated to f-string format for readability

* setting file works in version 2 (bk)

* add new format for version 3 setting file

* unfinished changes in the main script

* add observational data access info

* use the version2 format for now

* add the observational data info in the docstring

* save the new version3 format

* update preprocessing code

remove the open_mfdataset function
since it require dask to be installed

* update from main branch

* testing tropical sea level diag

* update to the latest format

* update to the latest format and some fix

suggested by Tom

* remove the previous version files

* Updates to AVISO processing script

- Implement faster os.scantree() for finding files
- Cleaned up xarray Dataset vs. DataArray objects
- Replaced for-loop with list comprehension
- Save output in 32-bit floats to save data volume
- Carry source variable attributes to output NetCDF file

* create pod

* main script update

output dir still need to change to env var

* altrimetry observational data preprocessing code

process the daily data from CMEMS to monthly data

* calculate the wind stress curl for model and obs

* unfinished

OMIP model variable is not in the example model

* small function to show used memory

* calculate the gridded area for observational data

* function designed for xr.Dataset

only used the da_linregress func in the main script

* create naming convension for OMIP CESM2

* setting of the config file

* update to relative path

to with respect to $CODE_ROOT

* update the file name to MDTF $CASENAME format

* include the documentation of the tool

* change variable name to match fieldlist

* update the jsonc files

still cannot run

* remove unused function and repeating constant

* remove redundent import and multiple defined var

* clear unused function

* change comment

* correct start year and end year to string

* change the OUTPUT_DIR setting

* correct the comma at the end of the list

* fixed the output dir blank

* syntex error corrected

* correct the jsonc format

* add print out

* correct the environment variable

* removed commented part that is not used

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* updated to f-string format for readability

* include kwarg for earth radius for flexibility

* update the html format for this particular diag

* corrcet error and add references

* setting file works in version 2 (bk)

* add new format for version 3 setting file

* unfinished changes in the main script

* add observational data access info

* use the version2 format for now

* add the observational data info in the docstring

* save the new version3 format

* update preprocessing code

remove the open_mfdataset function
since it require dask to be installed

* update from main branch

* testing tropical sea level diag

* update to the latest format

* update to the latest format and some fix

suggested by Tom

* remove the previous version files

* Updates to AVISO processing script

- Implement faster os.scantree() for finding files
- Cleaned up xarray Dataset vs. DataArray objects
- Replaced for-loop with list comprehension
- Save output in 32-bit floats to save data volume
- Carry source variable attributes to output NetCDF file

* Fixed tropical sea level jsonc syntax errors

- removed trailing comma
- removed "dimensions_ordered": true, line

* Updates default_tests.jsonc

- Renamed "CESM2" --> "CESM"
- Removed hard-coded anaconda paths

* Updates to sea level POD settings file

- Simplified settings for debugging
- Renamed nlat/nlon to lat/lon
- Commented out areacello for testing

* Added sea level fields to NCAR and CMIP field lists

- areacello, tauuo, tauvo, and zos

* Seperate testing jsonc file for NCAR Synthetic

- Making life easier for testing

* Fix dimension elimination for 2D lat/lon coords

* Make translated var axes check more permissive

- cf-xarray mod needed to understand curvilinear coordinates;
  until that fix is in place, this check will fail for
  grids that have 2-dimensional coordinates
- changed fatal exception to a warning
- wrapped rest of the coordinate metadata checks
  inside an if-block

* Updated tropical sea level pod settings file

* Updated NCAR field list for sea level POD

- now includes tauuo and tauvo

* Updated year range for default case list

* Updated to tropical sea level diagnostic script

- Mostly edits to opening datasets

* Bypass Dask parallelization

- Not working; commented out for now.

* Fixed indexing for tropical sea level curl calc

- indicies appear to be missing for central differences

* Modified model read statements for tropical sl POD

- More native Xarray logic

* Patches for obs. data time issues

- impacts tropical sea level pod
- addresses mismatch between time axis and data length
- hard-coded date ranges for now; needs to be generalized

* Specify dim names before regional averaging

- impacts tropical sea level pod
- needs to be generalized for all multidimensional coordinate
  grids

* incorporate changes from John

* using CESM2 testing

* used the xr_ufunc with dask disabled

load the dataset before function call

* multiple updates

1. allow xaxis and yaxis name option in  pod_env_var
2. allow selecting different obs start year and end year in pod_env_var
3. fix the trend plot in ylim and xlim setting

* add new pod_env_var for obs year and axis name

* fixed the dimension not matching in model cal

* function calculate wind stress curl for obs

* function calculate cell area in obs

* process predefined obs mean trend and season sig

* linear regression for linear trend calculation

* add the predefine flag for obs to speed up cal

* add the predefined obs option in the main.py

* pylint and black format corrected

* add transpose in functions to make sure dim order

* address some LGTM error

* create pod

* main script update

output dir still need to change to env var

* altrimetry observational data preprocessing code

process the daily data from CMEMS to monthly data

* calculate the wind stress curl for model and obs

* unfinished

OMIP model variable is not in the example model

* small function to show used memory

* calculate the gridded area for observational data

* function designed for xr.Dataset

only used the da_linregress func in the main script

* create naming convension for OMIP CESM2

* setting of the config file

* update to relative path

to with respect to $CODE_ROOT

* update the file name to MDTF $CASENAME format

* include the documentation of the tool

* change variable name to match fieldlist

* update the jsonc files

still cannot run

* remove unused function and repeating constant

* remove redundent import and multiple defined var

* clear unused function

* change comment

* correct start year and end year to string

* change the OUTPUT_DIR setting

* correct the comma at the end of the list

* fixed the output dir blank

* syntex error corrected

* correct the jsonc format

* add print out

* correct the environment variable

* removed commented part that is not used

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* Update diagnostics/tropical_pacific_sea_level/doc/tropical_pacific_sea_level.rst

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* updated to f-string format for readability

* include kwarg for earth radius for flexibility

* update the html format for this particular diag

* corrcet error and add references

* setting file works in version 2 (bk)

* add new format for version 3 setting file

* unfinished changes in the main script

* add observational data access info

* use the version2 format for now

* add the observational data info in the docstring

* save the new version3 format

* update preprocessing code

remove the open_mfdataset function
since it require dask to be installed

* update from main branch

* testing tropical sea level diag

* update to the latest format

* update to the latest format and some fix

suggested by Tom

* remove the previous version files

* Updates to AVISO processing script

- Implement faster os.scantree() for finding files
- Cleaned up xarray Dataset vs. DataArray objects
- Replaced for-loop with list comprehension
- Save output in 32-bit floats to save data volume
- Carry source variable attributes to output NetCDF file

* Fixed tropical sea level jsonc syntax errors

- removed trailing comma
- removed "dimensions_ordered": true, line

* Updates default_tests.jsonc

- Renamed "CESM2" --> "CESM"
- Removed hard-coded anaconda paths

* Updates to sea level POD settings file

- Simplified settings for debugging
- Renamed nlat/nlon to lat/lon
- Commented out areacello for testing

* Added sea level fields to NCAR and CMIP field lists

- areacello, tauuo, tauvo, and zos

* Seperate testing jsonc file for NCAR Synthetic

- Making life easier for testing

* Fix dimension elimination for 2D lat/lon coords

* Make translated var axes check more permissive

- cf-xarray mod needed to understand curvilinear coordinates;
  until that fix is in place, this check will fail for
  grids that have 2-dimensional coordinates
- changed fatal exception to a warning
- wrapped rest of the coordinate metadata checks
  inside an if-block

* Updated tropical sea level pod settings file

* Updated NCAR field list for sea level POD

- now includes tauuo and tauvo

* Updated year range for default case list

* Updated to tropical sea level diagnostic script

- Mostly edits to opening datasets

* Bypass Dask parallelization

- Not working; commented out for now.

* Fixed indexing for tropical sea level curl calc

- indicies appear to be missing for central differences

* Modified model read statements for tropical sl POD

- More native Xarray logic

* Patches for obs. data time issues

- impacts tropical sea level pod
- addresses mismatch between time axis and data length
- hard-coded date ranges for now; needs to be generalized

* Specify dim names before regional averaging

- impacts tropical sea level pod
- needs to be generalized for all multidimensional coordinate
  grids

* Revert "Make translated var axes check more permissive"

This reverts commit d68769c6429edff559c821ab3bca8bb2ed3c3c8c.

* fix the missing value casued by model lon range

the model lon range for the synthetic is -270~90
the original code did not convert the long range to
0-360 which causes the model to pick region that
does not exist in the synthetic data

* Remove unused imports from createCMIP6CV.py

* Delete default_tests_ncar_synthetic.jsonc

* remove user modifications from default_tests.jsonc

* remove duplicate variable entries from fieldlist_CMIP.jsonc

* remove changes to xr_parser.py

Co-authored-by: chiaweih@climate <chiaweih@email.arizona.edu>
Co-authored-by: Chia-Wei Hsu <chiaweh2@users.noreply.github.com>
Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Add surface variable support (#266)

* added modifiers.jsonc with entry for atmospheric height

added modifiers att to core.VariableTranslator __init__ method

added modifier att to data_model.DMDependentVariable class and the comments of classes that inherit it

corrected modifier to modifiers

* removed long_name att from modifiers.jsonc

* added check for modfiers to DMDependentVariable __post_init method

* change second temp_d index to entry.modifiers, remove invalid comment

* changed argument axes_set to modfiers in translate, from_CF, and from_CF_name

* added modifiers to NoTranslateFieldlist TranslateVarlistEntry, and changed axes_set to modifiers in from_CF, and from_CF_name

* changed modifiers to modifier

* change remaining axes_set arguments to modifier in core.py

* changed modifier.jsonc to modifiers.jsonc in core.py

* revert change in _ndim_to_axes_set name

* added develop branch exit call changes to core.py

* attempt to fix .strip .lower spec

* change self.modifiers to self.modifier

* add modifier att to TREFHT, tas, and t_ref in conventin jsonc files

* fixed CMIP fieldlist

* added modifier attribute to temperature variable in example settings.jsonc file

* add core import and redefine variabletranslator call indata_model.py

* remove space in description

* added modifiers documentation to ref_settings.rst

* add preliminary version of test_variable_translator_bad_modifier

* move the fieldlist convention reads to its own method, and replace MDTFFramework.configure call to variable translator with VariableTranslator.read_convention

* fix the variableTranslator and read_conventions calls

* rework tests that call VariableTranslator to use read_conventions after initialization

* added check that modifier is an empty string to from_CF routine
removed extraneous parentheses from routines in core.py

* refined modifier check to differentitate between 3d and 4d variable entries. Still needs work.

* added variable dimension size and argument num_dims to from_CF

* Changed default modifier arguments back to None in core.py routines
changed from_CF to check if modifier is false before checking lut entries since empty strings and None types will evaluate to False

* fixed changes in from_CF that got overwritten by develop branch sync

* added atmos_height modifier attribute to tas in temp_extremes_distshape settings.jsonc file
removed modifier assignment to frozenset in from_CF because it is a single string rather than a list

* missed adding the file with the frozenset change in prior commit message

* fix typo in data_model.py DMDependentVariable error message
fix spacing for inline comments

* fix issues with modifier test in test_core.py
add a check for a correct modifier entry to modifier test

* revert spacing change to data_manager.py

* revert changes to default_tests.jsonc again

* Update src/core.py

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

* add 3D and 4D wind_speed fields to CMIP fieldlist (#285)

This requires PR #266 to be merged first.

* Add 3D specific humidity to CMIP fieldlist (#286)

* Fix FRE integration and improve documentation (#245)

* Update FRE wrapper (mtdf_gfdl.csh)

Update the FRE wrapper script to use current flags.

Do not load a python env module; instead run the same commands as the
interactive wrapper script to invoke the site-installed conda env before
running the package.

Disable functionality to make a copy of output for hosting on an
internal website, since updates aren't being made to Dora. Retain code
for when this functionality is re-enabled.

* More debugging statements in FRE wrapper

* Pass through args in wrapper; set --convention default to GFDL

* Add more detail on wrapper script to GFDL docs

* Convention defaults for GFDL_PP, LocalFile data sources

Set the default --convention assumed for SampleLocalFileDataSource to
"CMIP", to be consistent with what's written in the documentation.

Set the default --convention assumed for the GFDL_PP data source to
"GFDL", since this is the most common use case (but not the only one).
Mention this in the docs. Take logic to set this default value out of
the mdtf_gfdl.csh wrapper script.

* Better regex in FRE wrapper

* Restore model component-specific options

Allow --component and --chunk_freq flags to be passed to the GFDL_PP
data source to restrict the data query to files with those attributes.

Add functionality to the FRE wrapper script to handle the real use
cases. Pass --multi_component to the wrapper to invoke "frepp mode"
(multiple runs of the package on same data, as it's being output from
postprocessing); pass --component_only to restrict the model component
used.

Update and expand the docs for these settings to explain all this.

* Refine model component-specific logic in FRE flags

Remove the --multi_component/--any_component flag from the mdtf_gfdl.csh
wrapper, and replace it with --run_once.

This is because the sensible default behavior for the wrapper hould be
the opposite of what was previously implemented: the default scenario
for when the package is called from FRE is the incremental/online
processing use case ("scenario 3B" in the docs).

Non-default use cases are when the package is invoked only once, after
all needed data is postprocessed (scenario 3A), or when the user wants
to manually restrict the operation to the given model components. These
are invokes via the --run_once or --component_only flags in the wrapper
script.

* Update all references to GFDL site installation

Remove all paths and references to the old GFDL site installation in
oar.gfdl.mdteam and replace with their counterparts in the oar.gfdl.mdtf
role account.

* Fix .rst formatting bug in docs

* More .rst tweaks

* Tweaks to default data source convention specification

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* change 3d wind_speed to sfcWind in CMIP fieldlist (#288)

* change 3 wind_speed to sfcWind in CMIP fieldlist

* change 3D hus to huss in CMIP fieldlist

hus is for 4D specific humidity, while huss is for 3D (near-surface) specific humidity.

* Document internal APIs (#274)

* Initial commit of custom CSS for autodoc (only)

Attempt to tweak spacing of elements in autodoc pages. Styling of rest
of documentation site is unchanged.

* Fix docstring 'Returns:' formatting

* Docstring formatting fixes

* Add manually written doc section to framework TOC

* Tweak autodoc settings to remove unneeded output

* Add more space, grouping between class docstrings

* Improve mamba usage in CI

- install mamba (https://github.com/mamba-org/mamba) first, and use it
to speed up installation of all other conda envs, including the one for
synthetic data. Note that currently only the dependency solve is
accelerated (https://github.com/mamba-org/mamba/issues/633), but this is
sufficient for a large improvement over conda.

- Remove the --mamba flag from the conda_env_setup.sh script's
arguments, as this was taken out in NOAA-GFDL/MDTF-diagnostics#166.

* Add temp extremes distshape to CI (#273)

* Initial commit of code from Arielle Catalano

* Correct extension for settings.jsonc

* Add freq=day to varlist for backwards compatibility

* Commit conda env for temp_extremes_distshape

* Commit conda env for temp_extremes_distshape

* Run temp_extremes_distshape in python_base instead of custom env

* adding new POD, documentation, and obs subset code

* Fix case of CF variable names

* Set Matplotlib backend in settings.jsonc

* Propagate gs command-line flags to GFDL branch

* Temporary debugging statements

* Update temp_extremes_distshape settings.jsonc

* Update format of settings.jsonc

* Add missing 'log' argument to GFDL data source attribute classes

* log InitVar must be 1st arg to __post_init__ due to inheritance

* Restore missing class attributes to GFDL CMIP6 DataSources

* Convert .pdf documentation to .rst

* Fix obs filenames and templating in html

* Write temp files to WK_DIR, not POD_HOME

temp_extremes_distshapes code was writing temporary parameter .json
files to $POD_HOME, which should be read-only. Change root directory for
writing all these files from $POD_HOME to $WK_DIR.

* Correct variable names

* Correct method for colorbar tick labels

* Handle case where no contours are labeled

* Correct cartopy longitude set_extents

* Correct Moments_plot colorbar positioning

* Add documentation for set_extent() fix

* add test jsonc files for temp_extremes_distshape CMIP

* added test yaml file just for temp_extremes_distshape

* add call create the inputdata directory before untarring the obs_data file to the test yaml

* add checks to test yaml

* Specify full paths in ubuntu set3 jsonc

* remove comments from test jsonc files
change CASENAME to match the format output by updated synthetic data generator in the set3 test jsonc files

* update mdtf_tests.yml to include this branch for testing before PR is submitted

* remove this branch from mdtf_tests.yml

* Switch paths back to relative locations in ubuntu set 3 jsonc

* change model name to synthetic in github actions set 3 jsonc files

* changed ls to just show temp_extremes_distshape obs_data directory

* fixed path in ls

* added print statements for path checks

* add pring statement to group_relative_links

* removed print statements from output_manager and verify_links

* changed test yml to experimental status and added ls for wkdir after POD runs

* fix typo

* remove some test lines in test_cmip.yml

* change exit status for missing files to 0 for debugging in verify_links.py

* remove debugging in verify_links
comment out pod.deactivate call in verify_links  for debugging

* removed PS from the output figure path in TempExtDistShape_CircComps_usp.py bc file is png not postscript

* changed ls to wildcard in test_cmip.yml

* dump output log to terminal

* define output figure path as separate variable with os.path.join in TempExtDistShape_ShiftRatio_util.py

* revert changes to path name

* change set3 env to python3

* change cartopy to version 0.19 in env_python3_base.yml

* revert debugging mods to output_manager.py

* remove extraneous checks from mdtf_tests.yml
test yaml with MDTF_base environment for set 3 tests

* replace _MDTF_python3_base with _MDTF_base in mdtf_tests.yml set 3 tests

Co-authored-by: Thomas Jackson <tom.jackson314@gmail.com>
Co-authored-by: tsjackson-noaa <thomas.jackson@noaa.gov>
Co-authored-by: Arielle Catalano <acat2@ggr-ch-412-ac.psu.ds.pdx.edu>

* Abbreviate logger in function/method signatures

* Set more prominent CSS highlight color

* Expand docstrings for src/util

* Commit manual doc for util subpackage

* Stop sphinx-apidoc from importing unit tests

* Fix mocking of external imports when docs are built

* Sphinx CSS tweaks

Increase item spacing, fix shading, add border for code literal
blockquotes

* Improve coverage of class attributes

Set inherited-members to True to include inherited dataclass fields and
simplify navigating class heirarchy. Define skip_members_handler() to
remove docstrings of methods we don't want to include (eg from stdlib).

* Update 'supporting modules' docstrings

* Partial commit of updated docstrings for main modules

* Fix documentation .rst syntax errors

* Initial commit of remaining manually-generated internal API docs

* Split up data source docs across multiple pages

Include sphinx links to relevant docs in module summary docstrings.

* More crossreferences in docs

* Fix module docstrings

Sphinx autosummary only includes first sentence of docstring, not first
paragraph.

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>
Co-authored-by: Thomas Jackson <tom.jackson314@gmail.com>
Co-authored-by: Arielle Catalano <acat2@ggr-ch-412-ac.psu.ds.pdx.edu>

* update env_base.yml (#289)

Incorporate common packages required to run mld (and other) PODs

* Revert "update env_base.yml (#289)" (#290)

This reverts commit 60632d3a8828a395ab9b5fb365b63b6604adcc11.

* Change cf_time calendar logic (#291)

* Change cf_time calendar logic

* remove check for cftime calendar in tcoord.values

The cftime calendar attribute is most likely present in other t_coord and ds features that are checked, so remove
unnecessary assumption of calendar att in t_coord.values[0]

* Add metpy to python3_base yaml (#294)

The metpy package will support incoming the ocn_flux_matrix POD and others that require it

* add test yaml for trop pac sl pod (#281)

* add test yaml for trop pac sl pod

* updated tpsl test yaml

* Fix branch name

* Change mdtf_test_data version to 1.0.4

* change mdtf-test-data version to 1.0.4.post1 in test yaml

* add tropical pacific sea level POD to set3 test yaml

* fix branch name in test yaml

* fix directory location in test data untar stage

* change tauuo and tauvo standard names in settings file to match names in CMIP fieldlist

* fix tauuo and tauvo standard_names in tpsl settings.jsonc

* update mdtf_test_data version in mdtf_tests.yml
add temp_extremes_distshape PODs back to set3 test jsonc files
add trop_pac_sl obs data tarball fetch to mdtf_tests.yml

* remove test_tpsl.yml from repository

* Add base yaml pkgs (#297)

* add NCO to env_NCL_base.yml

* add subprocess to env_base.yml

* fix subprocess module name

* Mld debug (#283)

* Add CC changes

Edit paths temporarily

Tidy up

Adding mld calculation

Working figures

Tweaks

Separate MLD branch works

Add documentation and tidy up html

Edit contact

Add info to header

Fix issues flagged by LGTM

Remove testing configuration file and unused pod env vars

Edited settings.jsonc, broke something

This works, but thetao not added to settings

Problem with settings.jsonc file for POD mixed_layer_depth

Remove comments from settings.jsonc

add depth coordinate to fieldlist_CMIP.jsonc

remove comment lines from mld settings file
change variable coordinate names to i and j to match model data

adjust default_tests.jsonc for testing

add lev entry for ocean depth to fieldlist_CMIP.jsonc

add check for axis att when defining ds_axes in reconcile_scalar_coords

environment variables and path definitions
reorganized and cleaned up some routines and calls
added logic to rename lat and lon to latitude and longitude if present in model data ds

changed so units to match definitions in CMIP fieldlist and ensure functionality with synthetic data
general cleanup

revert default_tests.jsonc changes

remove commented out defintion of ds_axes from xr_parser.py

Update src/xr_parser.py

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

Add density calculation description

Update diagnostics/mixed_layer_depth/mixed_layer_depth.py

Co-authored-by: John Krasting <John.Krasting@noaa.gov>

removed mld environment yaml

* add gsw and xesmf to env_python3_base.yml

* fixed comments for computemean in mixed_layer_depth.py

* remove unused sigma2 computation from computemld; sub pressure.lev to for sigma2.lev in mld

Co-authored-by: lettie_roach <lettie.roach@gmail.com>

* Enso mse (#292)

* Create xx

* Delete xx

* Create xx

* Create xx

* Create xx

* Create xx

* Delete xx

* Create xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Create xx

* Create xx

* Create xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Create xx

* Create xx

* Create xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Add files via upload

* Delete xx

* Create xx

* Create xx

* Add files via upload

* Delete xx

* Add files via upload

* Create xx

* Add files via upload

* Delete xx

* Create xx

* Add files via upload

* Delete xx

* Create xx

* Add files via upload

* Create xx

* Add files via upload

* Delete xx

* Delete xx

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete index.html

* Delete xx

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete default_test.jsonc

* Delete default_tests.jsonc.bak1

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Remove unused imports from composite.py

* Remove unused time import from enso_mse.py

* remove unused numpy improt from mse_var_obs.py

* remove unused numpy import from mse_var.py

* Delete mdtf

* Delete env_ENSO_RWS.yml

* Delete xr_parser.py

* Delete verify_links.py

* Delete date_label.py

* Revert changes to default_tests.jsonc

* Delete get_dimensions.py

* Delete get_lon_lat_plevels_in.py

* Delete get_season.py

* Delete plevs.txt

* Delete read_netcdf_2D.py

* Delete read_netcdf_3D.py

* Delete xx

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete README_LEVEL_01.pdf

* Delete README_LEVEL_02.pdf

* Delete README_LEVEL_03.pdf

* Delete README_LEVEL_04.pdf

* Delete ENSO_MSE.pdf

* Add files via upload

* remove whitespace from default_tests.jsonc

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Feature/ocean flux matrix (New POD) (#293)

* add the necessary function and main script

have not integrate to the framework yet

* POD name change and default test json

* add the necessary function and main script

have not integrate to the framework yet

* POD name change and default test json

* add metpy package

* testing ocn_surf_flux_diag POD with framework

* surface variables add modifier

* test synthetic data

* update the output format in obs_data

* fix the numpy error due to framework inputs

* include function docstring

* correct the 0.98 factor for salinity effect

* add script description and produce netCDF output

* finish the rst description for the POD

* update the html for POD specific changes

* remove comments and correct the "more" part

* remove the unused function

* remove unused import

* remove unused import variable

* remove testing files

* remove unused print

* remove unused plotting command

* testing file

* update description

* remove obs plot

* change output plot name

* remove testing jsonc

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Enso rws (#223)

* reinitialized files

* new file:   diagnostics/ENSO_MSE/COMPOSITE/COMPOSITE.html
	new file:   diagnostics/ENSO_MSE/COMPOSITE/COMPOSITE.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/COMPOSITE_OBS.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_composite_all.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_composite_all_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_correlation_all.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_correlation_all_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_regression_all.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL/plot_regression_all_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/data_radiation_routine.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/data_radiation_routine_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/data_routine.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/data_routine_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/write_24month_netcdf.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/NCL_CONVERT/write_24month_netcdf_OBS.ncl
	new file:   diagnostics/ENSO_MSE/COMPOSITE/README_LEVEL_01.pdf
	new file:   diagnostics/ENSO_MSE/COMPOSITE/check_input_files.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/check_input_files_OBS.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_clima_in.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_correlation.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_data_in.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_data_in_24.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_directories.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_directories_OBS.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_flux_clima.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_flux_in.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_flux_in_24.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_nino_index.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_parameters_in.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/get_regression.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/preprocess.py
	new file:   diagnostics/ENSO_MSE/COMPOSITE/write_out.py
	new file:   diagnostics/ENSO_MSE/ENSO_MSE.html
	new file:   diagnostics/ENSO_MSE/ENSO_MSE.pdf
	new file:   diagnostics/ENSO_MSE/ENSO_MSE.py
	new file:   diagnostics/ENSO_MSE/MSE/MSE.html
	new file:   diagnostics/ENSO_MSE/MSE/MSE.py
	new file:   diagnostics/ENSO_MSE/MSE/MSE_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE/NCL/plot_composite_all.ncl
	new file:   diagnostics/ENSO_MSE/MSE/NCL/plot_composite_all_OBS.ncl
	new file:   diagnostics/ENSO_MSE/MSE/README_LEVEL_02.pdf
	new file:   diagnostics/ENSO_MSE/MSE/check_input_files.py
	new file:   diagnostics/ENSO_MSE/MSE/check_input_files_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE/get_clima_in.py
	new file:   diagnostics/ENSO_MSE/MSE/get_data_in.py
	new file:   diagnostics/ENSO_MSE/MSE/get_directories.py
	new file:   diagnostics/ENSO_MSE/MSE/get_directories_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE/get_parameters_in.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_hadv.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_hdiv.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_madv.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_mdiv.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_mse.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_omse.py
	new file:   diagnostics/ENSO_MSE/MSE/moist_routine_tadv.py
	new file:   diagnostics/ENSO_MSE/MSE/write_out_mse.py
	new file:   diagnostics/ENSO_MSE/MSE/write_out_mse_clima.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/MSE_VAR.html
	new file:   diagnostics/ENSO_MSE/MSE_VAR/MSE_VAR.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/MSE_VAR_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/NCL/plot_bars_composite.ncl
	new file:   diagnostics/ENSO_MSE/MSE_VAR/NCL/plot_bars_composite_OBS.ncl
	new file:   diagnostics/ENSO_MSE/MSE_VAR/NCL_general/plot_bars_composite.ncl
	new file:   diagnostics/ENSO_MSE/MSE_VAR/NCL_general/plot_bars_composite_OBS.ncl
	new file:   diagnostics/ENSO_MSE/MSE_VAR/README_LEVEL_03.pdf
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_anomaly.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_clima_flux_in.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_clima_flux_in_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_clima_in.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_clima_in_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_data_in.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_data_in_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_directories.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_directories_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_flux_in.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_flux_in_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_lonlat_in_OBS.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/get_parameters_in.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/moist_routine_variance.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/write_out.py
	new file:   diagnostics/ENSO_MSE/MSE_VAR/write_out_general.py
	new file:   diagnostics/ENSO_MSE/README_general.pdf
	new file:   diagnostics/ENSO_MSE/SCATTER/NCL/scatter_01.ncl
	new file:   diagnostics/ENSO_MSE/SCATTER/NCL/scatter_02.ncl
	new file:   diagnostics/ENSO_MSE/SCATTER/NCL/scatter_03.ncl
	new file:   diagnostics/ENSO_MSE/SCATTER/NCL/scatter_04.ncl
	new file:   diagnostics/ENSO_MSE/SCATTER/README_LEVEL_04.pdf
	new file:   diagnostics/ENSO_MSE/SCATTER/SCATTER.html
	new file:   diagnostics/ENSO_MSE/SCATTER/SCATTER.py
	new file:   diagnostics/ENSO_MSE/SCATTER/check_input_files.py
	new file:   diagnostics/ENSO_MSE/SCATTER/get_data_in.py
	new file:   diagnostics/ENSO_MSE/SCATTER/get_scatter_data.py
	new file:   diagnostics/ENSO_MSE/SCATTER/list-models-historical-obs
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE.pdf
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE.rst
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE_fig1.png
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE_fig2.png
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE_fig3.png
	new file:   diagnostics/ENSO_MSE/doc/ENSO_MSE_fig4.png
	new file:   diagnostics/ENSO_MSE/doc/README_LEVEL_01.pdf
	new file:   diagnostics/ENSO_MSE/doc/README_LEVEL_02.pdf
	new file:   diagnostics/ENSO_MSE/doc/README_LEVEL_03.pdf
	new file:   diagnostics/ENSO_MSE/doc/README_LEVEL_04.pdf
	new file:   diagnostics/ENSO_MSE/doc/README_general.pdf
	new file:   diagnostics/ENSO_MSE/html/index.html
	new file:   diagnostics/ENSO_MSE/html/index_mdtf_03.html
	new file:   diagnostics/ENSO_MSE/html/mdtf_composite.html
	new file:   diagnostics/ENSO_MSE/html/mdtf_diag_banner.png
	new file:   diagnostics/ENSO_MSE/input_data/obs_data/ENSO_MSE
	new file:   diagnostics/ENSO_MSE/mdtf_diag_banner.png
	new file:   diagnostics/ENSO_MSE/settings.jsonc
	new file:   diagnostics/ENSO_MSE/shared/generate_ncl_call.py
	new file:   diagnostics/ENSO_MSE/shared/get_dimensions.py
	new file:   diagnostics/ENSO_MSE/shared/get_lon_lat_plevels_in.py
	new file:   diagnostics/ENSO_MSE/shared/get_season.py
	new file:   diagnostics/ENSO_MSE/shared/gsnColorRange.ncl
	new file:   diagnostics/ENSO_MSE/shared/parameters.txt
	new file:   diagnostics/ENSO_MSE/shared/plevs.txt
	new file:   diagnostics/ENSO_MSE/shared/read_netcdf_2D.py
	new file:   diagnostics/ENSO_MSE/shared/read_netcdf_3D.py
	new file:   diagnostics/ENSO_MSE/shared/rgb/amwg.old.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/amwg.png
	new file:   diagnostics/ENSO_MSE/shared/rgb/amwg.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/amwg21.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/amwg_reverse.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/bluered.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/blueyellowred.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/cloudsim.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/corr.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/corr2.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/diff.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/rainbow21.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/redyellowblue.rgb
	new file:   diagnostics/ENSO_MSE/shared/rgb/rgb.txt
	new file:   diagnostics/ENSO_MSE/shared/rgb/show_colors.ncl
	new file:   diagnostics/ENSO_MSE/shared/rgb/stress.rgb
	new file:   diagnostics/ENSO_MSE/shared/set_variables_AM4.py
	new file:   diagnostics/ENSO_MSE/shared/set_variables_CESM.py
	new file:   diagnostics/ENSO_MSE/shared/set_variables_CMIP.py
	new file:   diagnostics/ENSO_MSE/shared/util.py

* modified:   diagnostics/ENSO_MSE/settings.jsonc

* Delete diagnostics/ENSO_MSE directory

* Create ENSO_RWS.pdf

* Create ENSO_RWS.pdf

* Create util.py

* Create LEVEL_01.py

* Create LEVEL_02.py

* Create LEVEL_03.py

* Create LEVEL_04.py

* Create data_routine.ncl

* Create plot_betastar_clima.ncl

* Create plot_RWS_composite.ncl

* Create scatter_plot_01.ncl

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create obs_data

* Delete obs_data

* Create xx

* Create xx

* Delete xx

* Delete xx

* Create xx

* Delete xx

* Create xx

* Delete xx

* Create enso_mse

* Create enso_rws

* Delete enso_mse

* Delete enso_rws

* Create enso_rws

* Delete enso_rws

* Delete util.py

* Create util.py

* Add files via upload

* Delete ENSO_RWS.pdf

* Delete README_LEVEL_01_ENSO_RWS.pdf

* Delete README_LEVEL_02_ENSO_RWS.pdf

* Delete README_LEVEL_03_ENSO_RWS.pdf

* Delete README_LEVEL_04_ENSO_RWS.pdf

* Delete README_general_ENSO_RWS.pdf

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete ENSO_RWS.pdf

* Add files via upload

* Create xx

* Delete xx

* Create xx

* Add files via upload

* Add files via upload

* Create x

* Delete x

* Create dummy

* Add files via upload

* Delete dummy

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create dummy

* Create dummy

* Add files via upload

* Delete dummy

* Add files via upload

* Delete dummy

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete LEVEL_01.html

* Delete LEVEL_01.py

* Delete check_input_files.py

* Delete check_input_files_OBS.py

* Delete get_clima_in.py

* Delete get_data_in.py

* Delete get_dims.py

* Delete get_directories.py

* Delete get_directories_OBS.py

* Delete get_flux_clima.py

* Delete get_flux_in.py

* Delete get_nino_index.py

* Delete process_data.py

* Delete write_out.py

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Update mdtf_framework.py

* Update cli.py

* Update cli.py

* Update cmip6.py

* Update env_python3_base.yml

* Update core.py

* Update data_manager.py

* Update data_model.py

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete default_tests.jsonc.bak1

* Delete default_tests.jsonc

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete default_test.jsonc

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete README

* Delete xx

* commit

* Delete mdtf

* Delete env_ENSO_MSE.yml

* Revert changes to default_tests.jsonc

* Delete verify_links.py

* Delete read_netcdf_2D.py

* Delete read_netcdf_3D.py

* Add files via upload

* Add files via upload

* Delete env_ENSO_RWS.yml

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Delete inputdata/obs_data directory

* remove extra space from default_tests.jsonc

* Delete diagnostics/ENSO_MSE directory

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* change developer cheatsheet link

* Feature/ocean flux matrix (merged POD update on rst) (#300)

* add the necessary function and main script

have not integrate to the framework yet

* POD name change and default test json

* add the necessary function and main script

have not integrate to the framework yet

* POD name change and default test json

* add metpy package

* testing ocn_surf_flux_diag POD with framework

* surface variables add modifier

* test synthetic data

* update the output format in obs_data

* fix the numpy error due to framework inputs

* include function docstring

* correct the 0.98 factor for salinity effect

* add script description and produce netCDF output

* finish the rst description for the POD

* update the html for POD specific changes

* remove comments and correct the "more" part

* remove the unused function

* remove unused import

* remove unused import variable

* remove testing files

* remove unused print

* remove unused plotting command

* testing file

* update description

* remove obs plot

* change output plot name

* remove testing jsonc

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Update diagnostics/ocn_surf_flux_diag/ocn_surf_flux_diag.html

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* update outdated into in rst

Co-authored-by: Jess <20195932+wrongkindofdoctor@users.noreply.github.com>

* Add pods to ci (#301)

* added checks that plotted values fall within contour levels to prevent synthetic data nans from causing failure
and fixed the figure output path definition in ocn_surf_flux_diag.py

* added mixed layer depth and ocn surf flux matrix pods to test set 3 yamls

* removed the mdtf_test_data version spec from mdtf_tests.yml

* added temporary test workflow file for new pods

* fix name of ocn_surf_flux_diag tarball in test yaml

* convert start and end dates to iso format in mixed_layer_depth.py so that pod works with full range of year integer values

* added test lines to dump detailed output for mld pod to screen for debugging

* turned on experimental setting

* replace cat with printf

* add updated test files

* add continue on error to step, only build required environments

* move log dump to separate step

* fix test workflow file

* add esmpy to python3_base.yml and update xesmf version

* specify pip to install xesmf

* change package order

* update netcdf4, esmpy versions in python3_env yaml
add esmpy 8.2 to python3_env yaml to resolve package dependency issues

* update env_synthetic_data yaml package versions to match python3_base
add ocn_surf_flux_diag test back to test_tpsl to double check env changes
comment out mld log dump step

* add ocn_surf_flux_diag and mld PODs to mdtf_tests.yml
remove test_tpsl from remote repo
uncomment other set3 PODs in test jsonc files

* Fix Explicit_file CLI definition

* Update the CM4 diag table

Add atmos variables to the diag table
Note that this currently generates variables required for the MJO_suite POD using GFDL standard output
more variables may be required for the other PODs. It is also preferable to output in CMIP6 format

* remove --disable-epsv option from curl calls

* Feature/eulerian storm track (#312)

* added feature/eulerian-storm-track by Jeyavinoth Jeyaratnam

* ready to work on eulerian-storm-tracker

* changing settings.json for my eulerian-strom-track

* copying over new code

* working version of eulerian_storm_tracker; needs more finer changes

* adding wheeler kiladis to my test run; index.html has a bug when only a single POD is run

* trying out eulerian_storm_track

* eulerian storm track works, but now have to filter out the topography and create zonal means

* create topo file for obs data

* added zonal means

* added labels

* cleaned the eulerian_storm_track.yml file

* updated the pdf

* removed old front_detection codde

* removed old front_detection codde

* manually editted the yml env file

* updated the documentation

* removed etc_composites code from this branch

* added MDTF_Doc in remove

* removed README.md from main folder

* removed eulerian_readme.md

* deleted remove/MDTF_.pdf

* removed unnecessary files; added rst doc file

* removed default_jj.jsonc

* removed the seam on the plots using add_cyclic_point

* Re-merged the v3 main branch from NOAA MDTF

* removed import os from plotter.py

* removed unused variables n imports suggested by LGTM

* removed commented out codes in all files

* removed commented code; removed .yml & default_jj.jsonc

* removed mdtf_settings.jsonc

* Feature/precip buoy diag (#115)

* some changes to framework code

* some changes to framework code

* Edits

* included preprocessing

* Early version of working POD

* resolved merge conflict

* included preprocessing

* Early version of working POD

* fixed merge issues

* removed ghost comment

* fixed conda_init.sh

* removed hard coded paths

* fixed issues from lgtm

* except block now does not catch BaseException

* fixed BaseExcept handling

* removed intermediate driver script

* contents of /src now match develop

* removed documentation pdf

* removed unsused dependencies: networkx, seborn, netCDF4

* added package version numbers

* removed nco from dependencies

* modified settings file to include dimensions

* removed path variables from code; now assuming they are specified in settings file

* converted single to double quotes in subprocess.run

* comment in settings file about sigma coords

* one

* updated figures and documentation

* Create index.md

* Delete docs directory

* removed pdf and MDTF-diagnostics directory

Co-authored-by: Fiaz Ahmed <fiaz@jupyter.atmos.ucla.edu>

* Fix Wheeler-Kiladis PDO (#326)

* Revert "Update with current development efforts"

* Fix FTP links in README.md

Github.com does not show hyperlinks with the ftp:// protocol, even if
they're explicitly specified. Work around this by writing out the full
URL in the document.

* Fix FTP links in README.md

Fixes NOAA-GFDL/MDTF-diagnostics#7
Github.com does not show hyperlinks with the ftp:// protocol, even if
they're explicitly specified. Work around this by writing out the full
URL in the document so users can find data.

* Merge updates into main branch (#320)

* add CM4 diag table to repo (#269)

* add CM4 diag table to repo

* swap diag table docx with trimmed-down text file

* Improve mamba usage in CI

- install mamba (https://github.com/mamba-org/mamba) first, and use it
to speed up installation of all other conda envs, including the one for
synthetic data. Note that currently only the dependency solve is
accelerated (https://github.com/mamba-org/mamba/issues/633), but this is
sufficient for a large improvement over conda.

- Remove the --mamba flag from the conda_env_setup.sh script's
arguments, as this was taken out in NOAA-GFDL/MDTF-diagnostics#166.

* Add temp extremes distshape to CI (#273)

* Initial commit of code from Arielle Catalano

* Correct extension for settings.jsonc

* Add freq=day to varlist for backwards compatibility

* Commit conda env for temp_extremes_distshape

* Commit conda env for temp_extremes_distshape

* Run temp_extremes_distshape in python_base instead of custom env

* adding new POD, documentation, and obs subset code

* Fix case of CF variable names

* Set Matplotlib backend in settings.jsonc

* Propagate gs command-line flags to GFDL branch

* Temporary debugging statements

* Update temp_extremes_distshape settings.jsonc

* Update format of settings.jsonc

* Add missing 'log' argument to GFDL data source attribute classes

* log InitVar must be 1st arg to __post_init__ due to inheritance

* Restore missing class attributes to GFDL CMIP6 DataSources

* Convert .pdf documentation to .rst

* Fix obs filenames and templating in html

* Write temp files to WK_DIR, not POD_HOME

temp_extremes_distshapes code was writing temporary parameter .json
files to $POD_HOME, which should be read-only. Change root directory for
writing all these files from $POD_HOME to $WK_DIR.

* Correct variable names

* Correct method for colorbar tick labels

* Handle case where no contours are labeled

* Correct cartopy longitude set_extents

* Correct Moments_plot colorbar positioning

* Add documentation for set_extent() fix

* add test jsonc files for temp_extremes_distshape CMIP

* added test yaml file just for temp_extremes_distshape

* add call create the inputdata directory before untarring the obs_data file to the test yaml

* add checks to test yaml

* Specify full paths in ubuntu set3 jsonc

* remove comments from test jsonc files
change CASENAME to match the format output by updated synthetic data generator in the set3 test jsonc files

* update mdtf_tests.yml to include this branch for testing before PR is submitted

* remove this branch from mdtf_tests.yml

* Switch paths back to relative locations in ubuntu set 3 jsonc

* change model name to synthetic in github actions set 3 jsonc files

* changed ls to just show temp_extremes_distshape obs_data directory

* fixed path in ls

* added print statements for path checks

* add pring statement to group_relative_links

* removed print statements from output_manager and verify_links

* changed test yml to experimental status and added ls for wkdir after POD runs

* fix typo

* remove some test lines in test_cmip.yml

* change exit status for missing files to 0 for debugging in verify_links.py

* remove debugging in verify_links
comment out pod.deactivate call in verify_links  for debugging

* removed PS from the output figure path in TempExtDistShape_CircComps_usp.py bc file is png not postscript

* changed ls to wildcard in test_cmip.yml

* dump output log to terminal

* define output figure path as separate variable with os.path.join in TempExtDistShape_ShiftRatio_util.py

* revert changes to path name

* change set3 env to python3

* change cartopy to version 0.19 in env_python3_base.yml

* revert debugging mods to output_manager.py

* remove extraneous checks from mdtf_tests.yml
test yaml with MDTF_base environment for set 3 tests

* replace _MDTF_python3_base with _MDTF_base in mdtf_tests.yml set 3 tests

Co-authored-by: Thomas Jackson <tom.jackson314@gmail.com>
Co-authored-by: tsjackson-noaa <thomas.jackson@noaa.gov>
Co-authored-by: Arielle Catalano <acat2@ggr-ch-412-ac.psu.ds.pdx.edu>

* Indicate which settings are required, in website and CLI help

* Move --large-file flag higher in list of options, so that data_manager options can come last

* Clarify behavior of --site and plug-in settings

* Don't do full startup when only printing CLI help message

* Apply changes to CLI to NOAA_GFDL site

* Correct GFDL default paths to values in default_gfdl.jsonc

* Do more validation on input paths

* Do more input path validation in GFDL-specific code

* Validate CASE_ROOT_DIR before other checks

* Only valida…
  • Loading branch information
15 people authored Apr 29, 2022
1 parent 4ebed7b commit f8a6649
Show file tree
Hide file tree
Showing 6 changed files with 64 additions and 38 deletions.
14 changes: 7 additions & 7 deletions sites/NOAA_GFDL/default_gfdl.jsonc
Original file line number Diff line number Diff line change
Expand Up @@ -17,19 +17,19 @@
"CASENAME" : "ESM4_historical_D1",
"model" : "GFDL-ESM4",
"convention" : "CMIP",
"FIRSTYR" : 1996,
"LASTYR" : 1999,
"FIRSTYR" : 2000,
"LASTYR" : 2004,
"CASE_ROOT_DIR" : "/archive/oar.gfdl.cmip6/ESM4/DECK/ESM4_historical_D1/gfdl.ncrc4-intel16-prod-openmp/pp",
"pod_list": [
// Optional: PODs to run for this model only (defaults to all)
"Wheeler_Kiladis",
"EOF_500hPa",
"convective_transition_diag",
// "convective_transition_diag",
"MJO_suite",
"MJO_teleconnection",
"MJO_prop_amp",
"precip_diurnal_cycle",
"SM_ET_coupling"
//"MJO_prop_amp",
"precip_diurnal_cycle"
// "SM_ET_coupling"
]
}
],
Expand Down Expand Up @@ -77,7 +77,7 @@
// Settings affecting the framework's retrieval of model data.

// Method used to fetch model data.
"data_manager": "GFDL_auto",
"data_manager": "GFDL_PP",

// Time (in seconds) to wait before giving up on transferring a data file to
// the local filesystem. Set to zero to disable.
Expand Down
48 changes: 25 additions & 23 deletions sites/NOAA_GFDL/gfdl.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def parse_env_vars(self, cli_obj):
self.global_env_vars['MDTF_TMPDIR'] = gfdl_tmp_dir

def _post_parse_hook(self, cli_obj, config, paths):
### call parent class method
# call parent class method
super(GFDLMDTFFramework, self)._post_parse_hook(cli_obj, config, paths)

self.reset_case_pod_list(cli_obj, config, paths)
Expand Down Expand Up @@ -400,18 +400,26 @@ class PPDataSourceAttributes(data_manager.DataSourceAttributesBase):
# date_range: util.DateRange
# CASE_ROOT_DIR: str
# convention: str

convention: str = "GFDL"
CASE_ROOT_DIR: str = ""
component: str = ""
chunk_freq: util.DateFrequency = None

def __post_init__(self):
# chunk_freq: util.DateFrequency = None # THIS IS THE PROBLEM LINE FOPR THE GFDL SITE BUILD!!!

# This method overrides dataclass.mdtf_dataclass._old_post_init.
# _old_post_init has the parms *args, and **kwargs. Excluding these parms
# from the super().__post_init__() call, therefore, caused an error that 1
# positional argument (self) was specified, but 2 were given during the self.atts definition
# in data_manager.DataSourceBase.__init__()
# I resolved the problem (I think) using the example here:
# https://stackoverflow.com/questions/66995998/how-can-i-take-the-variable-from-the-parent-class-constructor-and-use-it-in-the
# after another post stated that an error like this could be caused by class override issues.
def __post_init__(self, *args, **kwargs):
"""Validate user input.
"""
super(PPDataSourceAttributes, self).__post_init__()
super(PPDataSourceAttributes, self).__post_init__(*args, **kwargs)
config = core.ConfigManager()

pass

gfdlppDataManager_any_components_col_spec = data_manager.DataframeQueryColumnSpec(
# Catalog columns whose values must be the same for all variables.
Expand All @@ -430,6 +438,7 @@ def __post_init__(self):
)

class GfdlppDataManager(GFDL_GCP_FileDataSourceBase):
# extends GFDL_GCP_FileDataSourceBase
_FileRegexClass = PPTimeseriesDataFile
_DirectoryRegex = pp_dir_regex
_AttributesClass = PPDataSourceAttributes
Expand All @@ -454,26 +463,15 @@ def __init__(self, case_dict, parent):
# any_components = True (set to False with --component_only)
config = core.ConfigManager()
self.frepp_mode = config.get('frepp', False)
# if no model component set, consider data from any components
self.any_components = not(self.attrs.component)

@property
def expt_key_cols(self):
"""Catalog columns whose values must be the same for all data used in
this run of the package.
"""
if not self.frepp_mode and not self.any_components:
return ('component', )
else:
return tuple()
self.any_components = config.get('any_components', False)

@property
def pod_expt_key_cols(self):
"""Catalog columns whose values must be the same for each POD, but can
differ for different PODs.
"""
if self.frepp_mode and not self.any_components:
return ('component', )
return 'component'
else:
return tuple()

Expand All @@ -486,9 +484,9 @@ def var_expt_key_cols(self):
# of frepp_mode. This is the default behavior when called from the FRE
# wrapper.
if self.any_components:
return ('chunk_freq', 'component')
return 'chunk_freq', 'component'
else:
return ('chunk_freq', )
return 'chunk_freq'

# these have to be supersets of their *_key_cols counterparts; for this use
# case they're all just the same set of attributes.
Expand Down Expand Up @@ -582,7 +580,11 @@ class GfdlautoDataManager(object):
ends in "pp", use :class:`GfdlppDataManager`, otherwise use CMIP6 data on
/uda via :class:`Gfdludacmip6DataManager`.
"""
def __new__(cls, case_dict, *args, **kwargs):
# Note, object is explicitly defined as a parameter for Python 2/3
# compatibility reasons; omitting object in Python2 yields "old-style" classes
# All classes are "new-style" in Python3 by default.
# TODO: Since WE DO NOT SUPPORT PYTHON2, remove object parm and verify that it doesn't destroy everything
def __new__(cls, case_dict, parent, *args, **kwargs):
"""Dispatch DataManager instance creation based on the contents of
case_dict."""
config = core.ConfigManager()
Expand All @@ -597,7 +599,7 @@ def __new__(cls, case_dict, *args, **kwargs):
_log.debug("%s: Dispatched DataManager to %s.",
cls.__name__, dispatched_cls.__name__)
obj = dispatched_cls.__new__(dispatched_cls)
obj.__init__(case_dict)
obj.__init__(case_dict, parent)
return obj

def __init__(self, *args, **kwargs):
Expand Down
5 changes: 3 additions & 2 deletions sites/NOAA_GFDL/mdtf_gfdl.csh
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,8 @@ set fremodule
set script_path

## set paths to site installation
set REPO_DIR="/home/oar.gfdl.mdtf/mdtfmdtf/MDTF-diagnostics"
set CONDA_ROOT="/home/oar.gfdl.mdtf/miniconda3"
set REPO_DIR="/home/oar.gfdl.mdtf/mdtf/MDTF-diagnostics"
set OBS_DATA_DIR="/home/oar.gfdl.mdtf/mdtf/inputdata/obs_data"
# output is always written to $out_dir; set a path below to GCP a copy of output
# for purposes of serving from a website
Expand Down Expand Up @@ -144,7 +145,7 @@ wipetmp
## run the command
echo "mdtf_gfdl.csh: conda activate"
source "${REPO_DIR}/src/conda/conda_init.sh" -q "/home/oar.gfdl.mdtf/miniconda3"
conda activate "${REPO_DIR}/envs/_MDTF_base"
conda activate "${CONDA_ROOT}/envs/_MDTF_base"

echo "mdtf_gfdl.csh: MDTF start"

Expand Down
2 changes: 1 addition & 1 deletion src/data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -806,7 +806,7 @@ class DataframeQueryDataSourceBase(DataSourceBase, metaclass=util.MDTFABCMeta):

def __init__(self, case_dict, parent):
super(DataframeQueryDataSourceBase, self).__init__(case_dict, parent)
self.expt_keys = dict() # Object _id -> expt_key tuple
self.expt_keys = dict() # Object _id -> expt_key tuple

@property
@abc.abstractmethod
Expand Down
19 changes: 18 additions & 1 deletion src/data_sources.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,8 @@
input_field="remote_path",
match_error_filter=ignore_non_nc_regex
)


@util.regex_dataclass(sample_data_regex)
class SampleDataFile():
"""Dataclass describing catalog entries for sample model data files.
Expand All @@ -34,6 +36,7 @@ class SampleDataFile():
variable: str = util.MANDATORY
remote_path: str = util.MANDATORY


@util.mdtf_dataclass
class SampleDataAttributes(dm.DataSourceAttributesBase):
"""Data-source-specific attributes for the DataSource providing sample model
Expand All @@ -45,7 +48,7 @@ class SampleDataAttributes(dm.DataSourceAttributesBase):
# date_range: util.DateRange
# CASE_ROOT_DIR: str
# log: dataclasses.InitVar = _log
convention: str = "CMIP" # default value, can be overridden
convention: str = "CMIP" # default value, can be overridden
sample_dataset: str = ""

def _set_case_root_dir(self, log=_log):
Expand Down Expand Up @@ -86,11 +89,13 @@ def __post_init__(self, log=_log):
self.sample_dataset, self.CASE_ROOT_DIR)
util.exit_handler(code=1)


sampleLocalFileDataSource_col_spec = dm.DataframeQueryColumnSpec(
# Catalog columns whose values must be the same for all variables.
expt_cols = dm.DataFrameQueryColumnGroup(["sample_dataset"])
)


class SampleLocalFileDataSource(dm.SingleLocalFileDataSource):
"""DataSource for handling POD sample model data stored on a local filesystem.
"""
Expand Down Expand Up @@ -202,6 +207,7 @@ def _post_normalize_hook(self, var, ds):
# translated var itself in addition to setting directly on ds
setattr(var.translation, k, v)


class MetadataRewritePreprocessor(preprocessor.DaskMultiFilePreprocessor):
"""Subclass :class:`~preprocessor.DaskMultiFilePreprocessor` in order to
look up and apply edits to metadata that are stored in
Expand Down Expand Up @@ -230,18 +236,22 @@ def _functions(self):
preprocessor.RenameVariablesFunction
)


dummy_regex = util.RegexPattern(
r"""(?P<dummy_group>.*) # match everything; RegexPattern needs >= 1 named groups
""",
input_field="remote_path",
match_error_filter=ignore_non_nc_regex
)


@util.regex_dataclass(dummy_regex)
class GlobbedDataFile():
"""Applies a trivial regex to the paths returned by the glob."""
dummy_group: str = util.MANDATORY
remote_path: str = util.MANDATORY


@util.mdtf_dataclass
class ExplicitFileDataSourceConfigEntry():
glob_id: util.MDTF_ID = None
Expand Down Expand Up @@ -291,6 +301,7 @@ def to_file_glob_tuple(self):
}
)


@util.mdtf_dataclass
class ExplicitFileDataAttributes(dm.DataSourceAttributesBase):
# CASENAME: str # fields inherited from dm.DataSourceAttributesBase
Expand Down Expand Up @@ -320,11 +331,13 @@ def __post_init__(self, log=_log):
self.convention, core._NO_TRANSLATION_CONVENTION)
self.convention = core._NO_TRANSLATION_CONVENTION


explicitFileDataSource_col_spec = dm.DataframeQueryColumnSpec(
# Catalog columns whose values must be the same for all variables.
expt_cols = dm.DataFrameQueryColumnGroup([])
)


class ExplicitFileDataSource(
dm.OnTheFlyGlobQueryMixin, dm.LocalFetchMixin, dm.DataframeQueryDataSourceBase
):
Expand Down Expand Up @@ -405,6 +418,7 @@ def iter_globs(self):

# ----------------------------------------------------------------------------


@util.mdtf_dataclass
class CMIP6DataSourceAttributes(dm.DataSourceAttributesBase):
# CASENAME: str # fields inherited from dm.DataSourceAttributesBase
Expand Down Expand Up @@ -498,6 +512,7 @@ def _init_x_from_y(source, dest):
else:
self.CATALOG_DIR = new_root


cmip6LocalFileDataSource_col_spec = dm.DataframeQueryColumnSpec(
# Catalog columns whose values must be the same for all variables.
expt_cols = dm.DataFrameQueryColumnGroup(
Expand All @@ -519,6 +534,7 @@ def _init_x_from_y(source, dest):
daterange_col = "date_range"
)


class CMIP6ExperimentSelectionMixin():
"""Encapsulate attributes and logic used for CMIP6 experiment disambiguation
so that it can be reused in DataSources with different parents (eg. different
Expand Down Expand Up @@ -623,6 +639,7 @@ def resolve_var_expt(self, df, obj):
col_name, df[col_name].iloc[0], obj.name)
return df


class CMIP6LocalFileDataSource(CMIP6ExperimentSelectionMixin, dm.LocalFileDataSource):
"""DataSource for handling model data named following the CMIP6 DRS and
stored on a local filesystem.
Expand Down
14 changes: 10 additions & 4 deletions src/util/dataclass.py
Original file line number Diff line number Diff line change
Expand Up @@ -511,6 +511,8 @@ def _mdtf_dataclass_type_check(self, log):

# declaration to allow calling with and without args: python cookbook 9.6
# https://github.com/dabeaz/python-cookbook/blob/master/src/9/defining_a_decorator_that_takes_an_optional_argument/example.py


def mdtf_dataclass(cls=None, **deco_kwargs):
"""Wrap the Python :py:func:`~dataclasses.dataclass` class decorator to customize
dataclasses to provide rudimentary type checking and conversion. This
Expand Down Expand Up @@ -573,24 +575,27 @@ def _dummy_post_init(self, *args, **kwargs): pass
# Do type coercion after dataclass' __init__, but before user __post_init__
# Do type check after __init__ and __post_init__
_old_post_init = cls.__post_init__

@functools.wraps(_old_post_init)
def _new_post_init(self, *args, **kwargs):
if hasattr(self, 'log'):
_post_init_log = self.log # for object hierarchy
_post_init_log = self.log # for object hierarchy
else:
_post_init_log = _log # fallback: use module-level logger
_post_init_log = _log # fallback: use module-level logger
_mdtf_dataclass_type_coercion(self, _post_init_log)
_old_post_init(self, *args, **kwargs)
_mdtf_dataclass_type_check(self, _post_init_log)
type.__setattr__(cls, '__post_init__', _new_post_init)

return cls


def is_regex_dataclass(obj):
"""Returns True if *obj* is a :func:`regex_dataclass`.
"""
return hasattr(obj, '_is_regex_dataclass') and obj._is_regex_dataclass == True


def _regex_dataclass_preprocess_kwargs(self, kwargs):
"""Edit kwargs going to the auto-generated __init__ method of this dataclass.
If any fields are regex_dataclasses, construct and parse their values first.
Expand Down Expand Up @@ -711,6 +716,7 @@ def _from_string(cls_, str_, *args):
return cls
return _dataclass_decorator


def dataclass_factory(dataclass_decorator, class_name, *parents, **kwargs):
"""Function that returns a dataclass (ie, a decorated class) whose fields
are the union of the fields in *parents*, which the new dataclass inherits
Expand All @@ -734,7 +740,7 @@ def _to_dataclass(self, cls_, **kwargs_):
return cls_(**new_kwargs)

def _from_dataclasses(cls_, *other_dcs, **kwargs_):
f"""Classmethod to create a new instance of {class_name} from instances
f"""Class method to create a new instance of {class_name} from instances
of its parents, along with any other field values passed in kwargs.
"""
# above docstring gets templated
Expand All @@ -754,7 +760,6 @@ def _from_dataclasses(cls_, *other_dcs, **kwargs_):
new_cls = type(class_name, tuple(parents), methods)
return dataclass_decorator(new_cls, **kwargs)

# ----------------------------------------------------

def filter_dataclass(d, dc, init=False):
"""Return a dict of the subset of fields or entries in *d* that correspond to
Expand Down Expand Up @@ -798,6 +803,7 @@ def filter_dataclass(d, dc, init=False):
ans.update({f.name: d[f.name] for f in init_fields if f.name in d})
return ans


def coerce_to_dataclass(d, dc, **kwargs):
"""Given a dataclass *dc* (may be the class or an instance of it), and a dict,
dataclass or dataclass instance *d*, return an instance of *dc*\'s class with
Expand Down

0 comments on commit f8a6649

Please sign in to comment.