Skip to content

btschwertfeger/BiasAdjustCXX

Repository files navigation

BiasAdjustCXX command-line tool for the application of fast and efficient bias corrections in climatic research

GitHub badge License badge C++ badge CICD badge Docker pulls badge GCC badge CMake badge GoogleTest badge Release date badge Release tag badge DOI badge Publication badge

The documentation can be found here: https://biasadjustcxx.readthedocs.io/en/latest.

This software is tested using Google's testing framework googletest (https://github.com/google/googletest).

The related project python-cmethods is a Python package that is implemented in Python and is way more flexible in terms of parameters, input data, shapes, and customization. For those who know Python at least a bit, this might be the better option, as it not only provides a command-line interface but also enables accessing its API in order to apply custom bias corrections.

1. About

The command-line tool BiasAdjustCXX is the subject of a publication that provides an insight into the architecture, possible applications and new scientific questions. This publication referencing BiasAdjustCXX v1.8.1 was published in the journal SoftwareX in March 2023 and is available at https://doi.org/10.1016/j.softx.2023.101379.

This tool and the provided data structures are designed to help minimize discrepancies between modeled and observed climate data of different time periods. Data from past periods are used to adjust variables from current and future time series so that their distributional properties approximate possible actual values.

Schematic representation of a bias adjustment procedure

Fig 1: Schematic representation of a bias adjustment procedure

In this way, for example, modeled data, which on average represent values that are too cold, can be bias-corrected by applying an adjustment procedure. The following figure shows the observed, the modeled, and the bias-corrected values. It is directly visible that the delta adjusted time series (T^{*DM}_{sim,p}) are much more similar to the observed data (T_{obs,p}) than the raw modeled data (T_{sim,p}).

Temperature per day of year in modeled, observed and bias-adjusted climate data

Fig 2: Temperature per day of year in modeled, observed and bias-adjusted climate data

In addition - all of these methods available here have also been implemented in Python. This can be found in the python-cmethods package.

If you have any inquiries, remarks, requests for assistance, ideas, or potential collaborations, you can always create an issue on BiasAdjustCXX/issues, utilize the discussion area on BiasAdjustCXX/discussions, or directly contact me at contact@b-schwertfeger.de.

2. Available bias correction methods

The following bias correction techniques are available:
Scaling-based techniques:
  • Linear Scaling
  • Variance Scaling
  • Delta Method
Distribution-based techniques:
  • Quantile Mapping
  • Quantile Delta Mapping

All of these mathematical methods are intended to be applied on 1-dimensional time-series climate data. This module also provides the possibility that enables the application of the desired bias correction method on 3-dimensional data sets.

General Notes

  • Except for the variance scaling, all methods can be applied on stochastic and non-stochastic climate variables. Variance scaling can only be applied on non-stochastic climate variables.
    • Non-stochastic climate variables are those that can be predicted with relative certainty based on factors such as location, elevation, and season. Examples of non-stochastic climate variables include air temperature, air pressure, and solar radiation.
    • Stochastic climate variables, on the other hand, are those that exhibit a high degree of variability and unpredictability, making them difficult to forecast accurately. Precipitation is an example of a stochastic climate variable because it can vary greatly in timing, intensity, and location due to complex atmospheric and meteorological processes.
  • The Delta Method requires that the time series of the control period have the same length as the time series to be adjusted.
  • Examples can be found in the BiasAdjustCXX repository and of course within the Documentation.
  • Speed/Performance tests and comparison to other tools can be found here: tool comparison

Notes regarding the scaling-based techniques

  • All data sets must exclude the 29th February and every year must have 365 entries. This is not required when using the --no-group flag which can be used to apply the scaling techniques in such a way that the scaling factors are based on the whole time series at once. This enables the possibility to apply the BiasAdjustCXX tool to data sets with custom time scales for example to adjust monthly separated time series individually to match the techniques described by Teutschbein et al. (2012) and Beyer et al. (2020). On the other hand the long-term 31-day interval procedures are customized variations and prevent disproportionately high differences in the long-term mean values at the monthly transitions. That's why the long-term 31-day interval variant is the preferred method and is enabled by default for all scaling-based techniques.

3. Compilation and Installation

Build from source

Since this tool is written in C++ it must be compiled and installed, before it can be used. The following libraries and tools must be installed to successfully compile and install the BiasAdjustCXX command-line tool.

Please have a look at the following code blocks that demonstrate how to download, build and install the BiasAdjustCXX tool from source:

git clone https://github.com/btschwertfeger/BiasAdjustCXX.git
cd BiasAdjustCXX

make build
make install

The tool can be uninstalled using the following command within the project directory:

make uninstall

After the installation, the tool can be executed using the command listed below. This repository also serves example data to test this. See the documentation for more information (https://biasadjustcxx.readthedocs.io/en/latest).

If the netcdf-cxx4 libraries cannot be found - make sure that ncxx4-config is globally executable, since this tool is used to determine the location of these libraries.

BiasAdjustCXX \
      --ref input_data/observations.nc  \ # observations/reference time series of the control period
      --contr input_data/control.nc     \ # simulated time series of the control period
      --scen input_data/scenario.nc     \ # time series to adjust
      --output linear_scaling.nc        \ # output file
      --method linear_scaling           \ # adjustment method
      --kind "+"                        \ # kind of adjustment ('+' == 'add' and '*' == 'mult')
      --variable tas                    \ # variable to adjust
      --processes 4                       # number of threads

Docker 🐳

The execution of BiasAdjustCXX is also possible within a Docker container. This is the preferred option when the installation of NetCDF-4 C++, CMake or BiasAdjustCXX on the local system is not desired. It also makes easier to access this tool since Docker container can run on nearly every operating system.

# remove the comments before execution ...
docker run -it -v $(PWD):/work btschwertfeger/biasadjustcxx:latest BiasAdjustCXX \
    --ref input_data/observations.nc  \ # observations/reference time series of the control period
    --contr input_data/control.nc     \ # simulated time series of the control period
    --scen input_data/scenario.nc     \ # time series to adjust
    --output linear_scaling.nc        \ # output file
    --method linear_scaling           \ # adjustment method
    --kind "+"                        \ # kind of adjustment ('+' == 'add' and '*' == 'mult')
    --variable tas                    \ # variable to adjust
    --processes 4                       # number of threads

See the Dockerhub registry to access the dev, pinned and older versions: Dockerhub

4. Arguments and Parameters

The following table lists the available command-line arguments that can be passed to the BiasAdjustCXX tool. Please also have a look at the requirements section below.

--ref, --reference
path to observational/reference data set (control period)
--contr, --control
path to modeled data set (control period)
--scen, --scenario
path to data set that is to be adjusted (scenario period)
-v, --variable
variable to adjust
-k, --kind
kind of adjustment - one of: + or add and * or mult
-m, --method
adjustment method name - one of: linear_scaling, variance_scaling, delta_method, quantile_mapping and quantile_delta_mapping
-q, --quantiles
[optional] number of quantiles to respect (only required for distribution-based methods)
--1dim
[optional] required if the data sets have no spatial dimensions (i.e. only one time dimension)
--no-group
[optional] Disables the adjustment based on 31-day long-term moving windows for the scaling-based methods. Scaling will be performed on the whole data set at once, so it is recommended to separate the input files for example by month and apply this program to every long-term month. (only for scaling-based methods)
--max-scaling-factor
[optional] Define the maximum scaling factor to avoid unrealistic results when adjusting ratio based variables for example in regions where heavy rainfall is not included in the modeled data and thus creating disproportional high scaling factors. (only for multiplicative methods except QM, default: 10)
-p, --processes
[optional] How many threads to use (default: 1)
-h, --help
[optional] display usage example, arguments, hints, and exits the program

Requirements

See the documentation for more information (https://biasadjustcxx.readthedocs.io/en/latest).

  • The variable of interest must have the same name in all data sets.
  • The dimensions must be named "time", "lat" and "lon" (i.e., time, latitudes and longitudes) in exactly this order - in case the data sets have more than one dimension.
  • Executed scaling-based techniques without the --no-group flag require that the data sets exclude the 29th February and every year has exactly 365 entries.
  • For adjusting data using the linear scaling, variance scaling or delta method and the --no-group flag: You have to separate the input files by month and then apply the correction for each month individually e.g., for 30 years of data to correct, you need to prepare the three input data sets so that they first contain all time series for all Januaries and then apply the adjustment for this data set. After that you have to do the same for the rest of the months (see /examples/example_all_methods.run.sh in the repository).

5. References

  • Schwertfeger, Benjamin Thomas and Lohmann, Gerrit and Lipskoch, Henrik (2023) "Introduction of the BiasAdjustCXX command-line tool for the application of fast and efficient bias corrections in climatic research", SoftwareX, Volume 22, 101379, ISSN 2352-7110, (https://doi.org/10.1016/j.softx.2023.101379)
  • Schwertfeger, Benjamin Thomas (2022) "The influence of bias corrections on variability, distribution, and correlation of temperatures in comparison to observed and modeled climate data in Europe" (https://epic.awi.de/id/eprint/56689/)
  • Delta Method based on: Beyer, R. and Krapp, M. and Manica, A. (2020) "An empirical evaluation of bias correction methods for palaeoclimate simulations" (https://doi.org/10.5194/cp-16-1493-2020)
  • Linear Scaling and Variance Scaling based on: Teutschbein, Claudia and Seibert, Jan (2012) "Bias correction of regional climate model simulations for hydrological climate-change impact studies: Review and evaluation of different methods" (https://doi.org/10.1016/j.jhydrol.2012.05.052)
  • Quantile Mapping based on: Alex J. Cannon and Stephen R. Sobie and Trevor Q. Murdock (2015) "Bias Correction of GCM Precipitation by Quantile Mapping: How Well Do Methods Preserve Changes in Quantiles and Extremes?" (https://doi.org/10.1175/JCLI-D-14-00754.1)
  • Quantile Delta Mapping based on: Tong, Y., Gao, X., Han, Z. et al. "Bias correction of temperature and precipitation over China for RCM simulations using the QM and QDM methods". Clim Dyn 57, 1425–1443 (2021). (https://doi.org/10.1007/s00382-020-05447-4)
  • Schulzweida, U.: "CDO User Guide", (https://doi.org/10.5281/zenodo.7112925), 2022.
  • This project took advantage of netCDF software developed by UCAR/Unidata (http://doi.org/10.5065/D6H70CW6).