Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit cbaae05
Author: jakob-fritz <37077134+jakob-fritz@users.noreply.github.com>
Date:   Wed Apr 24 10:34:09 2024 +0200

    Make create_gitlab_ci branch up-to-date before merging into master (Parallel-in-Time#418)

    * first working SDC version (M and Minv)

    * Update playground.py

    * cleaning up

    * Added some hyphens in plots (Parallel-in-Time#389)

    * Removed seperate file for GPU Dahlquist implementation (Parallel-in-Time#391)

    Co-authored-by: Thomas <t.baumann@fz-juelich.de>

    * Review (Parallel-in-Time#388)

    * Bug is fixed and added new code

    * new code for the table

    * Edits in markdown file

    * some edits in test

    * Bugs fix

    * Codecov

    * I cleaned up my code and separated classes to make it easier to work with. It is not ready yet; if Codecov fails, I will include more tests.

    * forgot black

    * flake8

    * bug fix

    * Edits codes according to the comments

    * Edited codes according to the comments in the GitHub

    * Defined new function in stability_simulation.py to check stability for
    given points and excluded codecov function that generates a table.

    * small edits for codecov

    * removed no cover

    * NCCL communicators (Parallel-in-Time#392)

    * Added wrapper for MPI communicator to use NCCL under the hood

    * Small fix

    * Moved NCCL communicator wrapper to helpers

    ---------

    Co-authored-by: Thomas <t.baumann@fz-juelich.de>

    * Version bump for new release

    * proper readme and link

    * Started playground for machine learning generated initial guesses for (Parallel-in-Time#394)

    SDC

    * playing with FEniCS

    * blackening

    * Bug fix (Parallel-in-Time#395)

    * readme file changes

    * fixed bugs for stability plots and some edits in README file

    * some edits

    * typo in citation

    * Bump version

    * Bug fix (Parallel-in-Time#396)

    * Clear documentation and some edits in the code

    * forgot black

    * some changes

    * bump version

    * Cosmetic changes (Parallel-in-Time#398)

    * Parallel SDC (Reloaded) project (Parallel-in-Time#397)

    TL: Added efficient diagonal preconditioners and associated project. Coauthored by @caklovicka

    * Generic multi-component mesh  (Parallel-in-Time#400)

    * Generic multicomponent mesh

    * new try

    * Added a test for MultiComponentMesh

    * Test that the type is conserved also after numpy operations

    * Added documentation for how to use `MultiComponentMesh`

    * Changed formatting of the documentation

    * Update ci_pipeline.yml

    * version freak show

    * version freak show II

    * version freak show III

    * version freak show IV

    * Update ci_pipeline.yml

    * version freak show V

    * 2D Brusselator problem (Parallel-in-Time#401)

    * Added 2D Brusselator problem from Hairer-Wanner II. Thanks @grosilho for
    the suggestion!

    * Added forgotten pytest marker

    * Fix brain afk error

    * Added work counter for right hand side evaluations

    * Removed file for running Brusselator from project

    * Retry at removing the file

    * I need to go to git school

    * Datatype `DAEMesh` for DAEs (Parallel-in-Time#384)

    * Added DAE mesh

    * Updated all DAE problems and the SDC-DAE sweeper

    * Updated playgrounds with new DAE datatype

    * Adapted tests

    * Minor changes

    * Black.. :o

    * Added DAEMesh only to semi-explicit DAEs + update for FI-SDC and ProblemDAE.py

    * Black :D

    * Removed unnecessary approx_solution hook + replaced by LogSolution hook

    * Update WSCC9 problem class

    * Removed unnecessary comments

    * Removed test_misc.py

    * Removed registering of newton_tol from child classes

    * Update test_problems.py

    * Rename error hook class for logging global error in differential variable(s)

    * Added MultiComponentMesh - @brownbaerchen + @tlunet + @pancetta Thank ugit add pySDC/implementations/datatype_classes/MultiComponentMesh.py

    * Updated stuff with new version of DAE data type

    * (Hopefully) faster test for WSCC9

    * Test for DAEMesh

    * Renaming

    * ..for DAEMesh.py

    * Bug fix

    * Another bug fix..

    * Preparation for PDAE stuff (?)

    * Changes + adapted first test for PDAE stuff

    * Commented out test_WSCC9_SDC_detection() - too long runtime

    * Minor changes for test_DAEMesh.py

    * Extended test for DAEMesh - credits for @brownbaerchen

    * Test for HookClass_DAE.py

    * Update for DAEMesh + tests

    * 🎉 - speed up test a bit (at least locally..)

    * Forgot to enable other tests again

    * Removed if-else-statements for mesh type

    * View for unknowns in implSysFlatten

    * Fix for RK sweeper - changed nodes in BackwardEuler class (Parallel-in-Time#403)

    * Made aborting the step at growing residual optional (Parallel-in-Time#405)

    * `pySDC`-build-in `LagrangeApproximation` class in `SwitchEstimator` (Parallel-in-Time#406)

    * SE now uses LagrangeApproximation class + removed Lagrange class in SE

    * Removed log message again (not corresponding to PR)

    * version bump

    * Added hook for logging to file (Parallel-in-Time#410)

    * Monodomain project (Parallel-in-Time#407)

    * addded some classes from oldexplicit_stabilized branch. Mainly, the problems description, datatype classes, explicit stabilized classes. Tested for IMEX on simple problems

    * added implicit,explicit,exponential integrator (in electrophysiology aka Rush-Larsen)

    * added exponential imex and mES, added parabolic_system in vec format

    * added new stabilized integrators using multirate, splitting and exponential approaches

    * before adding exponential_runge_kutta as underlying method, instead of the traditional collocation methods

    * added first order exponential runge kutta as underlying collocation method. To be generalized to higher order

    * generalized exponential runge kutta to higher order. Added exponential multirate stabilized method using exponential RK but must tbe checked properly

    * fixed a few things

    * optimized a few things

    * renamed project ExplicitStabilized to Monodomain

    * removed deprecated problems

    * fixed some renaming issues

    * did refactoring of code and put in Monodomain_NEW

    * removed old code and renamed new code

    * added finite difference discretization

    * added many things, cant remember

    * old convergence_controller

    * addded some classes from oldexplicit_stabilized branch. Mainly, the problems description, datatype classes, explicit stabilized classes. Tested for IMEX on simple problems

    * added implicit,explicit,exponential integrator (in electrophysiology aka Rush-Larsen)

    * added exponential imex and mES, added parabolic_system in vec format

    * added new stabilized integrators using multirate, splitting and exponential approaches

    * before adding exponential_runge_kutta as underlying method, instead of the traditional collocation methods

    * added first order exponential runge kutta as underlying collocation method. To be generalized to higher order

    * generalized exponential runge kutta to higher order. Added exponential multirate stabilized method using exponential RK but must tbe checked properly

    * fixed a few things

    * optimized a few things

    * renamed project ExplicitStabilized to Monodomain

    * removed deprecated problems

    * fixed some renaming issues

    * did refactoring of code and put in Monodomain_NEW

    * removed old code and renamed new code

    * added finite difference discretization

    * added many things, cant remember

    * added smooth TTP model for conv test, added DCT for 2D and 3D problems

    * added plot stuff and run scripts

    * fixed controller to original

    * removed explicit stabilized files

    * fixed other files

    * removed obsolete splittings from ionic models

    * removed old sbatch scripts

    * removed mass transfer and sweeper

    * fixed something

    * removed my base transfer

    * removed hook class pde

    * removed FD files

    * fixed some calls to FD stuff

    * removed FEM FEniCSx files

    * renamed FD_Vector to DCT_Vector

    * added hook for output and visualization script

    * removed plot scripts

    * removed run scripts, except convergence

    * removed convergence experiments script

    * fixed TestODE

    * added stability test in run_TestODE

    * added stability test in run_TestODE

    * added stability test in run_TestODE

    * removed obsolete stuff in TestODE

    * removed unneeded stuff from run_MonodomainODE

    * cleaned a bit run_MonodomainODE

    * removed utils/

    * added few comments, cleaned a bit

    * removed schedule from workflow

    * restored tutorial step 7 A which I has modified time ago

    * run black on monodomain project

    * fixed a formatting thing

    * reformatted everything with black

    * Revert "revert formatted with black"

    This reverts commit 82c82e9.

    * added environment file for monodomain project, started to add stuff in workflow

    * added first test

    * added package tqdm to monodomain environment

    * added new TestODE using DCT_vectors instead of myfloat, moved phi_eval_lists from MonodomainODE to the sweeper

    * deleted old TestODE and myfloat stuff

    * renamed TestODEnew to TestODE

    * cleaned a bit

    * added stability, convergence and iterations tests. Changed a bit other scripts as needed

    * reactivated other tests in workflow

    * removed my tests temporarly

    * added monodomain marker to project pyproject.toml

    * changed files and function names for tests

    * fixed convergence test

    * made one test a bit shorter

    * added test for SDC on HH and fixed missing feature in SDC imex sweeper for monodomain

    * reformatted with correct black options

    * fixed a lint error

    * another lint error

    * adding tests with plot

    * modified convergence test

    * added test iterations in parallel

    * removed plot from tests

    * added plots without writing to file

    * added write to file

    * simplified plot

    * new plot

    * fixed plot in iterations parallel

    * added back all tests and plots

    * cleaned a bit

    * added README

    * fixed readme

    * modified comments in controllers

    * try to compute phi every step

    * removed my controllers, check u changed before comuting phis

    * enabled postprocessing in pipeline

    * added comments to data_type classes, removed unnecessary methods

    * added comments to hooks

    * added comments to the problem classes

    * added comments to the run scripts

    * added comments to sweepers and transfer classes

    * fixed the readme

    * decommented if in pipeline

    * removed recv_mprobe option

    * changed back some stuff outiside of monodomain project

    * same

    * again

    * fixed Thomas hints

    * removed old unneeded move coverage folders

    * fixed previously missed Thomas comments

    * begin change datatype

    * changed run_Monodomain

    * added prints

    * fixed prints

    * mod print

    * mod print

    * mod print

    * mod print

    * rading init val

    * rading init val

    * removed prints

    * removed prints

    * checking longer time

    * checking longer time

    * fixed call phi eval

    * trying 2D

    * trying 2D

    * new_data type passing tests

    * removed coverage folders

    * optmized phi eval lists

    * before changing phi type

    * changed eval phi lists

    * polished a bit

    * before switch indeces

    * reformatted phi computaiton to its traspose

    * before changing Q

    * optimized integral of exp terms

    * changed interfate to c++ code

    * moved definition of dtype u f

    * tests passed after code refactoring

    * Generic MPI FFT class (Parallel-in-Time#408)

    * Added generic MPIFFT problem class

    * Fixes

    * Generalized to `xp` in preparation for GPUs

    * Fixes

    * Ported Allen-Cahn to generic MPI FFT implementation

    * Ported Gray-Scott to generic MPI FFT (Parallel-in-Time#412)

    * Ported Gray-Scott to generic MPI FFT class

    * `np` -> `xp`

    * Reverted poor changes

    * Update README.md (Parallel-in-Time#413)

    Added the ExaOcean grant identified and the "Supported by the European Union - NextGenerationEU." clause that they would like us to display.

    * TIME-X Test Hackathon @ TUD: Test for `SwitchEstimator` (Parallel-in-Time#404)

    * Added piecewise linear interpolation to SwitchEstimator

    * Started with test for SwitchEstimator [WIP]

    * Test to proof sum_restarts when event occuring at boundary

    * Started with test to check adapt_interpolation_info [WIP]

    * Added test for SE.adapt_interpolation_info()

    * Update linear interpolation + logging + changing tolerances

    * Test for linear interpolation + update of other test

    * Correction for finite difference + adaption tolerance

    * Added test for DAE case for SE

    * Choice of FD seems to be important for performance of SE

    * Removed attributes from dummy probs (since the parent classes have it)

    * Test for dummy problems + using functions from battery_model.py

    * Moved standard params for test to function

    * Updated hardcoded solutions for battery models

    * Updated hardcoded solutions for DiscontinuousTestODE

    * Updated docu in SE for FDs

    * Lagrange Interpolation works better with baclward FD and alpha=0.9

    * Added test for state function + global error

    * Updated LogEvent hooks

    * Updated hardcoded solutions again

    * Adapted test_problems.py

    * Minor changes

    * Updated tests

    * Speed-up test for buck converter

    * Black..

    * Use msg about convergence info in Newton in SE

    * Moved dummy problem to file

    * Speed up loop using mask

    * Removed loop

    * SDC-DAE sweeper for semi-explicit DAEs (Parallel-in-Time#414)

    * Added SI-SDC-DAE sweeper

    * Starte with test for SemiImplicitDAE

    * Test for SI-SDC sweeper

    * Clean-up

    * Removed parameter from function

    * Removed test + changed range of loop in SI-sweeper

    ---------

    Co-authored-by: Robert Speck <pancetta@users.noreply.github.com>
    Co-authored-by: Thomas Baumann <39156931+brownbaerchen@users.noreply.github.com>
    Co-authored-by: Thomas <t.baumann@fz-juelich.de>
    Co-authored-by: Ikrom Akramov <96234984+ikrom96git@users.noreply.github.com>
    Co-authored-by: Thibaut Lunet <thibaut.lunet@tuhh.de>
    Co-authored-by: Lisa Wimmer <68507897+lisawim@users.noreply.github.com>
    Co-authored-by: Giacomo Rosilho de Souza <jackrosilho@gmail.com>
    Co-authored-by: Daniel Ruprecht <danielru@users.noreply.github.com>

commit 24cdf05
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 24 09:11:38 2024 +0200

    Split installation and running into two jobs

    As one of the two jobs often failed during installation, while the other one succeeded. So it might be a race condition. Therefore, splitting installation and usage into separate jobs

commit 488e7a4
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 24 08:34:59 2024 +0200

    ci_pipeline.yml now more similar to upstream

commit 9ab9b63
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 23 15:36:39 2024 +0200

    Reduced diff to master

commit cdb77d4
Author: jakob-fritz <37077134+jakob-fritz@users.noreply.github.com>
Date:   Mon Apr 22 16:46:44 2024 +0200

    WIP: Use HPC for CI (Parallel-in-Time#386)

    Works on Parallel-in-Time#415

    Added sync with Gitlab, now also for pull requests

    ---------

    Co-authored-by: Robert Speck <pancetta@users.noreply.github.com> and Thomas Baumann <brownbaerchen@users.noreply.github.com>

commit fb4b745
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Apr 22 16:02:49 2024 +0200

    Moved development of action into main branch and added version-tag

commit 7de7187
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Apr 22 11:19:53 2024 +0200

    Added triggers for workflows again

commit 5f45785
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Thu Apr 18 14:22:13 2024 +0900

    Updated name of step, as merge is not ff-only anymore

commit 2e9930f
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 13:36:44 2024 +0900

    Wrong syntax for if else

commit e33a611
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 13:33:40 2024 +0900

    Unshallow repo if needed

commit c7db47a
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 13:18:26 2024 +0900

    Add name and email for merge-commit

commit f34de9c
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 12:09:20 2024 +0900

    Also allow non-fast-forward merges

commit 82d9233
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 11:49:56 2024 +0900

    Don't run mirror on push now (as gitlab-file is incorrect in this branch)

commit d1b7250
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 11:49:02 2024 +0900

    Make unshallow before merging to properly compare history

commit 9cfeea3
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Wed Apr 17 11:17:48 2024 +0900

    Changed way to use variables (set locally and later in github_env)

commit 6961ef3
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 16:31:51 2024 +0900

    Reverted and changed way to store variable

commit d906604
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 16:21:54 2024 +0900

    Redone storing of var again

commit faec097
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 14:50:56 2024 +0900

    Corrected querying of a variable

commit cbf0b5d
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 13:57:31 2024 +0900

    Added more reporting for better debugging

commit efdaa05
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 13:23:08 2024 +0900

    Don't run main CI during development

commit ccd646a
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 13:22:44 2024 +0900

    First fetch, to be able to checkout branch

commit 2712998
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 12:15:13 2024 +0900

    Don't rerun CI on every push during this development

commit 8a316e2
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 12:14:31 2024 +0900

    Moved the check of condition from shell to yaml

commit d347bd3
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Tue Apr 16 11:52:02 2024 +0900

    Try to merge code (from PR) first

    So that merged state is tested in Gitlab-CI

commit bcd64a5
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Feb 5 11:25:43 2024 +0100

    Use specific version of github2lab action

commit 28472dc
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 29 15:10:38 2024 +0100

    Uses newer checkout-action to use new node-version (20)

    Version 16 is deprecated

commit fefe88b
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 29 14:53:18 2024 +0100

    Minor formatting updates in README

    to trigger CI

commit 3de1b56
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Fri Jan 26 16:13:53 2024 +0100

    Formatted md-file to trigger CI

commit ef6a866
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Thu Jan 18 15:48:25 2024 +0100

    Set sha for checkout properly

commit be3aef7
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 15 16:14:41 2024 +0100

    Using default shallow checkout

    Otherwise, other own action complains

commit f38f0e5
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 15 16:11:05 2024 +0100

    Updated ref to use lastest code from PR; not merge

    Previously, a version of the code was used that was how a merge could look like.
    Now, the code is used as it is in the PR

commit 249741b
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 15 08:47:45 2024 +0100

    Updated workflow for mirroring

commit d8604b7
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 8 16:39:49 2024 +0100

    Try exapnding the predefined variable

commit c49accd
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 8 16:37:10 2024 +0100

    Another attempt to get the action to work

commit 5e0118a
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 8 16:28:51 2024 +0100

    Hopefully now, variable is expanded instead using the name

commit 832e7e5
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 8 16:07:29 2024 +0100

    Exit instead of return needed

    Because exiting the shell instead of a function

commit 5a5de4a
Author: Jakob Fritz <j.fritz@fz-juelich.de>
Date:   Mon Jan 8 12:00:55 2024 +0100

    First version of CI to mirror pull_requests to Gitlab

    If someone with write-permissions triggered the workflow
  • Loading branch information
brownbaerchen committed Apr 24, 2024
1 parent fd6ef0b commit a1cb678
Show file tree
Hide file tree
Showing 9 changed files with 253 additions and 46 deletions.
37 changes: 0 additions & 37 deletions .github/workflows/ci_pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,24 +37,6 @@ jobs:
run: |
flakeheaven lint --benchmark pySDC
# mirror_to_gitlab:

# runs-on: ubuntu-latest

# steps:
# - name: Checkout
# uses: actions/checkout@v1

# - name: Mirror
# uses: jakob-fritz/github2lab_action@main
# env:
# MODE: 'mirror' # Either 'mirror', 'get_status', or 'both'
# GITLAB_TOKEN: ${{ secrets.GITLAB_SECRET_H }}
# FORCE_PUSH: "true"
# GITLAB_HOSTNAME: "codebase.helmholtz.cloud"
# GITLAB_PROJECT_ID: "3525"
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

user_cpu_tests_linux:
runs-on: ubuntu-latest

Expand Down Expand Up @@ -215,24 +197,6 @@ jobs:
# run: |
# pytest --continue-on-collection-errors -v --durations=0 pySDC/tests -m ${{ matrix.env }}


# wait_for_gitlab:
# runs-on: ubuntu-latest

# needs:
# - mirror_to_gitlab

# steps:
# - name: Wait
# uses: jakob-fritz/github2lab_action@main
# env:
# MODE: 'get_status' # Either 'mirror', 'get_status', or 'both'
# GITLAB_TOKEN: ${{ secrets.GITLAB_SECRET_H }}
# FORCE_PUSH: "true"
# GITLAB_HOSTNAME: "codebase.helmholtz.cloud"
# GITLAB_PROJECT_ID: "3525"
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

# # - name: Get and prepare artifacts
# # run: |
# # pipeline_id=$(curl --header "PRIVATE-TOKEN: ${{ secrets.GITLAB_SECRET_H }}" --silent "https://gitlab.hzdr.de/api/v4/projects/3525/repository/commits/${{ github.head_ref || github.ref_name }}" | jq '.last_pipeline.id')
Expand Down Expand Up @@ -393,4 +357,3 @@ jobs:
# rm -rf data
# unzip artifacts.zip
#

116 changes: 116 additions & 0 deletions .github/workflows/gitlab_ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
---

name: Mirror to Gitlab to trigger CI

on:
push:
pull_request_target:
schedule:
- cron: '1 5 2 * *'

jobs:
check_permission:
runs-on: ubuntu-latest
if: >-
(github.repository_owner == 'Parallel-in-Time') &&
((github.event_name == 'push') ||
(github.event_name == 'schedule') ||
((github.event_name == 'pull_request_target') &&
(contains(github.event.pull_request.labels.*.name, 'gitlab-mirror'))
)
)
steps:
- name: Query permissions of triggering actor
id: query_permission_triggering_actor
if: github.event_name == 'pull_request_target'
uses: actions-cool/check-user-permission@v2
with:
username: ${{ github.triggering_actor }}
require: 'write'
token: ${{ secrets.GITHUB_TOKEN }}
- name: Interpret the queried result
if: github.event_name == 'pull_request_target'
run: |
echo "Current permission level is ${{ steps.query_permission_triggering_actor.outputs.user-permission }}"
echo "Job originally triggered by ${{ github.actor }}"
echo "Checking permission returned ${{ steps.query_permission_triggering_actor.outputs.require-result }}"
if ${{ steps.query_permission_triggering_actor.outputs.require-result }}
then
echo 'Permissions granted'
exit 0
else
echo 'Not enough permissions. Please ask a member of Parallel-in-Time to rerun the job.'
exit 1
fi
- name: Pass if workflow from push or schedule
if: >-
(github.event_name == 'push') ||
(github.event_name == 'schedule')
run: exit 0
# - name: Fail for other triggers
# if: >-
# (github.event_name != 'push') &&
# (github.event_name != 'schedule') &&
# (github.event_name != 'pull_request_target')
# run: exit 1

mirror_to_gitlab:
runs-on: ubuntu-latest
if: >-
(github.repository_owner == 'Parallel-in-Time') &&
((github.event_name == 'push') ||
(github.event_name == 'schedule') ||
((github.event_name == 'pull_request_target') &&
(contains(github.event.pull_request.labels.*.name, 'gitlab-mirror'))
)
)
needs:
- check_permission
steps:
- name: set proper sha
run: |
echo "${{ github.event_name }}"
if [ "${{ github.event_name }}" == 'push' ] || [ "${{ github.event_name }}" == 'schedule' ]
then
echo "USED_SHA=${{ github.sha }}" >> "$GITHUB_ENV"
fi
if [ "${{ github.event_name }}" == 'pull_request_target' ]
then
echo "USED_SHA=${{ github.event.pull_request.head.sha }}" >> "$GITHUB_ENV"
fi
- name: Checkout
uses: actions/checkout@v4
with:
ref: "${{ env.USED_SHA }}"
persist-credentials: false
- name: check if merge is possible (merge is used for testing)
if: github.event_name == 'pull_request_target'
run: |
if $(git rev-parse --is-shallow-repository); then
git fetch --unshallow
else
git fetch
fi
echo "Checkout of ${{ github.base_ref }}"
git checkout "${{ github.base_ref }}"
echo "Git pull"
git pull
MIRROR_BRANCH="TEMPORARY_MERGE_PR_${{ github.event.number }}"
echo MIRROR_BRANCH="$MIRROR_BRANCH" >> $GITHUB_ENV
echo "Create new branch $MIRROR_BRANCH and check it out"
git checkout -b "$MIRROR_BRANCH"
echo "Setting git committer info, so that merge-commit can be created"
git config user.email "unused@example.com"
git config user.name "Sync bot"
echo "Merge the two parts of the Merge-Request to test the resulting version"
git merge "${{ github.event.pull_request.head.sha }}"
- name: Mirror and wait for Gitlab-CI
uses: jakob-fritz/github2lab_action@v0.7
env:
MODE: 'all' # Either 'mirror', 'get_status', 'get_artifact', or 'all'
GITLAB_TOKEN: ${{ secrets.GITLAB_SECRET }}
FORCE_PUSH: "true"
GITLAB_HOSTNAME: "gitlab.jsc.fz-juelich.de"
GITLAB_PROJECT_ID: "6029"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
MIRROR_BRANCH: ${{ env.MIRROR_BRANCH }}
79 changes: 79 additions & 0 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,84 @@
---

stages:
- test
- benchmark
- upload


variables:
JUWELS_ACCOUNT: "cstma"


prepare_JUWELS:
stage: benchmark
rules:
- if: $CI_COMMIT_MESSAGE !~ /.*\[CI-no-benchmarks\]/
tags:
- jacamar
- juwels
- login
- shell
script:
- mkdir -p benchmarks
# load the latest Python module (currently 3.11)
- module --force purge
- module load Stages/2024
- module load GCC
- module load OpenMPI
- module load FFTW
- module load mpi4py
- module load SciPy-Stack
- module load CuPy
- pip install -e .
- pip install pytest-benchmark coverage


test_JUWELS:
stage: benchmark
needs:
- prepare_JUWELS
rules:
- if: $CI_COMMIT_MESSAGE !~ /.*\[CI-no-benchmarks\]/
tags:
- jacamar
- juwels
- login
- shell
parallel:
matrix:
- SHELL_SCRIPT: ['benchmark', 'cupy']
artifacts:
when: always
paths:
- coverage_*.dat
- sbatch.err
- sbatch.out
before_script:
- mkdir -p benchmarks
# load the latest Python module (currently 3.11)
- module --force purge
- module load Stages/2024
- module load GCC
- module load OpenMPI
- module load FFTW
- module load mpi4py
- module load SciPy-Stack
- module load CuPy
script:
# - touch benchmarks/output.json
- echo $SYSTEMNAME
- sbatch --wait etc/juwels_${SHELL_SCRIPT}.sh
- touch .coverage.empty
- python -m coverage combine
- mv .coverage coverage_${SHELL_SCRIPT}.dat
after_script:
- echo "Following Errors occured:"
- cat sbatch.err
- echo "Following was written to stdout:"
- cat sbatch.out


#test_kit:
# image: rcaspart/micromamba-cuda
# stage: benchmark
Expand Down Expand Up @@ -64,6 +140,9 @@ stages:
benchmark:
image: mambaorg/micromamba
stage: benchmark
when: manual
tags:
- docker
rules:
- if: $CI_COMMIT_MESSAGE !~ /.*\[CI-no-benchmarks\]/
artifacts:
Expand Down
14 changes: 7 additions & 7 deletions CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ representative at an online or offline event.
## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement
reported to the community leaders responsible for enforcement
[here](mailto:r.speck@fz-juelich.de).
All complaints will be reviewed and investigated promptly and fairly.

Expand Down Expand Up @@ -118,15 +118,15 @@ the community.

This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
<https://www.contributor-covenant.org/version/2/0/code_of_conduct.html>.

Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
enforcement ladder](<https://github.com/mozilla/diversity>).

[homepage]: https://www.contributor-covenant.org
[homepage]: <https://www.contributor-covenant.org>

For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.
<https://www.contributor-covenant.org/faq>. Translations are available at
<https://www.contributor-covenant.org/translations>.

:arrow_left: [Back to main page](./README.md)
:arrow_left: [Back to main page](./README.md)
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ This follows a specific OOP framework, you can look at the page on [custom imple
5. [Custom Implementations](./docs/contrib/04_custom_implementations.md)
6. [Documenting Code](./docs/contrib/05_documenting_code.md)

:arrow_left: [Back to main page](./README.md)
:arrow_left: [Back to main page](./README.md)
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
[![zenodo](https://zenodo.org/badge/26165004.svg)](https://zenodo.org/badge/latestdoi/26165004)
[![fair-software.eu](https://img.shields.io/badge/fair--software.eu-%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8F%20%20%E2%97%8F-green)](https://fair-software.eu)
[![SQAaaS badge shields.io](https://img.shields.io/badge/sqaaas%20software-silver-lightgrey)](https://api.eu.badgr.io/public/assertions/aS8J0NDTTjCyYP6iVufviQ "SQAaaS silver badge achieved")

# Welcome to pySDC!

The `pySDC` project is a Python implementation of the
Expand Down Expand Up @@ -95,7 +96,6 @@ Checkout the [Changelog](./CHANGELOG.md) to see pySDC's evolution since 2016.
Any contribution is dearly welcome ! If you want to take part of this, please take the time to read our [Contribution Guidelines](./CONTRIBUTING.md)
(and don't forget to take a peek at our nice [Code of Conduct](./CODE_OF_CONDUCT.md) :wink:).


## Acknowledgements

This project has received funding from the [European High-Performance
Expand Down
31 changes: 31 additions & 0 deletions docs/contrib/02_continuous_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,37 @@ pytest -v pySDC/tests
> pytest -v pySDC/tests/test_nodes.py::test_nodesGeneration[LEGENDRE] # only test_nodesGeneration with LEGENDRE nodes
> ```
## Running CI on HPC from pull requests
By syncing the GitHub repository to a certain Gitlab instance, CI-Jobs can be run on HPC machines. This can be helpful for benchmarks or when running on accelerators that are not available as GitHub runners.
For security and accounting reasons, a few extra steps are needed in order to run the contents of a pull request on HPC:
- The pull request needs to have the tag "gitlab-mirror" assigned to it.
- A person with write-permission for the Parallel-in-Time pySDC repository needs to trigger the workflow. Ask for someone with the required permissions to rerun the workflow if needed.
- The workflow checks if the code can be merged. If this is not the case, the code is not mirrored and the workflow fails. In this case, please merge upstream changes, fix all conflicts, and rerun the workflow.
> :bell: Note that direct pushes to Parallel-in-Time/pySDC will always trigger the HPC pipeline on Gitlab
Regardless of why the Gitlab pipeline was triggered, the following holds true:
- The return-state from Gitlab is transmitted to GitHub (Success/Failure) leading to the same result in GitHub
- Logs from Gitlab are also transferred. The full logs of all jobs can be read from within GitHub. For better overview, these are folded, so unfolding is needed before reading.
- Artifacts from Gitlab jobs are also transferred back to GitHub
- Information, such as coverage is transferred to GitHub, but not yet merged across multiple GitHub workflows. Therefore, there is no complete summary of e.g. coverage-reports across all jobs in all workflows.
> :warning: The coverage report from the HPC tests is not yet merged with other reports. The test coverage will not show up on the respective website or in the badge. We are working on this.
### HPC test environments
In order to run tests on GPUs, please use the pytest marker `cupy`.
If you want to create a new HPC test environment, the following steps need to be completed:
- Create a new slurm job-script in `etc/juwels_*.sh`. The name and location of the file is important.
- Adapt `.gitlab-ci.yml` to include the new job-script. For this, add a name in the job "test_JUWELS" in the section `parallel: matrix: SHELL_SCRIPT`. The name there must match the name of the newly created file.
As a starting point it is recommended to copy and adapt an existing file (e.g. `etc/juwels_cupy.sh`).
## Code coverage
This stage allows to checks how much of the `pySDC` code is tested by the previous stage. It is based on the [coverage](https://pypi.org/project/coverage/) library and currently applied to the following directories :
Expand Down
9 changes: 9 additions & 0 deletions etc/juwels_benchmark.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#!/bin/bash -x
#SBATCH --account=cstma
#SBATCH --nodes=1
#SBATCH --time=00:10:00
#SBATCH --partition=devel
#SBATCH --output=sbatch.out
#SBATCH --error=sbatch.err

srun python -m pytest --continue-on-collection-errors -v pySDC/tests -m "benchmark" --benchmark-json=benchmarks.json
9 changes: 9 additions & 0 deletions etc/juwels_cupy.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#!/bin/bash -x
#SBATCH --account=cstma
#SBATCH --nodes=1
#SBATCH --time=00:10:00
#SBATCH --partition=develgpus
#SBATCH --output=sbatch.out
#SBATCH --error=sbatch.err

srun python -m coverage run -m pytest --continue-on-collection-errors -v pySDC/tests -m "cupy"

0 comments on commit a1cb678

Please sign in to comment.