Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare release v4.0 #273

Merged
merged 32 commits into from
Dec 22, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
7ec31be
Upgrade depthcharge (#160)
bittremieux Apr 6, 2023
59974e3
Fix val step and add unit test (#164)
melihyilmaz Apr 10, 2023
da1d16d
Add validation frequency option (#165)
melihyilmaz Apr 11, 2023
5ba0275
Minor refactoring + issue fix (#166)
bittremieux Apr 14, 2023
868952e
Always use full file paths (#168)
bittremieux Apr 17, 2023
effc955
Only split off known extensions from output filename (#171)
bittremieux Apr 17, 2023
6299bd2
Fix CPU bug, overhaul model runner, and update to lightning >=2.0 (#176)
wfondrie May 10, 2023
ea69360
Only create log directory when checkpointing is enabled (#196)
bittremieux Jun 23, 2023
f74a6ff
Avoid uninformative warnings (#175)
bittremieux Jun 27, 2023
b271dff
Use a single beam by default (#195)
bittremieux Jun 27, 2023
621ebdc
Run linting and tests for the dev branch (#197)
bittremieux Jun 27, 2023
ad48a09
CLI revamp (#184)
wfondrie Jul 18, 2023
8d2593b
Remove Pytorch Lightning logger (#220)
bittremieux Aug 3, 2023
e1e1bb0
Nicely format logged warnings (#223)
bittremieux Aug 3, 2023
3ac0887
Fix validation and checkpointing interval (#224)
bittremieux Aug 3, 2023
514db80
Fix custom residues in config (#229)
bittremieux Aug 16, 2023
bbce330
Added shuffling to trainset dataloaders, no shuffle by default for ot…
cfmelend Aug 17, 2023
51221f9
Merge pull request #234 from cfmelend/shuffle
cfmelend Aug 17, 2023
d9396aa
Force gradient calculation during inference (#231)
VarunAnanth2003 Aug 18, 2023
82be5ae
Document batch shuffling in the changelog
bittremieux Aug 18, 2023
727ead6
Upgrade depthcharge to v0.2.3 (#235)
bittremieux Aug 21, 2023
86630e3
Edits to config file. (#237)
wsnoble Aug 24, 2023
9a44630
Remove unused custom_encoder option (#254)
ishagokhale Oct 19, 2023
3a18fed
Correctly report AA precision and recall during validation (#253)
bittremieux Oct 24, 2023
7721962
Remove gradient calculation during inference (#258)
melihyilmaz Oct 24, 2023
235420f
Issue error for unrecognized/missing config file entry (#257)
ishagokhale Nov 9, 2023
e073415
Check for invalid/missin config entries (#268)
bittremieux Dec 11, 2023
3b688e8
Label smoothing in training (#261)
melihyilmaz Dec 12, 2023
2aed9e5
Use config options and auto-downloaded weights (#246)
melihyilmaz Dec 12, 2023
ad9ba0f
Minor fix to label smoothing (#270)
melihyilmaz Dec 13, 2023
0486de2
Merge dev into main
bittremieux Dec 22, 2023
c862bb5
Update version number in changelog
bittremieux Dec 22, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion .github/workflows/black.yml → .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,14 @@
name: Lint

on: [push, pull_request]
on:
push:
branches:
- main
- dev
pull_request:
branches:
- main
- dev

jobs:
lint:
Expand Down
33 changes: 33 additions & 0 deletions .github/workflows/screenshots.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: Screenshots with rich-codex
on:
pull_request:
paths:
- "docs/*.md"
- "casanovo/casanovo.py"
workflow_dispatch:

jobs:
rich_codex:
runs-on: ubuntu-latest
steps:
- name: Check out the repo
uses: actions/checkout@v4
with:
ref: ${{ github.head_ref }}

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"

- name: Install your custom tools
run: |
python -m pip install --upgrade pip
pip install .

- name: Generate terminal images with rich-codex
uses: ewels/rich-codex@v1
with:
timeout: 10
commit_changes: "true"
clean_img_paths: docs/images/*.svg
10 changes: 7 additions & 3 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,20 @@ name: tests

on:
push:
branches: [ main ]
branches:
- main
- dev
pull_request:
branches: [ main ]
branches:
- main
- dev

jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
os: [ubuntu-latest, windows-latest, macos-latest]

steps:
- uses: actions/checkout@v2
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# Test stuff:
test_path/
lightning_logs/
envs/

# Byte-compiled / optimized / DLL files
__pycache__/
Expand Down
39 changes: 38 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,42 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),

## [Unreleased]

## [4.0.0] - 2023-12-22

### Added

- Checkpoints include model parameters, allowing for mismatches with the provided configuration file.
- `accelerator` parameter controls the accelerator (CPU, GPU, etc) that is used.
- `devices` parameter controls the number of accelerators used.
- `val_check_interval` parameter controls the frequency of both validation epochs and model checkpointing during training.
- `train_label_smoothing` parameter controls the amount of label smoothing applied when calculating the training loss.

### Changed

- The CLI has been overhauled to use subcommands.
- Upgraded to Lightning >=2.0.
- Checkpointing is configured to save the top-k models instead of all.
- Log steps rather than epochs as units of progress during training.
- Validation performance metrics are logged (and added to tensorboard) at the validation epoch, and training loss is logged at the end of training epoch, i.e. training and validation metrics are logged asynchronously.
- Irrelevant warning messages on the console output and in the log file are no longer shown.
- Nicely format logged warnings.
- `every_n_train_steps` has been renamed to `val_check_interval` in accordance to the corresponding Pytorch Lightning parameter.
- Training batches are randomly shuffled.
- Upgraded to Torch >=2.1.

### Removed

- Remove config option for a custom Pytorch Lightning logger.
- Remove superfluous `custom_encoder` config option.

### Fixed

- Casanovo runs on CPU and can pass all tests.
- Correctly refer to input peak files by their full file path.
- Specifying custom residues to retrain Casanovo is now possible.
- Upgrade to depthcharge v0.2.3 to fix sinusoidal encoding and for the `PeptideTransformerDecoder` hotfix.
- Correctly report amino acid precision and recall during validation.

## [3.5.0] - 2023-08-16

### Fixed
Expand Down Expand Up @@ -181,7 +217,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),

- Initial Casanovo version.

[Unreleased]: https://github.com/Noble-Lab/casanovo/compare/v3.5.0...HEAD
[Unreleased]: https://github.com/Noble-Lab/casanovo/compare/v4.0.0...HEAD
[4.0.0]: https://github.com/Noble-Lab/casanovo/compare/v3.5.0...v4.0.0
[3.5.0]: https://github.com/Noble-Lab/casanovo/compare/v3.4.0...v3.5.0
[3.4.0]: https://github.com/Noble-Lab/casanovo/compare/v3.3.0...v3.4.0
[3.3.0]: https://github.com/Noble-Lab/casanovo/compare/v3.2.0...v3.3.0
Expand Down
Loading
Loading