Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare release v4.0 #273

Merged
merged 32 commits into from
Dec 22, 2023
Merged

Prepare release v4.0 #273

merged 32 commits into from
Dec 22, 2023

Conversation

bittremieux
Copy link
Collaborator

Merge dev into main, resolve version conflicts, and update version numbers in the changelog.

bittremieux and others added 30 commits April 6, 2023 10:16
* Upgrade depthcharge

* Update CHANGELOG.md
* Fix logging and checkpointing bug

* Add option to validate every n steps
* Always use full file paths

Fixes #167.

* Update changelog

* Formatting fix
* Only split off extension if it's mzTab

* Also check for .log extension

* Update output format in help message
* Overhaul runner

* Update linting to only happen once

* Fix linting error

* Specify utf-8 encoding

* Specify utf-8 encoding only for default config

* Skip weights tests for now

* Update skipping API test

* Revert accidental max_epochs change

* msg -> reason for pytest.mark.skip

* Wout's suggestions and more tests

* Remove encoding

* Specify device type when weight loading

* Fix lint

* Capture init params and figure out device automagically

* Add runner tests

* Fix bug and limit saved models

* Support old weights too

* Remove every_n_train_steps from checkpoint

---------

Co-authored-by: melihyilmaz <yilmazmelih97@gmail.com>
Checkpointing during training can be enabled using the `logger` config option. If this is not specified, or during inference, don't create the `lightning_logs/` directory.

Fixes #187.
* Avoid detailed fsspec logging

* Replace deprecated `auto_select_gpus`

* Show nice dataloader workers log message

Capture the Pytorch Lightning warning as per https://lightning.ai/docs/pytorch/stable/advanced/speed.html#dataloaders

* Overhaul runner

* Update linting to only happen once

* Fix linting error

* Specify utf-8 encoding

* Specify utf-8 encoding only for default config

* Skip weights tests for now

* Update skipping API test

* Revert accidental max_epochs change

* msg -> reason for pytest.mark.skip

* Remove obsolete imports

* Fix type hints

* Wout's suggestions and more tests

* Remove encoding

* Ignore irrelevant PyTorch warnings

* Update changelog

* Remove unused imports

* Undo unnecessary whitespace change

---------

Co-authored-by: William Fondrie <fondriew@gmail.com>
* Ensure correct output with n_beams=1

Fixes #185.

* Use a single beam by default

Fixes #193.

* Add unit test for single beam

---------

Co-authored-by: melihyilmaz <yilmazmelih97@gmail.com>
* Run linting and tests for dev branch

* Fix tests

Limit DepthCharge to v0.2.

* No DepthCharge v0.3
* Overhaul runner

* Update linting to only happen once

* Fix linting error

* Specify utf-8 encoding

* Specify utf-8 encoding only for default config

* Skip weights tests for now

* Update skipping API test

* Revert accidental max_epochs change

* msg -> reason for pytest.mark.skip

* Wout's suggestions and more tests

* Remove encoding

* Specify device type when weight loading

* Fix lint

* Capture init params and figure out device automagically

* Add runner tests

* Fix bug and limit saved models

* WIP

* Support old weights too

* Remove every_n_train_steps from checkpoint

* Implementation done. Need to update tests

* Update tests

* add -h

* Add auto screenshot generation

* Add screenshots workflow [screenshots]

* Add manual dispatch

* Added image

* Updated instructions [screenshots]

* Update screenshot workflow

* Fix bug and screenshot uploads

* Restirct workflow

* fix ref

* Generate new screengrabs with rich-codex

* increase timeout

* bump changelog and increase timeout

* Generate new screengrabs with rich-codex

* Add more tests

* Add option for calculating precision during training

* Restrict depthcharge version

* Log scalars instead of tensors

* Fix typo

* Fix issue Wout found

* Write to stdout instead of stderr

* Minor refactoring

* Separate logger and model initialization

* Generate new screengrabs with rich-codex

* Generate new screengrabs with rich-codex

* Fix test formatting

* Fix edge case

---------

Co-authored-by: melihyilmaz <yilmazmelih97@gmail.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Wout Bittremieux <wout@bittremieux.be>
* Remove Pytorch Lightning logger

Fixes #219.

* Update changelog

* Clarify tb_summarywriter
* Format how warnings are logged

Fixes #222.

* Update changelog

* Fix linting
* Rename every_n_train_steps to val_check_interval

* Disable check_val_every_n_epochs

* Update changelog
* Fix specifying custom residues

* Update changelog
Added shuffling to train dataloader
* Add lines to force gradient calculation.

During inference only.

* Add comments in the code to document the temporary workaround

* Update changelog

---------

Co-authored-by: Varun Ananth <vananth3@nexus2.gs.washington.edu>
Co-authored-by: Wout Bittremieux <wout@bittremieux.be>
Co-authored-by: Wout Bittremieux <bittremieux@users.noreply.github.com>
And fix duplicate changelog organization.
* resolves issue #238: remove custom_encoder option

* fixed lint issue

* fixed lint issue

* Revert "fixed lint issue"

This reverts commit bd1366c.

* lint

* lint issue

* Consistently format changelog.

---------

Co-authored-by: Isha Gokhale <igokhale@tyrosine.gs.washington.edu>
Co-authored-by: Wout Bittremieux <wout@bittremieux.be>
Fixes #252.

Co-authored-by: Melih Yilmaz <32707537+melihyilmaz@users.noreply.github.com>
* Remove force_grad in inference

* Upgrade required PyTorch version

* Update CHANGELOG.md

* Update CHANGELOG.md

* Fix typo in torch version

* Specify correct Pytorch version change

---------

Co-authored-by: Wout Bittremieux <bittremieux@users.noreply.github.com>
* added unit tests to raise exceptions when unrecognized/missing file entry

* fixed lint issue

* fix lint issue

* fixed failing unit test

* lint issue

* lint

* lint issue

---------

Co-authored-by: Isha Gokhale <igokhale@tyrosine.gs.washington.edu>
* Refactor config checking

- More efficient checking for unknown and missing config entries (no needless conversions/creations of sets and lists).
- Avoid creating temporary files in root directories during unit testing.
- Update outdated test.

* Add unit test fix
* Add option to change learning rate scheduler and made it easier to add a new one.

* docs

* tests and formatting

* Add label smoothing

* Modify config file

* Minor fix config.yaml

* Run black

* Lint casanovo.py

* Revert "Merge branch 'add_lr_schedule_options' into label-smoothing"

This reverts commit 5716c7a, reversing
changes made to b044bc6.

* Add unit test

* Fix config test and add changelog

---------

Co-authored-by: Justin Sanders <jsander1@n033.grid.gs.washington.edu>
* Use auto-downloaded weights

* Use config options in model init

* Fix linting

* Fix missing return value

* Override mismatching params with ckpt

* Handle corrupt ckpt

* Add test case for parameter mismatch

* Add Python version to screenshot action

* Generate new screengrabs with rich-codex

* Fix import order

* Minor reformatting

---------

Co-authored-by: William Fondrie <fondriew@gmail.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Wout Bittremieux <wout@bittremieux.be>
Add option to change label smoothing for model initialization from a checkpoint file.
Copy link

codecov bot commented Dec 22, 2023

Codecov Report

Attention: 19 lines in your changes are missing coverage. Please review.

Comparison is base (ea2e875) 81.41% compared to head (c862bb5) 89.54%.
Report is 1 commits behind head on main.

Files Patch % Lines
casanovo/denovo/model_runner.py 87.02% 17 Missing ⚠️
casanovo/casanovo.py 98.85% 1 Missing ⚠️
casanovo/denovo/dataloaders.py 85.71% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #273      +/-   ##
==========================================
+ Coverage   81.41%   89.54%   +8.12%     
==========================================
  Files          11       12       +1     
  Lines         807      918     +111     
==========================================
+ Hits          657      822     +165     
+ Misses        150       96      -54     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@bittremieux bittremieux merged commit 3c2d3f5 into main Dec 22, 2023
7 checks passed
@bittremieux bittremieux deleted the release branch December 22, 2023 13:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants