Skip to content

Commit

Permalink
Fix mypy issues (#916)
Browse files Browse the repository at this point in the history
**Description of PR**
Currently, we are on a fairly old version of mypy. We also use hacks to
get around hera/pydantic compat.

This PR fixes various typing issues and bugs that were uncovered as a
result.

---------

Signed-off-by: Sambhav Kothari <skothari44@bloomberg.net>
Signed-off-by: Elliot Gunton <egunton@bloomberg.net>
Co-authored-by: Elliot Gunton <egunton@bloomberg.net>
  • Loading branch information
sambhav and elliotgunton authored Jan 10, 2024
1 parent 422dcf3 commit 6979d06
Show file tree
Hide file tree
Showing 74 changed files with 24,173 additions and 21,422 deletions.
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ install: ## Run poetry install with all extras for development
.PHONY: install-3.8
install-3.8: ## Install python3.8 for generating test data
@poetry env use 3.8
@poetry install
@poetry install --all-extras

.PHONY: ci
ci: ## Run all the CI checks
Expand Down Expand Up @@ -61,11 +61,11 @@ workflows-models: ## Generate the Workflows models portion of Argo Workflows
--wrap-string-literal \
--disable-appending-item-suffix \
--disable-timestamp \
--use-annotated \
--use-default-kwarg
@find src/hera/workflows/models/ -name '*.py' -exec sed -i.bak 's/from pydantic import Field/from hera.shared._pydantic import Field/' {} +
@find src/hera/workflows/models/ -name '*.bak' -delete
@poetry run python scripts/models.py $(OPENAPI_SPEC_URL) workflows
@poetry run stubgen -o src -p hera.workflows.models && find src/hera/workflows/models -name '__init__.pyi' -delete
@rm $(SPEC_PATH)
@$(MAKE) format

Expand All @@ -84,11 +84,11 @@ events-models: ## Generate the Events models portion of Argo Workflows
--wrap-string-literal \
--disable-appending-item-suffix \
--disable-timestamp \
--use-annotated \
--use-default-kwarg
@find src/hera/events/models/ -name '*.py' -exec sed -i.bak 's/from pydantic import Field/from hera.shared._pydantic import Field/' {} +
@find src/hera/events/models/ -name '*.bak' -delete
@poetry run python scripts/models.py $(OPENAPI_SPEC_URL) events
@poetry run stubgen -o src -p hera.events.models && find src/hera/events/models -name '__init__.pyi' -delete
@rm $(SPEC_PATH)
@$(MAKE) format

Expand Down
1 change: 1 addition & 0 deletions docs/examples/workflows-examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ Explore the examples through the side bar!
| [template-defaults](https://github.com/argoproj/argo-workflows/blob/main/examples/template-defaults.yaml) |
| [testvolume](https://github.com/argoproj/argo-workflows/blob/main/examples/testvolume.yaml) |
| [timeouts-step](https://github.com/argoproj/argo-workflows/blob/main/examples/timeouts-step.yaml) |
| [title-and-descriptin-with-markdown](https://github.com/argoproj/argo-workflows/blob/main/examples/title-and-descriptin-with-markdown.yaml) |
| [work-avoidance](https://github.com/argoproj/argo-workflows/blob/main/examples/work-avoidance.yaml) |
| [workflow-count-resourcequota](https://github.com/argoproj/argo-workflows/blob/main/examples/workflow-count-resourcequota.yaml) |
| [workflow-event-binding/event-consumer-workfloweventbinding](https://github.com/argoproj/argo-workflows/blob/main/examples/workflow-event-binding/event-consumer-workfloweventbinding.yaml) |
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/workflows/global_config.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
entrypoint: whalesay
serviceAccountName: argo-account
templates:
- activeDeadlineSeconds: '100'
- activeDeadlineSeconds: 100
container:
command:
- cowsay
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/workflows/upstream/default_pdb_support.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ The upstream example can be [found here](https://github.com/argoproj/argo-workfl
spec:
entrypoint: pdbcreate
podDisruptionBudget:
minAvailable: '9999'
minAvailable: 9999
serviceAccountName: default
templates:
- container:
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/workflows/upstream/retry_backoff.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ spec:
command=["python", "-c"],
args=["import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code)"],
retry_strategy=RetryStrategy(
limit=10,
limit="10",
backoff=m.Backoff(
duration="1",
factor="2",
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/workflows/upstream/retry_container.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The upstream example can be [found here](https://github.com/argoproj/argo-workfl
image="python:alpine3.6",
command=["python", "-c"],
args=["import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code)"],
retry_strategy=RetryStrategy(limit=10),
retry_strategy=RetryStrategy(limit="10"),
)
```

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/workflows/upstream/retry_script.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The upstream example can be [found here](https://github.com/argoproj/argo-workfl
from hera.workflows import RetryStrategy, Workflow, script


@script(image="python:alpine3.6", retry_strategy=RetryStrategy(limit=10), add_cwd_to_sys_path=False)
@script(image="python:alpine3.6", retry_strategy=RetryStrategy(limit="10"), add_cwd_to_sys_path=False)
def retry_script():
import random
import sys
Expand Down
2 changes: 0 additions & 2 deletions docs/examples/workflows/volume_mounts_nfs.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@
server="your.nfs.server",
mount_path="/mnt/nfs",
path="/share/nfs",
size="1Gi",
storage_class_name="nfs-client",
)
],
entrypoint="d",
Expand Down
2 changes: 1 addition & 1 deletion examples/workflows/global-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ spec:
entrypoint: whalesay
serviceAccountName: argo-account
templates:
- activeDeadlineSeconds: '100'
- activeDeadlineSeconds: 100
container:
command:
- cowsay
Expand Down
2 changes: 1 addition & 1 deletion examples/workflows/upstream/default-pdb-support.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
spec:
entrypoint: pdbcreate
podDisruptionBudget:
minAvailable: '9999'
minAvailable: 9999
serviceAccountName: default
templates:
- container:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ spec:
key: path/in/bucket
# Specify the bucket region. Note that if you want Argo to figure out this automatically,
# you can set additional statement policy that allows `s3:GetBucketLocation` action.
# For details, check out: https://argoproj.github.io/argo-workflows/configure-artifact-repository/#configuring-aws-s3
# For details, check out: https://argo-workflows.readthedocs.io/en/latest/configure-artifact-repository/#configuring-aws-s3
region: us-west-2
# accessKeySecret and secretKeySecret are secret selectors.
# It references the k8s secret named 'my-s3-credentials'.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# This example demonstrates the ability to use intermediate parameter.
# See https://argoproj.github.io/argo-workflows/intermediate-inputs/ for details.
# See https://argo-workflows.readthedocs.io/en/latest/intermediate-inputs/ for details.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# this example shows how to use key-only artifacts - introduced in v3.0
# https://argoproj.github.io/argo-workflows/key-only-artifacts/
# https://argo-workflows.readthedocs.io/en/latest/key-only-artifacts/
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ spec:
bucket: my-bucket
# Specify the bucket region. Note that if you want Argo to figure out this automatically,
# you can set additional statement policy that allows `s3:GetBucketLocation` action.
# For details, check out: https://argoproj.github.io/argo-workflows/configure-artifact-repository/#configuring-aws-s3
# For details, check out: https://argo-workflows.readthedocs.io/en/latest/configure-artifact-repository/#configuring-aws-s3
region: us-west-2

# NOTE: by default, output artifacts are automatically tarred and gzipped before saving.
Expand Down
2 changes: 1 addition & 1 deletion examples/workflows/upstream/retry_backoff.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
command=["python", "-c"],
args=["import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code)"],
retry_strategy=RetryStrategy(
limit=10,
limit="10",
backoff=m.Backoff(
duration="1",
factor="2",
Expand Down
2 changes: 1 addition & 1 deletion examples/workflows/upstream/retry_container.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,5 @@
image="python:alpine3.6",
command=["python", "-c"],
args=["import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code)"],
retry_strategy=RetryStrategy(limit=10),
retry_strategy=RetryStrategy(limit="10"),
)
2 changes: 1 addition & 1 deletion examples/workflows/upstream/retry_script.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from hera.workflows import RetryStrategy, Workflow, script


@script(image="python:alpine3.6", retry_strategy=RetryStrategy(limit=10), add_cwd_to_sys_path=False)
@script(image="python:alpine3.6", retry_strategy=RetryStrategy(limit="10"), add_cwd_to_sys_path=False)
def retry_script():
import random
import sys
Expand Down
2 changes: 0 additions & 2 deletions examples/workflows/volume_mounts_nfs.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,6 @@ def foo():
server="your.nfs.server",
mount_path="/mnt/nfs",
path="/share/nfs",
size="1Gi",
storage_class_name="nfs-client",
)
],
entrypoint="d",
Expand Down
Loading

0 comments on commit 6979d06

Please sign in to comment.