Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with Spectral Extension Workflow: Failure with Custom Date Ranges #209

Closed
click2cloud-tejas opened this issue Nov 28, 2024 · 2 comments
Labels
bug Something isn't working triage Issues still not triaged by team workflows Issues encountered when running workflows

Comments

@click2cloud-tejas
Copy link

click2cloud-tejas commented Nov 28, 2024

In which step did you encounter the bug?

Workflow execution

Are you using a local or a remote (AKS) FarmVibes.AI cluster?

Local

Bug description

I am encountering an issue with the Spectral Extension Workflow in FarmVibes.AI. When I run the workflow using a date range other than the one specified in the example notebook, the workflow fails at the compute_onnx task with the following error:
RuntimeError: Failed to run op compute_onnx_from_sequence for input...
InvalidProtoBuf: Load model failed: Protoub parsing failed.
However, when I use the exact date range provided in the workflow notebook, the workflow executes successfully, and I obtain the expected output.Image

Steps to reproduce the problem

1.Modify the date range in the Spectral Extension Workflow notebook to any range other than the one provided in the example notebook.
2.Run the workflow.
3.Observe that the workflow fails at the compute_onnx task.
4.Reset the date range to the original values from the notebook and rerun the workflow.
5.Observe that the workflow completes successfully.

@click2cloud-tejas click2cloud-tejas added the bug Something isn't working label Nov 28, 2024
@github-actions github-actions bot added triage Issues still not triaged by team workflows Issues encountered when running workflows labels Nov 28, 2024
@renatolfc
Copy link
Contributor

Our image build pipeline was missing loading model weights from git LFS.

The pipeline was updated to do that as well as the images.

Rebuilding the cluster should fix this issue.

@click2cloud-tejas
Copy link
Author

Hi @renatolfc Thanks for your help, Issue resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issues still not triaged by team workflows Issues encountered when running workflows
Projects
None yet
Development

No branches or pull requests

2 participants