Max tasks across dags stuck at 487 #44293
Replies: 2 comments
-
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Converted it to a discussion - this is not a reproducible issue with airflow, this is some configuration question (looks like limitation of your k8S, but hard to say - maybe someone will be able to help you. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.8.1
What happened?
I have the airflow running on AKS cluster with just the kubernetes executor.
The following configs are set for attaining the max concurrency.
Parallelism: 128 (per scheduler)
No of Schedulers: 10
Default pool size: 1200
DB: Postgres
Airflow Version: 2.8.1
Platform: Ubuntu
We are running big data workloads and the number of Max task runs can reach 1000.
However, with the above configurations, the Max task runs across all the dags is stuck at the magic number 487 and the remaining ones are queued.
I have checked the docs about any other parameters that need to be set, however only these are mentioned.
Appreciate the suggestions from the community to identify the missing configuration.
Thanks
What you think should happen instead?
No response
How to reproduce
this setup is running on Azure K8s.
The number of concurrent tasks with kubernetes executor is stuck at 487 mac.
Operating System
Ubuntu
Versions of Apache Airflow Providers
No response
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions