Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Policy lock exception, HTTP Status: 423 Locked #2151

Open
supratikg opened this issue Dec 3, 2024 · 6 comments
Open

Policy lock exception, HTTP Status: 423 Locked #2151

supratikg opened this issue Dec 3, 2024 · 6 comments

Comments

@supratikg
Copy link

Hello,

We are having hundreds of Okta group rules which are updated simultaneously. We first create a group and then a group rule where the group is being used in the rule policy. The code works fine but sometimes some of the group rule requests fails with the below error, while creating the resource.

E0000239: Policy lock exception
HTTP Status: 423 Locked
Policy priorities are being reconciled. Please try again later.

We discussed the problem with Okta and they mentioned that "the SDK does not have parallelism built into it". As Terraform is running with 10 parallel threads by default, it is ending up into this problem. Okta suggests that this should be handled at the client end. In this case it is the Okta Terraform provider which we are using.

The error doesn't appear anymore when we run apply with the flag -parallelism=1. However, this increases the amount of time the apply runs.

Could you please look into the issue and provide us a fix for the same?

Thanks in advance

Terraform Version

Terraform v1.9.5

Affected Resource(s)

  • okta_group_rule

Expected Behavior

No error

Can this be done in the Admin UI?

I don't know

Can this be done in the actual API call?

I don't know

Steps to Reproduce

  1. Create 100+ groups
  2. Create 100+ group rules, use the groups from the above in the group rule policy
  3. terraform apply
@duytiennguyen-okta
Copy link
Contributor

You will have to manage it by specifying -parallelism=?. There is no way around that.

@supratikg
Copy link
Author

@duytiennguyen-okta I think it would be really great if you can document this limitation in the okta_group_rule resource page. This would save a lot of precious time. :-)

Thank you so much

@supratikg
Copy link
Author

@duytiennguyen-okta I would request Okta to reconsider the decision of not fixing this issue. With just only one thread the deployment takes very long time to finish. This would become even slower when we have more and more resources manged under Terraform.

Thanks

@exitcode0
Copy link
Contributor

@duytiennguyen-okta
Should the API return a HTTP 429 here instead of a HTTP 423?
If the API returned a 429, the Okta Terraform Provider could re-use its existing API rate limiting and back-off functionality

Is the lack of concurrency support on this API documented anywhere? I think this is the first Okta API that i've come across where I can't update multiple resources concurrently

@duytiennguyen-okta
Copy link
Contributor

duytiennguyen-okta commented Dec 12, 2024

@supratikg One thing you can do is file a support ticket to modified the API rate limits at a granular level on your account by your Okta account manager. You can see your rate limit in your Admin Console: Reports > Rate Limits. You can also try to increase parallelism to more than 1. I will also need the support ticket to raise issue with the API team to potentially switch from 429 to 423 and use the backOff strategy that we have

@duytiennguyen-okta
Copy link
Contributor

duytiennguyen-okta commented Dec 12, 2024

@exitcode0 It is not that Okta API does not support concurrency, I think it is hitting rate limits. @supratikg mentioned that he has hundreds of groups and group rule on top of that. My expectation is that they both fall under api/v1/groups rl bucket which is why it is return 423. In regards to return 429 or 423, I will have to speak with the API team about that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants