Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: allow arbitrary length API tokens #752

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

kamushadenes
Copy link

Context

We have developed a soon-to-be-open-source proxy that forces specific labels in order to provide scoped API access, and that doesn't expose the real API token. This was created to have better control of resources inside the same project (as API tokens currently lack granularity), and to be able to use a single project securely, given that it isn't possible to create a project via the API.

One of it's operating modes is using JWT as a virtual self-validating token, which can't have a fixed size.

This support is required to make full use of it inside a Kubernetes cluster.

The feature is behind a default-false flag so it shouldn't interfere with current behavior.

Related

kubernetes/autoscaler#7285
hetznercloud/csi-driver#724

@apricote
Copy link
Member

Hey @kamushadenes,

I will respond here, but the same applies to the other two PRs:

This sounds very interesting. I am not sure if a flag is necessary, or if we just want to the whole validation. I will talk to the team responsible for tokens next week and will report back afterwards.

@apricote
Copy link
Member

apricote commented Oct 7, 2024

Hey @kamushadenes,

we talked about this today internally. We would prefer not to add any additional flags or environment variables to disable the check.

We also do not think that the check is strictly necessary. Instead we would prefer to change the errors when len(token) != 64 to a warning message but allow the token. This would also help us in the future if we ever change the format of our API tokens.

Do you want to update your PRs to log warnings or should we do work on that?

@apricote apricote added the enhancement New feature or request label Oct 7, 2024
@kamushadenes
Copy link
Author

Hey @apricote, thanks for getting back!

Makes total sense, I'll update the PRs in a couple hours when my day starts.

@kamushadenes kamushadenes changed the title Allow arbitrary length API token behind a flag Allow arbitrary length API token Oct 7, 2024
@kamushadenes
Copy link
Author

Done!

@apricote apricote changed the title Allow arbitrary length API token feat: allow arbitrary length API tokens Oct 7, 2024
Copy link
Member

@apricote apricote left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was about to leave a comment regarding the unit test :D

Do not worry about the e2e tests, they do not work for PRs from forks.

Copy link

codecov bot commented Oct 7, 2024

Codecov Report

Attention: Patch coverage is 0% with 1 line in your changes missing coverage. Please review.

Project coverage is 71.73%. Comparing base (6b132c1) to head (d989b20).

Files with missing lines Patch % Lines
internal/config/config.go 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #752      +/-   ##
==========================================
- Coverage   72.00%   71.73%   -0.27%     
==========================================
  Files          31       30       -1     
  Lines        2650     2473     -177     
==========================================
- Hits         1908     1774     -134     
+ Misses        553      525      -28     
+ Partials      189      174      -15     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants