Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python] secure python package software supply chain to comply with partner org requirements #19401

Closed
7 tasks done
timothytrippel opened this issue Aug 9, 2023 · 0 comments · Fixed by #19531
Closed
7 tasks done
Assignees
Labels
Component:CI Continuous Integration (Azure Pipelines & Co.) Component:Software Issue related to Software

Comments

@timothytrippel
Copy link
Contributor

timothytrippel commented Aug 9, 2023

Some partner organizations have security requirements for how python packages are installed. Namely:

  • all python packages (including transitive dependencies) should be specified in the python-requirements.txt file
  • python packages in python-requirements.txt should be pinned to specific versions (already done)
  • python packages in python-requirements.txt should have associated expected hashes
  • when downloading/installing python packages with pip install -r python-requirements.txt the --require-hashes flag should always be passed.

To comply with these requirements (needed to continue running regressions), we need to update how our python packages are declared and maintained in our repo. Specifically, we need to:

Addressed in lowRISC/fusesoc#4:

  • update the lowRISC fork of fusesoc to add the fallback_version to the use_scm_version field in setup.py (so pip-compile can generate the hash for this dep)

Addressed in #19531:

  • add the pip-tools package as a new dep, so we can run pip-compile --generate-hashes python-requirements.in to auto-generate the python-requirements.txt file we check-in to this repo so it contains:
    • all pinned versions,
    • all expected hashes,
    • for all deps (including transitive deps)
  • change the git VCS link references to fusesoc, edalize, and chipwhisperer packages to use plain HTTPS URLs to github hosted zip archives (so pip-compile can generate the hashes)
  • move the current python-requirements.txt file to python-requirements.in
  • autogenerate a python-requirements.txt file with pip-compile --generate-hashes python-requirements.in and check it into the repo
  • add a CI check to ensure the auto-generated python-requirements.txt file checked-in does not get stale
  • update website documentation on how to add python deps, i.e.,
    1. update the python-requirements.in file to add your dep (pinned to a specific version)
    2. run pip-compile --generate-hashes python-requirements.in to autogenerate the new python-requirements.txt file
    3. check both into the repo
    4. to install python packages securely, run pip install --require-hashes -r python-requirements.txt
@timothytrippel timothytrippel added Component:Software Issue related to Software Component:CI Continuous Integration (Azure Pipelines & Co.) labels Aug 9, 2023
@timothytrippel timothytrippel added this to the Discrete: M2.5.5 milestone Aug 9, 2023
@timothytrippel timothytrippel self-assigned this Aug 9, 2023
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
4. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
4. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
4. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Aug 25, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
4. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` dependency as this does not seem to be
   pinned by the `jsonschema` package and causes CI errors when pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 6, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 7, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 13, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 13, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses lowRISC#19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit to timothytrippel/opentitan that referenced this issue Sep 13, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit that referenced this issue Sep 14, 2023
To comply with partner organization python package supply chain
requirements we add hashes for all python packages (including all
transitive dependencies). To do so we:
1. move the existing `python-requirements.txt` file to
   `python-requirements.in`, as this will become the input to the tool
   (i.e., `pip-compile`) that generates the `python-requirements.txt` file
   we check in,
2. add `pip-tools` as a project dependency, it contains the
   `pip-compile` tool,
3. add `importlib-resources` and `pkgutil_resolve_name` dependencies as these do
   not seem to be pinned by the `jsonschema` package and causes CI errors when
   pinning hashes,
4. change the git VCS link references to fusesoc, edalize, and chipwhisperer
   packages to use plain HTTPS URLs to github hosted zip archives (so
   `pip-compile` can generate the hashes), and
5. autogenerate a `python-requirements.txt` file with `pip-compile
   --generate-hashes python-requirements.in` and check it into the repo.

This partially addresses #19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit that referenced this issue Sep 14, 2023
This adds a CI check to ensure the auto-generated `python-requirements.txt`
file checked-in does not get stale. This partially addresses #19401.

Signed-off-by: Tim Trippel <ttrippel@google.com>
timothytrippel added a commit that referenced this issue Sep 14, 2023
To fix #19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
mundaym pushed a commit that referenced this issue Nov 13, 2023
To fix #19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
msfschaffner pushed a commit to msfschaffner/opentitan that referenced this issue Nov 13, 2023
To fix lowRISC#19401, we have a new process for adding Python packages to the
project.

Signed-off-by: Tim Trippel <ttrippel@google.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Component:CI Continuous Integration (Azure Pipelines & Co.) Component:Software Issue related to Software
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant