Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEATURE: Check and mitigate missing plugin dependencies on upgrade #225

Open
RoboMagus opened this issue May 31, 2022 · 6 comments
Open
Projects

Comments

@RoboMagus
Copy link

Is your feature request related to a problem? Please describe.
After the update to Octoprint 1.8.x, some plugins broke because one of their dependencies was no longer present in the Octoprint Image. See this issue on the main Octoprint repo.

Describe the solution you'd like
It is clearly out of scope for Octoprint itself to check for these kind of breaking changes, as based on regular python environments dependencies don't just go missing. Only when in this case the environment changes due to the use of a different / newer docker image will this become a problem.

Manual mittigation for those of us savy enough is fairly straightforward. But to ensure stability between upgrades a built in solution would be preferred. E.g. on startup of the docker container, if the Octoprint version changed or if it is a clean start of an image, run some checks on all listed dependencies of plugins and if some got missing install them prior to starting Octoprint itself.

Important to note here is that we Should Not just run the plugin setups again 'just to be sure'. As this might have unintended side effects (like reverting local file modifications). This check and mittigation should only touch the listed dependencies if some are missing.

Describe alternatives you've considered
Manual mitigation whenever this occurs. Frequency should not be that high, but the issue would remain for people not familiar enough with how the plugin enviroment works for docker based installations.

Additional context
Logging, additional info and input of others can be found in the original issue on the main Octoprint repo.

@LongLiveCHIEF
Copy link
Member

How are you mounting your volumes in regards to paths? The image is designed so that you should mount only the /octoprint volume, and that means an upgrade (including python packages that would happen with the upgrade), should be handled correctly.

The docker image explicitly sets the PYTHONUSERBASE to /octoprint/plugins so that any python modules installed go onto the volume mount instead of the python system path.

I also echo the comments in the thread you opened on the octoprint repo, in that plugins need to explicitly mention dependencies they have as part of their package, and not implicitly rely on a package that may have been on the system path of their development environment or the octoprint package, ensuring they don't break when octoprint upgrades.

@RoboMagus
Copy link
Author

The docker image explicitly sets the PYTHONUSERBASE to /octoprint/plugins so that any python modules installed go onto the volume mount instead of the python system path.

That is indeed the case for anything installed as a result from installing a plugin. However that is not the case when a dependency is already satisfied by a module installed as part of the Octoprint environment contained in the Docker image. Then when the image is updated, the dependency breaks.

I also echo the comments in the thread you opened on the octoprint repo, in that plugins need to explicitly mention dependencies they have as part of their package, and not implicitly rely on a package that may have been on the system path of their development environment or the octoprint package, ensuring they don't break when octoprint upgrades.

That also is very much true, any dependency of a plugin should be listed, but not the issue at hand. The problem occurs even with propperly listed dependencies that were satisfied by modules installed in the Octoprint Docker image that after an update of the image would be removed (as was the case for six).
This is backed up by looking at a plugin like Octolapse. This lists six as one of its dependencies, but that module was not installed in the persistent volume, as the dependency was already satisfied by the module part of the Octoprint Image.

What I am suggesting is that upon upgrade of the Octoprint Docker image, a check could be performed to check if all dependencies are (still) satisfied after switching to a newer image and if not, to install dependencies that have gone missing.

@RoboMagus
Copy link
Author

Steps to reproduce:

1. Create bare Octoprint Docker container in compose based on Octoprint 1.7:

octoprint:
  image: octoprint/octoprint:1.7
  container_name: octoprint
  restart: unless-stopped
  privileged: true
  volumes:
    - ./Volumes/octoprint:/octoprint
  ports:
    - 5000:80

2. Bare configuration + Install Octolapse plugin

  • Note the line that says Requirement already satisfied: six in /usr/local/lib/python3.8/site-packages (from Octolapse==0.4.1) (1.16.0) in the Octolapse Installation.log
  • Directory structure does not show modules that are already satisfied by modules installed in the Docker image: Persistent volume Tree

3. Update docker image to 1.8 in compose:

octoprint:
  image: octoprint/octoprint:1.8
  container_name: octoprint
  restart: unless-stopped
  privileged: true
  volumes:
    - ./Volumes/octoprint:/octoprint
  ports:
    - 5000:80

4. Verify missing module in Octoprint container log when recreating the container.

@LongLiveCHIEF
Copy link
Member

That helps a bunch.

What I am suggesting is that upon upgrade of the Octoprint Docker image, a check could be performed to check if all dependencies are (still) satisfied after switching to a newer image and if not, to install dependencies that have gone missing.

Yeah, this could probably be done. I do think that this is a problem unique to docker, in the event that you switch main image versions.

So far, the advice on upgrading octoprint in docker has been murky, because there are a lot of gaps just like this one that I anticipated, but hadn't uncovered.

One thing to keep in mind, is that the octoprint docker image has an image version of its own, and generally speaking, as long as you keep the same image version, we prefer you use octoprint internal upgrade feature to upgrade your version of octoprint within your container.

If you upgrade your container, right now the best method has proven to be exporting your config, start the new container, then import.

Right now, those are my advised upgrade paths, as the export/import method will re-run those plugins.

I do think however, this could be made smoother, and although there is a lot more to it that just running this check, this would be a big part.

@LongLiveCHIEF LongLiveCHIEF changed the title Check and mitigate missing plugin dependencies on upgrade FEATURE: Check and mitigate missing plugin dependencies on upgrade Jun 20, 2022
@LongLiveCHIEF LongLiveCHIEF added this to Planned in Roadmap Jun 20, 2022
@RoboMagus
Copy link
Author

RoboMagus commented Jun 23, 2022

One thing to keep in mind, is that the octoprint docker image has an image version of its own, and generally speaking, as long as you keep the same image version, we prefer you use octoprint internal upgrade feature to upgrade your version of octoprint within your container.

The whole appeal of running applications / services in Docker containers is the ease of updating through pulling a new base image. This way the complete environment for the application / service is in an exact known state for each update. Sticking to the initially spun up container and performing updates inside the container should not be advised and isn't considered best practice.

I do agree on the second workaround though, but its a bit more hands-on even when compared to just updating the base image and fixing stuff that breaks.

Happy to tag along in searching for a proper solution, as this is quite an interesting case to solve.

@LongLiveCHIEF
Copy link
Member

The whole appeal of running applications / services in Docker containers is the ease of updating through pulling a new base image.

The appeal of running applications/services in Docker containers, the reason they were designed, is so that an applications dependencies can be isolated from the host operating system, and so that external container needs can be configured by sysadmin while the developer can specify those needs but program against locals.

Sticking to the initially spun up container and performing updates inside the container should not be advised and isn't considered best practice.

This is considered best practice for stateless containers.

Many applications and services do not actually act this way as containers, or if they appear to to the end user, then they have internal scripts as part of the entrypoint of the container to detect and run migrations.

All that is besides the point though.

Octoprint is not a traditional docker container to begin with, as it has a process supervisor, and has multiple processes. We call this out in the opening statement of the README. The whole point of containers is to provide abstraction from the host... yet Octoprint requires pretty tight host integration, and it's done in such a way where some things are going to have to be workarounds if you want to use containers.

The purpose of the Octoprint docker image isn't to be an easy one-click install and service provider for OctoPrint.

It should only be used when docker is the only answer for the users needs, and the user is very familiar with docker.

Containers have seem to become ubiquitous with "easy service access". That's not due to the design of docker, that's due to the efforts of the owners and maintainers of applications that take the effort to make processes like these seem like they're plug-n-play or version-swappable to the end user.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Roadmap
Planned
Development

No branches or pull requests

2 participants