Replies: 1 comment 2 replies
-
@cdenneen when we were doing the design for PCT, at least for the early stages, we wanted to be less prescriptive about managing/maintaining code files for the longer lifecycle; we scoped PCT towards laying files and folders down. One of the reasons for this was the incompatibility introduced between the PDK and modulesync, necessitating a rework of how people managed their code bases. Our thought was that by not defining an update procedure in PCT - especially as individual values and configurations get exponentially complex - we could leave that functionality to a tool built only to handle it (ie modulesync). Does modulesync not meet these needs for you? Or would documenting the use of modulesync with PCT resolve this concern? |
Beta Was this translation helpful? Give feedback.
-
There needs to be a way to keep modules, control repositories, etc in sync with a template.
At some point people will create "artifacts" from these PCT's but as things evolve the goal will be to update the "upstream" template and then have the "artifacts" sync back those updates, if they choose.
This comes with some challenges in this current model:
- So you could have now created a custom template (or repo, or module) that is a combination of different upstream sources. You might want to keep those in sync. Base template, maybe a gitlab-ci template, GH Actions template, etc.
You could have
pct update all
that goes through all the upstream pieces and creates anupdate_report.txt
.You could do
pct update puppetlabs/editorconfig
.You could do combination of
pct update puppetlabs/puppet-control-repo
,pct update puppetlabs/editorconfig
,pct update thirdparty/puppet-fact factname
,pct update thirdparty/bolt-plan planname
.Beta Was this translation helpful? Give feedback.
All reactions