Hapi pipelines plugin for the Screwdriver API
const Hapi = require('@hapi/hapi');
const server = new Hapi.Server();
const pipelinesPlugin = require('./');
server.connection({ port: 3000 });
server.register({
register: pipelinesPlugin,
options: {}
}, () => {
server.start((err) => {
if (err) {
throw err;
}
console.log('Server running at:', server.info.uri);
});
});
page
, count
, sort
, sortBy
, search
, and configPipelineId
optional
search
will search for a pipeline with a name containing the search keyword in the scmRepo
field
GET /pipelines?page={pageNumber}&count={countNumber}&configPipelineId={configPipelineId}&search={search}
Need to have array format for ids
to only return pipelines with matching ids
GET /pipelines?search={search}&ids[]=12345&ids[]=55555
GET /pipelines/{id}
Create a pipeline and create a job called 'main'
POST /pipelines
Arguments
checkoutUrl
- Source code URL for the application. For a git-based repository, it is typically the SSH endpoint and the branch name, separated by a octothorpe. Must be unique.rootDir
- Optional Root directory where the source code lives. Default to empty string.
Example payload:
{
"checkoutUrl": "git@github.com:screwdriver-cd/data-model.git#master",
"rootDir": "src/app/component"
}
You can update the checkoutUrl of a pipeline.
PUT /pipelines/{id}
Arguments
checkoutUrl
- Source code URL for the application. For a git-based repository, it is typically the SSH endpoint and the branch name, separated by a octothorpe. Must be unique.rootDir
- Optional Root directory where the source code lives. Default to empty string.
Example payload:
{
"checkoutUrl": "git@github.com:screwdriver-cd/data-model.git#master",
"rootDir": "src/app/component"
}
DELETE /pipelines/{id}
- Synchronize the pipeline by looking up latest screwdriver.yaml
- Create, update, or disable jobs if necessary.
- Store/update the pipeline workflowGraph
POST /pipelines/{id}/sync
- Synchronize webhooks for the pipeline
- Add or update webhooks if necessary
POST /pipelines/{id}/sync/webhooks
- Synchronize pull requests for the pipeline
- Add or update pull request jobs if necessary
POST /pipelines/{id}/sync/pullrequests
page
, count
, sort
, and prNum
are optional
Only PR events of specified PR number will be searched when prNum
is set
GET /pipelines/{id}/events?page={pageNumber}&count={countNumber}&sort={sort}&prNum={prNumber}
archived
is optional and has a default value of false
, which makes the endpoint not return archived jobs (e.g. closed pull requests)
GET /pipelines/{id}/jobs?archived={boolean}
GET /pipelines/{id}/admin
GET /pipelines/{id}/triggers
GET /pipelines/{id}/stages
GET /pipelines/{id}/stages?eventId={eventId}
GET /pipelines/{id}/secrets
GET /pipelines/{id}/metrics
GET /pipelines/{id}/metrics?startTime=2019-02-01T12:00:00.000Z
GET /pipelines/{id}/metrics?aggregateInterval=week
Need to have array format for downtimeJobs and downtimeStatuses
GET /pipelines/{id}/metrics?downtimeJobs[]=123&downtimeJobs[]=456&downtimeStatuses[]=ABORTED
- Start all child pipelines belong to this config pipeline all at once
POST /pipelines/{id}/startall
POST /pipelines/{id}/token
GET /pipelines/{id}/tokens
PUT /pipelines/{pipelineId}/tokens/{tokenId}
PUT /pipelines/{pipelineId}/tokens/{tokenId}/refresh
DELETE /pipelines/{pipelineId}/tokens/{tokenId}
DELETE /pipelines/{pipelineId}/tokens
GET /pipelines/{id}/jobs/{jobName}/latestBuild
Can search by build status
GET /pipelines/{id}/jobs/{jobName}/latestBuild?status=SUCCESS
DELETE /pipelines/${id}/caches?scope={scope}&cacheId={id}
Path Params:
id
- The id of the pipeline
Query Params:
scope
- Scope of the cache supporting valuespipelines|jobs|events
cacheId
- The id of the cache - pipelineId/jobId/eventId
ecosystem:
store: 'https://store.screwdriver.cd'
queue: 'https://queue.screwdriver.cd'
cache:
strategy: 's3'
Route requests to queue service api if strategy is disk and to store api if strategy is s3
POST /pipelines/{id}/openPr
The server supplies factories to plugins in the form of server settings:
// handler pipelinePlugin.js
handler: async (request, h) => {
const factory = request.server.app.pipelineFactory;
// ...
}
GET /pipeline/templates
Can use additional options for sorting, pagination and count:
GET /pipeline/templates?sort=ascending&sortBy=name&page=1&count=50
GET /pipeline/templates/{namespace}/{name}/versions
Can use additional options for sorting, pagination and count:
GET /pipeline/templates/{namespace}/{name}/versions?sort=ascending&page=1&count=50
Creating a template will store the template meta (name
, namespace
, maintainer
, latestVersion
, trustedSinceVersion
, pipelineId
) and template version (description
, version
, config
, createTime
, templateId
) into the datastore.
version
will be auto-bumped. For example, if mypipelinetemplate@1.0.0
already exists and the version passed in is 1.0.0
, the newly created template will be version 1.0.1
.
POST /pipeline/template
'name', 'namespace', 'version', 'description', 'maintainer', 'config'
name
- Name of the templatenamespace
- Namespace of the templateversion
- Version of the templatedescription
- Description of the templatemaintainer
- Maintainer of the templateconfig
- Config of the template. This field is an object that includessteps
,image
, and optionalsecrets
,environments
. Similar to what's inside thepipeline
Example payload:
{
"name": "example-template",
"namespace": "my-namespace",
"version": "1.3.1",
"description": "An example template",
"maintainer": "example@gmail.com",
"config": {
"steps": [{
"echo": "echo hello"
}]
}
}
Validate a pipeline template and return a JSON containing the boolean property ‘valid’ indicating if the template is valid or not
POST /pipeline/template/validate
'name', 'namespace', 'version', 'description', 'maintainer', 'config'
name
- Name of the templatenamespace
- Namespace of the templateversion
- Version of the templatedescription
- Description of the templatemaintainer
- Maintainer of the templateconfig
- Config of the template. This field is an object that includessteps
,image
, and optionalsecrets
,environments
. Similar to what's inside thepipeline
Example payload:
{
"name": "example-template",
"namespace": "my-namespace",
"version": "1.3.1",
"description": "An example template",
"maintainer": "example@gmail.com",
"config": {
"steps": [{
"echo": "echo hello"
}]
}
}
GET /pipeline/template/{namespace}/{name}
GET /pipeline/template/{id}
GET /pipeline/template/{namespace}/{name}/{versionOrTag}
Template tag allows fetching on template version by tag. For example, tag mytemplate@1.1.0
as stable
.
GET /pipeline/templates/{namespace}/{name}/tags
Can use additional options for sorting, pagination and count:
GET /pipeline/templates/{namespace}/{name}/tags?sort=ascending&sortBy=name&page=1&count=50
If the template tag already exists, it will update the tag with the new version. If the template tag doesn't exist yet, this endpoint will create the tag.
Note: This endpoint is only accessible in build
scope and the permission is tied to the pipeline that creates the template.
PUT /templates/{templateName}/tags/{tagName}
with the following payload
version
- Exact version of the template (ex:1.1.0
)
Deleting a pipeline template will delete a template and all of its associated tags and versions.
DELETE /pipeline/templates/{namespace}/{name}
name
- Name of the template
Delete the template version and all of its associated tags.
If the deleted version was the latest version, the API would set the latestVersion
attribute of the templateMeta to the previous version.
DELETE /pipeline/templates/{namespace}/{name}/versions/{version}
'namespace', 'name', 'version'
namespace
- Namespace of the templatename
- Name of the templateversion
- Version of the template
Delete the template tag. This does not delete the template itself.
Note: This endpoint is only accessible in build
scope and the permission is tied to the pipeline that creates the template.
DELETE /pipeline/templates/{namespace}/{name}/tags/{tag}
'namespace', 'name', 'tag'
namespace
- Namespace of the templatename
- Name of the templatetag
- Tag name of the template