This pipeline demonstrates the definition of a simple pipeline that creates and signs a release bundle, and distributes it to an Artifactory edge node. An example Pipelines DSL is used to show how to use integrations, resources, and steps to construct a simple, automated workflow. The pipeline is triggered by any change in any of the BuildInfo resources used to create the release bundle.
This pipeline demonstrates the following:
This pipeline starts from where the Pipeline Example: Docker Build and Push left off. While that pipeline built and pushed a Docker build, this pipeline takes the Buildinfo from that example and creates the release bundle and distributes it to an edge node.
A successful run of the pipeline in this quickstart looks like this:
Before you Begin
Before trying this quickstart, ensure that you have:
A GitHub account. This is required for forking the example repository.
A JFrog Platform account.
Set up Artifactory as a Docker registry. For more information, see Getting Started with Artifactory as a Docker Registry and Docker Registry.
Installed and configured JFrog Distribution.
Generated, uploaded, and deployed GPG Keys for JFrog Distribution.
At least one Pipelines node pool. This is the set of nodes that all pipeline steps will execute in. For more information, see Managing Pipelines Node Pools.
Successfully run the Docker Build and Push pipeline.
Running This Example
Perform the steps below to run this pipeline:
Sign in to JFrog Platform with your Artifactory credentials.
Go toApplication | Pipelines | Integrations to add these integrations:
GitHub Integration: This integration is used to add the Pipeline source.
myGithubis the name of the GitHub integration used in this example.
Distribution Integration: This integration connects your JFrog Pipeline Automation platform to a Distribution instance.
myDistis the name of the Distribution integration used in this example.
Artifactory Integration: This integration is used to authenticate with Artifactory to to get artifacts including Docker images, and maintain build information.
myArtifactoryis the name of Artifactory integration used in this example.
Write down the names of the Artifactory and Distribution integrations as these are required for the next step. Ensure that the names are unique and easy to remember.
Fork GitHub repository
The DSL file is a yaml file that contains the pipeline definitions. The example uses two YAML files,
values.yml. The pipelines.yml file contains the declarations for all resources and workflow steps. The
values.ymlfile contains the values required for the
pipelines.ymlfile. For a full breakup of all the resources, pipelines and steps used in the yml file, see the pipelines.ymlsection below.
Fork this repository to your account or organization. This is important since you need admin access to repositories that are used as Pipeline Sources or GitRepo resources, which is required to add webhooks to these repositories and listen for change events.
The pipelines configuration is available in the values.yml file. If required, edit this file in your fork of this repo and replace the following:
Provide the name of the Github integration you added in the previous step.
Provide your Artifactory integration.
Provide your Distribution integration.
All pipeline names are global across your JFrog Pipelines. The names of your pipelines and resources need to be unique within JFrog Pipelines.
Add Pipeline Source
The Pipeline Source represents the Git repo where our Pipelines definition files are stored. A pipeline source connects to the repository through an integration, which was added in the previous step.
In your left navigation bar, go to Administration | Pipelines | Pipeline Sources. Click on Add a Pipeline Source and then choose From YAML. Follow instructions to add a Pipeline Source.
After your Pipeline Source syncs successfully, navigate toPipelines|My Pipelinesin the left navbar to see the newly added pipeline. In this example,
demo_release_mgmtis the name of our pipeline.
Click the name of the pipeline. This renders a real-time, interactive diagram of the pipeline and the results of its most current run.
Execute the pipeline
You can now commit to the repo to trigger your pipeline, or trigger it manually through the UI. The steps in the pipeline execute in sequence. Multiple steps can execute in parallel if the node pool has multiple build nodes available.