This pipeline demonstrates the definition of a pipeline that builds a single Docker Image and pushes it to Artifactory, and then publishes BuildInfo. An example Pipelines DSL is used to show how to use integrations, resources, and steps to construct a simple, automated workflow.
This pipeline demonstrates the following:
Adding a Pipeline Source.
Creating a ??? trigger, which triggers a step when the contents of the source control repository changes.
Using an Image resource to add a reference to a Docker image to your pipeline.
Using inputResources and inputSteps to set up dependencies between steps and resources.
See it Live!
Click here to see this quickstart in action.
A successful run of the pipeline in this quickstart looks like this:
Before You Begin
Before trying this quickstart, ensure that you have:
A GitHub account. This is required for forking the example repository.
A JFrog Platform account, or self-hosted JFrog Pipelines.
A user account in Artifactory with deploy permissions to at least one binary repository.
Create a local Docker repository to set up Artifactory as a Docker Registry.
At least one node pool. This is the set of nodes that all pipeline steps will execute in. For more information, see Managing Pipelines Node Pools.
Run the Pipeline
For a detailed overview of the Docker and Build Push YAML example, see Docker and Build Push Example: pipelines.yml.
Perform the steps below to build and push your Docker image:
Fork the Repository
The DSL file is a yaml file that contains the pipeline definitions. The example uses a single YAML file,
pipelines.yml. This file contains the declarations for all resources and workflow steps. For a full breakup of all the resources, pipelines and steps used in the yml file, see the pipelines.ymlsection below.
Fork this repository to your account or organization. This is important since you need admin access to repositories that are used as Pipeline Sources or GitRepo resources, in order to add webhooks to these repositories and listen for change events.
Sign in to Artifactory
Sign in to JFrog Platform with your Artifactory credentials.
Go to Administration | Pipelines | Integrations to add two integrations:
Write down the names of both GitHub and Artifactory integrations as these are required for the next step. Ensure that the names are unique and easy to remember.
Update pipelines.yml file
The pipelines configuration is available in the pipelines.yml file. Edit this file in your fork of the Git repo and replace the following:
Provide the name of the Github integration you added in the previous step.
Provide the path to your fork of this repository.
Provide the the name of the Artifactory integration you added in the previous step.
Provide your Docker image path and name.
/docker_localis the image path and
dbpis the image name.
Provide your Artifactory integration.
All pipeline definitions are global across JFrog Pipelines within a Project. The names of your pipelines and resources need to be unique within the Project in JFrog Pipelines.
Add Pipeline Sources
The Pipeline Source represents the Git repo where our Pipelines definition files are stored. A pipeline source connects to the repository through an integration, which we added in the previous step.
Follow instructions to add a Pipeline Source and point it to the
pipelines.ymlin your fork of the repo. This automatically adds your configuration to the platform and a pipeline is created based on your YAML. The
pipelines.ymlfile is parsed and resources, steps, and pipelines are added as configured.
After your pipeline source syncs successfully, navigate to Pipelines | My Pipelines in the left navbar to view the newly added pipeline. In this example,
pipeline_dbpis the name of our pipeline.
Click the name of the pipeline. This renders a real-time, interactive, diagram of the pipeline and the results of its most current run.
Execute the Pipeline
You can trigger the pipeline by committing a change to your Git repository, or by manually triggering it through the UI. The steps in the pipeline execute in sequence. Multiple steps can execute in parallel if the node pool has multiple build nodes available.
Once the pipeline has completed, a new run is listed.