This example demonstrates how simple pipelines can be defined and executed with JFrog Pipelines. An example Pipelines DSL is used to show how to use integrations, resources, and steps to construct a simple, automated workflow.
For a detailed look at the Hello World YAML, see jfrog-pipelines-hello-world.yml.
This pipeline example demonstrates the following:
Creating a GitHub Integration.
Adding a Pipeline Source.
Creating a GitRepo trigger, which will trigger a step when the contents of the source control repository change.
Using inputResources and inputSteps to set up dependencies between steps and resources.
Using environment variables (e.g. $res_myFirstRepo_commitSha) to extract information from
inputResources
.Using run state to pass information to downstream steps of a run.
Using pipeline state to pass information to subsequent runs.
Connecting dependent pipelines through resources
Successful runs of the pipeline in this quickstart look like this:
Before you Begin
A GitHub account. This is required for forking the example repository.
A JFrog Platform Cloud account, or self-hosted JFrog Pipelines.
At least one node pool. This is the set of nodes that all pipeline steps will execute in. For more information, see Managing Pipelines Node Pools.
A user account in Artifactory with deploy permissions to at least one binary repository.
Before trying this quickstart, ensure that you have:
Run the Pipeline
Perform the following steps to run this pipeline:
Fork repository
The Pipelines DSL for this example is available in the JFrog GitHub repository repository in the JFrog GitHub account.
The DSL file is a yaml file that contains the pipeline definitions. This example uses two YAML files:
jfrog-pipelines-hello-world.yml
, which contains the declarations for the pipelines in this examplevalues.yml
, which contains the values required for thejfrog-pipelines-hello-world.yml
file.
For a full breakup of all the resources, pipelines and steps used in the yml file, see the jfrog-pipelines-hello-world.yml section below. Fork this repository to your account or organization. This is important since you need admin access to repositories that are used as Pipeline Sources or GitRepo resources, in order to add webhooks to these repositories and listen for change events.
Sign in
Sign in to JFrog Platform with your Artifactory credentials.
Add integration
Go to Administration | Pipelines | Integrations to add one integration:
GitHub Integration: This integration is used to add the Pipeline source, as well as the GitRepo resource defined in values.yml, to connect Github to Pipelines. Write down the GitHub integration name.
Update values.yml
The pipelines configuration is available in the values.yml file. Edit this file in your fork of this repo and replace the following:
Tag
Description
Example
gitProvider
Provide the name of the Github integration you added in the previous step.
gitProvider: my_github
path
Provide the path to your fork of this repository.
path: myuser/jfrog-pipelines-hello-world
Note
All pipeline and resource names are global across your JFrog Pipelines project. The names of your steps and resources need to be unique within the JFrog Pipelines project.
Add pipeline source
The Pipeline Source represents the git repository where our pipelines definition files are stored. A pipeline source connects to the repository through an integration, which we added in step 3.
In your left navigation bar, go to Administration | Pipeline | Pipeline Sources. Click on Add a Pipeline Source and then choose From YAML. Follow instructions to add a Pipeline Source. This automatically adds your configuration to the platform and pipelines are created based on your YAML.
After your pipeline source syncs successfully, navigate to Pipelines | My Pipelines in the left navbar to view the newly added pipeline. In this example,
my_first_pipeline
andmy_second_pipeline
are the names of our pipeline.Click the name of the pipeline. This renders a real-time, interactive, diagram of the pipeline and the results of its most current run.
Execute the pipeline
You can now commit to the repo to trigger your pipeline, or trigger it manually through the UI. The steps in the pipeline execute in sequence.
Once the pipeline has completed, a new run is listed.