Defining a Pipeline

JFrog Pipelines Documentation

ft:sourceType
Paligo

This page provides a high-level overview of the structure of a pipeline configuration file.

JFrog Pipelines uses its own declarative language based on YAML syntax, Pipelines DSL, to describe workflows. You create pipelines in text files written in Pipelines DSL, which we refer to as a pipeline config. You can create these files in any text editor of your choice.

You must store your pipeline config file(s) in a source code repository (for example, GitHub). When Pipelines has been configured to use this repo as a pipeline source, the config files will be automatically read and the workflows loaded into Pipelines and run.

Pipeline Config Structure

There are two top-level sections that can be defined in a pipeline config:

  • A resources section that specifies the Resources used by the automated pipeline.

  • A pipelines section that specifies the pipeline execution environment and the Steps to execute.

For ease of illustration, we'll describe how this looks in a single pipeline config file (e.g., pipelines.yml).

pipelinesDefinition_19jan23.png

Resources Section

Resources provide the information that steps need in order to execute or store information generated by a step. For example, a resource might point to a source code repository, a Docker image, or a Helm chart. A list of all supported resources is available in Resources overview.

The basic format of each resources declaration is:

Tag

Description

name

A globally unique friendly name for the resource.

type

A predefined string that specifies the type of resource.

For more information, see Resource Types.

configuration

Begins the section of settings required by the resource type. This typically includes the name of the integration that connects the resource to the external service.

Note

Resource definitions are global and can be used by all pipelines in a Project that is in at least one of the same environments. This means that resource names must be unique across all pipeline config files in a Project.

For example, here is a resources section that defines two resources, a GitRepo and a Docker Image:GitRepo

resources:
  - name: my_Git_Repository
    type: GitRepo
    configuration:
      gitProvider: my_GitHub_Integration
      path: ~johndoe/demo
      branches:
        include: master

  - name: my_Docker_Image
    type: Image
    configuration:
      registry: my_Docker_Registry_Integration
      imageName: johndoe/demo_image
      imageTag: latest

Pipelines Section

The pipelines section defines the workflow, consisting of steps and the dependencies between them.

The basic format of each pipelines declaration is:

Tag

Description

name

A friendly name for the resource, unique within the project.

configuration

An optional section to specify environment variables and/or a runtime image for the pipeline to execute in.

preRun

This is an optional step. When configured, this step will always run at the beginning of a pipeline. This is useful when you want to run some checks at the beginning of a run.

Like Pipelines Steps, preRun must include at least one execution command, such as onExecute, onStart, onSuccess, onFailure, or onComplete. In addition, it can include input and output resources.

pipelines: 
  - name: PreAndPostRuns 
    preRun: 
      execution: 
        onExecute: 
          - printenv 
          - echo "Executing Pre Run"

postRun

This is an optional step. When configured, this step will always run at the end of a pipeline. This is useful when you want to run some checks at the end of a run.

Like Pipelines Steps, postRun must include at least one execution command, such as onExecute, onStart, onSuccess, onFailure, or onComplete. In addition, it can include input and output resources.

pipelines: 
  - name: PreAndPostRuns 
    postRun: 
      execution: 
        onExecute: 
          - printenv 
          - echo "Executing Post Run"

This is followed by a collection of step sections that specifies the steps to execute.

Tip

The name of the pipeline will be available in the environment variable $pipeline_name, which can be used to construct the base name for builds.

Pipeline Configuration

The optional configuration section can specify an execution environment for all steps in the pipeline. While this configuration can be defined per step, it is sometimes more convenient to define it at a pipeline level if it's going to be the same for all steps in the pipeline.

The basic format of each configuration section is:

Tag

Description

environmentVariables

Variables defined here are available for use in every step in the pipeline. These variables are read-only; they cannot be redefined in a step.

If the following variables are set, they will be used:

  • JFROG_CLI_BUILD_NAME : If set, the pipeline uses this value instead of the default pipeline name for the build info collected.

  • JFROG_CLI_BUILD_NUMBER : If set, the pipeline uses this value instead of the default run number for the build info collected.

  • USE_LOCAL_JFROG_CLI : If set as true, the local JFrog CLI on the host or in the image (depending on runtime configuration) is used instead of the version packaged with JFrog Pipelines. This is not recommended and native steps may not be able to run with the local JFrog CLI version.

  • JFROG_XRAY_URL : If jfrogCliVersion is set to 2, this variable may be used to specify an Xray URL for use when configuring the JFrog CLI with an Artifactory integration. In most cases, the platform URL will be correct and JFROG_XRAY_URL is not required.

nodePool

Optionally specify a specific node pool where your steps will execute. If not specified, then the node pool set as the default will be used.

For more information, see Choosing Node Pools.

affinityGroup

Optionally specify an Affinity Group name to specify that all steps in this Pipeline are part of one Affinity Group. This means that all steps will run on the same build node.

For more information, see Running multiple steps on the same build node.

runtime

This section allows you to specify the default runtime environment for steps in the pipeline. The options are:

  • Run steps directly on the host machine

  • Run steps inside the node pool's default Docker container or one of its language-specific variants

  • Run steps inside a custom Docker container of your choice

For more information, see Choosing your Runtime Image.

chronological

Any runs of the pipeline will not start running while another run of the same pipeline is processing if chronological is set to true. The default is false, allowing runs to execute in parallel if there are nodes available.

dependencyMode

Specifies when the pipeline may run relative to other pipelines connected by resources. If any of these three settings are true, new runs will not be created for resources updated by other pipelines if there is already a waiting run with the same resources and steps. So if a pipeline runs twice consecutively and the following pipeline has waitOnParentComplete set to true, the following pipeline will only run once. When the pipelines do run, they will use the latest resource versions.

The optional settings are:

  • waitOnParentComplete : If true, the pipeline will not start running when a pipeline that outputs a resource that is an input to this pipeline has a waiting or processing run.

  • waitOnParentSuccess : If true, the pipeline will not start running when a pipeline that outputs a resource that is an input to this pipeline has a processing run or the last complete run was not successful.

  • waitOnChildComplete : If true, the pipeline will not start running when a pipeline that has an input resource that is the output of this pipeline has a waiting or processing run unless that child pipeline is waiting for this pipeline to complete.

retentionPolicy

Optionally specify if the pipeline run data should be deleted after a specific number of days. Also, provides the ability to keep a minimum number of pipeline runs data:

  • maxAgeDays : Specifies number of days after which the pipeline run data will be deleted (cannot exceed the system level setting). Setting this value to 0 means infinite retention.

  • minRuns : Specifies the minimum number of pipeline runs data to keep, regardless of their age (cannot exceed the system level setting).

For more information, see Setting Retention Policy.

jfrogCliVersion

Optionally specify either 1 to use JFrog CLI v1 or 2 to use JFrog CLI v2 in the steps in the pipeline. The default is currently v1.

integrations

Specifies Integrations, similar to the integrations section in a step, to be input to all steps in the Pipeline. Integrations listed here may not be listed in the integrations section of a step in this Pipeline.

For more information, see Pipelines Integrations.

Note

Pipeline-level declarations for integrations are applicable for all steps. You cannot choose the steps that will not use those integrations.

inputResources

Specifies Resources, similar to the inputResources section in a step, to be inputs to all steps in the Pipeline. Resources listed here may not be listed in the inputResources or out putResources sections of any step in this Pipeline. Each Resource in inputResources should be specified by name and branch (if from a multi-branch Pipeline Sources) and the trigger option may be set to control if updates to the resource should trigger steps in the Pipeline.

For more information, see Using Resources.

Note

Pipeline-level declarations for input resources are applicable for all steps. You cannot choose the steps that will not use those input resources.

outputResources

Specifies Resources, similar to the out putResources section in a step, to be inputs to all steps in the Pipeline. Resources listed here may not be listed in the inputResources or out putResources sections of any step in this Pipeline. Each Resource in inputResources should be specified by name and branch (if from a multi-branch Pipeline Sources).

For more information, see Using Resources.

Note

Pipeline-level declarations for output resources are applicable for all steps. You cannot choose the steps that will not use those output resources.

Any step can override the pipeline's default runtime configuration if needed to configure its own runtime selection.

Pipeline Steps

Each named pipeline declares a collection of named step blocks the pipeline will execute.

The basic format of each step declaration is:

Tag

Description

name

A friendly name for the step that may be referenced in other steps. Step names must be unique within the same pipeline.

type

A predefined string that specifies the type of step.

For more information, see Pipelines Steps.

configuration

Begins the section of settings required by the step type. This may include:

  • Environment variables local to the step

  • Any runtime configuration for the step

  • Any triggering input steps or resources

  • Any resources output by the step

  • Any integrations used by the step

  • All settings required by the step type

execution

Specifies the actions to perform for each execution phase of the step.

For example, here is a simple sequence of two steps. Each uses the generic Bash step to output text to the console:

    steps:
    - name: step_1
      type: Bash
      configuration:
        inputResources:
          - name: my_Git_Repository     # Trigger execution on code commit
      execution:
        onExecute:
          - echo "Hello World!"
 
    - name: step_2
      type: Bash
      configuration:
        inputSteps:
          - name: step_1               # Execute this step after the prior step
      execution:
        onExecute:
          - echo "Goodbye World!"

Pipeline Config File Strategies

A pipeline config file can have one or more pipelines defined in it, but the definition of a single pipeline cannot be fragmented across multiple files. Pipeline config filenames can be any form you choose, although the convention for a single file is pipelines.yml

Some things to note about pipelines:

  • You can have as many pipeline config files as you want. For example, our customers manage config in the following different ways:

    • Maintain a central DevOps repository and keep all pipeline config files for all projects in that repository.

    • Keep pipeline config files that build each microservice with the source code for that microservice.

    • Separating out pipeline steps and resources into separate config files (for example, pipelines.steps.yml and pipelines.resources.yml respectively).