Pipelines Steps

JFrog Pipelines Documentation

JFrog Pipelines
Content Type
User Guide

A Step is a unit of execution in a pipeline. It is triggered by some event and uses resources to perform an action as part of the pipeline.

Steps take Inputs in the form of Integrations or Resources, execute tasks that perform the operations necessary and then produce a result i.e. Output(s). Now these Outputs can become Inputs to other steps and so on forming a dependency-based, event-driven pipeline.


A trigger for a step is some external event, or combination of events, such as:

  • Completion of another step

  • A change to a resource, such as a commit to a source code repository or the generation of a new Docker image

Step Types


Artifactory Integration will be deprecated soon!

Artifactory Integration will be deprecated as announced in Artifactory version 7.47.10, hence we recommend migrating from Artifactory Integration to JFrog Platform Access Token Integration. Refer to Artifactory Integration.

If any of the following steps except Helm steps require strict use of Artifactory integration, you can use either Artifactory integration or JFrog Platform Access Token integration.


Helm Steps

Please note that Platform Access Token Integration does not support Helm steps. If there is an Artifactory Integration in Helm step, do not migrate.

See it Live

Click here to see some of these steps in action.

Generic and Native Steps

Steps are categorized as one of the following:

  • Generic step: A generic step is for general-purpose execution. The Bash and PowerShell steps, which execute any series of shell commands you specify, are the single generic steps for Linux and Windows runtimes, respectively.

  • Native Steps: A native step performs a specific set of actions as an encapsulated unit. Native steps inherently know what shell commands to execute to perform their action. With native steps, you can create complex workflows that push, publish, and promote your builds in Artifactory using simple step definitions.

    All native steps derive from the base generic step definition for their respective runtime environment, so all steps have the same base tag:entity structure in a pipeline config. Each native step defines additional tags that are specific to its function.

Generic Steps


The Bash is a generic step type that enables executing any shell command. This general-purpose step can be used to execute any action that can be scripted, even with tools and services that haven't been integrated with JFrog Pipelines. This is the most versatile of the steps while taking full advantage of what the lifecycle offers.


The Matrix step commences multiple parallel build processes across multiple containers, with different settings for each.


The PostMatrix generic step may be used to perform post-execution tasks following a Matrix step.


The PowerShell step type is a generic type that enables executing PowerShell commands. PowerShell steps can only run on Windows node pools and are similar to the Bash step on other node pools. As a general-purpose step that can execute any action that can be scripted, even with tools and services that haven't been integrated with JFrog Pipelines, it can be used to perform actions where complete control is required.


The PreMatrix generic step may be used to prepare a build environment for execution of a Matrix step.

Native Steps


The CreateReleaseBundle is a native step that produces a Release Bundle for distribution to an Artifactory Edge Node. The step can be used to create a signed or unsigned release bundle.


The DistributeReleaseBundle native step triggers the distribution of a Release Bundle to an Artifactory Edge Node. This step requires a signed release bundle and one or more distribution rules to successfully execute.


The DockerBuild native step performs a build to produce a Docker image from a Dockerfile in a Git source repository.


The DockerPush native step pushes a Docker Image to a Docker registry.


The GoBuild native step performs a build from Go (GoLang) source.


The GoPublishBinary native step publishes the GO (GoLang) binaries to Artifactory.


The GoPublishModule native step publishes the GO (GoLang) modules to an Artifactory. This step should be used in conjunction with the GoBuild step.


The GradleBuild native step performs a Gradle build on files in a Git repository. Optionally, it can also publish build information to Artifactory.


The HelmBlueGreenCleanup step uninstalls an Idle release previously deployed by a HelmBlueGreenDeploy step.


The HelmBlueGreenDeploy step implements a Blue/Green strategy to deploy a Docker image to a Kubernetes cluster using a Helm chart.


The HelmBlueGreenRoleSwitch step flips the roles played by the Helm releases deployed through a HelmBlueGreenDeploy step.


The HelmDeploy step deploys a Docker image to a Kubernetes cluster using a Helm chart.


The HelmPublish step publishes a Helm chart and associated build info from a location in a Git repo to a Helm repository in Artifactory.


The Jenkins native step transfers execution to a Jenkins pipeline.


The MvnBuild native step performs a Maven project build on files in a Git repository. Optionally, it can also publish build information to Artifactory.


The NpmBuild native step builds an npm source. This step automatically performs npm-install on the source in a Git repository.


The NpmPublish step publishes an npm package to the registry in Artifactory following an NpmBuild step.


The PromoteBuild native step promotes a BuildInfo and moves or copies the related artifacts from one Artifactory repository to another.


The PublishBuildInfo step publishes BuildInfo to Artifactory. BuildInfo provides a manifest for the build and includes metadata about the modules, dependencies and other environment variables.


The SignReleaseBundle native step signs a Release Bundle in preparation for distributing it to Edge nodes.


The XrayScan native step triggers a scan by JFrog Xray for security vulnerabilities and license compliance. If there was a watch created that covers the selected build, Xray will scan the indexed build artifacts.

Step Definition

Steps are defined in a pipeline config under the steps tag, as shown below.

  - name: pipe1
      <optional configuration settings>
      <collection of step types>

All step are composed of these top-level tags:





An alphanumeric string (underscores are permitted) that identifies the step. This is the name that will be used when the step is assigned as an input to other steps. The name should be chosen to accurately describe what the step does, e.g. prov_test_env to represent a job that provisions a test environment. Names of steps must be unique within a pipeline.



A predefined step type.


You can modify the type of a step, such as transforming a Bash step into a DockerBuild step. Nevertheless, after you change the step's type, make sure to adjust the configuration to align with the new step's configuration.



Specifies all configuration settings for the step's execution environment. These settings may include:

  • Environment variable definitions

  • Runtime the step will execute in

  • Integrations

  • Input and output resources

  • Timeout duration

  • Queuing priority

  • Settings for native steps

While most configuration tags are optional, in practice you will at least need to define the integrations, steps, and/or resources that trigger the execution of the step. Many native steps will also require some settings in this tag section.

Optional for the generic steps

Required for most native steps


Specifies the actions to perform for each execution phase of the step. These phases may include:

  • the base set of commands

  • successful execution of the base set

  • failed execution of the base set

  • commands to execute on any completion (e.g., for cleanup)


Step Configuration

When coding your steps, no matter what their function, these are some of the configuration section settings you will likely need to consider.

Environment Variables

You may define or override a set of environmentVariables that will be available only for the duration of the execution of the step.

If the following variables are set, they will be used:

  • JFROG_CLI_BUILD_NAME: If set, the pipeline uses this value instead of the default pipeline name for the build info collected.

  • JFROG_CLI_BUILD_NUMBER: If set, the pipeline uses this value instead of the default run number for the build info collected.

  • USE_LOCAL_JFROG_CLI: If set as true, the local JFrog CLI on the host or in the image (depending on runtime configuration) is used instead of the version packaged with JFrog Pipelines. This is not recommended and native steps may not be able to run with the local JFrog CLI version.


You may choose a runtime for the step to execute in.

Node Pool Assignment

You can designate a particular nodePool that you want your steps to draw execution nodes from. You may wish to do this to ensure your steps execute in a particular architecture/OS combination, and/or draw from a node pool that provides enough execution nodes to process multiple steps in your pipeline at once.

If you don't specify a nodePool then your steps will run in the node pool that has been set as the default.

Affinity Group

You can bind steps together in a named affinityGroup to ensure that they execute on the same node. This helps when several steps need access to files or artifacts that are created on a node. For example, DockerBuild and DockerPush steps must be assigned to the same affinityGroup so that the image built by DockerBuild is available for DockerPush.

Queuing Priority

You can assign a queuing priority to a step to ensure it gains appropriately ordered access to a build node when there are parallel steps in a pipeline or multiple pipelines are executing.

Timeout Duration

You can set a time limit for the step to complete execution, using the timeoutSeconds tag. If the step does not complete in the given number of seconds, the step will be forced to a completion state of failed.


Your native step may require an integration to be declared, or you may wish to issue commands in your step that reference integrations, such as sending notifications to Slack. If so, you must declare the integrations in your step.


When defining the first step in your pipeline, you will likely need to define triggering inputResources (such as GitRepo or a Webhook) that will automatically commence a new run of the pipeline. Resources may be from any pipeline source. If the resource is part of a multi-branch pipeline source and from a different branch than the step, the branch must be specified for the resource in inputResources.

You should define the remainder of your steps so they execute in an interdependent sequence. This means that each step is configured so that its execution will be triggered by the successful completion of a prior, prerequisite step (or steps). In this way, step 1's completion will trigger the execution of step 2, completion of step 2 triggers execution of step 3, and so on until all steps in the pipeline are executed.

To do this, each step will need to define the inputSteps that trigger it in order to build the desired execution sequence.


Your step will likely produce changes to a resource as a result of performing its task. You must declare these as outputResources. If the updated resource is part of a multi-branch pipeline source and from a different branch than the step, the branch must be specified for the resource in outputResources.

Step Execution

Your automated pipeline issues shell commands to the execution node to perform the work. These are specified in each step's execution section during the specified execution stage of that step.

For example, in the generic Bash step:

        . . .
            - echo "prepare for main execution"
            - echo "executing task command 1"
            - echo "executing task command 2"
            - echo "Job well done!"
            - echo "uh oh, something went wrong"
            - echo "Cleaning up some stuff"


You may not declare an onExecute section in a native step as they provide their own onExecute to perform their native function.

In addition to the conventional shell commands, you can also make use of built-in Utility Functionsto perform actions through integrations or preserve state information.

You can also leverage environment variables that are available during pipeline execution to flexibly automate any of your step execution actions.ENVIRONMENT VARIABLES