The TriggerPipeline step enables one step in a parent pipeline to "embed" another pipeline and wait for the resulting run to complete before returning the control back to the parent pipeline's step. The parent pipeline then continues its run with access to some additional information on the embedded run.
For example, in the image below, Step 2 in the parent pipeline X triggers the embedded pipeline Y. When pipeline Y is successful, it returns control to Step 2 in pipeline X, which triggers Step 3 and Step 4 in the pipeline.
Note
If the embedded run ends with a bad status (failure, timed out, cancelled), the TriggerPipeline step will end with a failure status.
Tip
Node Usage: The step triggering the embedded pipeline uses a dedicated node for processing, polling, and waiting for status. Keep this in mind when designing your embedded pipeline. Depending on how often you use the step, you could find that your nodepool runs out of capacity to execute the embedded pipeline leaving you in a stuck state. If you plan to make heavy use of this step type, we recommend considering assigning your TriggerPipeline steps to a separate nodepool to restrict the overall node capacity that can be used by this step type. By using a separate node pool, you can ensure that the pipeline does not go into a deadlock.
Wait time: TriggerPipeline steps are subject to timeout just like any other step. Keep this in mind when choosing an embedded pipeline. If your embedded pipeline takes longer than the system timeout limit, then it is likely that your TriggerPipeline step will timeout while waiting for the child to complete. One measure we have taken to avoid this is to reset the timeout timer of the TriggerPipeline step when the embedded run moves from a "waiting" status to a "processing" status.
Multiple TriggerPipeline steps: There is no limit to how many TriggerPipeline steps you can include in your pipelines, but keep in mind that it can be easy to find yourself waiting on multiple tiers of embedded runs. Before using this step type, consider if your scenario is better suited to simply connecting two pipelines via resource.
Embedding Pipelines
To create and run embedded pipelines, you will need:
At least two build nodes
Create a JFrog Platform Access Token Integration integration
Add the TriggerPipeline native step to your pipeline YAML
YAML
The yaml for embedded pipelines supports the following parameters:
Note
Each of these fields supports placeholders like {{gitBranch}}
or any ${environment_variable}
.
Parameter | Description | Required/Optional |
---|---|---|
| Name of the embedded pipeline that needs to be triggered. | Required |
| Name of the step in the embedded pipeline that needs to be triggered. | Required |
| If the step you want to trigger belongs to a Project, it is recommended to use the projectKey to ensure the correct step is triggered. | Optional |
| If the step you want to trigger belongs to a multibranch pipeline, then this parameter is necessary to distinguish which branch you want to trigger. | Required if target is multibranch |
TriggerPipeline
pipelines: - name: <string> steps: - name: <string> type: TriggerPipeline configuration: #inherits all the tags from bash; pipelineName: <string> # required stepName: <string> # required branchName: <string> # optional. required if target is multibranch. projectKey: <string> # optional. recommended if target belongs to a project. integrations: - name: <JFrog Platform Token integration> # required execution: onStart: - echo "Preparing for work..." - set_trigger_payload stepVariables "test=true" - set_trigger_payload pipelineVariables "notify=true" "version=5.4.3" - export pipelines_poll_interval_seconds=30 # defaults to 10 onSuccess: - echo "Done!" onFailure: - echo "Something went wrong" onComplete: - echo "Cleaning up some stuff"
Example
Here are an example showing embedded pipelines.
In this example, top_pipeline
is the parent pipeline and scanner_pipeline
is the child pipeline.
pipelines: - name: top_pipeline steps: - name: scan_controller type: TriggerPipeline configuration: pipelineName: scanner_pipeline stepName: scan_it integrations: - name: myPlatformToken environmentVariables: scan_target: default: "hello-world" allowCustom: true values: - "vault" - "redis" - "postgresql" - "hello-world" execution: onStart: - set_trigger_payload pipelineVariables "scan_target=${scan_target}" - set_trigger_payload stepVariables "notify=email" "uploadReport=true" onComplete: - echo "Final status is $nested_run_status" - name: scanner_pipeline steps: - name: scan_it type: Bash execution: onExecute: - echo "Image to scan is $scan_target." - echo "Triggered by parent step at $parent_step_url"