A Pipeline Source represents a source control repository (such as GitHub or BitBucket) where Pipelines definition files can be found. A pipeline source connects to the repository through an integration. After a pipeline source is added, Pipelines automatically loads all config files from the repository that matches the specified filename filter.
Note
Administrator users can create, view, edit and sync pipeline sources. Non-administrator users can only view and sync pipeline sources.
Administering Pipeline Sources
Only users with administrator privileges can manage pipeline sources.
Requirements
To successfully add a pipeline source, ensure the following:
The credentials used in the source control integration should have admin access to the specified source control repository. Admin privileges are used to add a webhook that allows Pipelines to be notified of changes in the repository.
Pipelines 1.26.0 and lower: The repository path is valid. To know the exact path to specify, inspect the git clone URL of that repository in your source control system and copy the value.
Here are some examples:
GitHub / GitHub Enterprise: https://github.com/myuser/basic-pipeline.git - use myuser/basic-pipeline
Bitbucket Server (Private repository): https://git.mycompany.com/scm/project-id/repo-name.git - use project-id/repo-name
Bitbucket Server (Individual repository): https://git.mycompany.com/scm/~myuser/myfirstrepo.git - use ~myuser/myfirstrepo
Bitbucket: https://username@bitbucket.org/teamspace/test-repo.git - use teamspace/test-repo
GitLab: https://gitlab.com/user1/repo1.git - use user1/repo1
Branch name is valid and the credential used in the Git integration for the pipeline source has access to the branch.
Add an integration for the source control system where your pipeline file repository is (or will be) located. The integration can be one of these source control system providers:
Adding a Pipeline Source (1.31.0 and higher)
To add a source control repository as a Pipeline Source:
Go to Administration | Pipelines | Pipeline Sources.
In the resulting Pipeline Sources display, click Add Pipelines Source and click From YAML.
In the Add YAML Pipeline Source page, click one of the following:
Single Branch
Multi Branch
Click to select the protocol type to be used for cloning the repository when the pipeline source is synced:
SSH
HTTPS
Complete the Single/Multi Branch form:
Field
Description
Name
Enter a unique name for the pipeline source.
SCM Provider Integration
Click the SCM Provider Integration field and select your source control integration from the dropdown list. Only integrations that are compatible for use as a Pipeline Source will be included in the dropdown list.
Repository Full Name
Based on the information you have provided for the selected integration, such as API token, all the relevant repositories are listed in the Repository Full Name field. Select the path of the repository where your pipeline files are stored. If the name of the repository is not auto-fetched, enter the full name of the repository where your pipeline files are stored.
Branch (Single Branch only)
Based on the SCM provider and Repository Full Name you have provided, all the available branches are automatically fetched (for example,
main
). Select the required branch. If the name of the branch is not auto-fetched, enter the name of the branch.Exclude (Multi Branch only)
Specify the Exclude Branch Patternas a regular expression for the matching branch names to exclude.
Include (Multi Branch only)
Specify the Include Branch Pattern as a regular expression for the matching branch names to include.
Sync all branches toggle
(Multi Branch only)
When a new multi branch pipeline source is added, only the default branched is synced. To sync all the branches based on the Exclude/Include pattern, click the Sync all branches toggle. If Exclude/Include pattern is not provided, all branches are synced.
In addition, when a new commit is pushed to thematching branch, that particular branch is synced.
Folder Name
Provide the name of the directory where the YAML config is available.
To use Folder Name, in the SCM repository, place your YAML files in a directory named .
jfrog-pipelines
. This directory can be the root or one level below the root. Directories beyond this level are not supported.Following are the possible directory structures for monorepos:
- root
------ .jfrog-pipelines
Or
- root
---- .jfrog-pipelines
-------- service1
-------- service2
Or
- root
---- service1
--------- .jfrog-pipelines
---- service2
--------- .jfrog-pipelines
The .
jfrog-pipelines
directory can contain any number of YAML files.To parse all YAML files in the root directory, enter ".". However, other pipeline sources cannot point to this repository.
You can create multiple pipeline sources pointing to the same SCM repository as long as the directory names are different (and none of them point to root).
Recommended Directory Structure for using Folder Name
Here are some examples of the supported directory structure:
YAML Files Location
Folder Name Path
YAML files are in the
.jfrog-pipelines
root directoryEnter
.
(dot) or.jfrog-pipelines
in theFolder Name field to fetch all the YAML filesYAML files are in the
.jfrog-pipelines/project1
directoryEnter
.jfrog-pipelines/project1
in the Folder Name field to fetch the YAML files in the.jfrog-pipelines/project1
directoryYAML files are in the following directories:
.jfrog-pipelines
.jfrog-pipelines/project1
.jfrog-pipelines/project2
Enter
.
(dot) or.jfrog-pipelines
in the Folder Name field to fetch all the YAML files in all directoriesor
Enter
.jfrog-pipelines/project1
in the Folder Name field to fetch the YAML files in the.jfrog-pipelines/project1
directoryEnter
.jfrog-pipelines/project2
in the Folder Name field to fetch the YAML files in the.jfrog-pipelines/project2
directoryYAML files are in the following directories:
.jfrog-pipelines
service1
/.jfrog-pipelinesservice2
/.jfrog-pipelines
Enter
.
(dot) or.jfrog-pipelines
in the Folder Name field to fetch the YAML files in the.jfrog-pipelines
directory, but YAML files in theservice1
/.jfrog-pipelines
andservice2
/.jfrog-pipelines
directories will not be included.or
Enter
service1
/.jfrog-pipelines
in the Folder Name field to fetch specific YAML files in theservice1
/.jfrog-pipelines
directoryEnter
service2
/.jfrog-pipelines
in the Folder Name field to fetch specific YAML files in theservice2
/.jfrog-pipelines
directoryIf you have a monorepo with multiple services within a single repo, a directory structure such as the following is recommended:
Existing Directory Structure
Recommended Directory Structure
Root
Build/ci
-Service1
-Build/ci/pipe.yaml
-Service2
-Build/ci/pipe.yaml
Root
.jfrog-pipelines
-Service1/pipe.yaml
-Service2/pipe.yaml
Local Artifactory Token Permissions
When a pipeline source is added to a custom Project, all pipelines associated with this pipeline source will have access to a local Artifactory integration by default. This integration enables you to connect to your local Artifactory instance to push artifacts, without having to explicitly create an Artifactory integration. However, unlike the custom Artifactory integration, which can be used to connect to any Artifactory instance, the local Artifactory integration can only connect to your local Artifactory instance. In addition, while the custom Artifactory integration uses API key for authentication and permissions, the local Artifactory integration uses scoped token.
Important
Local Artifactory integration has the following limitations:
It can only be used in custom Projects. The default Project does not support this integration.
It can only be used with repositories created within custom Projects.
It can only be used in Bash steps.
Permissions
Read permission is enabled by default for all local Artifactory integrations. When required, you can add the following additional permissions for your pipeline source. All pipelines associated with that pipeline source will then use these permissions:
Read: Download artifacts (available by default)
Write: Upload or update artifacts
Delete: Delete artifacts
Manage: Change the permission settings for other users
Usage in Pipelines YAML
The local Artifactory integration can be added as an input to any Bash step. To do this, use
{{ .jfrog-pipelines.localArtifactory }}
as a placeholder for the integration. It can be used at both pipeline-level and step-level.When you add this integration directly to a Bash step, the
JFROG_LOCAL_ARTIFACTORY_TOKEN
environment variable is automatically made available to be used at runtime. This environment variable creates a scoped token with read permissions. When the pipeline YAML is added as a pipeline source, the permissions for the token can be elevated to Write, Delete, and/or Manage. If the pipeline source includes more than one pipeline, the new permissions will be applicable for all the pipelines in the source.resources: - name: myRepo_native type: GitRepo configuration: path: jfrog/jfrog_pipelines gitProvider: myGithub branches: include: master pipelines: - name: myPipeline_native steps: - name: first_step type: Bash configuration: integrations: - name: {{.jfrog-pipelines.localArtifactory}} execution: onExecute: - echo "first_step" - jfrog config add jftest-$step_id --url https://pipelines.jfrog.io --access-token ${JFROG_LOCAL_ARTIFACTORY_TOKEN} - touch test.txt - echo "hello" > test.txt - jfrog rt upload test.txt demo-sampleRepo/test.txt --server-id jftest-$step_id - name: second_step type: Bash configuration: inputSteps: - name: first_step execution: onExecute: - echo "second_step" - echo "$JFROG_LOCAL_ARTIFACTORY_TOKEN"
Validating YAML
Before adding a pipeline source, you have the option of validating your pipelines, resources, and values YAMLs and then committing it to the SCM. This enables you to get instant feedback on pipeline sync errors. The YAML validator will validate your YAML for both semantic and syntactic errors.
To validate your YAML:
Click Validate YAML.
Either enter the YAML and paste the YAML contents in the Pipelines YAML and Resources YAML fields. If you have a values YAML, enter the YAML contents in the Values YAML field. For more information about the Pipelines DSL, see Defining a Pipeline.
As the YAML content is entered, it is validated and syntactic errors, if any, are displayed.
After fixing all the syntactic errors, click Validate to validate the YAML for semantic errors.
Viewing Pipeline Sources
To view the list of pipeline sources already added to Pipelines, go to Administration | Pipelines | Pipeline Sources or Application | Pipelines | Pipeline Sources.
The page displays the list of pipelines sources that are available to you according to the permission targets defined in the JFrog Platform. Your user account must be granted permissionsfor a pipeline source for it to be shown.
Each row of the pipeline sources list includes the following:
Property | Description |
---|---|
Name |
|
Git Repository | The source repository path of the pipeline source |
Project | The Project that the pipeline source belongs to |
Latest Status | The success/failure status of the last sync |
Config File Filter | The filter string for the pipeline config files |
Last Sync | The time and date of the last sync |
Changed By | Name of the user who made the last update to the pipeline source |
Context | The commit SHA that triggered the last sync |
Logs | Click the Logs link to view the log from the last sync. Use this to diagnose a failure to sync a pipeline source. |
If the pipeline source is a multi branch source, the row presents aggregate information for all branches, and can be expanded/collapsed to show the sync status of each branch.
Syncing a Pipeline Source
When any of the pipeline config files have changed, you will need to sync the pipeline source to reload:
Go to Administration | Pipelines | Pipeline Sources and click the Actions button located at the far right.
Click Sync.