How to migrate from Bamboo to Bitbucket Pipelines

CI/CD offerings at Atlassian

Atlassian currently offers two primary continuous integration and delivery (CI/CD) options: Bitbucket Pipelines (built into Bitbucket Cloud) and Bamboo Data Center. As Bamboo Data Center approaches its end of life on March 28, 2029, we want to help you migrate to Pipelines and maintain smooth, reliable workflows.

Bitbucket Pipelines is built into Bitbucket Cloud as its native CI/CD, so when migrating from Bamboo to Pipelines, there no need to install or manage any separate CI/CD product. Pipelines supports both Atlassian-managed ephemeral cloud runners for on-demand scalability and self-hosted runners (similar to Bamboo agents) for workloads requiring firewall-restricted environments. Additionally, you’ll be able to seamlessly extend your workflows using Pipes, while managing your pipelines at scale to meet your compliance needs. Learn more about the benefits of migrating to Bitbucket Pipelines.

To save your team time, Atlassian provides automated tooling to assist with this transition. This includes the Bitbucket Cloud Migration Assistant (BCMA), a tool to securely transfer your source code, commit history, and pull request metadata to the cloud, as well as the Bitbucket Pipelines Importer, an automated migration tool designed to translate your Bamboo YAML specs (and Jenkins configuration files) directly into Bitbucket Pipelines YAML files. The importer is currently in Beta for Bamboo migrations.

Migration best practices

To ensure a smooth transition from Bamboo Data Center to Bitbucket Cloud, we recommend this structured 6-step migration path.

The 6-step migration path

  1. Pre-migration discovery & planning: Conduct a structured inventory of your Bamboo environment to understand the complexity and effort required. Use the Bitbucket Pipelines Importer's audit command to scan your footprint, identify unsupported tasks, and generate a clear readiness report, and forecast command to project your future consumption costs based on historical build times and optimal runner sizes.

  2. Repository migration (if required): Bitbucket Pipelines requires repositories to be hosted in Bitbucket Cloud. If your source code is hosted on Bitbucket Data Center, use the Bitbucket Cloud Migration Assistant to securely transfer your code, commit history, and pull request metadata to the cloud. For other source code management tools, use Bitbucket Cloud’s repository importer feature.

  3. Generate a baseline: Run the Bitbucket Pipelines Importer's migrate command to automatically translate your existing CI/CD platform setups into Bitbucket Pipelines YAML. This will output a baseline bitbucket-pipelines.yml file for your project or individual plans.

  4. Refine and resolve: You should plan for manual follow-up on every migrated plan, even when the importer reports success. Review the generated YAML and resolve any placeholder comments (#TODO) left by the tool. Apply manual updates for elements the tool cannot automate, such as deployments, secrets (password variables), and self-hosted runners. Use the technical patterns detailed in Sample conversion section below as your reference guide for translating custom Bamboo logic into Bitbucket-native YAML.

  5. Test the execution Commit the refined bitbucket-pipelines.yml file to a feature branch. Run the pipeline to verify that the build, plugin integrations, and test suites pass successfully in the Bitbucket Cloud environment.

  6. Cutover and decommission Once the pipeline execution is verified, merge the configuration into your main branch. Enable your new CI/CD workflows in Bitbucket and safely decommission the legacy Bamboo plan.

💡 Pro tip 1: Phased migration strategy

If your builds rely on mission‑critical workflows that can’t be recreated easily or moved without downtime, we recommend a phased or hybrid. Use the atlassian/bamboo-trigger-build pipe to begin moving code and simple steps to Bitbucket Pipelines while keeping mission-critical steps running in Bamboo temporarily. This lets you progress migration work without risking service interruptions.

💡Pro tip 2: Eliminate boilerplate with Bitbucket Pipes

Don’t forget that you can enhance your CI/CD workflows and minimize boilerplate code with our powerful CI/CD integrations, known as Pipes. Choose from our curated list of 100+ Pipes, or create and share your own custom solutions within your organization.

💡 Pro tip 3: Get familiar with Bitbucket Pipelines features to get the most value from your migration.

Explore all available features, including caches, artifacts, deployments/environments, parallel steps, and secure variables (incl. OIDC), etc. Use the docs and examples to speed up adoption.

Bitbucket Pipelines Importer

The Bamboo migration tooling is currently in Beta.

To migrate from Bamboo Data Center (DC) to Bitbucket Pipelines, you’ll need to translate each of your Bamboo build plans into the YAML format that Bitbucket Pipelines can understand. To save time converting from one format to another, we’ve created an automated migration tool to translate most of Bamboo YAML specs to Bitbucket Pipelines configuration files.

In the following sections, we’ll cover the topics you need to successfully guide you through the migration process.

Limitations

The automated migration tool only supports Bamboo YAML specifications. Java Specs are not supported.

The Bitbucket Pipelines Importer is designed to migrate from Bamboo Data Center to Bitbucket Cloud, but because Bamboo Data Center allows for extensive customization through third-party Marketplace apps (also referred to as plugins or tasks), no automated tool can reliably translate every possible configuration. Below are known limitations of the tool:

  • The tool exclusively supports only exported YAML specs. Scripted pipelines (Java specs) or any settings or configurations that Bamboo does not support exporting to YAML will not be migrated.

  • When the tool encounters a trigger with a condition, the trigger is migrated without it, and the original condition is surfaced as a comment for your reference. Conditions will never be automatically recreated upon your migration to Pipelines; they will always require you to add them back to the appropriate trigger(s) manually.

  • Options to explicitly clean up build workspaces are omitted, as Bitbucket's ephemeral cloud and self-hosted runners handle this automatically.

  • If an entire Bamboo plan is disabled, it is safely migrated as a no-op (no operation) action in Pipelines.

  • Disabled tasks are not included in the plan export when using the Bamboo API and are excluded from the migration by default.

  • Customized server settings for artifact storage are streamlined into the standard Bitbucket Pipelines artifact construct to share files between steps.

  • Unrecognized third-party apps will not be translated and will require manual conversion. We do, however, continually update the importer to automatically translate the most common Bamboo actions. View the current list of supported plugins.

To provide clear visibility into what was and wasn’t migrated, the importer surfaces limitations as comments in the generated YAML and in the tool’s logs, so you can quickly review and address any manual follow-ups.

Installation

Below are two ways to install and run the importer:

A self-contained, ready-to-run image that requires minimal setup. It automatically translates all natively supported Bamboo constructs and common plugins out of the box.

Most users should use the Docker image for simplicity and ease of use.

Prerequisites

  • Docker: Ensure Docker is installed and running on your system.

  1. Pull the latest beta Docker image: This step is required only if you have not pulled the importer’s Docker image already or want to ensure you have the latest version.

    docker pull atlassian/bitbucket-pipelines-importer:latest
  2. Run the following command to test that the Docker image is working correctly and to view the available options.

    docker run -it --rm atlassian/bitbucket-pipelines-importer --help

Source Code (Advanced)

If your Bamboo instance relies heavily on unsupported tasks, we recommend automating and customizing the migration to fit your needs. You can do this by downloading the importer’s source code and extending its logic to support your specific Bamboo apps. Please refer to README.md to build the migration tool from the source code

Audit command

Before executing a large-scale migration, you can use the Bitbucket Pipelines Importer's built-in audit command to get a clearer picture of your entire CI footprint. The tool connects to your bamboo instance and automatically downloads your Bamboo build plans and generates a comprehensive readiness report.

export MOUNT_PATH=$PWD docker run -it -e MOUNT_PATH=$MOUNT_PATH -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer \ audit bamboo --output-dir /path/to/output --bamboo-instance-url <url> \ --bamboo-access-token <access token>

To run the audit, forecast, and migrate commands, your Bamboo access token must have View, View configuration, and Edit Plans permissions. While the importer is strictly read-only and never modifies your plans, the Edit Plans permission is a technical requirement of the Bamboo REST API to export the data.

Details on how Docker volume mount works

For users unfamiliar of how Docker volume mount works, ${PWD}:/files mounts your current local directory to the /files directory inside the Docker container, allowing the CLI to read from and write to your host machine. Because of this mapping, any input or output file paths in your command must be prefixed with /files/ (e.g., /files/bamboo.yaml) so the container can locate them.

Understanding the Audit Summary (audit_summary.md)

The audit_summary.md file sits at the root of your output and provides a high-level breakdown of your migration readiness - automated conversion rates, unsupported plugins, & highlights specific elements you will need to configure manually.

It contains the following sections:

  • Audit info: Displays the timestamp when the audit was generated, the target Bamboo base URL, the version of the Importer CLI used, and the overall migration score.

  • Pipelines: Shows the total number of evaluated plans. It categorizes them into percentages of Migrated (fully automated migration), Needs Review (needs manual fixes), Unsupported (cannot be automated), or Failed (parsing errors).

  • Tasks: Displays the counts and percentages of supported and unsupported Bamboo tasks. It also lists each specific task type (like maven or script) and how many times it appeared across the audited plans.

  • Triggers: Outlines the count and percentage of supported versus unsupported triggers , detailing the specific trigger types (like cron or bitbucket-server-trigger) encountered.

  • Migrated plans: A directory mapping showing the name of each Bamboo plan and its relative path to the newly migrated pipelines.yml file.

Forecast command

Moving from Bamboo Data Center to Bitbucket Pipelines introduces consumption-based billing based on build minutes and runner sizes. To help estimate these costs, the Bitbucket Pipelines Importer includes a forecast command that analyzes your historical build times and Bamboo agent specs, then maps your workloads to optimal Bitbucket Cloud runners. It generates a report with details about expected billed minutes and runner options to help you understand costs and performance trade-offs.

Forecast a single Bamboo Project

Use the following command to forecast Bitbucket pipelines cost for a specific project for Q1 2026.

export MOUNT_PATH=$PWD docker run -it -e MOUNT_PATH=$MOUNT_PATH -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer \ forecast bamboo --bamboo-instance-url <url> --project-key <project-key> \ --from-date 2026-01-01 --to-date 2026-03-31 \ --agent-config cpu=4,memory=8 \ --output-dir /path/to/output \ --bamboo-access-token <access token>

Understanding the Forecast output

Forecast command generates a comprehensive report summarizing your predicted Bitbucket Pipelines usage based on your historical build data for the specified time period.

The tool evaluates agent requirements using a mix of pre-configured mappings and configuration fallbacks:

  • For Elastic Agents, the tool uses mapped EC2 instance types.

  • For other agents, the tool relies on the manual agent configuration provided as input as part of the --agent-config command.

The output is organized into the following key sections:

  • Forecast Summary: Details the exact date range evaluated, the Bamboo instance analyzed, and the total number of plans, jobs, and builds processed. It also highlights the overall estimated pipelines cost when using the automatically recommended "Best Match" runner.

  • Execution Info: Breaks down your historical build durations in minutes, providing metrics such as total, average, P50, P90, minimum, and maximum execution times for the specified period. The actual minutes are then used to use calculate the cost.

  • Cost for Cloud Runners (Trade-off Matrix): Provides a side-by-side comparison of all available Bitbucket runner sizes (from 1X up to 32X). For each runner, it displays the estimated cost, average execution time, cost differences compared to the Best Match runner, and how much faster or slower the builds will be compared to your existing Bamboo performance.

Bitbucket Pipelines is a fully managed service. The estimate can't be accurately compared to your current Bamboo infrastructure costs without accounting for your total cost of ownership (TCO), including licensing, infrastructure, and maintenance costs.

Migration command

Once you have audited your footprint, it is time to execute the migration. The Bitbucket Pipelines Importer's migrate command translates your existing CI/CD platform setups into Bitbucket Pipelines YAML.

The tool offers flexible usage modes to fit your workflow. You can perform a direct, local conversion of an exported Bamboo plan file, or connect to your Bamboo server to migrate entire projects or specific individual plans at scale.

Migrate a single Bamboo plan

Option A: Local file conversion For local file conversions, only the input (-i) and output (-o) file paths are needed.

export MOUNT_PATH=$PWD docker run -it -e MOUNT_PATH=$MOUNT_PATH -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer \ migrate bamboo \ -i /files/BambooYamlfile \ -o files/bitbucket-pipelines.yml

Option B: Fetch from the server Alternatively, you can let the tool fetch the plan directly from your Bamboo server using the --plan-key flag.

export MOUNT_PATH=$PWD docker run -it -e MOUNT_PATH=$MOUNT_PATH -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer \ migrate bamboo \ --plan-key <plan-key> \ --output-dir /path/to/output \ --bamboo-instance-url <url> \ --bamboo-access-token <access token>

Migrate an entire Bamboo project

To migrate all plans within a specific project at once, use the --project-key flag. In this example, the server credentials are passed directly as flags rather than relying on the configure profile.

export MOUNT_PATH=$PWD docker run -it -e MOUNT_PATH=$MOUNT_PATH -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer \ migrate bamboo \ --project-key <project-key> \ --output-dir /path/to/output \ --bamboo-instance-url <url> \ --bamboo-access-token <access token>

It’s important to review and validate the Bitbucket Pipelines files generated, as some custom extensions will require additional manual configuration. Use our syntax examples.

Sample conversion

The following examples illustrate the automated conversion logic and provide a syntax comparison to help you understand the transition from Bamboo to Bitbucket Pipelines.

Check the output directory for the generated YAML file.

Running a Hello World build

The default pipeline block runs on every push to any branch not matched by a more specific rule. No separate trigger configuration is needed.

Image: The image field replaces Bamboo's agent capability matching. Instead of tagging agents and adding requirements to plans, you simply name the Docker image your step runs in. Use any public Docker Hub image, or your own private image by adding a credentials block.

Parallel Jobs: Bamboo's Stage feature runs multiple Jobs in parallel within a stage. In Pipelines the parallel keyword groups steps that run concurrently on separate cloud runners.

Bamboo YAML Spec

Pipelines YAML

--- version: 2 plan: project-key: PROJ key: PLAN name: Example Plan stages: - Stage 1: jobs: - Build and Test - Lint - Security scan Build and Test: tasks: - script: - echo "Your build and test goes here..." Lint: tasks: - script: - echo "Your linting goes here..." Security scan: tasks: - script: - echo "Your security scan goes here..."
image: atlassian/default-image:5 pipelines: default: - parallel: - step: name: 'Build and Test' script: - echo "Your build and test goes here..." - step: name: 'Lint' script: - echo "Your linting goes here..." - step: name: 'Security scan' script: - echo "Your security scan goes here..."

Variables

Bitbucket Pipelines categorizes variables in the following way:

  • Default variables: Pipelines provides a set of default variables that are available for builds and can be used in scripts. For example, $BITBUCKET_BUILD_NUMBER.

  • Workspace variables: Variables specified for a workspace can be accessed from all repositories that belong to the workspace.

  • Repository variables: Pipelines variables added at the repository level can be used by any user who has write access in the repository.

  • Deployment variables: You can also define variables so that they can only be used in a specific deployment environment.

It’s also worth mentioning that Bitbucket Pipelines supports OIDC (OpenID Connect) to allow accessing third-party and internal applications securely without storing credentials.

Bamboo categorizes variables based on their scope as follows:

  • Build-specific variables: Evaluated dynamically at build time from Bamboo properties or plugins. For example ${bamboo.planKey}.

  • Deployment variables: Available during project deployment.

  • System variables: Apply across the entire Bamboo instance, inheriting values from system or environment variables.

  • Global variables: Defined at the instance level with static values for every plan.

  • Project variables: Defined for specific projects, capable of overriding global variables.

  • Plan variables: Defined for specific plans, capable of overriding global and project variables, and can be manually overridden during builds.

With these fundamental differences in how variables are set, here are some examples on how to migrate a Bamboo plan to Bitbucket Pipelines.

Mapping default system variables

The migration tool automatically maps some of the Bamboo's built-in system variables to their corresponding Bitbucket Pipelines environment variables. Refer to the supported mapping table below for details.

Bamboo YAML Spec

Pipelines YAML

--- version: 2 plan: project-key: PROJ key: PLAN name: Example Plan stages: - Stage 1: jobs: - Example Job Example Job: tasks: - script: - echo "Running ${bamboo.buildNumber} on ${bamboo.planRepository.repositoryUrl}"
image: atlassian/default-image:5 pipelines: default: - step: name: Example script: - echo "Running $BITBUCKET_BUILD_NUMBER on $BITBUCKET_GIT_HTTP_ORIGIN"

Bamboo to Bitbucket Pipelines system variable mapping

Following conversions are automatically handled by the migration tool.

Bamboo Variable

Bitbucket Pipelines Equivalent

${bamboo.buildNumber}

$BITBUCKET_BUILD_NUMBER

${bamboo.repository.revision.number}

$BITBUCKET_COMMIT

${bamboo.planRepository.branchName}

$BITBUCKET_BRANCH

${bamboo.buildResultKey}

$BITBUCKET_PIPELINE_UUID

${bamboo.deploy.environment}

$BITBUCKET_DEPLOYMENT_ENVIRONMENT

${bamboo.agentWorkingDirectory}

$BITBUCKET_CLONE_DIR

${bamboo.buildTimeStamp}

$BITBUCKET_STEP_TRIGGERER_UUID (no direct equivalent)

${bamboo.repository.git.repositoryUrl}

$BITBUCKET_GIT_HTTP_ORIGIN

${bamboo.repository.pr.key}

$BITBUCKET_PR_ID

${bamboo.repository.pr.targetBranch}

$BITBUCKET_PR_DESTINATION_BRANCH

${bamboo.planRepository.repositoryUrl}

$BITBUCKET_GIT_HTTP_ORIGIN

${bamboo.TagBuildTriggerReason.tagName}

$BITBUCKET_TAG

${bamboo.ManualBuildTriggerReason.userName}

$BITBUCKET_STEP_TRIGGERER_UUID

${bamboo.planKey}

$BITBUCKET_REPO_SLUG

${bamboo.buildResultsUrl}

Construct from $BITBUCKET_WORKSPACE, $BITBUCKET_REPO_SLUG, $BITBUCKET_BUILD_NUMBER

For additional details on how to manage variables and secrets in Bitbucket Pipelines, refer to Variables and secrets | Bitbucket Cloud | Atlassian Support.

Handling credentials

To replace Bamboo's Password Variables, Bitbucket Pipelines uses Secured variables. These are configured in the Bitbucket UI (via Workspace, Repository, or Deployment settings). Marking a variable as "Secured" encrypts the value and ensures it remains hidden in your execution logs.

The pipeline configuration below shows how to pass these secure credentials into your steps as standard environment variables.

Bamboo YAML Spec

Pipelines YAML

--- version: 2 plan: project-key: PROJ key: PLAN name: Example Plan stages: - Stage 1: jobs: - Example Job Example Job: tasks: - script: description: Upload file using a secret credential scripts: # Bamboo injects secured "Password" variables using the ${bamboo.*} syntax. - curl -X POST -u "username:${bamboo.bitbucket_access_token}" -F "files=@package.json" "https://api.bitbucket.org/2.0/repositories/workspace/repo/downloads"
image: atlassian/default-image:5 pipelines: default: - step: name: Example Job script: - # Task description - Upload file using a secret credential - curl -X POST -u "username:$BITBUCKET_ACCESS_TOKEN" -F "files=@package.json" "https://api.bitbucket.org/2.0/repositories/workspace/repo/downloads"

The above code can further be simplified using Bitbucket upload file Pipe.

image: atlassian/default-image:5 pipelines: default: - step: name: Example script: - pipe: atlassian/bitbucket-upload-file:0.7.1 variables: BITBUCKET_ACCESS_TOKEN: $BITBUCKET_ACCESS_TOKEN FILENAME: 'package.json'

Manual configuration steps

Automation handles the standard mapping, but certain Bamboo features do no have a 1:1 equivalent in Bitbucket Pipelines. When the tool encounters a Bamboo app or custom plugin it does not recognize, it safely skips the step to prevent the conversion from failing. It clearly marks the command that was not migrated with “TODO” in the resulting Bitbucket Pipelines file so you can manually review.

Example of the message you will receive for an unsupported task

script: - "# TODO: any-task is not yet supported for conversion." - "# {plugin-key=<plugin_key>, configuration={<plugin_configuration>}, conditions=[<task_conditions>], description=<task_description>}"

You will need to manually replace these placeholders using standard Linux commands or official Bitbucket Pipes, utilizing our syntax examples to help guide your revisions.

Adapting Bamboo Features to Bitbucket Pipelines

In addition to unsupported tasks, few core CI/CD concepts are structured differently in Bitbucket Pipelines. Use the examples below to translate your Bamboo configurations for timeouts, schedules, artifacts and branching.

Set timeout

Bamboo YAML

Pipelines YAML

Based on your settings, Bamboo can determine if a build is hanging or timed out. You can override these settings for individual plans in the executable configuration of each plan. Build monitoring is enabled by default.

In Bitbucket Pipelines, you can establish a global default using the max-time setting in the options block, or define a specific timeout limit for individual steps to override that default

image: atlassian/default-image:5 pipelines: default: - step: name: Example script: - echo 'Hello World' max-time: 60 # Timeout in minutes

Cron expressions

While Bamboo relies on inline cron expressions within your YAML spec to schedule builds, Bitbucket Pipelines scheduled builds are configured in the user interface.

Bamboo YAML

Pipelines YAML

--- version: 2 #short syntax triggers: - cron: 0 * * * ? *

You can easily set up and manage your build schedules, specify the frequency, and timing of builds using simple UI.

Build, test and deploy - Using artifacts & pipes

In Bitbucket, the artifacts keyword is used to pass outputs to downstream steps within the same pipeline run. For deployments, you can modernize your workflows by unifying your build and deploy stages in that same bitbucket-pipelines.yml file, replacing static cloud credentials with secure OIDC authentication, and managing environments and manual approvals directly within Bitbucket's Repository settings.

Bamboo YAML

Pipelines YAML

---- version: 2 plan: project-key: PROJ key: PLAN name: Example Plan stages: - Lint and Test: - Lint - Build Artifact: - Build Artifact - Publish Image: - Publish Image Lint: tasks: - script: - npm install - npm lint docker: image: node:alpine Build Artifact: tasks: - script: - ${bamboo_capability_system_builder_npm} install - ${bamboo_capability_system_builder_npm} run build - script: - cp -r content build/ - cp release/* build/ requirements: - node artifacts: - name: release pattern: build/** Deploy Image: tasks: - artifact-download: source-plan: PROJ-PLAN artifacts: - name: release
image: atlassian/default-image:5 pipelines: default: - step: name: Lint and Test caches: - node script: - npm install - npm lint - npm test - step: name: Build Artifact script: - npm install - npm run build - cp -r content build/ - cp release/* build/ - zip -r build.zip build/ artifacts: - build.zip - step: name: Deploy Image services: - docker script: - pipe: atlassian/aws-code-deploy:1.5.0 variables: AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION S3_BUCKET: $S3_BUCKET COMMAND: "upload" APPLICATION_NAME: "my-application" ZIP_FILE: "build.zip"

Branch-specific Pipelines

The automated conversion tool does not currently support translating branch-specific execution logic. In Bamboo, running different tasks based on the branch name is typically handled using Task Conditions mapped to the planRepository.branchName variable. In Bitbucket Pipelines, this structural logic is much flatter and is handled natively using the branches and default keywords.

You will need to manually map your Bamboo conditional tasks into distinct Bitbucket branch pipelines, as shown in the comparison below:

Bamboo YAML

Pipelines YAML

version: 2 plan: project-key: PROJ key: MAVENBUILD name: Multi-Branch Maven Build stages: - Build Stage: jobs: - Maven Build Job Maven Build Job: docker: image: maven:3.9-eclipse-temurin-17 tasks: - checkout: force-clean-build: true # Runs for feature branches (anything NOT main or release/*) - script: interpreter: SHELL scripts: - mvn test conditions: - variable: matches: planRepository.branchName: '^(?!main$|release/).*' # Runs ONLY for the main branch - script: interpreter: SHELL scripts: - mvn clean verify -P integration-tests conditions: - variable: matches: planRepository.branchName: '^main$' # Runs ONLY for release branches - script: interpreter: SHELL scripts: - mvn clean package -DskipTests - mvn deploy -P release conditions: - variable: matches: planRepository.branchName: '^release$'
image: maven:3.9-eclipse-temurin-17 pipelines: default: # Replaces the negative regex condition (feature branches) - step: name: Unit Tests (feature branches) caches: - maven script: - mvn test branches: main: # Replaces the '^main$' condition - step: name: Full Build and Integration Tests caches: - maven script: - mvn clean verify -P integration-tests 'release/*': # Replaces the '^release/.*' condition - step: name: Build Release Candidate script: - mvn clean package -DskipTests - mvn deploy -P release

Pull request gating (automated checks before merge)

In Bamboo, enforcing a passing build before merging a pull request required configuration across two separate systems: setting up the Pull Request plan branch in Bamboo and enforcing merge checks in Bitbucket Server. In Bitbucket Cloud, this workflow is natively consolidated into a single platform. You simply define your PR-specific testing tasks (like linting and unit tests) inside a pull-requests: block within your pipeline YAML file , and then enforce it by enabling the Require passing pipeline merge check directly in your Repository settings under Branch permissions.

Bamboo YAML

Pipelines YAML

--- version: 2 plan: project-key: PROJ key: PRVALIDATION name: Pull Request Validation branches: create: for-pull-request stages: - Validation Stage: jobs: - PR Checks Job PR Checks Job: docker: image: "node:20" artifacts: - name: Coverage Report pattern: coverage/** required: false tasks: - checkout: force-clean-build: true - script: description: PR Validation — Lint interpreter: SHELL scripts: - | npm ci npm run lint conditions: - variable: matches: repository.pr.targetBranch: '^(?!main$).*' - script: description: PR Validation — Unit Tests interpreter: SHELL scripts: - npm test -- --coverage conditions: - variable: matches: repository.pr.targetBranch: '^(?!main$).*' - script: description: Full Test Suite interpreter: SHELL scripts: - | npm ci npm run lint npm test -- --coverage npm run test:integration conditions: - variable: matches: repository.pr.targetBranch: '^main$'
# Global image definition replaces the Bamboo Docker requirement image: node:20 pipelines: pull-requests: # 1. Condition: Target branch is 'main' main: - step: name: Full Test Suite script: - npm ci - npm run lint - npm test -- --coverage - npm run test:integration artifacts: - coverage/** # 2. Condition: Target branch is anything else ('^(?!main$).*') '**': - step: name: PR Validation (Lint & Unit Tests) script: - npm ci - npm run lint - npm test -- --coverage artifacts: - coverage/**

Recreating cross-plan triggers (Parent-child pipelines)

Because Bamboo operates on a centralized server, it natively supports automated parent-child build dependencies. In Bitbucket Pipelines' repository-centric model, you recreate these cross-project relationships manually using API triggers. To achieve this, configure your parent repository to execute the atlassian/trigger-pipeline pipe upon a successful build, using a securely stored Workspace Access Token to authenticate the request. Finally, in your downstream child repository, define a custom pipeline block specifically designed to listen for and execute upon receiving this external trigger.

Bamboo YAML

Pipelines YAML

--- version: 2 plan: project-key: PROJ key: PARENT name: Parent Build # The parent natively triggers the child plan upon success dependencies: child-plans: - PROJ-CHILD

In Bitbucket, you replace the native dependency block with a step that executes the trigger Pipe.

image: atlassian/default-image:5 pipelines: default: - step: name: Build and Test Parent script: - npm ci - npm test # Explicitly trigger the downstream repository using the Atlassian Pipe - step: name: Trigger Downstream Child script: - pipe: atlassian/trigger-pipeline:5.5.0 variables: BITBUCKET_USERNAME: $BITBUCKET_USERNAME BITBUCKET_APP_PASSWORD: $PIPELINE_TRIGGER_TOKEN REPOSITORY: 'child-microservice-repo' PIPELINE_PATTERN: 'run-integration-tests'

The child repository listens for the trigger from the parent using a custom pipeline block.

image: atlassian/default-image:5 pipelines: custom: # This matches the PIPELINE_PATTERN defined in the parent's trigger pipe run-integration-tests: - step: name: Run Integration Tests script: - echo "Triggered successfully by the parent repository!" - npm run test:integration

Notifications — Slack and email on build events

Bamboo’s Notification settings are not migrated. You must recreate them in Pipelines using after-script and pipes.

To recreate your notification workflows, utilize the after-script block, which executes regardless of the step's final outcome. Within this block, you can integrate communication tools using Bitbucket Pipes or custom scripts, leveraging the $BITBUCKET_EXIT_CODE variable to conditionally trigger alerts based on whether the build succeeded (an exit code of 0) or failed (a non-zero exit code).

Bamboo YAML

Pipelines YAML

Bamboo manages build notifications (such as Slack or email alerts) through UI-based plan configurations.

image: atlassian/default-image:5 pipelines: default: - step: name: Build and Notify script: - npm run build after-script: - pipe: atlassian/slack-notify:2.0.0 variables: WEBHOOK_URL: $SLACK_WEBHOOK_URL # Your secured Slack Incoming Webhook MESSAGE: "Pipeline execution completed with exit code: $BITBUCKET_EXIT_CODE"

Trigger and Wait for a Bamboo Build

The following configuration uses a Bitbucket Pipe to trigger a specific remote Bamboo plan and intentionally pauses the pipeline's execution until that Bamboo build finishes completely. Using this code supports a hybrid migration approach, which minimizes disruption and allows your team to gradually adapt before making a full cutover.

script: - pipe: atlassian/bamboo-trigger-build:0.6.0 variables: BAMBOO_ENDPOINT: 'http://your-bamboo.net:8085' PROJECT_KEYWORD: 'TEST' PLAN_KEYWORD: 'TEST' # Use a secured repository variable for the token TOKEN: $BAMBOO_ACCESS_TOKEN WAIT: 'true' # Make sure to set a sufficient WAIT time for your workflows. WAIT_INTERVAL: '15'

 

Still need help?

The Atlassian Community is here for you.