Only this pageAll pages
Powered by GitBook
Couldn't generate the PDF for 231 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

sfp - flxbl package manager

Loading...

Getting Started

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

CONCEPTS

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

configuring a project

Loading...

Loading...

Development

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Analysing a Project

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Validating a change

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

BUILDING ARTIFACTS

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Installing an artifact

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

publishing and fetching artifacts

Loading...

Install sfp

Depending on your edition, please select the guide from the below link

Install sfp community editionInstall sfp-pro

Pre-Requisites

The following list of software is the minimum to get started using sfp. Our assumption is that you are familiar with the Salesforce CLI and are comfortable using VS Code. To keep the material simple, we will be assuming you already have access to a Salesforce Sandbox that you can build and install artifacts.

Salesforce

Workstation

git
NPM
VS Code
sf CLI

Configuring installation behaviour of a package

sfp provides various features to alter the installation behaviour of a package. These behaviours have to be applied as an additional property of a package during build time. The following section details each of the parameters that is available.

sfp is built on the concept of immutable artifacts, hence any properties to control the installation aspects of a package need to be applied during the build command. Installation behaviour of an package cannot be controlled dynamically. If you need to alter or add a new behaviour, please build a new version of the artifact

PermissionSet Group Awaiter

BuiltIn Deployment Helpers

Overview

sfp is a purpose-built CLI tool for modular Salesforce development and release management. sfp streamlines and automates the build, test, and deployment processes of Salesforce metadata, code, and data. It extends sf cli functionalities, focusing on artifact-driven development to support #flxbl Salesforce project development.

sfp is available in two editions:

  • sfp (Community Edition): Open-source CLI with core build, deploy, and orchestration capabilities

  • sfp-pro: Enterprise edition with additional features including a centralized server, environment management, authentication services, and team collaboration tools

Key Aspects of sfp:

  • Built with codified process: sfp is derived from extensive experience in modular Salesforce implementations. By embracing the #FLXBL framework, it streamlines the process of creating a well-architected, composable Salesforce Org, eliminating time-consuming efforts usually spent on re-inventing fundamental processes.

  • Artifact-Centric Approach: sfp packages Salesforce code and metadata into artifacts with deployment details, ensuring consistent deployments and simplified version management across environments.

  • Best-in-Class Mono Repo Support: Offers robust support for mono repositories, facilitating streamlined development, integration, and collaboration.

Commands

sfp incorporates a suite of commands to aid in your end-to-end development cycle for Salesforce. Starting with the core commands, you can perform basic workflows to build and deploy artifacts across environments through the command line. As you get comfortable with the core commands, you can utilize more advanced commands and flags in your CI/CD platform to drive a complete release process, leveraging release definitions, changelogs, metrics, and more.

sfp is constantly evolving and driven by the passionate community that has embraced our work methods. Over the years, we have introduced utility commands to solve pain points specific to the Salesforce Platform. The commands have been successfully tested and used on large-scale enterprise implementations.

Below is a high-level snapshot of the main command categories in sfp.

Core
Advanced
Server (sfp-pro)
Utilities

Docker Images

sfp docker images are published from the flxbl-io Github packages registry at the link provided below

One can utilize the flxbl-io sfp images by using the

docker pull ghcr.io/flxbl-io/sfp:latest

You can also pin to a specific version of the docker image, by using the version published here To preview latest images for the docker image, visit the release candidate page and update your container image reference. For example:

default:
   image: ghcr.io/flxbl-io/sfp-rc:<version-number>

or
   image: ghcr.io/flxbl-io/sfp-rc:<sha>

Domains

sfp is a natural fit for organisations that utilize Salesforce in a large enterprise setting as its purpose built to deal with modular saleforce development. Often these organisations have teams dedicated to a particular business function, think of a team who works on features related to billing, while a team works on features related to service\

Packages in an org organised by domains

The diagram illustrates the method of organizing packages into specific categories for various business units. These categories are referred to as 'domains' within the context of sfp cli. Each domain can either contain further domains ( sub-domains) and each domain constitute of one or more packages.

sfp cli utilizes 'Release Config' to organise packages into domains. You can read more about creating a release config in the next section

Packages

Packages are containers used to group related metadata together. A package would contain components such as objects, fields, apex, flows and more allowing these elements to be easily installed, upgraded and managed as a single unit Packages in the context of sfp are not limited to second generation packaging (2GP), sfp supports various types of packages which you can read in the following sections

Defining a package in repository

SF CLI vs. SFP

Salesforce CLI

The Salesforce CLI is a command-line interface that simplifies development and build automation when working with your Salesforce org. There are numerous of commands that the sf cli provides natively that is beyond the scope of this site and can be found on the official Salesforce Documentation Site.

From a build, test, and deployment perspective, the following diagram depicts the bare minimum commands necessary to get up and running in setting up your sf project, retrieving and deploying code to the target environments.

Artifacts

An artifact is a key concept within sfp. An artifact is a point in time snapshot of a version of a package, as mentioned in sfdx-project.json . The snapshot contains source code of a package directory , additional metadata information regarding the particular version, changelog and other details. An artifact for 2GP package would also contain details such as

In the context of sfp, packages are represented in your version control, and artifact is an output of a the build command when operated on a package

Artifacts provide an abstraction over version control, as it detaches the version control from from the point of releasing into a salesforce org. Almost all commands in sfp operates on an artifact or generates an artifact. \

In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package or specific package types introduce by sfp, an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices. sfp's artifacts are built to be compatible for npm package supported registries , most CI/CD providers provide a npm compatible registry to host these packages/artifacts. Here is the link to operate on Github Package Manager for instance (

Applying attributes of an artifact

All the attributes that are configured additionally to a package in your project is recorded within the artifact. When the artifact is being installed to the target org, the following steps are undertaken in sequence as seen in the below diagram. Each of the configured attributes below to one of the category in the sequence and executed

Overview

sfp cli features an intiutive command to build artifacts of all your packages in your project directory. The 'build' command automatically detects the type of package and builds an artifact individually for each package

By default, the behaviour of sfp's build command is to build a new version of all packages in your project directory and and creates it associated artifact.

sfp's build command is also equipped with an ability to selectively build only packages that are changed. Read more on how sfp determines a package to be built on the subsequent sections.

Controlling aspects of the build command

sfp provides mechanisms to control the aspects of the build command, the following section details how one can configure these mechanisms in your sfdx-project.json

Building an artifact for package individually

An artifact can be build individually for a package, sfp will determine the and the corresponding api's will be invoked

Ignoring packages from being built
Building a collection of packages together
Selective ignoring of components from being built
Use of multiple config file in build command
Series of steps undertake while installnig an artifact

Package vs Artifacts

In Salesforce, a package is a container that groups together metadata and code in order to facilitate the deployment and distribution of customizations and apps. Salesforce supports different types of packages, such as:

  • Unlocked Packages: These are more modular and flexible than traditional managed packages, allowing developers to group and release customizations independently. They are version-controlled, upgradeable, and can include dependencies on other packages.

  • Org-Dependent Unlocked Packages: Similar to unlocked packages but with a dependency on the metadata of the target org, making them less portable but useful for specific org customizations.

  • Managed Packages: Managed packages are a type of Salesforce package primarily used by ISVs (Independent Software Vendors) for distributing and selling applications on the Salesforce AppExchange. They are fully encapsulated, which means the underlying code and metadata are not accessible or editable by the installing organization. Managed packages support versioning, dependency management, and can enforce licensing. They are ideal for creating applications that need to be securely distributed and updated across multiple Salesforce orgs without exposing proprietary code.

sfp auguments the above formal salesforce package types with additional package types such as below

  • Source Packages: These are not formal packages in Salesforce but refer to a collection of metadata and code retrieved from a Salesforce org or version control that can be deployed to an org but aren't versioned or managed as a single entity.

  • Diff Packages: These are not a formal type of Salesforce package but refer to packages created by determining the differences (diff) between two sets of metadata, often used for deploying specific changes.

In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package (of any type mentioned above), an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices.

Key differences between Salesforce packages and sfp artifacts include:

  • Versioning and Dependencies: While Salesforce packages support versioning, sfp artifacts enrich this with detailed dependency tracking, ensuring that the CI/CD pipeline respects the order of package installations based on dependencies.

  • Installation Behavior: Artifacts in sfp carries additional metadata that defines custom installation behaviors, such as pre- and post-installation scripts or conditional installation steps, which are not inherently a part of Salesforce packages.

  • CI/CD Integration: Artifacts in sfp are specifically designed to fit into a CI/CD pipeline, such as supporting storing in an artifact registory, version tracking, and release management that are essential for automated deployments but are outside the scope of Salesforce packages themselves.

Support for Multiple Package Types: sfp accommodates various Salesforce package types with streamlined commands, enabling modular development, independent versioning, and flexible deployment strategies.

  • Orchestrate Across Entire Lifecycle: sfp provides an extensive set of functionality across the entire lifecycle of your Salesforce development.

  • End-to-End Observability: sfp is built with comprehensive metrics emitted on every command, providing unparalleled visibility into your ALM process.

  • Centralized Server (sfp-pro): A backend server that provides environment management, authentication, webhooks, and API access for team-based workflows.

  • publish

    releasedefinition

    auth

    impact

    artifacts

    changelog

    environment

    repo

    metrics

    org

    flow

    pool

    webhook

    repository

    review-envs

    key-value

    doc-store

    quickbuild

    validate

    init

    profile

    build

    pool

    start / stop / status

    apextests

    install

    release

    health / logs / scale

    A Salesforce Project (in a git repository)

    dependency

    sfp build -v <devhub_name> --branch <value> 
    sfp build -v <devhub_name> --branch <value>  --diffcheck
    // Sample sfdx project json 
    {
      "packageDirectories": [
        {
          "path": "./src-env-specific-pre",
          "package": "src-env-specific-pre",
          "versionNumber": "1.0.0.0",
        },
        {
          "path": "./src/frameworks/feature-mgmt",
          "package": "feature-mgmt",
          "versionNumber": "1
        }
      ]
    }
    // Build a single package
    sfp build -v devhub --branch=main -p feature-mgmt
    type of a package

    Always deploy a package

    Attribute
    Type
    Description
    Package Types Applicable

    alwaysDeploy

    boolean

    Deploys an artifact of the package, even if it's installed already in the org. The artifact has to be present in the artifact directory for this particular option to work

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    To ensure that an artifact of a package is always deployed, irrespective the same version of the artifact is previously deployed to the org, you can utlize alwaysDeploy as a property added to your package,

    {
      "packageDirectories": [
        {
          "path": "src-env-specific-pre",
          "package": "env-specific-pre",
          "versionDescription": "Environment related settings",
          "versionNumber": "4.7.0.NEXT",
          "alwaysDeploy":true
        },
         ...
       ]
    }

    Skip Coverage Validation

    Attribute
    Type
    Description
    Package Types Applicable

    skipCoverageValidation

    boolean

    Skip apex test coverage validation of a package

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    sfp during validation checks the apex test coverage of a package depending on the package type. This is beneficial so that you dont run into any coverage issues while deploying into higher environments or building an unlocked package. \

    However, there could be situations where the test coverage calculation is flaky , sfp provides you with an option to turn the coverage validation off.

    // Demonstrating how to do use skipCoverageValidation
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "skipCoverageValidation": true
        },
         ...
       ]
    }

    Defining a domain

    A domain is defined by a release configuration. In order to define a domain, you need to create a new release config yaml file in your repository

    A simple release config can be defined as shown below

    // Sample release config <business_domain.yaml>
    releaseName: <business_domain>   # --> The name of the domain
    pool: <sandbox/scratch org pools>
    excludeAllPackageDependencies: true
    includeOnlyArtifacts:   # --> Insert packages
      - <pkg1>
    releasedefinitionProperties:
      promotePackagesBeforeDeploymentToOrg: prod

    The resulting file could be stored in your repository and utilized by the commands such as build, release etc.

    Skip Testing

    Attribute
    Type
    Description
    Package Types Applicable

    skipTesting

    boolean

    Skip trigger of apex tests while validating or deploying

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    One can utilize this attribute on a package definition is sfdx-project.json to skipTesting while a change is validated. Use this option with caution, as the package when it gets deployed (depending on the package behaviour) might trigger testing while being deployed to production

    // Demonstrating how to do use skipTesting
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "skipTesting": true
        },
         ...
       ]
    }

    Ignoring packages from being built

    Attribute
    Type
    Description
    Package Types Applicable

    ignoreOnStage

    array

    Ignore a package from being processed by a particular stage

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    Using the ignoreOnStage:[ "build" ] property on a package, causes the particular package to be skipped by the build command. Similarly you can use ignoreOnStage:[ "quickbuild" ] to skip packages in the quickbuild stage.

    {
      "packageDirectories": [
        {
          "path": "./src-env-specific-pre",
          "package": "src-env-specific-pre",
          "versionNumber": "1.0.0.NEXT",
          "ignoreOnStage": [
           "build"
          ]
        },
        {
          "path": "./src/frameworks/feature-mgmt",
          "package": "feature-mgmt",
          "versionNumber": "1.0.0.NEXT"
        }
      ]
    }

    Selective ignoring of components from being built

    Sometimes, due to certain platform errors, some metadata components need to be ignored during build (especially for unlocked packages) while the same being required for other commands like validate . sfp offer you an easy mechanism which allows to switch .forceignore files depending on the operation.

    Add this entry to your sfdx-project.json and as in the example below, mention the path to different files that need to be used for different stages

     {
      "packageDirectories": [
        {
          "path": "core",
          "package": "core-package",
          "versionName": "Core 1.0",
          "versionNumber": "1.0.0.NEXT",
          "default": true,
        }
      ],
      "plugins": {
            "sfp": {
                "ignoreFiles": {
                    "build": "forceignores/.buildignore"
                    "validate": ".forceignore"
                }
       }
     }

    Overview

    Given a directory of artifacts and a target org, the deploy command will deploy the artifacts to the target org according to the sequence defined in the project configuration file.

    // Command to install set of artifacts to devhub
    sfp install -u devhub --artifactdir artifacts

    Sequence of Activities

    The install command runs through the following steps

    • Reads all the sfp artifacts provided through the artifact directory

    • Unzips the artifacts and finds the latest sfdx-project.json.ori to determine the deployment order, if this particular file is not found, it utilizes sfdx-project.json on the repo

    • Read the installed packages in the target org utilizing the records in SfpowerscriptsArtifacts2__c

    • Install each artifact from the provided artifact directory to the target org based on the deployment order respecting the attributes configured for each package

    Building a domain

    Assume you have a domain 'sales' as defined by release config sales.yaml provided as shown in this example here.

    # release-config for sales
    releaseName: sales
    pool: sales-pool
    excludeAllPackageDependencies: true 
    includeOnlyArtifacts:
      - sales-ui
      - sales-channels
    releasedefinitionProperties:
      skipIfAlreadyInstalled: true
      usePackageDiffDeploymentMode: true
      promotePackagesBeforeDeploymentToOrg: prod
      changelog:
        workItemFilters:
          -  (AKG|GIK)-[0-9]{2,5}
        workItemUrl: https://example.atlassian.net/browse
        limit: 30

    In order to build the artifact of the packages defined by the above release config, you would use the the build command with the flags as described here.

    sfp build --releaseconfig sales.yaml -v devhub --branch main 

    If you require only to build packages that's changed form the last published packages, you would add an additional diffcheck flag.

    sfp build --releaseconfig sales.yaml -v devhub --branch main  --diffcheck

    diffcheck will work accurately only if the build command is able to access the latest tag in the repository. In certain CI system if the command is operated on a repository where only the head commit is checked out, diffchek will result in building all the artifacts for all packages within the domain

    SFP

    sfp is based on the The Open CLI Framework for building a command line interface (CLI) in Node.js. Instead of being a typical Salesforce CLI plugin, sfp is standalone and leverages the same core libraries and APIs as the @salesforce/cli. sfp releases are independently managed and as core npm libraries are stable, we will update them as needed to ensure no breaking changes are introduced.

    The diagram below depicts the basic flow of the development and test process, building artifacts, and deploying to target environments.

    Once you have mastered the basic workflow, you can progress to publishing artifacts to a NPM Repository that will store immutable, versions of the metadata and code used to drive the release of your packages across Salesforce Environments.

    sfp Develop, Test, and Release Workflow

    References

    The list below is a curated list of core sf cli and Salesforce DX developer guides for your reference.

    • SF CLI

      • Salesforce CLI Setup Guide

      • Salesforce CLI Release Notes

      • Salesforce CLI Status Page

    • sfp

    sf cli deployments
    )

    Artifacts built by sfp follow a naming convention that starts with the <name_of_the_package>sfpowerscripts_artifact_<Major>.<Minor>.<Patch>-<BuildNumber>. One can use any of the npm commands to interact with sfp artifacts. \

    Subscriber Package Version ID
    An expansion of an artifact generated by sfp
    https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-npm-registry

    Install sfp community edition

    A. Install sfp in your local machine

    npm i -g @flxbl-io/sfp

    B. Check sfp version

    sfp requires node-gyp for its dependencies. If you are facing issues during installation, with node-gyp, please follow the instructions here

    C. Validate Installation

    Build & Install an Artifact

    In the earlier section, we looked at how we configured the project directory for sfp, Now lets walk through some core commands.

    A. Build an artifact for a package

    The build command will generate a zipped artifact file for each package you have defined in the sfdx-project.json file. The artifact file will contain metadata and the source code at the point build creation for you to use to install.

    1. Open up a terminal within your Salesforce project directory and enter the following command:

    You will see the logs with details of your package creation, for instance here is a sample output

    1. A new "artifacts" folder will be generated within your source project containing a zipped artifact file for each package defined in your sfdx-project.json file. For example, the artifact files will contain the following naming convention with the "_sfpowerscript_artifact_" in between the package name and version number. package-name_sfpowerscripts_artifact_1.0.0-1.zip \

    B. Install the artifact to a target org

    Ensure you have authenticated your salesforce cli to a org first. If you haven't, please find the instructions .

    1. Once you have the authenticated to the sandbox, you could execute the installation command as below

    1. Navigate to your target org and confirm that the package is now installed with the expected changes from your artifact. In this example above, a new custom field has been added to the Account Standard Object.

    Depending on the type of packages, with in the package directory and it could result in failures during installation, Please fix the issues in your code and repeat till you get a sucessful installation. If your packages doesn't have sufficient test coverage, you may need to use the all tests in the org to get your package installed. Refer to the material

    Org-Dependent Unlocked Packages

    Org-dependent unlocked packages, a variation of unlocked packages, allow you to create packages that depend on unpackaged metadata in the target org. Org dependent package is very useful in the context of orgs that have lots of metadata and is struggling with understanding the dependency while building a 'unlocked package'

    Org dependent packages significantly enhance the efficiency of #flxbl projects who are already on scratch org based development. By allowing installation on a clean slate, these packages validate dependencies upfront, thereby reducing the additional validation time often required by unlocked packages.

    Org-Dependent Unlocked Package and Test Coverage

    Org-dependent unlocked packages bypass the test coverage requirements, enabling installation in production without standard validation. This differs significantly from metadata deployments, where each Apex class deployed must meet a 75% coverage threshold or rely on the org's overall test coverage. While beneficial for large, established orgs, this approach should be used cautiously.

    To address this, sfp incorporates a default test coverage for org-dependent unlocked packages during the validation process. To disable this test coverage check during validation, additional attributes must be added to the package directory in the sfdx-project.json file.

    \

    Congratulations!

    Good Work! If you made it past the getting started guide with minimal errors and questions, you are well on your way to introduce sfp into your Salesforce Delivery workflow.

    Let's summarize what you have done:

    1. Setup pre-requisite software on your workstation and have access to a Salesforce Org.

    2. Installed the latest sfp cli.

    3. Configured your source project and added additional properties required for sfp cli to generate artifacts.

    4. Build artifact(s) locally to be used to deploy.

    5. Installed artifact(s) to target org.

    This is just the tip of the iceberg for the full features sfp can provide for you and your team. Please continue to read further and experiment.

    For any comments/recommendations to sfp so please join our . If you are adventurous, contribute!

    PermissionSet Assignment

    Attribute
    Type
    Description
    Package Types Applicable

    assignPermSetsPreDeployment

    array

    Apply permsets before installing an artifact to the deployment user

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    assignPermSetsPostDeployment

    array

    Apply permsets after installing an artifact to the deployment user

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    Occasionally you might need to assign a permission set for the deployment user to successfully install the package, run apex tests or functional tests, sfp provides you an easy mechanism to assign permission set either before installation or after installation of an artifact. assignPermSetsPreDeployment assumes the permission sets are already deployed on the target org and proceed to assign these permission sets to the user

    assignPermSetsPostDeployment can be used to assign permission sets that are introduced by the artifact to the target org for any aspects of testing or any other automation usage

    Entitlement Deployment Helper

    Have you ever encountered this error when deploying an updated Entitlement Process to your org?

    Deployment Issue with Entitlement Process in Salesforce

    It's important to note that Salesforce prevents the deployment of an Entitlement Process that is currently in use, irrespective of its activation status in the org. This limitation holds true even for inactive processes, potentially impacting deployment strategies.

    To create a new version of an Entitlement Process, navigate to the Entitlement Process Detail page in Salesforce and perform the creation. If you then use sf cli to retrieve this new version and attempt to deploy it into another org, you're likely to encounter errors. Deploying it as a new version can bypass these issues, but this requires enabling versioning in the target org first.

    The culprit behind this is an attribute in the Entitlement Process metadata file: versionMaster.

    According to the Salesforce documentation, versionMaster ‘identifies the sequence of versions to which this entitlement process belongs and it can be any value as long as it is identical among all versions of the entitlement process.’ An important factor here about versionMaster is: versionMaster is org specific. When you deploy a completely new Entitlement Process to an org with a randomly defined versionMaster, Salesforce will generate an ID for the Entitlement Process and map it to this specific versionMaster.

    Deploying updated Entitlement Processes from one Salesforce org to another can often lead to deployment errors due to discrepancies in the versionMaster attribute, sfp's enitlement helper enables you to automatically align the versionMaster by pulling its value from the target org and updating the deploying metadata file accordingly. This ensures that your deployment process is consistent and error-free.

    1. Enable Versioning: First and foremost, activate versioning in the target org to manage various versions of the Entitlement Process.

    2. Create or Retrieve Metadata File:

      • Create a new version of the Entitlement Process as a metadata file, which can be done via the Salesforce UI and retrieved with the Salesforce CLI (sf cli).

    3. Ensure Metadata Accuracy

    Automation around entitlement filter can be disabled globally by using these attributes in your sfdx-project.json

    Test Synchronously

    Attribute
    Type
    Description
    Package Types Applicable

    testSynchronous

    boolean

    Ensure all tests of the package is triggered synchronously

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    sfp by default attempts to trigger all the apex tests in a package in asynchronous mode during validation. This is to ensure that you have a highly performant apex test classes which provide you fast feedback.

    During installation of packages, test cases are triggered synchronously for source and diff packages. Unlocked and Org-depednendent packages do not trigger any apex test during installation

    If you have inherited an existing code base or your unit tests have lot of DML statemements, you will encounter test failuers when a tests in package is triggered asynchronously. You can instruct sfp to trigger the tests of the package in a synchronous mode by setting 'testSynchnronous as true \

    Controlling Aspects of Installation

    Skip artifacts if they are already installed

    // Command to deploy set of artifacts to devhub
    sfp install -u devhub --artifactdir artifacts --skipifalreadyinstalled          

    By using the skipifalreadyinstalled option with the deploy command, you can prevent the reinstallation of an artifact that is already present in the target organization.

    Install artifacts to an org baselined on an another org

    The --baselineorg parameter allows you to specify the alias or username of an org against which to check whether the incoming package versions have already been installed and form a deployment plan.This overrides the default behaviour which is to compare against the deployment target org. This is an optional feature which allows to ensure each org's are updated with the same installation across every org's in the path to production.

    Skip Install on Certain Orgs

    Attribute
    Type
    Description
    Package Types Applicable

    skipDeployOnOrgs

    array

    Skips installation of an artifact on a target org

    • org-dependent unlocked

    • unlocked

    • data

    • source

    sfp cli when encounters the attribute skipDeployOnOrgs on a package, the generated artifact during installation is checked against the alias or the username passed onto the installation command. If the username or the alias is matched, the artifact installation is skipped

    // Demonstrating how to do use skipDeployOnOrgs
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "skipDeployOnOrgs":[
           "qa",
           "[email protected]
          ]
        },
         ...
       ]
    }

    In the above example, if the alias to installation command is qa, the artifact for core-crm will get skipped during intallation. The same can also be applied for username of an org as well

    Reconciling Profiles

    Attribute
    Type
    Description
    Package Types Applicable

    reconcileProfiles

    boolean

    Reconcile profiles to only apply permissions to objects, fields and features that are present in the target org.

    • source

    • diff

    In order to prevent deployment errors to target Salesforce Orgs, enabling reconcileProfiles totrue ensures and extra steps are taken to ensure the profile metadata is cleansed of any missing attributes prior to deployment.

    During the reconcilement process, you can provide the follow flags to the command:

    • Folder - path to the folder which contains the profiles to be reconciled, if project contain multiple package directories, please provide a comma separated list, if

    • Profile List - list of profiles to be reconciled. If omitted, all the profiles components will be reconciled.

    • Source Only - set this flag to reconcile profiles only against component available in the project only. Configure ignored permissions in sfdx-project.json file in the array

    • Destination Folder - the destination folder for reconciled profiles, if omitted existing profiles will be reconciled and will be rewritten in the current location

    For more details on thesfp profile reconcile command, .

    For more a more detailed understanding of the ProfileReconcile library, visit the GitHub repository "" for details of on the .

    State management for Flows

    Attribute
    Type
    Description
    Package Types Applicable

    enableFlowActivation

    boolean

    Enable Flows automatically in Production

    • source

    • diff

    • unlocked

    While installing a source/diff package, flows by default are deployed as 'Inactive' in the production org. One can deploy flow as 'Active' using the steps mentioned here, however, this requires the flow to meet test coverage requirement.

    Also making a flow inactive, is convoluted, find the detailed article provided by Gearset

    sfp has automation built in that attempts to align the status of a particular flow version as kept inside the package. It automatically activates the latest version of a Flow in the target org (assuming the package has the latest version). If an earlier version of package is deployed, it skips the activation.

    At the same time, if the package contains a Flow with status Obsolete/InvalidDraft/Draft, all the versions of the flow would be rendered inactive.\

    This feature is enabled by default, if you would like to disable the feature, set enableFlowActivation to false on your package descriptor

    Always sync a package during validation

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    November 25

    Attribute
    Type
    Description
    Package Types Applicable

    To ensure a package is always included during validation when its domain is impacted, add alwaysSync as a property to your package descriptor:

    When framework-core is changed and validation is triggered, framework-config will automatically be included because it belongs to the same domain and has alwaysSync: true.

    This also works across domains when using dependencyOn in release configs. If Domain B has a dependencyOn referencing a package from Domain A, and that package changes, all alwaysSync packages in Domain A will be included.

    Install sfp-pro

    Step 1: Download the Installer

    1. Go to

    2. Select the latest release (or a specific version you need)

    Configure Your Project

    A. Ensure you have enabled and authenticated to DevHub

    Ensure you have in your production org or created a developer org.

    B. Authenticate your Salesforce CLI to DevHub

    sfp-pro

    SFP-Pro provides Docker images through our self-hosted Gitea registry at source.flxbl.io. These pre-built images are maintained and updated regularly with the latest features and security patches.

    Prerequisites

    1. Access to source.flxbl.io (Gitea server)

    Overview

    sfp is a purpose-built cli tool used predominantly in a modular salesforce project. \

    A project utilizing sfp implements the following concepts.

    Domains

    In an sfp-powered Salesforce project, "Domains" represent distinct business capabilities. A project can encompass multiple domains, and each domain may include various sub-domains. Domains are not explicitly declared within sfp but are conceptually organized through ""

    Supported package types

    sfp cli supports operations on various types of packages within your repository. A short summary on the comparison between different package types is provided below\

    Features
    Unlocked Packages
    Org Dependent Unlocked Packages
    Source Packages
    Data Packages
    Diff Packages

    Identifying types of a package

    This section details identifies on how sfp analyzes and classifies different package types by using the information in sfdx-project.json

    • Unlocked Packages are identified if a matching alias with a package version ID is found and verified through the DevHub. For example, the package named "Expense-Manager-Util" is found to be an Unlocked package upon correlation with its alias "Expense Manager - Util" and subsequent verification.

    • Source Packages are assumed if no matching alias is found in packageAliases. These packages are typically used for source code that is not meant to be packaged and released as a managed or unlocked package.

    Setup Salesforce Org

    To fully leverage the capabilities of sfp, a few addition steps need to be configured in your Salesforce Orgs. Please follow the following steps.

    1. Enable Dev Hub

    To enable modular package development, the following configurations need to be turned on in order to create Scratch Orgs and Unlock Packages.

    in your Salesforce org so you can create and manage scratch orgs and second-generation packages. Scratch orgs are disposable Salesforce orgs to support development and testing.

    Overview

    validate command helps you to test ((deployability, apex tests, coverage) a change made to your configuration / code against a target org. This command is typically triggered as part of your Pull Request (PR) or Merge process, to ensure the correctness of configuration/code, before being merged into your main branch.

    sfp validates a change by deploying the changed packages into the target org. This is different from 'check only' deployment in other CI/CD solutions.

    validate can either utilise a scratch org from a tagged pool prepared earlier using the command or one could use the a target org for its purpose

    validate pool / validate org command runs the following checks with the options to enable additional features such as dependency and impact analysis:

    Determining whether an artifact need to be built

    sfp's build commands process the packages in the order as mentioned in your sfdx-project.json. The commands also read your dependencies property, and then when triggered, will wait till all its dependencies are resolved, before triggering the equivalent package creation command depending on the type of the package and generating an artifact in the due process

    sfp's build command is equipped with a diffcheck functionality, which is enabled when one utilizes diffcheck flag, A comparison (using git diff) is made between the latest source code and the previous version of the package published by the '' command. If any difference is detected in the package directory, package version or scratch org definition file (applies to unlocked packages only), then the package will be created - otherwise, it is skipped. For eg: provided the followings packages in sfdx-project.json along with its dependencies

    Scenario 1 : Build All

    1. Trigger creation of artifact for package A

    Limiting artifacts to be built

    Artifacts to be built can be limited by various mechanisms. This section deals with various techniques to limit artifacts being built

    Limiting artifacts by domain

    Artifacts to be built can be restricted by during a build process in sfp by utilizing specific configurations. Consider the example provided in

    You can use the path to the config file to the build command to limit the artifacts being built as in the sample below

    Project structure

    Projects that utilise sfp predominantly follow a mono-repo structure similar to the picture shown above. Each repository has a "src" folder that holds one or more packages that map to your sfdx-project.json file.

    Different folders in each of the structure are explained as below:

    1. core-crm: A folder to house all the core model of your org which is shared with all other domains.

    2. frameworks: This folder houses multiple packages which are basically utilities/technical frameworks such as Triggers, Logging and Error Handling, Dependency Injection etc.

    Publish Artifact

    The Publish command pushes artifacts created by the build command to a npm registry and also provides you functionality to tag a version of the artifact in your git repository.

    Publishing to an NPM-Compatible Private Registry

    To publish packages to your private registry, you'll need to configure your credentials accordingly. Follow the guidelines provided by your registry's service to ensure a smooth setup.

    1. Locate Documentation: Jump into your private registry provider's documentation or help center.

    Updating Picklist

    Attribute
    Type
    Description
    Package Types Applicable

    Salesforce inherently does not support the deployment of picklist values as part of unlocked package upgrades. This limitation has led to the need for workarounds, traditionally solved by either replicating the picklist in the source package or manually adding the changes in the target organization. Both approaches add a burden of extra maintenance work. This issue has been documented and recognised as a known problem, which can be reviewed in detail .

    To ensure picklist consistency between local environments and the target Salesforce org, the following pre-deployment steps outline the process for managing picklists within unlocked packages:

    Optimized Installation

    Attribute
    Type
    Description
    Package Types Applicable

    sfp cli optimally deploys artifacts to the target organisation by reducing the time spent on running Apex tests where possible. This section explains how this optimisation works for different package types.

    Package Type

    diff

    Slack Community
    Packages

    A package (synonymous with modules) is a container that groups related metadata together. A package would contain components such as objects, fields, apex, flows and more, allowing these elements to be easily installed, upgraded and managed as a single unit. A package is defined as a directory in your project repository and is defined by an entry in sfdx-project.json

    Artifacts

    An artifact represents a versioned snapshot of a package at a specific point in time. It includes the source code from the package directory (as specified in sfdx-project.json), along with metadata about the version, change logs, and other relevant details. Artifacts are the deployable units in the sfp framework, ensuring consistency and traceability across the development lifecycle. When sfp is integrated into a Salesforce project, it centralizes around the following key essential process

    Building a domain

    Building' refers to the creation of an artifact from a package. Utilizing the build command, sfp facilitates the generation of artifacts for each package in your repository. This process encapsulates the package's source code and metadata into a versioned artifact ready for installation.

    Publishing a domain

    Publishing a domain involves the process of publishing the artifacts generated by the build command into an artifact repository. This is the storage area where the artifacts are fetched for releases, rollback, etc.

    Releasing a domain

    Releasing an domain involves the process of promoting, installing a collection of artifacts to a higher org such as production, generating an associated changelog for the domain. This process is driven by the release command along with a release definition.

    \

    Release Configs
    How sfp works

    sales: An example of a domain in your org. Under this particular domain, multiple packages that belong to the domain are included.

  • src-access-mgmt: This package is typically one of the packages that is deployed second to last in the deployment order and used to store profiles, permission sets, and permission set groups that are applied across the org. Permission Sets and Permission Set Groups particular to a domain should be in their respective package directory.

  • src-env-specific: An aliasified package which carries metadata for each particular stage (environment) of your path to production. Some examples include named credentials, remote site settings, web links, custom metadata, custom settings, etc.

  • src-temp: This folder is marked as the default folder in sfdx-project.json. This is the landing folder for all metadata and this particular folder doesn't get deployed anywhere other than a developers scratch org. This place is utilized to decide where the new metadata should be placed into.

  • src-ui: Should include page layouts, flexipages and Lightning/Classic apps unless we are sure these will only reference the components of a single domain package and its dependencies. In general, custom UI components such as LWC, Aura and Visualforce should be included in a relevant domain package.

  • runbooks: This folder stores markdown files required for each release and or sandbox refresh to ensure all manual steps are accounted for and versioned control. As releases are completed to production, each release run book can be archived as the manual steps should typically no longer be required. Sandbox refresh run books should be managed accordingly to the type of sandbox depending if they have data or only contain metadata.

  • scripts: This optional folder is to store commonly used APEX or SOQL scripts that need to be version controlled and reference by multiple team members.

  • src-env-specific should be added to .forceignore files and should not be deployed to a scratch org.

    A sample repository structure used with sfp
    Salesforce DX Developer Guide
    Developing sf Plugins
    @salesforce/cli NPM Repository
    @flxblio/sfp NPM Repository
    GitHub Repository

    Controlling validation attributes of a package

    sfp --version
    @flxbl-io/sfp/37.0.0 darwin-arm64 node-v20.3.1
    https://www.npmjs.com/package/node-gyp
    here
    sfp will issue the equivalent test classes
    here
    Build Outputs
    artifacts folder
    Install Outputs
    // Org dependent unlocked package with additonal attributes
    
      "packageDirectories": [
        {
          "path": "src/core/core-crm",
          "package": "core-crm",
          "versionNumber": "4.7.0.NEXT",
          "skipTesting": true,
          "skipCoverageValidation": true
        },
         ...
       ]
    }
    validation
    // Demonstrating how to do use testSynchronous
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "testSynchronous": true
        },
         ...
       ]
    }
    // Command to deploy with baseline
    sfp install -u qa \
               --artifactdir artifacts \
               --skipifalreadyinstalled --baselineorg devhub
    click here
    sfprofiles
    source
    // Demonstrating package by disabling flow activation
    {
      "packageDirectories": [
          {    
          "path": "src/order-management",
          "package": "order-management",
          "versionNumber": "2.0.10.NEXT",
          "enableFlowActivation" : false
        }

    alwaysSync

    boolean

    During validation, automatically includes this package when any other package in the same domain is impacted. Useful for config/settings packages that must stay synchronized with their domain.

    • unlocked

    • org-dependent unlocked

    • source

    • data

    Checks accuracy of metadata by deploying the metadata to an org

  • Triggers Apex Tests

  • Validates Apex Test Coverage of each package (default: 75%)

  • Toggle between different modes for validation:

    • Thorough (Default) - Comprehensive validation with full deployments and all tests

    • Individual - Validates changed packages individually, ideal for PRs

  • Automatically differentiates between:

    • Packages to synchronize (upstream changes)

    • Packages to validate (PR changes)

  • [optional] - AI-Assisted Error Analysis - Intelligent error analysis and fix suggestions (learn more)

  • [optional] - Limit validation scope using release configurations (--releaseconfig)

  • [optional] - Validate dependencies between packages for changed components

  • [optional] - Disable diff check while validating (--diffcheck)

  • [optional] - Disable parallel testing of apex tests (--disableparalleltesting)

  • [optional] - Skip test execution entirely (--skipTesting)

  • [optional] - Execute custom validation scripts for setup/cleanup workflows

  • prepare

    Dependency management

    sfp features commands and functionality that helps in dealing with complexity of defining dependency of unlocked packages. These functionalities are designed considering aspects of a #flxbl project, such as the predominant use of mono repositories in the context of a non-ISV scenarios.

    Package dependencies are defined in the sfdx-project.json (Project Manifest). More information on defining package dependencies can be found in the Salesforce docs.

    Let's unpack the concepts utilizing the above example:

    • There are two unlocked packages and one source package

      • Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias

      • Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'

      • Expense Manager Org Config - a source package, lets assume has some reports and dashboards

    • External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.

    • Source Packages/Org Dependent Unlocked / Data packages have an implied dependency as defined by the order of installation, as in it assumes any depedent metadata is already available in the target org before the installation of components within the package

    Use of multiple config file in build command

    The configFile flag in the build command is used for features and settings of the scratch org used to validate your unlocked package. It is optional and if not passed in, it will assume the default one which is none.

    Typically, all packages in the same repo share the same scratch org definition. Therefore, you pass in the definition that you are using to build your scratch org pool and use the same to build your unlocked package.

    sfp build --configFile <path-to-config-file> -v <devhub>

    However, there is an option to use multiple definitions.

    For example, if you have two packages, package 1 is dependent on SharedActivities, and package 2 is not. You would pass a scratch org definition file to package 1 via scratchOrgDefFilePaths in sfdx-project. Package 2 would be using. the default definitionFile.

    {
      "packageDirectories": [
        {
          "path": "package1",
          "default": true,
        },
        {
          "path": "package2",
          "default": false
        }
      ],
      "plugins": {
        "sfp":{
          "scratchOrgDefFilePaths":{
            "enableMultiDefinitionFiles": true,
            "packages": {
              "package1":"scratchOrgDef/package1-def.json"
            }
          }
        }
      }
    }
    Ensure you have authenticated your local development environment to your DevHub. If you are not familiar with the process, you can follow the instructions provided by Salesforce here.

    C. Add additional attributes to your Salesforce DX Project

    sfp cli only works with a Salesforce DX project, with source format as described here . If your project is not source format, you would need to convert into source format using the options provided by salesforce cli.

    1. Navigate to your sfdx-project.json file and locate the packageDirectories property.

    In the above example, the package directories are

    • force-app

    • unpackaged

    • utils

    1. Add the following additional attributes to the sfdx-project.json and save.

    • package

    • versionNumber

    Thats the minimal configuration required to run sfp on a project.

    Move on to the next chapter to execute sfp commands in this directory.

    For detailed configurations on this sfdx-project.json schema for sfp, click here.

    enabled DevHub
    Docker installed on your machine
  • Registry credentials from your welcome email

  • Accessing the Images

    1. Login to the Gitea registry:

    1. Pull the desired image:

    The version numbers can be found at https://source.flxbl.io/flxbl/-/packages/container/sfp-pro/

    1. (Optional) Tag for your registry:

    Best Practices

    1. Use specific version tags in production

    2. Cache images in your private registry for better performance

    3. Implement proper access controls in your registry

    4. Document image versions used in your pipelines

    Building Docker Images

    If you need to build the images yourself, you can access the source code from source.flxbl.io and follow these instructions:

    Prerequisites

    • Docker with BuildKit support

    • GitHub Personal Access Token with packages:read permissions

    • Node.js (for local development)

    Building the Base Image (sfp-pro-lite)

    Building the Image with SF CLI Bundled (sfp-pro)

    Build Arguments

    The following build arguments are supported:

    • NODE_MAJOR: Node.js major version (default: 22)

    • SFP_VERSION: Version of SFP Pro to build

    • GIT_COMMIT: Git commit hash for versioning

    • SF_COMMIT_ID: Salesforce commit ID

    Support

    For issues or questions about Docker images, please contact flxbl support through your designated support channels.

    The presence of an additional type attribute within a package directory will further inform sfp of the specific nature of the package. For instance, types such as "data" for data packages or "diff" for diff packages

    The sfdx-project.json file outlines various specifications for Salesforce DX projects, including the definition and management of different types of Salesforce packages. From the sample provided, sfp (Salesforce Package Builder) analyzes the "package" attribute within each packageDirectories entry, correlating with packageAliases to identify package IDs, thereby determining the package's type as 2GP (Second Generation Packaging).

    Consider the following sfdx-project.json

    Understanding Package Type Determination using the samplesfdx-project.json

    The sfdx-project.json sample can be used to determine how sfp processes and categorizes packages within a Salesforce DX project. The determination process for each package type, based on the attributes defined in the packageDirectories, unfolds as follows:

    • Unlocked Packages: For a package to be identified as an Unlocked package, sfp looks for a correlation between the package name defined in packageDirectories and an alias within packageAliases. In the provided example, the "Expense-Manager-Util" under the util path is matched with its alias "Expense Manager - Util", subsequently confirmed through the DevHub with its package version ID, categorizing it as an Unlocked package.

    • Source Packages: If a package does not have a corresponding alias in packageAliases, it is treated as a Source package. These packages are typically utilized for organizing source code not intended for release. For instance, packages specified in paths like "exp-core-config" and "expense-manager-test-data" would default to Source packages if no matching aliases are found.

    • Specialized Package Types: The explicit declaration of a type attribute within a package directory allows for the differentiation into more specialized package types. For example, the specification of "type": "data" explicitly marks a package as a Data package, targeting specific use cases different from typical code packages.

    Limiting artifacts by packages

    Artifacts to be built can be limited only to certain packages. This could be very helpful, in case you want to build an artifact of only one package or a few while testing locally.

    Limiting artifacts by ignoring packages

    Packages can be ignored from the build command by utilising an additional descriptor on the package in project manifest (sfdx-project.json)

    In the above scenario, src-env-specific-pre will be ignored while build command is invoked

    # release-config for sales
    releaseName: sales
    pool: sales-pool
    excludeAllPackageDependencies: true 
    includeOnlyArtifacts:
      - sales-ui
      - sales-channels
    releasedefinitionProperties:
      skipIfAlreadyInstalled: true
      usePackageDiffDeploymentMode: true
      promotePackagesBeforeDeploymentToOrg: prod
      changelog:
        workItemFilters:
          -  (AKG|GIK)-[0-9]{2,5}
        workItemUrl: https://example.atlassian.net/browse
        limit: 30
    domain
    release config
  • Credentials Setup: Find the section dedicated to setting up publishing credentials or authentication.

  • Follow Steps: Adhere to the detailed steps provided by your registry to configure your system for publishing.

  • When working with sfp and managing artifacts, it’s required to use a private NPM registry. This allows for more control over package distribution, increased security, and custom package management. Here are some popular NPM compatible private registries, including instructions on how to configure NPM to use them:

    GitHub Packages

    GitHub Packages acts as a hosting service for npm packages, allowing for seamless integration with existing GitHub workflows. For configuration details, visit GitHub's guide on configuring NPM for use with GitHub Packages.

    GitLab NPM Registry

    GitLab offers a fully integrated npm registry within its continuous integration/continuous deployment (CI/CD) pipelines. To configure NPM to interact with GitLab’s registry, refer to GitLab's NPM registry documentation.

    Azure Artifacts

    Azure Artifacts provides a npm feed that's compatible with NPM, enabling package hosting, sharing, and version control. Note: The link provided previously was incorrect. Azure Artifacts information can be found here: Azure Artifacts documentation.

    JFrog Artifactory

    JFrog Artifactory offers a robust solution for managing npm packages, with features supporting binary repository management. For setting up Artifactory to work with NPM, refer to JFrog’s npm Registry documentation.

    MyGet

    MyGet provides package management support for npm, among other package managers, facilitating the hosting and management of private npm packages. For specifics on utilizing NPM with MyGet, check out MyGet’s NPM support documentation.

    Utilising publish command

    • Each of these registries offers its own advantages, and the choice between them should be based on your project’s needs and existing infrastructure.

    • Follow the instructions on your npm registry to generate .npmrc file with the correct URL and access token (which has the permission to publish into your registry.

    • Utilize the parameters in sfp publish and provide the npmrc file along with activating npm

    Tagging an artifact

    sfp's publish command provides you with various options to tag the published artifacts in version control. Please note that these tags are utilized by sfp's build command to determine which packages are to be built when used with diffcheckflag

    1. Retrieval and Comparison: Initially, picklists defined within the packages marked for deployment will be retrieved from the target org. These retrieved values are then compared with the local versions of the picklists.

    2. Update on Change Detection: Should any discrepancies be identified between the local and retrieved picklists, the differing values will be updated in the target org using the Salesforce Tooling API.

    3. Handling New Picklists: During the retrieval phase, picklists that are present locally but absent in the target org are identified as newly created. These new picklists are deployed using the standard deployment process, as Salesforce permits the deployment of new picklists from unlocked packages.

    4. Picklist Enabled Package Identification: Throughout the build process, the sfp cli examines the contents of each unlocked package artifact. A package is marked as 'picklist enabled' if it contains one or more picklist(s).

    enablePicklist

    boolean

    Enable picklist upgrade for unlocked packages

    • unlocked

    • org dependent unlocked

    here
    Apex Test Execution during installation
    Coverage Requirement

    Unlocked Package

    No

    75% for the contents of a package validated during build

    Org Dependent Unlocked Package

    No

    No

    Source Package

    Yes

    Yes, either each classes need to have individual 75% coverage or use the entire org's apex test coverage

    Diff Package

    Yes

    Yes, either each classes need to have individual 75% coverage or use the entire org's coverage

    Data Package

    No

    No

    To ensure salesforce deployment requirements for source packages, each Apex class within the source package must achieve a minimum of 75% test coverage. This coverage must be verified individually for each class. It's imperative that the test classes responsible for this coverage are included within the same package.

    During the artifact preparation phase, sfp cli will automatically identify Apex test classes included within the package. These identified test classes will then be utilised at the time of installation to verify that the test coverage requirement is met for each Apex class, ensuring compliance with Salesforce's deployment standards. When there are circumstances, where individual coverage cannot be achieved by the apex classes within a package, one can disable 'optimized deployment' feature by using the attribute mentioned below.

    This option is only applicable to source/diff packages. Disabling this option will trigger the entire local tests in the org and can lead to considerable delays

    isOptimizedDeployment

    boolean

    Detects test classes in a source package automatically and utilise it to deploy the provided package

    • source

    • diff

    sfp commands
    
    Command                         Summary
     ─────────────────────────────── ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
     apextests trigger               Triggers Apex unit test in an org. Supports test level RunAllTestsInPackage, which optionally allows validation of individual class code coverage
     artifacts fetch                 Fetch sfp artifacts from a NPM compatible registry using a release definition file
     artifacts promote               Promotes artifacts predominantly for unlocked packages with code coverage greater than 75%
     artifacts query                 Fetch details about artifacts installed in a target org
     build                           Build artifact(s) of your packages in the current project
     ...
    sfp build -v <DevHubAlias/DevHubUsername>
    sfp install -o <TargetOrgAlias/TargetOrgUsername>
     {
          "path": "src/health-cloud",
          "package": "health-cloud",
          "versionName": "health-cloud-1.0",
          "versionNumber": "1.0.0.0",
          "assignPermSetsPreDeployment": [
            "HealthCloudFoundation",
            "HealthCloudSocialDeterminants",
            "HealthCloudAppointmentManagement",
            "HealthCloudVideoCalls",
            "HealthCloudUtilizationManagement",
            "HealthCloudMemberServices",
            "HealthCloudAdmin",
            "HealthCloudApi",
            "HealthCloudLimited",
            "HealthCloudPermissionSetLicense",
            "HealthCloudStandard",
            "HealthcareAssociatedInfectionDiseaseGroupAccess"
          ]
        },
    // Sample package 
    
    {
      "packageDirectories": [
          {    
          "path": "packages/access-mgmt",
          "package": "access-mgmt",
          "default": false,
          "versionName": "access-mgmt",
          "versionNumber": "1.0.0.0",
          "reconcileProfiles": "true"
        }
    
    {
      "packageDirectories": [
        {
          "path": "src/frameworks/framework-core",
          "package": "framework-core",
          "versionNumber": "1.0.0.NEXT"
        },
        {
          "path": "src/frameworks/framework-config",
          "package": "framework-config",
          "versionNumber": "1.0.0.NEXT",
          "alwaysSync": true
        }
      ]
    }
    // An example where validate is utilized against a pool 
    // of earlier prepared scratch orgs with label review
     
    sfp validate pool  -p review \
                       -v devhub  \
                       --diffcheck
    // An example where validate is utilized against a target org 
    sfp validate org -o ci \
                     -v devhub  \
                     --installdeps
    // An example with branch references for intelligent synchronization
    sfp validate org -o ci \
                     -v devhub \
                     --ref feature-branch \
                     --baseRef main
    // An example with release config for domain-limited validation
    sfp validate org -o ci \
                     -v devhub \
                     --releaseconfig config/release-sales.yml
    {
      "packageDirectories": [
        {
          "path": "util",
          "package": "Expense-Manager-Util",
          "versionName": "Winter ‘25",
          "versionDescription": "Welcome to Winter 2025 Release of Expense Manager Util Package",
          "versionNumber": "4.7.0.NEXT"
        },
        {
          "path": "exp-core",
          "default": false,
          "package": "ExpenseManager",
          "versionName": "v 3.2",
          "versionDescription": "Winter 2025 Release",
          "versionNumber": "3.2.0.NEXT",
          "dependencies": [
            {
              "package": "ExpenseManager-Util",
              "versionNumber": "4.7.0.LATEST"
            },
              {
              "package": "TriggerFramework",
              "versionNumber": "1.7.0.LATEST"
            },
            {
              "package": "External Apex Library - 1.0.0.4"
            }
          ]
        },
        {
          "path": "src/exp-core-config",
          "package": "Expense-Manager-Org-Config",
          "type" : "source",
          "versionName": "Winter ‘25",
          "versionDescription": "Welcome to Winter 2025 Release of Expense Manager Util Package",
          "versionNumber": "4.7.0.NEXT"
        },
      ],
      "sourceApiVersion": "47.0",
      "packageAliases": {
        "TriggerFramework": "0HoB00000004RFpLAM",
        "Expense Manager - Util": "0HoB00000004CFpKAM",
        "External Apex [email protected]": "04tB0000000IB1EIAW",
        "Expense Manager": "0HoB00000004CFuKAM"
      }
    }
    { 
    "packageDirectories" : [ 
        { "path": "force-app", "default": true}, 
        { "path" : "unpackaged" }, 
        { "path" : "utils" } 
      ],
    "namespace": "", 
    "sfdcLoginUrl" : "https://login.salesforce.com", 
    "sourceApiVersion": "60.0"
    }
    {
       "packageDirectories" : [ 
        {
           "package": "force-app",
           "versionNumber": "1.0.0.NEXT",
           "path": "force-app",
           "default": true
         }, 
        { "path" : "unpackaged" },  // You can repeat the same for additonal directories
        { "path" : "utils" }  // You can repeat the same for additonal directories
       ],
    "namespace": "", 
    "sfdcLoginUrl" : "https://login.salesforce.com", 
    "sourceApiVersion": "60.0"
    }
    docker login source.flxbl.io -u your-username
    # For base sfp-pro image
    docker pull source.flxbl.io/sfp-pro-lite:version
    
    # For sfp-pro with SF CLI
    docker pull source.flxbl.io/sfp-pro:version
    # Tag for your registry
    docker tag source.flxbl.io/sfp-pro:version your-registry/sfp-pro:version
    
    # Push to your registry
    docker push your-registry/sfp-pro:version
    # Create a file containing your GITEA token
    echo "YOUR_GITEA_TOKEN" > .npmrc.token
    
    # Build the base sfp-pro image (without SF CLI)
    docker buildx build \
      --secret id=npm_token,src=.npmrc.token \
      --build-arg NODE_MAJOR=22 \
      --file dockerfiles/sfp-pro-lite.Dockerfile \
      --tag sfp-pro-lite:local .
    
    # Remove the token file
    rm .npmrc.token
    # Create a file containing your GITEA token
    echo "YOUR_GITEA_TOKEN" > .npmrc.token
    
    # Build the sfp-pro image with SF CLI bundled
    docker buildx build \
      --secret id=npm_token,src=.npmrc.token \
      --build-arg NODE_MAJOR=22 \
      --file dockerfiles/sfp-pro.Dockerfile \
      --tag sfp-pro:local .
    
    # Remove the token file
    rm .npmrc.token
    // Sample sfdx-project.json
    {
      "packageDirectories": [
        {
          "path": "util",
          "default": true,
          "package": "Expense-Manager-Util",
          "versionName": "Winter ‘20",
          "versionDescription": "Welcome to Winter 2020 Release of Expense Manager Util Package",
          "versionNumber": "4.7.0.NEXT"
        },
        {
          "path": "exp-core",
          "default": false,
          "package": "ExpenseManager",
          "versionName": "v 3.2",
          "versionDescription": "Winter 2020 Release",
          "versionNumber": "3.2.0.NEXT",
          "dependencies": [
            {
              "package": "ExpenseManager-Util",
              "versionNumber": "4.7.0.LATEST"
            },
              {
              "package": "TriggerFramework",
              "versionNumber": "1.7.0.LATEST"
            },
            {
              "package": "External Apex Library - 1.0.0.4"
            }
          ]
        },
        {
          "path": "exp-core-config",
          "package": "expense-manager-config",
          "versionNumber": "1.0.1.NEXT",
          "versionDescription": "This source package extends expense manager unlocked package",
        },
        {
          "path": "expense-manager-test-data",
          "package": "exppense-manager-test-data",
          "type":"data",
          "versionNumber": "1.0.1.NEXT",
          "versionDescription": "This source package extends expense manager unlocked package",
        },
      ],
      "sourceApiVersion": "47.0",
      "packageAliases": {
        "TriggerFramework": "0HoB00000004RFpLAM",
        "Expense Manager - Util": "0HoB00000004CFpKAM",
        "External Apex [email protected]": "04tB0000000IB1EIAW",
        "Expense Manager": "0HoB00000004CFuKAM"
      }
    }
    // Limit build by release config
    sfp build -v devhub --branch=main --domain config/sales.yaml
    // Limit build by certain packages
    sfp build -v devhub --branch=main -p sales-ui,sales-channels
    // Sample sfdx project json 
    {
      "packageDirectories": [
        {
          "path": "./src-env-specific-pre",
          "package": "src-env-specific-pre",
          "versionNumber": "1.0.0.0",
          "ignoreOnStage": [
            "build"
          ]
        },
        {
          "path": "./src/frameworks/feature-mgmt",
          "package": "feature-mgmt",
          "versionNumber": "1.0.0.NEXT"
        }
      ]
    }
      --npm                             Upload artifacts to a pre-authenticated private npm registry
      --npmrcpath=<value>               Path to .npmrc file used for authentication to registry. If left blank, defaults to home
                                        directory
      --npmtag=<value>                  Add an optional distribution tag to NPM packages. If not provided, the 'latest' tag is set
                                        to the published version.
      --scope=<value>                   (required for NPM) User or Organisation scope of the NPM package
      --gittag                          Tag the current commit ID with an annotated tag containing the package name and version -
                                        does not push tag
      --gittagage=<value>               Specifies the number of days,for a tag to be retained,any tags older the provided number
                                        will be deleted
      --gittaglimit=<value>             Specifies the minimum number of  tags to be retained for a package
      --pushgittag                      Pushes the git tags created by this command to the repo, ensure you have access to the repo
    // Demonstrating how to disable optimized deployment
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "enablePicklist": false
        },
         ...
       ]
    }
    // Demonstrating how to disable optimized deployment
    {
      "packageDirectories": [
        {
          "path": "core-crm",
          "package": "core-crm",
          "versionDescription": "Package containing core schema and classes",
          "versionNumber": "4.7.0.NEXT",
          "isOptimizedDeployment": false
        },
         ...
       ]
    }
    :
    • API Name: Validate that the API name within the metadata file accurately reflects the new version.

    • Version Number: Update the versionNumber in your metadata file to represent the new version clearly.

    • Default Version: Confirm that only one version of the Entitlement Process is set as Default to avoid deployment conflicts.

    Download the appropriate installer for your platform:

    Platform
    Installer
    Filename Example

    Windows

    MSI

    sfp-pro-49.3.0-windows-x64.msi

    macOS

    DMG

    sfp-pro-49.3.0-darwin-universal.dmg

    Linux (Debian/Ubuntu)

    DEB

    sfp-pro_49.3.0_linux_amd64.deb

    Linux (RHEL/Fedora)

    RPM

    Step 2: Install

    Windows

    macOS

    Linux (Debian/Ubuntu)

    Linux (RHEL/Fedora/CentOS)

    Step 3: Verify Installation

    Updating sfp-pro

    Download and run the latest installer - it will automatically upgrade your existing installation.

    Uninstalling

    Windows

    Use "Add or Remove Programs" in Control Panel

    macOS

    Linux (Debian/Ubuntu)

    Linux (RHEL/Fedora/CentOS)

    https://source.flxbl.io/flxbl/sfp-pro/releases

    Dependency Declaration

    Yes

    Yes (supported by sfp)

    Yes

    Yes

    Yes (supported by sfp)

    Requires dependency to be resolved during creation

    Yes

    No

    No

    N/A

    No

    Supported Metadata Types

    Unlocked Package Section in

    Unlocked Package Section in

    Metadata API Section in

    N/A

    Metadata API Section in

    Code Coverage Requirement

    Package should have 75% code coverage or more

    Not enforced by Salesforce, sfp by default checks for 75% code coverage

    Each apex class should have a coverage of 75% or above for optimal deployment, otherwise the entire coverage of the org will be utilized for deployment

    N/A

    Each apex class that's part of the delta between the current version and the baseline needs a test class and requires a coverage of 75%.

    Component Lifecycle

    Automated

    Automated

    Explicit, utilize destructiveManifest or manual deletion

    N/A

    Explicit, utilize destructiveManifest or manual deletion

    Component Lock

    Yes, only one package can own the component

    Yes, only one package can own the component

    No

    N/A

    No

    Version Management

    Salesforce enforced versioning; Promotion required to deploy to prod

    Salesforce enforced versioning; Promotion required to deploy to prod

    sfp enforced versioning

    sfp enforced versioning

    sfp enforced versioning

    Dependency Validation

    Occurs during package creation

    Occurs during package installation

    Occurs during package installation

    N/A

    Occurs during package installation

    Navigate to the Setup menu

  • Go to Development > Dev Hub

  • Toggle the button to on for Enable Dev Hub

  • Enable Unlocked Packages and Second-Generation Managed Packages

  • Enable Dev Hub

    2. Install sfpowerscripts-artifact Unlocked Package

    The sfpowerscripts-artifact package is a lightweight unlocked package consisting of a custom setting SfpowerscriptsArtifact2__c that is used to keep a record of the artifacts that have been installed in the org. This enables package installation, using sfp, to be skipped if the same artifact version already exists in the target org.

    Once the command completes, confirm the unlocked package has been installed.

    1. Navigate to the Setup menu

    2. Go to Apps > Packaging > Installed Packages

    3. Confirm the package sfpowerscripts-artifact is listed in the "Installed Packages"

    sfpowerscripts-artifact

    Ensure that you install sfpowerscripts artifact unlocked package in all your target orgs that you intend to deploy using sfp.

    If refreshing from Production with sfpowerscripts-artifact already installed, you do not need to install again to your sandboxes.

    Enable Dev Hub
  • Once A is completed, trigger creation of artifacts for packages B & C **,**using the version of A, created in step 1

  • Once C is completed, trigger creation of package D

  • Scenario 2 : Build with diffCheck enabled on a package with no dependencies

    In this scenario, where only a single package has changed and diffCheck is enabled, the build command will only trigger the creation of Package B

    Scenario 3 : Build with diffCheck enabled on changes in multiple packages

    In this scenario, where there are changes in multiple packages, say B & C, the build command will trigger the creation of artifacts for these packages in parallel, as their dependent package A has not changed (hence fulfilled). Please note even though there is a change in C, package D will not be triggered, unless there is an explicit version change of version number (major.minor.patch) of package D

    publish

    Migrating from sfp community edition to sfp pro edition

    Breaking Changes: sfp community edition to sfp-pro

    Command
    Must Work
    Notes

    sfp build (alias: orchestrator:build)

    ✅

    Pro adds --commit flag

    sfp install

    ✅

    Core functionality unchanged

    sfp publish

    ✅

    Critical: Update Your Pipelines

    Most orchestrator: command aliases have been removed in sfp-pro. You must update your pipelines:

    Old Command (sfp-comm)
    New Command (sfp-pro)

    New/Changed in sfp-pro:

    Exception: sfp orchestrator:publish still works in sfp-pro.

    • --commit flag added to build command for specifying commit hash

    • Server flags added to most commands (--server-url, --api-key)

    • Additional commands in pro: server:*

    ✅ What Stays the Same

    • Core command functionality is unchanged

    • All flags from sfp-comm still work

    • Command outputs remain compatible

    Migration Notes:

    🆕 New in sfp-pro

    • All core orchestrator commands (build, install, publish, release) work identically

    • Existing pipelines using these commands will continue to work

    • New flags are optional and backward compatible

    Transitive Dependency Resolution

    This feature is by default activated whenever build/quickbuild even in implicit scenarios such as validate, prepare etc, which might result in building packages.

    Let's consider the following sfdx-project.json to explain how this feature works.

    The above project manifest (sfdx-project.json) describes three packages, sfdc-logging, feature-mgmt., core-crm . Each package are defined with dependencies as described below

    Package
    Incorrectly Defined Dependencies

    sfdc-logging

    None

    feature-mgmt

    sfdc-logging

    As you might have noticed, this is an incorrect representation, as per the definitions of unlocked package, the package 'core-crm' should be explicitly defining all its dependencies. This means it should be as described below.

    Package
    Correctly Defined Dependencies

    To successfully create a version of core-crm , both sfdc-logging and feature-mgmt. should be defined as an explicit dependency in the sfdx-project.json

    As the number of packages grow in your project, it is often seen developers might accidentally miss declaring dependencies or the sfdx-project.json has become so large due to large amount of repetition of dependencies between packages. This condition would result in build stage often failing with missing dependencies.

    sfp features a transitive dependency resolution which can autofill the dependencies of the package by inferring the dependencies from sfdx-project.json, so the above package descriptor of core-crm will be resolved correctly to \

    Please note, in the current iteration, it will autofill dependency information from the current sfdx-project.json and doesn't consider variations among previous versions.

    For dependencies outside of the sfdx-project.json, one could define an externalDependencyMap as shown below

    If you need to disable this feature and have stringent dependency management rules, utilize the following in your sfdx-project.json

    An external dependency is a package that is not defined within the current repositories sfdx-project.json. Managed packages and unlocked packages built from other repositories fall into 'external dependency' bucket. IDs of External packages have to be defined explicitly in the packageAliases section.

    Development Environment

    In a Flxbl project powered by sfp, development follows a structured, iterative workflow designed for team collaboration and continuous delivery. This section guides you through the complete development lifecycle - from setting up your environment to submitting code for review.

    Prerequisites

    DevHub Access is Required for sfp development:

    • Building any type of package (source, unlocked, data)

    • Creating and managing scratch orgs

    • Resolving package dependencies

    Ensure your Salesforce user has DevHub access by following .

    The Development Flow

    Development in an sfp project follows these key principles:

    1. Isolated Environments: Each developer works in their own sandbox or scratch org

    2. Source-Driven Development: All changes are tracked in version control

    3. Iterative Cycles: Developers continuously pull, modify, and push changes

    4. Automated Validation: Pull requests trigger comprehensive validation

    How Developers Work

    Developers in an sfp project typically follow this pattern:

    Starting a Sprint or Feature

    • Fetch a fresh environment from a pre-prepared pool

      • Scratch orgs: Can be fetched per story for maximum isolation

      • Sandboxes: Usually fetched at sprint start for longer-running work

    • Pull the latest metadata to sync with the org

    Daily Development Cycle

    • Pull changes from the org to stay synchronized

    • Make modifications to metadata and code

    • Push changes back to the org for testing

    • Run tests locally to validate functionality

    Completing Work

    • Create a pull request with your changes

    • Automated validation runs via CI/CD pipeline

    • Review environment is created for acceptance testing

    • Merge after approval and successful validation

    Environment Management

    sfp provides powerful commands for environment management through pools:

    Local Scratch Org Pools (Community Edition)

    Pre-prepared scratch orgs managed locally via DevHub:

    Server-Managed Pools (sfp-pro)

    Centralized pool management for both scratch orgs and sandboxes:

    This pooling approach means developers spend less time waiting for environment creation and more time coding.

    Source Synchronization

    The foundation of sfp development is bidirectional source synchronization:

    Pull Command

    Retrieves changes from your org to local source:

    • Automatically handles text replacement reversals

    • Detects conflicts and provides resolution options

    • Suggests new replacements for hardcoded values

    Push Command

    Deploys local changes to your org:

    • Applies environment-specific text replacements

    • Handles destructive changes automatically

    • Supports various test execution levels

    What's Next?

    For a complete walkthrough of the development workflow with detailed commands and examples:

    For specific development tasks:

    For managing environments:

    Dependency Management

    sfp provides powerful commands to manage and understand package dependencies in your Salesforce projects. These tools help you maintain clean dependency declarations, troubleshoot dependency issues, and optimize your build processes.

    Available Commands

    Command
    Description

    Add all transitive dependencies to make them explicit

    Understanding Dependencies

    In Salesforce DX projects, packages can depend on other packages. These dependencies come in two forms:

    • Direct Dependencies: Dependencies explicitly declared in a package's configuration

    • Transitive Dependencies: Dependencies of your dependencies (indirect dependencies)

    For example, if Package A depends on Package B, and Package B depends on Package C, then:

    • Package A has a direct dependency on Package B

    • Package A has a transitive dependency on Package C

    Dependency Management Workflow

    A typical dependency management workflow involves:

    1. Development Phase: Use sfp dependency:expand to make all dependencies explicit during development, helping identify potential issues early

    2. Analysis: Use sfp dependency:explain to understand dependency relationships and identify unnecessary dependencies

    3. Cleanup: Use sfp dependency:shrink before committing to maintain minimal, clean dependency declarations

    External Dependencies

    External dependencies are packages from outside your project, typically:

    • Managed packages from AppExchange

    • Packages from other Dev Hub organizations

    • Third-party components

    These are configured in your sfdx-project.json under plugins.sfp.externalDependencyMap:

    Best Practices

    1. Keep dependencies minimal: Only declare direct dependencies in your source control

    2. Use expand for analysis: Temporarily expand dependencies to understand the full graph

    3. Validate regularly: Run dependency commands in CI/CD to catch issues early

    4. Document external dependencies: Clearly document why each external dependency is needed

    Common Issues and Solutions

    Circular Dependencies

    If packages depend on each other in a circular manner, the dependency commands will report an error. Refactor your packages to break the circular dependency.

    Missing Dependencies

    If a package references components from another package without declaring the dependency, add the missing dependency to the package's configuration.

    Version Conflicts

    When different packages require different versions of the same dependency, sfp will use the highest compatible version. Consider standardizing versions across your project.

    See Also

    • - Conceptual overview of dependencies

    • - How sfp resolves dependencies

    AI-Assisted Error Analysis

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    October 25

    sfp provides intelligent AI-assisted error analysis to help developers quickly understand and resolve validation failures. When enabled through the errorAnalysis configuration in ai-assist.yaml, the system automatically analyzes error patterns and provides actionable insights.

    AI-assisted error analysis requires:

    1. The errorAnalysis.enabled flag set to true in config/ai-assist.yaml

    Quick Setup

    1. Create config/ai-assist.yaml in your project root:

    2. Add the minimal configuration:

    3. Set your LLM provider credentials (e.g., in CI/CD secrets)

    That's it! AI error analysis will automatically activate during validation failures.

    How It Works

    1. Change Significance Analysis

    Before triggering AI analysis, sfp evaluates if changes are significant enough to warrant review:

    • Metadata Type Detection: Uses Salesforce ComponentSet for accurate identification

    • Smart Thresholds: Different thresholds per file type (Apex: 3 lines, Flows: 1 line, LWC: 10 lines)

    • Automatic Exclusions: Skips non-critical metadata (CustomLabels, StaticResources, Translations)

    2. Error Analysis

    When validation fails and changes are significant, AI provides:

    • Root Cause Analysis: Understanding why the error occurred

    • Quick Fix Suggestions: Immediate actions to resolve issues

    • Related Components: Other files that might be involved

    • Documentation Links: References to relevant Salesforce docs

    Configuration

    Configure AI assistance through config/ai-assist.yaml:

    Usage

    AI error analysis is automatically enabled when:

    1. A config/ai-assist.yaml file exists in your project

    2. The errorAnalysis.enabled flag is set to true

    3. Valid LLM provider credentials are available

    No additional CLI flags are required - sfp automatically detects and uses the configuration.

    Limiting Validation by Domain

    Validation processes often need to synchronize the provided organization by installing packages that differ from those already installed. This task can become particularly time-consuming for large projects with hundreds of packages, especially in monorepo setups with multiple independent domains.

    Using Release Configurations

    To streamline the validation process and focus it on specific domains, you can use release configurations with the --releaseconfig flag. This approach limits the scope of validation to only the packages defined in your release configuration, significantly enhancing efficiency and reducing validation time.

    Basic Usage

    In this example, validation is limited to packages defined in the release-domain-sales.yml configuration file. Only packages that:

    1. Are listed in the release configuration AND

    2. Have changes compared to what's installed in the org

    will be validated.

    Multiple Domain Configurations

    For projects with multiple independent domains, you can specify multiple release configurations:

    Benefits of Domain-Limited Validation

    1. Faster Feedback: Validate only the relevant packages for your team's domain

    2. Reduced Dependencies: Avoid failures from unrelated packages in other domains

    3. Parallel Development: Multiple teams can work independently without blocking each other

    4. Optimized CI/CD: Shorter validation times mean more efficient pipeline execution

    Example Release Configuration

    Combining with Other Options

    With Diff Check

    With Individual Mode

    With Branch References

    Best Practices

    1. Organize by Domain: Create separate release configurations for each logical domain

    2. Keep Configurations Updated: Regularly review and update package lists in release configs

    3. Use in CI/CD: Automate domain-specific validation in your pipeline

    4. Document Dependencies: Clearly document cross-domain dependencies in your configurations

    Note: The deprecated --mode=thorough-release-config and --mode=ff-release-config have been replaced by using the standard modes with the --releaseconfig flag. This provides the same functionality with a simpler, more consistent interface.

    Diff Package

    A diff package is a variant of 'source package' , where the contents of the package only contain the components that are changed. This package is generated by sfpowerscripts by computing a git diff of the current commit id against a baseline set in the Dev Hub Org

    A diff package mimics a Change Set model, where only changes are contained in the artifact. As such, this package is always an incremental package. It only deploys the changes incrementally compared to baseline and applied to the target org. Unless both previous version and the current version have the exact same components, a diff package can never be rolled back, as the impact on the org is unpredictable. It is always recommended to roll forward a diff package by fixing or creating a necessary change that counters the impact that you want on the target orgs.

    Diff packages doesnt work with scratch orgs. It should be used with sandboxes only.

    A diff package is the least consistent package among the various package types available within sfpowerscripts, and should only be used for transitioning to a modular development model as prescribed by flxbl

    The below example demonstrates a sfdx-project.json where the package unpackaged is a diff package. You can mark a diff package with the type 'diff'. All other attributes applicable to source packages are applicable for diff packages.

    A manual entry in sfpowerscripts_Artifact2__c custom object should be made with the name of the package and the baseline commit id and version. Subsequent deployments will automatically reset the baseline when the package gets deployed to Dev Hub

    Building a collection of packages together

    Attribute
    Type
    Description
    Package Types Applicable

    buildCollection

    array

    Build a collection of packages together, even if only one package among the collection is changed

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    In certain scenarios, it's necessary to build a new version of a package when any package in the collection undergoes changes. This can be accomplished by utilizing the buildCollection attribute in the sfdx-project.json file.

    To ensure that a new version of a package is built whenever any package in a specified collection undergoes a change, you can use the buildCollection attribute in the sfdx-project.json file. Below is an example illustrating how to define a collection of packages that should be built together.

    Field History & Feed Tracking

    Attribute
    Type
    Description
    Package Types Applicable

    enableFHT

    boolean

    Enable field history tracking for fields

    • unlocked

    • org dependent unlocked

    enableFT

    boolean

    Enable Feed Tracking for fields

    • unlocked

    • org dependent unlocked

    Salesforce has a strict limit on the number of fields that can be tracked. Of course, this limit can be increased by raising it to the Salesforce Account Executive (AE). However, it would be problematic if an unlocked package is deployed to orgs that do not have the limit increased. So don’t be surprised when you are working on a flxbl project and happen to find that the deployment of field history tracking from unlocked packages is disabled by Salesforce.

    One workaround is to keep a copy of all the fields that need to be tracked in a separate source package (field-history-tracking or similar) and deploy it as one of the last packages with the ‘alwaysDeploy’ option.

    However, this specific package has to be carefully aligned with the original source/unlocked packages, to which the fields originally belong. As the number of tracked fields increases in large projects, this package becomes larger and more difficult to maintain. In addition, since it’s often the case that the project does not own the metadata definition of fields from managed packages, it doesn’t make much sense to carry the metadata only for field history tracking purposes.

    To resolve this, sfp features the ability to automate the deployment of field history tracking for both unlocked packaged and managed packages without having to maintain additional packages.

    Two mechanisms are implemented to ensure that all the fields that need to be field history tracking enabled are properly deployed. Specifically, a YAML file that contains all the tracked fields is added to the package directory.

    During deployment, the YAML file is examined and the fields to be set are stored in an internal representation inside the deployment artifact. Meanwhile, the components to be deployed are analyzed and the ones with ‘trackHistory’ on are added to the same artifact. This acts as a double assurance that any field that is missed in the YAML files due to human error will also be picked up. After the package installation, all the fields in the artifact are retrieved from the target org and, for those fields, the trackHistory is turned on before they are redeployed to the org.

    In this way, the deployment of field history tracking is completely automated. One now has a declarative way of defining field history tracking (as well as feed tracking) without having to store the metadata in the repo. The only maintenance effort left for developers is to manage the YAML file.

    Data Packages

    Data packages are a sfpowerscripts construct that utilise the to create a versioned artifact of Salesforce object records in csv format, which can be deployed to the a Salesforce org using the sfpowerscripts package installation command.

    The Data Package offers a seamless method of integrating Salesforce data into your CICD pipelines , and is primarily intended for record-based configuration of managed package such as CPQ, Vlocity (Salesforce Industries), and nCino.

    Data packages are a wrapper around SFDMU that provide a few key benefits:

    • Ability to skip the package if already installed: By keeping a record of the version of the package installed in the target org with the support of an unlocked package, sfpowerscripts can skip installation of data packages if it is already installed in the org

    Destructive Changes

    sfp handles destructive changes according to the type of package. Here is a rundown on how the behaviour is according to various package types and modes

    Unlocked Packages

    Salesforce handles destructive changes in unlocked packages / org dependent unlocked packages as part of the package upgrade process. From the Salesforce documentation ()

    Metadata that was removed in the new package version is also removed from the target org as part of the upgrade. Removed metadata is metadata not included in the current package version install, but present in the previous package version installed in the target org. If metadata is removed before the upgrade occurs, the upgrade proceeds normally. Some examples where metadata is deprecated and not deleted are:

    Shrink Dependencies

    sfp-pro
    sfp (community)

    The sfp dependency:shrink command optimizes your project's dependency declarations by removing redundant transitive dependencies from your sfdx-project.json. This results in a cleaner project configuration with only the necessary direct dependencies declared for each package.

    Explain Dependencies

    sfp-pro
    sfp (community)

    The sfp dependency:explain command helps you understand the dependencies between packages in your project. It can analyze either a specific package's dependencies or all package dependencies in the project, showing both direct and transitive dependencies.

    Pre/Post Deployment Script

    Attribute
    Type
    Description
    Package Types Applicable

    In some situations, you might need to execute a pre/post deployment script to do manipulate the data before or after being deployed to the org.

    String Replacements

    Attribute
    Type
    Description
    Package Types Applicable

    AI Assisted Insight Report

    sfp-pro
    sfp (community)

    The AI-powered report functionality generates comprehensive analysis reports for your Salesforce projects using advanced language models. This feature provides deep insights into code quality, architecture, and best practices specific to the Flxbl framework.

        "plugins": {
            "sfp": {
              "disableEntitlementFilter": true //disable entitlement filtering
              }
            }
    # Double-click the .msi file or run:
    msiexec /i sfp-pro-*.msi
    # 1. Open the DMG file
    # 2. Drag sfp-pro.app to your Applications folder
    # 3. Run the installer script from the DMG:
    sudo bash /Volumes/sfp-pro-*/install-cli.sh
    # Or if you've already unmounted the DMG:
    sudo /Applications/sfp-pro.app/Contents/Resources/install-cli.sh
    sudo dpkg -i sfp-pro_*.deb
    sudo rpm -i sfp-pro_*.rpm
    # or
    sudo yum install sfp-pro_*.rpm
    sfp --version
    # Example output: @flxbl-io/sfp/49.3.0 linux-x64 node-v22.0.0
    sudo rm -rf /Applications/sfp-pro.app
    sudo rm -f /usr/local/bin/sfp
    sudo dpkg -r sfp-pro
    sudo rpm -e sfp-pro
    # or
    sudo yum remove sfp-pro
    sf package install --package 04t1P000000ka9mQAA -o <your_org_alias> --security-type=AdminsOnly --wait=120
    
    Waiting 120 minutes for package install to complete.... done
    Successfully installed package [04t1P000000ka9mQAA]
    {
        "packageDirectories": [
           {
                "path": "./src/frameworks/sfdc-logging",
                "package": "sfdc-logging",
                "versionName": "Version 1.0.2",
                "versionNumber": "1.0.2.NEXT"
            },
            {
                "path": "./src/frameworks/feature-mgmt",
                "package": "feature-mgmt",
                "versionName": "Version 1.0.6",
                "versionNumber": "1.0.6.NEXT",
                "dependencies": [
                    {
                        "package": "sfdc-logging",
                        "versionNumber": "1.0.2.LATEST"
                    }
                ]
            },
            {
                "path": "./src/core-crm",
                "package": "core-crm",
                "versionName": "Version 1.0.4",
                "versionNumber": "1.0.4.NEXT",
                "dependencies": [
                    {
                        "package": "feature-mgmt",
                        "versionNumber": "1.0.6.LATEST"
                    }
                ]
            }
        ],
        "namespace": "",
        "sfdcLoginUrl": "https://login.salesforce.com",
        "sourceApiVersion": "60.0",
        "packageAliases": {
            "feature-mgmt": "0Ho5f000000GmkrCAC",
            "sfdc-logging": "0Ho5f000000GmerCAC",
            "core-crm": "0Ho5f000000Amz7CAC"
        },
        "plugins": {}
    }

    sfp-pro_49.3.0_linux_amd64.rpm

    ,
    dev:*
    ,
    project:*
    ,
    sandbox:*
  • Orchestrator alias: Both versions support sfp orchestrator:build as alias for sfp build

  • --commit flag on build command

  • Server mode with --server-url and --api-key flags

  • Additional commands: server:*, dev:*, project:*, sandbox:*, config:*

  • Core functionality unchanged

    sfp release

    ✅

    Core functionality unchanged

    sfp orchestrator:build

    sfp build

    sfp orchestrator:deploy

    sfp install

    sfp orchestrator:release

    sfp release

    sfp orchestrator:validate

    sfp validate

    sfp orchestrator:quickbuild

    sfp quickbuild

    Metadata Coverage Report
    Metadata Coverage Report
    Metadata Coverage Report
    Metadata Coverage Report

    core-crm

    feature-mgmt

    sfdc-logging

    None

    feature-mgmt

    sfdc-logging

    core-crm

    sfdc-logging, feature-mgmt

    Review Environments: Stakeholders can test changes before merge

    Create a feature branch for version control

    Build artifacts to ensure packages are valid

    Salesforce's DevHub setup guide
    Development Workflow
    Pull Changes from your org
    Push Changes to your org
    Pools

    Build Optimization: Expanded dependencies help build tools understand the complete dependency graph for optimized builds

    Version carefully: Use specific versions for production, .LATEST for development

    sfp dependency:shrink

    Remove redundant transitive dependencies for cleaner configuration

    sfp dependency:explain

    Analyze and understand package dependencies

    Package Dependencies
    Transitive Dependency Resolution
    sfp dependency:expand
    A configured LLM provider (OpenAI, Anthropic, etc.)

    See Configuring LLM Providers for setup instructions.

    User-entered data in custom objects and fields are deprecated and not deleted. Admins can export such data if necessary.

  • An object such as an Apex class is deprecated and not deleted if it’s referenced in a Lightning component that is part of the package.

  • sfp utilizes mixed mode while installing unlocked packages to the target org. So any metadata that can be deleted is removed from the target org. If the component is deprecated, it has to be manually removed. Components that are hard deleted upon a version upgrade is found here.

    Source Packages

    Source packages support destructive changes using folder structure to demarcate components that need to be deleted. One can make use of pre-destructive and `post-destructive folders to mark components that need to be deleted

    The package installation is a single deployment transaction with components that are part of pre/post deployed along with destructive operation as specified in the folder structure. This would allow one to refactor the current code to facilitate refactoring for the destructive changes to succeed, as often deletion is only allowed if there are no existing components in the org that have a reference to the component that is being deleted

    {% hint style="info" %} Destructive Changes support for source package is currently available only in sfp (pro) version. {% endhint %}


    {% hint style="info" %}

    Things to look out for

    • Test destructive changes in your review environment thoroughly before merging your changes

    • You will need to understand the dependency implications while dealing with destructive changes, especially the follow on effects of a deletion in other packages, It is recommended you do a compile all of all apex classes (https://salesforce.stackexchange.com/a/149955 & https://salesforce.stackexchange.com/a/391614) to detect any errors on apex classes or triggers

    • After the version of package is installed across all the target orgs, you would need to merge another change which would remove the post-destructive or pre-destructive folders. You do not need to rush through this , as sfp ignores any warning associated with missing components in the org {% endhint %}


    Data Packages

    Data packages utilize sfdmu under the hood, and one can utilize any of the below approaches to remove data records.

    Approach 1: Combined Upsert and Delete Operations

    One effective method involves configuring SFDMU to perform both upsert and delete operations in sequence for the same object. This approach ensures comprehensive data management—updating and inserting relevant records first, followed by removing outdated entries based on specific conditions.

    Upsert Operation: Updates or inserts records based on a defined external ID, aligning the Salesforce org with new or updated data from a source file.

    Delete Operation: Deletes specific records that meet certain criteria, such as being marked as inactive, to ensure the org only contains relevant and active data.

    Approach 2: Utilizing deleteOldData

    Another approach involves using the deleteOldData parameter. This parameter is particularly useful when needing to clean up old data that no longer matches the current dataset in the source before inserting or updating new records.

    • Delete Old Data: Before performing data insertion or updates, SFDMU can be configured to remove all existing records that no longer match the new dataset criteria, thus simplifying the maintenance of data freshness and relevance in the target org

    https://developer.salesforce.com/docs/atlas.en-us.sfdx_dev.meta/sfdx_dev/sfdx_dev_unlocked_pkg_install_pkg_upgrade.htm?q=delete+metadata
    Usage

    Flags

    Flag
    Description
    Required

    -o, --overwrite

    Overwrites the existing sfdx-project.json file with the shrunk configuration

    No

    -v, --targetdevhubusername

    Username or alias of the target Dev Hub org

    Yes

    --loglevel

    Logging level (trace, debug, info, warn, error, fatal)

    No

    Behavior

    • Without --overwrite, creates a new file at ./project-config/sfdx-project.min.json

    • With --overwrite, backs up existing sfdx-project.json to ./project-config/sfdx-project.json.bak and overwrites the original

    • Removes transitive dependencies that are already covered by direct dependencies

    • Preserves external dependency mappings

    External Dependencies Configuration

    External dependencies can be configured in your sfdx-project.json file using the plugins.sfp section. This is particularly useful for managed packages or packages from other Dev Hubs that your project depends on.

    Configuration Format

    Example

    Notes

    • External dependencies must be defined with their 04t IDs (subscriber package version IDs)

    • The versionNumber can use .LATEST to automatically use the latest version that matches the major.minor.patch pattern

    • External dependencies are preserved during both shrink and expand operations

    • These dependencies are automatically included when calculating transitive dependencies

    Availability

    ✅

    ✅

    Usage

    Flags

    Flag
    Description
    Required

    -p, --package

    Name of the package to analyze dependencies for

    No

    --json

    Format output as JSON

    No

    --loglevel

    Logging level (trace, debug, info, warn, error, fatal)

    No

    Understanding the Output

    When analyzing dependencies, the command provides information about:

    • Direct dependencies: Dependencies explicitly declared in the package's configuration

    • Transitive dependencies: Dependencies that are required by your direct dependencies

    • For transitive dependencies, the command shows which packages contribute to requiring that dependency

    JSON Output Structure

    When using the --json flag, the command returns data in the following structure:

    Examples

    Analyze all package dependencies:

    Analyze dependencies for a specific package:

    Get dependencies in JSON format:

    Notes

    • The command requires a valid SFDX project configuration

    • Dependencies are resolved based on the information in your sfdx-project.json file

    • Transitive dependency resolution helps identify indirect dependencies that might not be immediately obvious from the project configuration

    Availability

    ✅

    ✅

    From

    November 24

    December 24

    sfp
    allow you to provide a path to a shell script (Mac/Unix) / batch script (on Windows).

    The scripts are called with the following parameters. In your script you can refer to the parameters using positional parameters.

    Please note scripts are copied into the artifacts and are not executed from version control. sfpowerscripts only copies the script mentioned by this parameter and do not copy any additional files or dependencies. Please ensure pre/post deployment scripts are independent or should be able to download its dependencies

    Position
    Value

    1

    Name of the Package

    2

    Username of the target org where the package is being deployed

    3

    Alias of the target org where the package is being deployed

    4

    Path to the working directory that has the contents of the package

    5

    Path to the package directory. One would need to combine parameter 4 and 5 to find the absolute path to the contents of the package

    Please note the script has to be completely independent and should not have dependency on a file in the version control, as scripts are executed within the context of an artifact.

    preDeploymentScript

    string

    Run an executable script before deploying an artifact. Users need to provide a path to the script file

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    postDeploymentScript

    string

    Run an executable script after deploying an package. Users need to provide a path to the script file

    • unlocked

    • org-dependent unlocked

    • source

    • diff

    String replacements are available only for Source Packages and require sfp-pro (available from September 2025)

    String replacements enable automatic substitution of placeholder values with environment-specific values during package installation, similar to how aliasfy packages work with folder structures.

    How It Works

    During build, when a package contains a preDeploy/replacements.yml file:

    1. The configuration is analyzed and validated

    2. The replacement patterns are embedded in the artifact

    3. During installation, placeholders are replaced based on the target org

    For example, if you have placeholders like %%API_ENDPOINT%% in your code, they get replaced with the appropriate value for the target environment during installation.

    Configuration

    Place your replacements.yml file in the package's preDeploy directory:

    The replacement configuration will be automatically detected and processed during build.

    Package Type Requirements

    String replacements are only supported for source packages. If you attempt to use replacements with other package types (unlocked, org-dependent unlocked, or data), the build will fail with an error message indicating that you must either remove the replacements.yml file or change the package type to 'source'.

    Example Configuration

    The replacements.yml file in the preDeploy folder is automatically detected and processed during build.

    When to Use String Replacements vs Aliasfy Packages

    String replacements and aliasfy packages are complementary features for managing environment-specific configurations:

    Use String Replacements For:

    • Configuration values that change between environments (API endpoints, email addresses, feature flags)

    • Reducing duplication when only specific values differ in otherwise identical files

    • Maintaining single source for business logic while varying configuration

    • Code-based metadata (Apex classes, Lightning components) with environment-specific values

    Use Aliasfy Packages For:

    • Structural differences between environments (different fields, objects, workflows)

    • Completely different metadata per environment (unique page layouts, permission sets)

    • Declarative configurations that vary significantly by environment

    • Binary metadata that cannot be text-replaced (images, static resources)

    Migration Strategy

    If you're currently using aliasfy packages with duplicated files just for value changes, consider migrating to string replacements:

    1. Identify candidates: Find files duplicated solely for configuration value changes

    2. Consolidate files: Create single source file with placeholder patterns

    3. Create replacements.yml: Define environment-specific values

    4. Remove duplicates: Delete redundant files from aliasfy folders

    5. Test thoroughly: Validate in lower environments before production

    Example: Before and After Migration

    Before (Using Aliasfy - 3 duplicate files):

    After (Using String Replacements - 1 file + config):

    String replacements reduce maintenance overhead by eliminating file duplication while maintaining environment-specific configurations

    Related Documentation

    • String Replacements Overview

    • String Replacements During Install

    • Aliasfy Packages

    replacements

    configuration file

    Enable replacement of placeholder values with environment-specific values during installation

    • source

    Overview

    The report generator analyzes your codebase through multiple perspectives:

    • Package architecture and design patterns

    • Dependencies and coupling between packages

    • Code quality and technical debt

    • Flxbl best practices compliance

    • Security and compliance considerations

    Prerequisites

    For complete setup and configuration instructions, see Configuring LLM Providers.

    Quick Setup

    Basic Usage

    Package Analysis

    Domain Analysis ( sfp-pro only)

    Provider-Specific Examples

    Anthropic (Recommended)

    GitHub Copilot

    Ensure the corresponding models are activated in GitHub Copilot Settings

    Amazon Bedrock

    Output Format

    Reports are generated in Markdown format with the following structure:

    1. Executive Summary - High-level findings and recommendations

    2. Package/Domain Overview - Architecture and design analysis

    3. Dependencies Analysis - Inter-package relationships

    4. Code Quality Insights - Technical debt and improvement opportunities

    5. Recommendations - Prioritized action items

    Troubleshooting

    OpenCode CLI Not Found

    If you see an error about OpenCode CLI not being installed:

    Rate Limiting

    If you encounter rate limits:

    • Reduce --prompt-count to lower token usage

    • Analyze smaller scopes (single package vs domain)

    • Consider using a different model with higher limits

    Cost Considerations

    • Token Usage: Package analysis typically uses 10-30K tokens, domains 30-80K tokens

    • Models: Default models are optimized for best value and performance

    • GitHub Copilot: No additional cost if you have Copilot subscription

    • Amazon Bedrock: Pay-per-use pricing through AWS, check Bedrock pricing in your region

    Availability

    ✅

    ❌

    From

    October 25

    Not Available

     {
                "path": "./src/core-crm",
                "package": "core-crm",
                "versionName": "Version 1.0.6",
                "versionNumber": "1.0.6.NEXT",
                "dependencies": [
                      {
                        "package": "sfdc-logging",
                        "versionNumber": "1.0.2.LATEST"
                    },
                    {
                        "package": "feature-mgmt",
                        "versionNumber": "1.0.6.LATEST"
                    }
                ]
            },
            
     "plugins": {
            "sfp": {
                "disableTransitiveDependencyResolver": true,
                "ignoreFiles": {
                    "prepare": ".forceignore",
                    "validate": ".forceignore",
                    "quickbuild": "forceignores/.quickbuildignore",
                    "build": "forceignores/.buildignore"
                },
                "externalDependencyMap": {
                    "trigger-framework": [
                        {
                            "package": "0H1000XRTCam",
                            "versionNumber": "1.0.3.LATEST"
                        }
                    ],
              }
          }
    }
    "plugins": {
            "sfp": {
                "disableTransitiveDependencyResolver": true,
            }
        }
    # Fetch a scratch org for your feature (aliases: pool:fetch)
    sfp pool scratch fetch --tag dev-pool --alias my-feature \
      --targetdevhubusername mydevhub
    # Fetch an org instance from the server pool
    sfp server pool instance fetch \
      --repository myorg/myrepo \
      --tag dev-pool \
      --assignment-id feature-123
    
    # The same command works for both scratch orgs and sandboxes
    # Pool type is configured at the server level
    {
      "plugins": {
        "sfp": {
          "externalDependencyMap": {
            "external-package-name": [
              {
                "package": "04tXXXXXXXXXXXXXXX",
                "versionNumber": "1.0.0.LATEST"
              }
            ]
          }
        }
      }
    }
    mkdir -p config
    touch config/ai-assist.yaml
    # config/ai-assist.yaml
    errorAnalysis:
      enabled: true
    # config/ai-assist.yaml
    
    # Error Analysis Configuration
    errorAnalysis:
      enabled: true                    # Enable/disable AI error analysis
      provider: openai                 # AI provider (openai, anthropic, etc.)
      model: gpt-4                     # Model to use
      timeout: 180000                  # Timeout in ms (default: 3 minutes)
      maxSuggestedFixes: 5            # Max number of fix suggestions
    
    # Change Significance Configuration
    architecture:
      changeSignificance:
        # Metadata types to exclude from analysis
        excludedMetadataTypes:
          - CustomLabels
          - StaticResource
          - CustomTab
    
        # Thresholds for triggering analysis
        fileTypeThresholds:
          apex:
            lines: 3    # Very strict for Apex
            files: 1
          flows:
            lines: 1    # Any flow change
            files: 1
          lwc:
            lines: 10   # More lenient for UI
            files: 2
    sfp validate org -o ci \
                     -v devhub \
                     --mode thorough \
                     --releaseconfig=config/release-domain-sales.yml
    sfp validate org -o ci \
                     -v devhub \
                     --mode thorough \
                     --releaseconfig config/release-sales.yml \
                     --releaseconfig config/release-service.yml
    # config/release-sales.yml
    name: Sales Domain
    includeOnlyPackages:
      - sales-core
      - sales-ui
      - sales-integrations
      - opportunity-management
      - quote-management
    skipPackages:
      - sales-deprecated
    sfp validate org -o ci \
                     -v devhub \
                     --diffcheck \
                     --releaseconfig=config/release.yml
    sfp validate org -o ci \
                     -v devhub \
                     --mode individual \
                     --releaseconfig=config/release.yml
    sfp validate org -o ci \
                     -v devhub \
                     --ref feature-branch \
                     --baseRef main \
                     --releaseconfig=config/release.yml
    // sfdx-project.json
    
    {
      "packageDirectories": [
        {
          "path": "util",
          "default": false,
          "package": "Expense-Manager-Util",
          "versionName": "Winter ‘24",
          "versionDescription": "Welcome to Winter 2024 Release of Expense Manager Util Package",
          "versionNumber": "4.7.0.NEXT"
        },
         {
          "path": "unpackaged",
          "default": true,
          "package": "unpackaged",
          "versionName": "v3.2",
          "type":"diff"
        }
      ],
      "sourceApiVersion": "58.0",
      "packageAliases": {
        "TriggerFramework": "0HoB00000004RFpLAM",
        "Expense Manager - Util": "0HoB00000004CFpKAM",
        "External Apex [email protected]": "04tB0000000IB1EIAW",
        "Expense Manager": "0HoB00000004CFuKAM"
      }
    }
    {
      "packageDirectories": [
        {
          "path": "core",
          "package": "core-package",
          "versionName": "Core 1.0",
          "versionNumber": "1.0.0.NEXT",
          "default": true,
          "buildCollection": [
            "core-package",
            "featureA-package",
            "featureB-package"
          ]
        },
        {
          "path": "features/featureA",
          "package": "featureA-package",
          "versionName": "Feature A 1.0",
          "versionNumber": "1.0.0.NEXT"
           "buildCollection": [
            "core-package",
            "featureA-package",
            "featureB-package"
          ]
        },
        {
          "path": "features/featureB",
          "package": "featureB-package",
          "versionName": "Feature B 1.0",
          "versionNumber": "1.0.0.NEXT",
          "buildCollection": [
            "core-package",
            "featureA-package",
            "featureB-package"
          ]
        }
      ],
    }
    // Consider a source package feature-management
    // with path as src/feature-management
    
    └── feature-management
        ├── main
        ├──── default
        ├────────  <metadata-contents>
        ├── post-destructive
        ├────────  <metadata-contents>
        ├── pre-destructive
        ├────────  <metadata-contents>
        └── test
    {
      "name": "CustomObject__c",
      "operation": "Upsert",
      "externalId": "External_Id__c",
      "query": "SELECT Id, Name, IsActive__c FROM CustomObject__c WHERE SomeCondition = true"
    }
    {
      "name": "CustomObject__c",
      "operation": "Delete",
      "query": "SELECT Id FROM CustomObject__c WHERE IsActive__c = false"
    }
    // Use of deleteOldData
    {
      "name": "CustomObject__c",
      "operation": "Upsert",
      "externalId": "External_Id__c",
      "deleteOldData": true
    }
    
    # Create a shrunk version of the project configuration
    sfp dependency:shrink
    
    # Overwrite the existing sfdx-project.json with shrunk dependencies
    sfp dependency:shrink --overwrite
    {
      "packageDirectories": [...],
      "plugins": {
        "sfp": {
          "externalDependencyMap": {
            "package-name": [
              {
                "package": "04tXXXXXXXXXXXXXXX",
                "versionNumber": "1.0.0.LATEST"
              }
            ]
          }
        }
      }
    }
    {
      "plugins": {
        "sfp": {
          "externalDependencyMap": {
            "trigger-framework": [
              {
                "package": "0H1000XRTCam",
                "versionNumber": "1.0.3.LATEST"
              }
            ]
          }
        }
      }
    }
    # Explain dependencies for all packages
    sfp dependency:explain
    
    # Explain dependencies for a specific package
    sfp dependency:explain -p <package-name>
    {
      "packages": [
        {
          "name": "package-name",
          "dependencies": [
            {
              "name": "dependency-name",
              "version": "version-number",
              "type": "direct|transitive",
              "contributors": ["package-names"] // Only present for transitive dependencies
            }
          ]
        }
      ]
    }
    sfp dependency:explain
    sfp dependency:explain -p my-package
    sfp dependency:explain -p my-package --json
    // Sample package 
    
    {
      "packageDirectories": [
          {    
          "path": "src/data-package-cl",
          "package": "data-package-cloudlending",
          "type": "data",
          "versionNumber": "2.0.10.NEXT",
          "preDeploymentScript": "scripts/enableEmailDeliverability.sh"
          "postDeploymentScript": "scripts/pushData.sh"
        }
    
    // A script to enable email deliverablity to All
    
    #!/bin/bash
    
    export SF_DISABLE_DNS_CHECK=true
    
    # Create a temporary file
    temp_file="$(mktemp)"
    
    # Write the anonymous Apex code to the temp file
    cat << EOT >> "$temp_file"
    {
      "$schema": "https://raw.githubusercontent.com/amtrack/sfdx-browserforce-plugin/master/src/plugins/schema.json",
      "settings": {
        "emailDeliverability": {
          "accessLevel": "All email"
        }
      }
    }
    EOT
    
    
    if [ "$3" = 'ci' ]; then
    # Execute the browserforce configuration to this org
      sf browserforce:apply  -f "$temp_file" --target-org $3
    fi
    
    # Clean up by removing the temporary file
    rm "$temp_file"
    src/
      your-package/
        preDeploy/
          replacements.yml
        main/
          default/
    // Demonstrating package directory with string replacements
    {
      "packageDirectories": [
        {
          "path": "src/your-package",
          "package": "your-package",
          "versionNumber": "1.0.0.NEXT"
          // No specific flag needed - replacements are auto-detected
        }
      ]
    }
    src-env-specific/
      default/classes/APIService.cls     # endpoint = "https://api-dev.example.com"
      staging/classes/APIService.cls     # endpoint = "https://api-staging.example.com"
      production/classes/APIService.cls  # endpoint = "https://api.example.com"
    // Single APIService.cls with placeholder
    public class APIService {
        private static final String ENDPOINT = '%%API_ENDPOINT%%';
    }
    
    // preDeploy/replacements.yml
    replacements:
      - name: "API Endpoint"
        pattern: "%%API_ENDPOINT%%"
        glob: "**/APIService.cls"
        environments:
          default: "https://api-dev.example.com"
          staging: "https://api-staging.example.com"
          production: "https://api.example.com"
    # Install OpenCode CLI
    npm install -g opencode-ai
    # Analyze single package
    sfp project report --package nextGen
    
    # Analyze multiple packages
    sfp project report --package core --package utils --output core-utils-analysis.md
    # Analyze all packages in a domain
    sfp project report --domain billing --output billing-analysis.md
    # Uses defaults (provider: anthropic, model: claude-sonnet-4-20250514)
    sfp project report --package nextGen --output nextgen-analysis.md
    
    # Specify different model (if needed)
    sfp project report --model claude-sonnet-4-20250514 --package core
    # Must specify provider explicitly (uses claude-sonnet-4 by default)
    sfp project report --provider github-copilot --package rate-changes
    
    # Uses default model for GitHub Copilot
    sfp project report --provider github-copilot --domain service
    # Uses default model: anthropic.claude-sonnet-4-20250514-v1:0
    sfp project report --provider amazon-bedrock --package core
    
    # Specify different region (if not in environment)
    export AWS_REGION=eu-west-1
    sfp project report --provider amazon-bedrock --domain billing
    # Install OpenCode CLI
    npm install -g opencode-ai
    
    # Verify installation
    opencode --version
    Versioned Artifact:
    Aligned with sfpowerscripts principle of traceability, every deployment is traceable to a versioned artifact, which is difficult to achieve when you are using a folder to deploy
  • Orchestration: Data package creation and installation can be orchestrated by sfpowerscripts, which means less scripting

  • Defining a Data package

    Simply add an entry in the package directories, providing the package's name, path, version number and type (data). Your editor may complain that the 'type' property is not allowed, but this can be safely ignored.

    Generating the contents of the data package

    Export your Salesforce records to csv files using the SFDMU plugin. For more information on plugin installation, creating an export.json file, and exporting to csv files, refer to Plugin Basic > Basic Usage in SFDMU's documentation.

    Options with Data Packages

    Data packages support the following options, through the sfdx-project.json.

    Adding Pre/Post Deployment Scripts to Data Packages

    Refer to this link for more details on how to add a pre/post script to data package

    Defining a vlocity Data Package

    sfpowerscripts support vlocity RBC migration using the vlocity build tool (vbt). sfpowerscripts will be automatically able to detect whether a data package need to be deployed using vlocity or using sfdmu. (Please not to enable vlocity in preparing scratchOrgs, the enableVlocity flag need to be turned on in the pool configuration file)

    A vlocity data package need to have vlocityComponents.yaml file in the root of the package directory and it should have the following definition

    The same package would be defined in the sfdx-project.json as follows

    SFDMU plugin
    Files to be maintained to enable field history tracking deployment before the Jan 23 Release
    Sample history-tracking.yml
    Files to be maintained to enable field history tracking deployment after the Jan 23 Release. (Needs to be kept in 'postDeploy' folder)

    Release Config

    Release configuration is a fundamental setup that outlines the organisation of packages within a project, streamlining across different lifecycle of your project, such as validating, building, deploying/release of artifacts. In flxbl projects, a release config is used to define the concept of a domain/subdomains. This configuration is instrumental when using sfp commands, as it allows for selective operations on specified packages defined by a configuration. By employing a release configuration, teams can efficiently manage a mono repository of packages across various teams.

    The below table list the options that are currently available for release configuration

    Parameter
    Required
    Type
    Description

    releaseName

    No

    String

    A release configuration also can contain additional options that can be used by certain sfp commands to generate . These properties in a release definiton alters the behaviour of deployment of artifacts during a release

    Release Definition Properties

    Parameter
    Required
    Type
    Description

    Overview

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    January 25

    The project analysis command helps you analyze your Salesforce project for potential issues and provides detailed reports in various formats. This command is particularly useful for identifying issues such as duplicate components, compliance violations, hardcoded IDs and URLs, and other code quality concerns.

    Usage

    Common Use Cases

    The analyze command serves several key purposes:

    1. Runs various available linters across the project

    2. Generating comprehensive analysis reports

    3. Integration with CI/CD pipelines for automated checks

    Available Flags

    Flag
    Description
    Required
    Default

    Scoping Analysis

    The command provides three mutually exclusive ways to scope your analysis:

    1. By Package: Analyze specific packages

    2. By Domain: Analyze all packages in a domain

    3. By Source Path: Analyze a specific directory

    Output Formats

    The command supports multiple output formats:

    • Markdown: Human-readable documentation format

    • JSON: Machine-readable format for integration with other tools

    • GitHub: Special format for GitHub Checks API integration

    GitHub Integration

    When running in GitHub Actions, the command automatically:

    1. Creates GitHub Check runs for each analysis

    2. Adds annotations to the code for identified issues

    3. Provides detailed summaries in the GitHub UI

    Examples

    1. Basic analysis of all packages:

    2. Analyze specific packages with JSON output:

    3. Analyze with strict validation:

    4. Generate reports in a specific directory:

    Aliasfy Packages - Merge Mode

    Attribute
    Type
    Description
    Package Types Applicable

    mergeMode

    boolean

    Enable deployment of contents of a folder that matches the alias of the environment using merge

    • source

    sfp-pro
    sfp (community)

    mergeMode adds an additional mode for deploying aliasified packages with content inheritance. During package build, the default folder's content is merged with subfolders that matches the org with alias name, along with subfolders able to override inherited content. This reduces metadata duplication while using aliasifed packages.\

    Note in the image above that the AccountNumberDefault__c field is replicated in each deployment directory that matches the alias.

    The default folder is deployed to any type of environment, including production unlike when only used with aliasfy mode

    The table below describes the behavior of each command when merge mode is enabled/disabled.

    Merge Mode
    Default available?
    Build
    Install
    Push
    Pull

    When merge mode is enabled, it also supports push/pull commands, the default subfolder is always used during this process, make sure it is not force ignored.

    Before merge mode, the whole package was "force ignored." With merge mode, you have the option to allow push/pull from aliasfy packages by not ignoring the default subfolder.

    Non-merge mode
    Merge Mode

    Aliasfy Packages

    Attribute
    Type
    Description
    Package Types Applicable

    aliasfy

    boolean

    Enable deployment of contents of a folder that matches the alias of the environment

    • source

    Aliasified packages are available only for Source Package

    Aliasify enables deployment of a subfolder in a source package that matches the target org. For example, you have a source package as listed below. \

    During Installation, only the metadata contents of the folder that matches the alias gets deployed. If the alias is not found, sfp will fallback to the 'default' folder. If the default folder is not found, an error will be displayed saying default folder or alias is missing.\

    Default folder are only deployed to sandboxes

    When to Use Aliasfy Packages vs String Replacements

    Both aliasfy packages and string replacements provide environment-specific deployments, but serve different purposes:

    Use Aliasfy Packages When:

    • You have structural metadata differences between environments (different fields, objects, workflows, or permissions)

    • You need completely different files per environment (e.g., different page layouts, record types)

    • The entire metadata component varies by environment

    • You're managing environment-specific declarative configurations

    Use String Replacements When:

    • Only configuration values differ between environments (endpoints, emails, feature flags)

    • You want to avoid duplicating entire files just to change a few values

    • You need to maintain single source of truth for business logic

    • You're dealing with code-based metadata (Apex, LWC, Aura) with environment-specific values

    Combine Both When:

    • Large packages need both structural differences AND configuration value changes

    • Different teams manage different aspects of environment-specific configurations

    • You're gradually migrating from full file duplication to value-based replacements

    Example Comparison

    Aliasfy Approach - Different permission sets per environment:

    String Replacements Approach - Same code, different values:

    String replacements complement aliasfy packages - use them together for complete environment management

    String Replacements During Install

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    September 2025

    During artifact installation, sfp automatically applies string replacements to convert placeholder values to environment-specific values based on the target org.

    How It Works

    When installing artifacts with embedded replacement configurations:

    1. Org Detection: Identifies the target org alias and whether it's a sandbox or production org

    2. Value Resolution: Determines the appropriate replacement value based on org alias or defaults

    3. File Modification: Applies replacements to matching files before deployment

    4. Deployment: Deploys the modified components to the target org

    Environment Resolution

    Replacements are resolved in this order:

    1. Exact alias match: If the org alias matches a configured environment

    2. Default value: For sandbox/scratch orgs when no exact match is found

    3. Error: For production orgs without explicit configuration

    Installation Output

    During installation with replacements, you'll see:

    Command Line Options

    Disable Replacements

    To skip replacements during installation:

    Override Replacements

    To use a custom replacement configuration file:

    Production Deployments

    Production deployments require explicit configuration for the org alias. If no configuration is found, the installation will fail:

    Troubleshooting

    No Replacements Applied

    • Verify the artifact was built with replacements embedded

    • Check that glob patterns match your files

    • Ensure the org alias has configured values

    Wrong Values Applied

    • Verify org alias with sf org list

    • Check environment resolution order

    • Ensure production orgs have explicit configuration

    Related Documentation

    Different types of validation

    sfp provides validation techniques to verify changes in your Salesforce projects before merging. The validate command supports two primary modes to suit different validation needs.

    Validation Modes

    Mode
    Description
    Flag

    Duplicate Check

    sfp-pro
    sfp (community)

    The duplicate check functionality helps identify duplicate metadata components across your Salesforce project. This analysis is crucial for maintaining clean code organization and preventing conflicts in your deployment process.

    Simplify your Salesforce Branching StrategyMedium
    GitHubGitHub
      {
        "path": "path--to--data--package",
        "package": "name--of-the-data package", //mandatory, when used with sfpowerscripts
        "versionNumber": "X.Y.Z.0 // 0 will be replaced by the build number passed",
        "type": "data", // required
      }
      {
        "path": "path--to--package",
        "package": "name--of-the-package", //mandatory, when used with sfpowerscripts
        "versionNumber": "X.Y.Z.[NEXT/BUILDNUMBER]",
        "type": "data", // required
        "aliasfy": <boolean>, // Only for source packages, allows to deploy a subfolder whose name matches the alias of the org when using deploy command
        "assignPermSetsPreDeployment: ["","",],
        "assignPermSetsPostDeployment: ["","",],
        "preDeploymentScript":<path>, //All Packages
        "postDeploymentScript":<path> // All packages
      }
    # $1 package name
    # $2 org
    # $3 alias
    # $4 working directory
    # $5 package directory
    
    sfdx force:apex:execute -f scripts/datascript.apex -u $2
    projectPath: src/vlocity-config // Path to the package directory
    expansionPath: datapacks // Path to the folder containing vlocity attributes
    autoRetryErrors: true //Additional items
    manifest:
            {
                "path": "./src/vlocity-config",
                "package": "vlocity-attributes",
                "versionNumber": "1.0.0.0",
                "type": "data"
            }
    // Sample package 
    
    {
      "packageDirectories": [
          {    
          "path": "src/core-crm",
          "package": "core-crm",
          "versionNumber": "2.0.10.NEXT",
          "enableFHT" : true,
          "enableFT" : true
        }
    
    ---
    ​releaseName: core
    pool: core_pool
    includeOnlyArtifacts:
      - src-env-specific-pre
      - src-env-specific-alias-pre
      - core-crm
      - telephony-service
    excludePackageDependencies:
      - Genesys Cloud for Salesforce
      - Marketing Cloud
    releasedefinitionProperties:
      changelog:
        workItemFilters:
          -  BRO-[0-9]{3,4}
        workItemUrl: https://bro.atlassian.net/browse
        limit: 30

    releasedefinitionProperties.changelog.repoUrl

    No

    Prop

    The URL of the version control system to push changelog files

    releasedefinitionProperties.changelog.workItemFilters

    No

    Prop

    An array of regular expression used to identify work items in your commit messages

    releasedefinitionProperties.changelog.workitemUrl

    No

    Prop

    The generic URL of work items, to which to append work item codes. Allows easy redirection to user stories by clicking on the work-item link in the changelog.

    releasedefinitionProperties.changelog.limit

    No

    Prop

    Limit the number of releases to display in the changelog markdown

    releasedefinitionProperties.changelog.showAllArtifacts

    No

    Prop

    Whether to show artifacts that haven't changed between releases

    Name of the release config, in flxbl project, this name is used as the name of the domain

    pool

    No

    String

    Name of the scratch org or sandbox pool associated with this release config during validation

    excludeArtifacts

    No

    Array

    An array of artifacts that need to be excluded while creating the release definition

    includeOnlyArtifacts

    No

    Array

    An array of artifacts that should only be included while creating the release definition

    dependencyOn

    No

    Array

    An array of packages that denotes the dependency this configuration has. The dependencies mentioned will be used for synchronization in review sandboxes

    excludePackageDependencies

    No

    Array

    Exclude the mentioned package dependencies from the release definition

    includeOnlyPackageDependencies

    No

    Array

    Include only the mentioned package dependencies from the release definition

    releasedefinitionProperties

    No

    Object

    Properties of release definition that should be added to the generated release definition. See below

    releasedefinitionProperties.skipIfAlreadyInstalled

    No

    boolean

    Skip installation of artifact if it's already installed in target org

    releasedefinitionProperties.baselineOrg

    No

    string

    The org used to decide whether to to skip installation of an artifact. Defaults to the target org when not provided

    releasedefinitionProperties.promotePackagesBeforeDeploymentToOrg

    No

    string

    release definitions

    Promote packages before they are installed into an org that matches alias of the org

    Default merging not available

    Deploys subfolder that matches enviroment alias name. If there is no match, deploys default (Only sandboxes)

    Only default is pushed

    Only default is pushed

    Default merging not available

    Deploys subfolder that matches enviroment alias name. If there is no match, fails

    Nothing to push

    Nothing to pull

    Availability

    ✅

    ❌

    From

    September 24

    Default metadata is merged into each subfolder

    Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production)

    Only default is pushed

    Only default is pulled

    Default merging not available

    Package Content (before build)
    Artifact Content (after build)

    Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production)

    String Replacements Overview
    String Replacements Configuration

    --exclude-linters

    Comma-separated list of linters to exclude

    No

    []

    --fail-on

    Linters that should cause command failure if issues found

    No

    []

    --show-aliasfy-notes

    Show notes for aliasified packages

    No

    true

    --fail-on-unclaimed

    Fail when duplicates are found in unclaimed packages

    No

    false

    --output-format

    Output format (markdown, json, github)

    No

    markdown

    --report-dir

    Directory for analysis reports

    No

    -

    --compliance-rules

    Path to compliance rules YAML file

    No

    config/compliance-rules.yaml

    --generate-compliance-config

    Generate sample compliance rules configuration

    No

    false

    Generate compliance configuration:
  • Run compliance checks with custom rules:

  • --package, -p

    The name of the package to analyze

    No

    -

    --domain, -d

    The domain to analyze

    No

    -

    --source-path, -s

    The path to analyze

    No

    -

    Overview

    Duplicate checking scans your project's metadata components and identifies:

    • Components that exist in multiple packages

    • Components in aliasified packages

    • Components in unclaimed directories (not associated with a package)

    How It Works

    The duplicate checker:

    1. Scans all specified directories for metadata components

    2. Creates a unique identifier for each component based on type and name

    3. Identifies components that appear in multiple locations

    4. Provides detailed reporting with context about each duplicate

    Configuration

    Duplicate checking can be configured through several flags in the project:analyze command:

    Key Configuration Options

    Option
    Description
    Default

    --show-aliasfy-notes

    Show information about aliasified packages

    true

    --fail-on-unclaimed

    Fail when duplicates are in unclaimed packages

    false

    Understanding Results

    The duplicate check provides three types of findings:

    1. Direct Duplicates: Same component in multiple packages

    2. Aliasified Components: Components in aliasified packages (typically expected)

    3. Unclaimed Components: Components in directories (such as unpacakged or src/temp) not associated with packages

    Sample Output

    Understanding Indicators

    The analysis uses several indicators to help you understand the results:

    • ❌ Indicates a problematic duplicate that needs attention

    • ⚠️ Indicates a duplicate that might be intentional (e.g., in aliasified packages)

    • (aliasified) Marks components in aliasified packages

    • (unclaimed) Marks components in unclaimed directories

    Best Practices

    1. Regular Scanning: Run duplicate checks regularly during development

    2. Clean Package Structure: Keep each component in its appropriate package

    3. Proper Package Configuration: Ensure all directories are properly claimed in sfdx-project.json

    4. Documented Exceptions: Document cases where duplication is intentional

    Common Scenarios and Solutions

    1. Legitimate Duplicates

    Some components may need to exist in multiple packages. In these cases:

    • Use aliasified packages when appropriate

    • Document the reason for duplication

    • Consider creating a shared package for common components

    2. Unclaimed Directories

    If you find components in unclaimed directories:

    1. Add the directory to sfdx-project.json

    2. Assign it to an appropriate package

    3. Re-run the analysis to verify the fix

    3. Aliasified Package Duplicates

    Duplicates in aliasified packages are often intentional:

    • Used for environment-specific configurations

    • Different versions of components for different contexts

    • Not considered errors by default

    Integration with CI/CD

    Integration is limited only to GitHub at the monent, The command needs GITHUB_APP_PRIVATE_KEY and GITHUB_APP_ID to be set in an env variable for the results to be reported as a GitHub check

    When integrating duplicate checking in your CI/CD pipeline:

    1. Configure Failure Conditions:

    2. Generate Reports:

    3. Review Results:

      • Use generated reports for tracking

      • Address critical duplicates

      • Document accepted duplicates

    Troubleshooting

    1. False Positives

      • Verify package configurations in sfdx-project.json

      • Check if components should be in aliasified packages

      • Ensure directories are properly claimed

    2. Missing Components

      • Verify scan scope (package, domain, or path)

      • Check file permissions

      • Validate metadata format

    Availability

    ✅

    ❌

    From

    January 25

    // Demonstrating package directory with aliasfy merge mode
    {
      "packageDirectories": [
          {    
          "path": "src/src-env-specific-alias-post",
          "package": "src-env-specific-alias-post",
          "versionNumber": "2.0.10.NEXT",
          "aliasfy" : {
              "mergeMode": true
          }
        }
    # .forceIgnore (aliasfy non-merge mode)
    src-env-specific-alias-post
    # .forceIgnore (aliasfy merge mode)
    src-env-specific-alias-post/dev
    src-env-specific-alias-post/test
    src-env-specific-alias-post/prod
    // Demonstrating package directory with aliasfy
    {
      "packageDirectories": [
          {    
          "path": "src/src-env-specific-alias-post",
          "package": "src-env-specific-alias-post",
          "versionNumber": "2.0.10.NEXT",
          "aliasfy" : true
        }
    src-env-specific/
      default/
        permissionsets/
          StandardUser.permissionset-meta.xml  # Basic permissions
      production/
        permissionsets/
          StandardUser.permissionset-meta.xml  # Production-specific permissions
      staging/
        permissionsets/
          StandardUser.permissionset-meta.xml  # Staging-specific permissions
    // Single file with placeholders
    public class IntegrationService {
        private static final String ENDPOINT = '%%API_ENDPOINT%%';
        private static final String API_KEY = '%%API_KEY%%';
    }
    Text replacements for org alias dev (sandbox)
    Total replacement configs 3
    Modified APIService.cls with 2 replacements
    Modified ConfigService.cls with 1 replacements
    Replacements summary: 3 replacements in 2 files using dev environment
    sfp install --targetorg dev --artifactdir artifacts --no-replacements
    sfp install --targetorg dev --artifactdir artifacts --replacementsoverride custom-replacements.yml
    ERROR: No replacement value found for production org with alias 'prod'
    Production deployment requires explicit configuration for alias 'prod' in replacement 'API Endpoint'
    sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliance
    sfp project:analyze [flags]
    sfp project:analyze -p core,utils
    sfp project:analyze -d sales
    sfp project:analyze -s ./force-app/main/default
    sfp project:analyze
    sfp project:analyze -p core,utils --output-format json
    sfp project:analyze --fail-on duplicates --fail-on-unclaimed
    sfp project:analyze --report-dir ./analysis-reports
    sfp project:analyze --generate-compliance-config
    sfp project:analyze --fail-on duplicates --fail-on-unclaimed --output-format github
    sfp project:analyze --report-dir ./reports --output-format markdown
    sfp project:analyze --fail-on duplicates --fail-on-unclaimed
    # Duplicate Analysis Report
    
    ## Aliasified Package Information
    - Note: Package "env-specific" is aliasified - duplicates are allowed.
    
    ## Component Analysis
    
    ### ❌ CustomObject: Account.Custom_Field__c (Duplicate Component)
    - `src/package1/objects/Account.object` 
    - `src/package2/objects/Account.object`
    
    ### ⚠️ CustomLabel: Common_Label (Aliasified Component)
    - `src/env-specific/qa/labels/Custom.labels` (aliasified)
    - `src/env-specific/prod/labels/Custom.labels` (aliasified)

    --mode=thorough

    Individual

    Ignore packages installed in scratch org, identify list of changed packages from PR/Merge Request, and validate each of the changed packages (respecting any dependencies) using thorough validation rules.

    --mode=individual

    Release Config Filtering

    Both validation modes support filtering packages using a release configuration file through the --releaseconfig flag. When provided, only packages defined in the release config that have changes will be validated. This is useful for:

    • Large monorepos with multiple domains

    • Focusing validation on specific package groups

    • Reducing validation time by limiting scope

    Evolution of Validation Modes

    Why Fast Feedback Was Deprecated

    Fast Feedback mode was initially introduced to provide quicker validation by:

    • Installing only changed components instead of full packages

    • Running selective tests based on impact analysis

    • Skipping coverage calculations

    • Skipping packages with only descriptor changes

    However, this mode was deprecated and removed because:

    1. Complexity vs. Value: The mode introduced significant complexity in determining what to test versus what to synchronize, while the time savings were inconsistent.

    2. Improved Alternative Approach: The validation logic was enhanced to automatically differentiate between:

      • Packages to synchronize: Already validated packages from upstream changes (deployed but not tested)

      • Packages to validate: Packages with changes introduced by the current PR (deployed and tested)

      This automatic differentiation provides the speed benefits of fast feedback without requiring a separate mode.

    3. Better Options Available:

      • Use --ref and --baseRef flags to specify exact comparison points

      • Use --releaseconfig to limit validation scope

    Currently Deprecated Modes

    • Fast Feedback (--mode=fastfeedback) - Removed in favor of automatic synchronization logic

    • Fast Feedback Release Config (--mode=ff-release-config) - Use --releaseconfig with standard modes instead

    • Thorough Release Config (--mode=thorough-release-config) - Use --mode=thorough --releaseconfig instead

    Note: The current validation intelligently handles synchronized vs. validated packages automatically when you provide --ref and --baseRef flags, achieving faster feedback without a separate mode.

    Sequence of Activities

    The following steps are orchestrated by the validate command:

    Initial Setup

    When using pools:

    • Fetch a scratch org from the provided pools in a sequential manner

    • Authenticate to the scratch org using the auth URL fetched from the Scratch Org Info Object

    When using a provided org:

    • Authenticate to the provided target org

    Package Processing

    1. Identify packages to validate:

      • Build packages that are changed by comparing the tags in your repo against the packages installed in the target

      • If --releaseconfig is provided, filter packages based on the release configuration

    2. For each package to validate:

      Thorough Mode (Default):

      • Deploy all the built packages as source packages / data packages (unlocked packages are installed as source packages)

      • Trigger Apex Tests if there are any apex tests in the package

      • Validate test coverage of the package depending on the type:

        • Source packages: Each class needs to have 75% or more coverage

      Individual Mode:

      • Ignore packages that are installed in the scratch org (eliminates the requirement of using a pooled org)

      • Compute changed packages by observing the diff of Pull/Merge Request

      • Validate each of the changed packages individually

      • Install any dependencies required for each package

    Additional Options

    Test Execution

    By default, all apex tests are triggered in parallel with automated retry. Tests that fail due to SOQL locks or other transient issues are automatically retried synchronously. You can override this behavior:

    • --disableparalleltesting: Forces all tests to run synchronously

    • --skipTesting: Skip test execution entirely (use with caution)

    Coverage Requirements

    • Default coverage threshold: 75%

    • Configure custom threshold: --coveragepercent <value>

    • Coverage is validated per class for source packages and per package for unlocked packages

    Best Practice: Use "thorough" mode for comprehensive validation before merging to ensure all packages are properly tested and deployable. For faster feedback during development, consider using "individual" mode or filtering with release configs.

    Skipping tests with --skipTesting bypasses critical quality checks. Only use this option in development environments or when you're certain the changes don't require test validation.

    Thorough (Default)

    Include package dependencies, code coverage, all test classes during full package deployments. This is the recommended mode for comprehensive validation.

    Unlocked Packages

    Unlocked/Org Dependent Unlocked Packages

    There is a huge amount of documentation on unlocked packages. Here are list of curated links that can help you get started on learning more about unlocked package

    The Basics

    • Package Development Model

    • Unlocked Package for Customers

    • Successfully Creating Unlocked Package

    Advanced Materials

    The following sections deals with more of the operational aspects when dealing with unlocked packages

    Unlocked Package and Test Coverage

    Unlocked Packages, excluding Org-Dependent unlocked packages have mandatory test coverage requirements. Each package should have minimum of 75% coverage requirement. A validated build (or in sfp) validates the coverage of package during the build phase. To enable the feedback earlier in the process, sfp provide you functionality to validate test coverage of a package such as during the Pull Request Validation process.

    Managing Version Numbers of Unlocked Package

    For unlocked packages, we ask users to follow a versioning of packages.

    Please note Salesforce packages do not support the concept of PreRelease/BuildMetadata. The last segment of a version number is a build number. We recommend to utilize the auto increment functionality provided by Salesforce rather than rolling out your own build number substitution ( Use 'NEXT' while describing the build version of the package and 'LATEST' to the build number where the package is used as a dependency)

    Note that an unlocked package must be before it can be installed to a production org, and either the major, minor or patch (not build) version must be higher than the last version of this package which was promoted. These version number changes should be made in the sfdx-project.json file before the final package build and promotion.

    Deprecating Components from an Unlocked Package

    Unlocked packages provide traceability in the org by locking down the metadata components to the package that introduces it. This feature which is the main benefit of unlocked package can also create issues when you want to refactor components from one package to another. Let's look at some scenarios and common strategies that need to be applied

    For a project that has two packages.

    • Package A and Package B

    • Package B is dependent on Package A.

    Scenario 1:

    • Remove a component from Package A, provided the component has no dependency

    • Solution: Create a new version of Package A with the metadata component being removed and install the package.

    Scenario 2:

    • Move a metadata component from Package A to Package B

    • Solution: This scenario is pretty straight forward, one can remove the metadata component from Package A and move that to Package B. When a new version of Package A gets installed, the following things happen:

      • If the deployment of the unlocked package is set to mixed, and no other metadata component is dependent on the component, the component gets .

    Scenario 3:

    • Move a metadata component from Package B to Package A, where the component currently has other dependencies in Package B

    • Solution: In this scenario, one can move the component to Package A and get the packages built. However during deployment to an org, Package A will fail with an error this component exists in Package B. To mitigate this one should do the following:

      • Deploy a version of Package B which removes the lock on the metadata component using deprecate mode. Some times this needs extensive refactoring to other components to break the dependencies. So evaluate whether the approach will work.

    Managing Package Dependencies

    Package dependencies are defined in the sfdx-project.json. More information on defining package dependencies can be found in the Salesforce .

    Let's unpack the concepts utilizing the above example:

    • There are two unlocked packages

      • Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias

      • Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'

    • External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.

    Build Options with Unlocked Packages

    Unlocked packages have two build modes, one with and one without. A package being built without skipping dependency check cant be deployed into production and can usually take a long time to build. sfp cli tries to build packages in parallel understanding your dependency, however some of your packages could spend a significant time in validation.

    During these situations, we ask you to consider whether the time taken to build all validated packages on an average is within your build budget, If not, here are your options

    • : Org-dependent unlocked packages are a variant of unlocked packages. Org-dependent packages do not validate the dependencies of a package and will be faster. However please note that all the org's where the earlier unlocked package was installed, had to be deprecated and the component locks removed, before the new org-dependent unlocked package is installed.

    • Use it as the least resort, source packages have a fairly loose lifecycle management.

    Handling metadata that is not supported by unlocked packages

    Create a and move the metadata and any associated dependencies over to that particular package.

    Expand Dependencies

    sfp-pro
    sfp (community)

    Availability

    ✅

    ✅

    The sfp dependency:expand command enriches your project's dependency declarations by automatically adding all transitive dependencies to your sfdx-project.json. This ensures that each package explicitly declares all its dependencies, both direct and indirect, making the dependency graph complete and explicit.

    What It Does

    When you run sfp dependency:expand, it:

    1. Analyzes all packages in your project to identify their direct dependencies

    2. Resolves transitive dependencies by recursively finding dependencies of dependencies

    3. Updates each package to include both direct and transitive dependencies

    4. Maintains proper ordering ensuring dependencies are listed in topological order

    Why Use Expand

    Expanding dependencies is useful for:

    • Explicit dependency management: Makes all dependencies visible in the project configuration

    • Build optimization: Helps build tools understand the complete dependency graph

    • Troubleshooting: Easier to identify dependency-related issues when all dependencies are explicit

    • CI/CD pipelines: Ensures all required dependencies are known upfront

    Usage

    Flags

    Flag
    Description
    Required

    Behavior

    • Without --overwrite, creates a new file at ./project-config/sfdx-project.exp.json

    • With --overwrite, backs up existing sfdx-project.json to ./project-config/sfdx-project.json.bak and overwrites the original

    Example: Before and After

    Before Expansion

    After Expansion

    Relationship with Shrink

    The expand and shrink commands are complementary:

    • Expand: Adds all transitive dependencies, making them explicit

    • Shrink: Removes redundant transitive dependencies, keeping only direct ones

    Typical workflow:

    1. Use expand during development to understand full dependency graphs

    2. Use shrink before committing to maintain clean, minimal dependency declarations

    Version Conflict Resolution

    When multiple packages require different versions of the same dependency, expand automatically selects the highest version that satisfies all requirements. For example:

    • Package A requires [email protected]

    • Package B requires [email protected]

    • Result: Both will get [email protected]

    External Dependencies

    External dependencies (managed packages from other Dev Hubs) defined in your externalDependencyMap are:

    • Automatically included in the expansion process

    • Preserved with their version specifications

    • Added to packages that transitively depend on them

    Notes

    • The command validates all dependencies exist in the project

    • Circular dependencies are detected and reported as errors

    • External dependencies must be properly configured in the plugins.sfp.externalDependencyMap section

    • The expanded configuration can become large for projects with many packages

    See Also

    • - Remove redundant transitive dependencies

    • - Understand package dependencies

    Navigating Salesforce Deployment Strategies: Artifact vs. Delta DeploymentsMedium

    Automated Image Synchronization to Your Registry

    This guide helps organizations set up automated synchronization of sfp pro images from Flxbl's registry to their own container registry, with optional customization capabilities

    Why Synchronize to Your Registry?

    While you can pull directly from source.flxbl.io, maintaining your own synchronized copy provides:

    • Centralized version control across all teams

    CI/CD Integration

    The project analysis command integrates seamlessly with various CI/CD platforms to provide automated code quality checks and visual feedback through GitHub Checks.

    Automatic Detection

    GitHub Actions (Default)

    Source Packages

    sfp-pro
    sfp (community)

    Source Packages is an sfp feature that provides a flexible alternative to native unlocked packages for metadata deployment and organization.

    Validation Scripts

    sfp-pro
    sfp (community)

    Validation scripts allow you to execute custom logic at specific points during the validation process. These global-level scripts provide hooks for setup, cleanup, reporting, and integration with external systems during validation workflows.

    Push Changes to your org

    sfp-pro
    sfp (community)

    The sfp project:push command deploys source from your local project to a specified Salesforce org. It can push changes based on a package, domain, or specific source path. This command is useful for deploying local changes to your Salesforce org.

    # Validate with release config filtering
    sfp validate org --targetorg myorg --mode thorough --releaseconfig config/release.yml

    Use --skipTesting when tests aren't needed

  • Use individual mode for isolated package validation

  • Unlocked packages: Package as a whole needs to have 75% or more coverage

    Apply thorough validation rules (deployment, testing, coverage)

    On the subsequent install of Package B, Package B restores the field and takes ownership of the component.

  • If not, you can go to the UI (Setup > Packaging > Installed Packages > <Name of Package> > View Components and Remove) and remove the lock for a package.

  • Salesforce Developer Guide to Unlocked Package
    Unlocked Package FAQ
    Anti Patterns in Package Dependency Design
    build command
    semantic
    promoted
    deleted
    docs
    skip dependency check
    Move to org dependent package
    Move to source-package:
    source package

    Preserves external dependencies defined in your project configuration

    Adds all transitive dependencies to each package's dependency list
  • Maintains topological ordering of dependencies

  • Handles version conflicts by selecting the highest version required

  • -o, --overwrite

    Overwrites the existing sfdx-project.json file with the expanded configuration

    No

    -v, --targetdevhubusername

    Username or alias of the target Dev Hub org

    Yes

    --loglevel

    Logging level (trace, debug, info, warn, error, fatal)

    No

    Shrink Dependencies
    Explain Dependencies

    Reduced external dependencies during CI/CD runs

  • Ability to add organization-specific customizations

  • Improved pull performance from your own registry

  • Compliance with internal security policies

  • Setting Up Automated Synchronization

    Step 1: Create a Dedicated Repository

    Create a GitHub repository in your organization specifically for Docker image management (e.g., docker-images or sfp-docker).

    Step 2: Configure Repository Secrets

    Add the following secrets to your repository (Settings → Secrets and variables → Actions):

    Secret Name
    Description
    Value

    GITEA_USER

    Your Gitea username

    From your welcome email

    GITEA_PAT

    Personal Access Token for Gitea

    Generate at source.flxbl.io (Settings → Applications → Personal Access Tokens) with read:package permission

    Step 3: Create Synchronization Workflow

    Create .github/workflows/sync-sfp-pro.yml in your repository:

    Creating Custom Images

    If you need to add organization-specific tools or configurations, create a Dockerfile:

    For base sfp-pro-lite (without SF CLI):

    For sfp-pro with SF CLI:

    Then modify the workflow to build and push your custom image:

    Using Synchronized Images in Your Pipelines

    Update your project workflows to use images from your registry:

    GitHub Actions:

    GitLab CI:

    Azure DevOps:

    Verification

    After running the workflow, verify the synchronization:

    Troubleshooting

    Authentication Issues

    If you encounter authentication errors:

    1. Verify your PAT has read:package permission

    2. Check that secrets are correctly set in repository settings

    3. Ensure your Gitea username is correct

    Image Not Found

    If the source image cannot be pulled:

    1. Check the version exists at https://source.flxbl.io/flxbl/-/packages/container/sfp-pro/

    2. Verify your network can reach source.flxbl.io

    3. Confirm your credentials are valid

    Push Failures to GitHub Container Registry

    1. Ensure the workflow has packages: write permission

    2. Verify the repository name in IMAGE_PREFIX is correct

    3. Check GitHub Packages settings for your repository

    When running in GitHub Actions, everything works automatically because GitHub Actions provides built-in access to GitHub App tokens:

    Note: The GITHUB_TOKEN provided by GitHub Actions has the necessary permissions to create checks. This is why it works automatically in GitHub Actions but requires special setup in other CI platforms (see below).

    The command automatically:

    • ✅ Detects it's running in a PR context

    • ✅ Fetches changed files from the PR

    • ✅ Creates GitHub Checks with results

    • ✅ Adds annotations to files with issues

    Manual Configuration (Other CI Platforms)

    If you're using a different CI platform but want to push checks to GitHub, you need to manually provide environment variables.

    Required Environment Variables

    For PR Context Detection

    Variable
    Description
    Example

    GITHUB_ACTIONS

    Set to "true" to enable GitHub mode

    true

    GITHUB_EVENT_NAME

    Event type that triggered the workflow

    pull_request

    GITHUB_REPOSITORY

    Repository in format owner/repo

    owner/repo

    GITHUB_EVENT_PATH

    Path to file containing PR event data

    /tmp/pr-event.json

    For Check Creation

    Variable
    Description
    Example

    GITHUB_SHA

    Commit SHA to attach check to

    abc123...

    GITHUB_TOKEN

    GitHub token with checks:write permission

    ghs_xxxxx

    GITHUB_RUN_ID

    Optional: Run ID for details URL

    12345

    For Git Diff Operations

    Variable/Flag
    Description
    Example

    --base-ref

    Base commit/branch for comparison

    main or abc123...

    --head-ref

    Head commit/branch for comparison

    HEAD or def456...

    Event Data Format

    The GITHUB_EVENT_PATH should point to a JSON file with this structure:

    Authentication

    IMPORTANT: Creating GitHub Checks requires a GitHub App installation token, not a regular GITHUB_TOKEN.

    Using sfp server (Recommended)

    If you have sfp server installed, use it to generate installation tokens:

    The token from sfp server has the required GitHub App permissions:

    • checks:write - Create/update checks

    • contents:read - Read repository contents

    • pull_requests:read - Read PR information

    Without sfp server

    If you don't have sfp server, you need to:

    1. Create your own GitHub App with the permissions listed above

    2. Install it on your repository/organization

    3. Generate an installation token using your own tooling

    4. Set GITHUB_TOKEN to that installation token

    Note: Regular GitHub Actions GITHUB_TOKEN or Personal Access Tokens will NOT work for creating checks.

    Testing Your Configuration

    Test your configuration locally:

    Expected output:

    Troubleshooting

    No PR Context Detected

    Solution: Verify GITHUB_ACTIONS=true and GITHUB_EVENT_NAME=pull_request are set.

    Missing GitHub Context

    Solution: Ensure GITHUB_REPOSITORY, GITHUB_SHA, and GITHUB_EVENT_PATH are set.

    Authentication Failed

    Solution: Set GITHUB_TOKEN environment variable with a valid token.

    Wrong Line Counts

    Solution: Provide correct --base-ref and --head-ref flags. In PR contexts, use the actual base/head SHAs, not just HEAD.

    Understanding Source Packages

    Source Packages are metadata deployments from a Salesforce perspective - they are groups of components that are deployed to an org as a unit. Unlike Unlocked packages which are First Class Salesforce deployment constructs with lifecycle governance (versioning, component locking, automated dependency validation), source packages provide more flexibility at the cost of some safety guarantees.

    When to Use Source Packages

    While we generally recommend using unlocked packages over source packages for production code, source packages excel in several scenarios:

    Ideal Use Cases

    • Application Configuration: Configuring applications delivered by managed packages (e.g., changes to help text, field descriptions)

    • Org-Specific Metadata: Global or org-specific components (queues, profiles, permission sets, custom settings)

    • Environment-Specific Configuration: Components that vary significantly across environments

    • Composite UI Layouts: Complex UI configurations that don't package well

    • Rapid Development: When iteration speed is more important than package versioning

    • Large Monolithic Applications: When breaking into unlocked packages is not feasible

    • Starting Package Development: Teams beginning their journey to package-based development

    Technical Constraints Favoring Source Packages

    • Metadata not supported by Unlocked Packages

    • Known bugs or limitations with unlocked package deployment

    • Unlocked Package validation taking too long (consider org-dependent as alternative)

    • Need for destructive changes support

    • Requirement for environment-specific text replacements

    Source Packages vs Unlocked Packages

    Advantages of Source Packages

    • Deployment Speed: Significantly faster deployment times (no package creation/validation overhead)

    • Testing Control: Flexible test execution - can skip tests in sandboxes for faster iterations (see Testing Behavior)

    • Environment Management: Support for aliasified folders and text replacements for environment-specific configurations

    • Destructive Changes: Full support for pre and post-destructive changes

    • Metadata Coverage: Supports all metadata types, including those not supported by unlocked packages

    • No Version Constraints: No need to manage package versions or dependencies at the platform level

    • Quick Fixes: Can deploy hotfixes immediately without package creation

    • Development Flexibility: Easier to refactor and reorganize code

    Disadvantages of Source Packages

    • No Component Locking: Salesforce org is unaware of package boundaries - components can be overwritten by other packages

    • No Platform Validation: Dependencies are not validated by Salesforce

    • No Automated Rollback: Cannot easily rollback to previous versions

    • Manual Destructive Changes: Must be explicitly managed through pre/post-destructive folders

    • No Package Lifecycle: No built-in versioning, upgrade, or deprecation paths

    • Dependency Risks: Dependencies only validated at deployment time

    • Production Risk: Higher risk of deployment failures in production

    Common Override Scenarios

    Metadata components that commonly get overridden across packages:

    • Custom Labels

    • Profiles and Permission Sets

    • Custom Settings

    • Global Value Sets

    Advanced Features (sfp-pro)

    Environment-Specific Configuration

    Source packages in sfp-pro support two powerful mechanisms for managing environment-specific differences:

    1. Aliasified Packages

    Deploy different metadata variants based on org aliases:

    See Aliasified Packages for details.

    2. Text Replacements

    Replace configuration values during deployment without file duplication:

    See String Replacements for details.

    Destructive Changes Support

    Source packages fully support destructive changes through dedicated folders:

    • pre-destructive/: Components deleted before deployment

    • post-destructive/: Components deleted after deployment

    Destructive changes are automatically detected and processed during package installation.

    Apex Testing Behaviour

    Source packages have the following test execution control:

    • Development/Sandbox: Tests skipped by default for faster iterations

    • Production: Tests always run & coverage validated (Salesforce requirement)

    • Override Options:

      • sfp install --runtests: Force test execution while installing a package to a sandbox

      • Package-level skipTesting in sfdx-project.json (ignored in production, in production tests are always executed and each individual class needs to have a coverage of 75% or more)

    This provides significant performance improvements during development while maintaining production safety.

    Dependency Management

    Source packages can depend on other unlocked packages or managed packages. Dependencies are validated at deployment time, meaning the dependent metadata must already exist in the target org.

    For development in scratch orgs, you can add dependencies to enable automatic installation:

    sfp commands like prepare and validate will automatically install dependencies before deploying the source package.

    Best Practices

    1. Use unlocked packages for shared libraries that need version control and component protection

    2. Use source packages for environment-specific configuration and org-specific metadata

    3. Organize large source packages into logical domains for better maintainability

    4. Leverage aliasified packages for structural differences between environments

    5. Use text replacements for configuration values that change across environments

    6. Document destructive changes clearly in your release notes

    7. Test in lower environments before production deployment

    8. Consider org-dependent unlocked packages as a middle ground when validation time is a concern

    Migration Paths

    From Unpackaged Metadata

    1. Identify logical groupings of metadata

    2. Create source package entries in sfdx-project.json

    3. Move metadata into package directories

    4. Define dependencies between packages

    5. Test deployment order

    To Unlocked Packages

    1. Start with source packages to organize metadata

    2. Identify stable components suitable for packaging

    3. Gradually convert source packages to unlocked packages

    4. Keep environment-specific components as source packages

    Availability

    ✅

    ✅

    {
      "packageDirectories": [
        {
          "path": "util",
          "default": true,
          "package": "Expense-Manager-Util",
          "versionName": "Winter ‘20",
          "versionDescription": "Welcome to Winter 2020 Release of Expense Manager Util Package",
          "versionNumber": "4.7.0.NEXT"
        },
        {
          "path": "exp-core",
          "default": false,
          "package": "ExpenseManager",
          "versionName": "v 3.2",
          "versionDescription": "Winter 2020 Release",
          "versionNumber": "3.2.0.NEXT",
          "dependencies": [
            {
              "package": "ExpenseManager-Util",
              "versionNumber": "4.7.0.LATEST"
            },
              {
              "package": "TriggerFramework",
              "versionNumber": "1.7.0.LATEST"
            },
            {
              "package": "External Apex Library - 1.0.0.4"
            }
          ]
        }
      ],
      "sourceApiVersion": "47.0",
      "packageAliases": {
        "TriggerFramework": "0HoB00000004RFpLAM",
        "Expense Manager - Util": "0HoB00000004CFpKAM",
        "External Apex [email protected]": "04tB0000000IB1EIAW",
        "Expense Manager": "0HoB00000004CFuKAM"
      }
    }
    # Create an expanded version of the project configuration
    sfp dependency:expand
    
    # Overwrite the existing sfdx-project.json with expanded dependencies
    sfp dependency:expand --overwrite
    {
      "packageDirectories": [
        {
          "path": "packages/feature-a",
          "package": "feature-a",
          "dependencies": [
            { "package": "core-package" }
          ]
        },
        {
          "path": "packages/feature-b",
          "package": "feature-b",
          "dependencies": [
            { "package": "feature-a" }
          ]
        },
        {
          "path": "packages/core-package",
          "package": "core-package",
          "dependencies": [
            { "package": "base-package" }
          ]
        },
        {
          "path": "packages/base-package",
          "package": "base-package",
          "dependencies": []
        }
      ]
    }
    {
      "packageDirectories": [
        {
          "path": "packages/feature-a",
          "package": "feature-a",
          "dependencies": [
            { "package": "base-package" },     // Transitive dependency added
            { "package": "core-package" }       // Direct dependency
          ]
        },
        {
          "path": "packages/feature-b",
          "package": "feature-b",
          "dependencies": [
            { "package": "base-package" },     // Transitive dependency added
            { "package": "core-package" },     // Transitive dependency added
            { "package": "feature-a" }         // Direct dependency
          ]
        },
        {
          "path": "packages/core-package",
          "package": "core-package",
          "dependencies": [
            { "package": "base-package" }      // Direct dependency
          ]
        },
        {
          "path": "packages/base-package",
          "package": "base-package",
          "dependencies": []
        }
      ]
    }
    name: Sync SFP Pro Images
    
    on:
      workflow_dispatch:
        inputs:
          sfp_version:
            description: 'SFP Pro version (leave empty for latest)'
            required: false
            type: string
          include_sf_cli:
            description: 'Also sync SF CLI variant'
            required: false
            type: boolean
            default: true
    
    env:
      REGISTRY: ghcr.io
      IMAGE_PREFIX: ${{ github.repository }}
    
    jobs:
      sync-images:
        runs-on: ubuntu-latest
        permissions:
          contents: read
          packages: write
        
        steps:
          - name: Checkout repository
            uses: actions/checkout@v4
            
          - name: Login to source.flxbl.io
            run: |
              echo "${{ secrets.GITEA_PAT }}" | docker login source.flxbl.io \
                -u ${{ secrets.GITEA_USER }} \
                --password-stdin
    
          - name: Login to GitHub Container Registry
            run: |
              echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io \
                -u ${{ github.actor }} \
                --password-stdin
    
          - name: Determine version
            id: version
            run: |
              if [ -n "${{ github.event.inputs.sfp_version }}" ]; then
                echo "version=${{ github.event.inputs.sfp_version }}" >> $GITHUB_OUTPUT
              else
                # Fetch latest version from your version strategy
                echo "version=latest" >> $GITHUB_OUTPUT
              fi
    
          - name: Sync base SFP-Pro Lite image
            run: |
              SOURCE_IMAGE="source.flxbl.io/flxbl/sfp-pro-lite:${{ steps.version.outputs.version }}"
              TARGET_IMAGE="${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-lite"
              
              docker pull ${SOURCE_IMAGE}
              docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
              docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:latest
              
              docker push ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
              docker push ${TARGET_IMAGE}:latest
    
          - name: Sync SFP-Pro with SF CLI image
            if: github.event.inputs.include_sf_cli == true
            run: |
              SOURCE_IMAGE="source.flxbl.io/flxbl/sfp-pro:${{ steps.version.outputs.version }}"
              TARGET_IMAGE="${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro"
              
              docker pull ${SOURCE_IMAGE}
              docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
              docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:latest
              
              docker push ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
              docker push ${TARGET_IMAGE}:latest
    ARG BASE_VERSION=latest
    FROM source.flxbl.io/flxbl/sfp-pro-lite:${BASE_VERSION}
    
    # Add your customizations
    RUN apt-get update && apt-get install -y \
        jq \
        your-custom-tools \
        && rm -rf /var/lib/apt/lists/*
    
    # Copy custom scripts or configurations
    # COPY scripts/ /usr/local/bin/
    # COPY config/ /etc/your-app/
    ARG BASE_VERSION=latest
    FROM source.flxbl.io/flxbl/sfp-pro:${BASE_VERSION}
    
    # Your customizations here
          - name: Build and push custom image
            run: |
              docker build \
                --build-arg BASE_VERSION=${{ steps.version.outputs.version }} \
                -f Dockerfile \
                -t ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:${{ steps.version.outputs.version }} \
                -t ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:latest \
                .
              
              docker push ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:${{ steps.version.outputs.version }}
              docker push ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:latest
    jobs:
      build:
        runs-on: ubuntu-latest
        container:
          image: ghcr.io/your-org/docker-images/sfp-pro:latest
          credentials:
            username: ${{ github.actor }}
            password: ${{ secrets.GITHUB_TOKEN }}
    image: ghcr.io/your-org/docker-images/sfp-pro:latest
    
    before_script:
      - echo "$CI_REGISTRY_PASSWORD" | docker login ghcr.io -u "$CI_REGISTRY_USER" --password-stdin
    resources:
      containers:
      - container: sfp
        image: ghcr.io/your-org/docker-images/sfp-pro:latest
        endpoint: your-service-connection
    
    jobs:
    - job: Build
      container: sfp
    # List available images in your registry
    docker search ghcr.io/your-org/docker-images
    
    # Pull and test the synchronized image
    docker pull ghcr.io/your-org/docker-images/sfp-pro:latest
    docker run --rm ghcr.io/your-org/docker-images/sfp-pro:latest sfp --version
    # .github/workflows/pr-analysis.yml
    name: PR Analysis
    
    on: pull_request
    
    jobs:
      analyze:
        runs-on: ubuntu-latest
        steps:
          - uses: actions/checkout@v3
    
          - name: Run Project Analysis
            run: sfp project:analyze
            env:
              GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
    {
      "number": 123,
      "pull_request": {
        "number": 123,
        "base": {
          "sha": "base-commit-sha"
        },
        "head": {
          "sha": "head-commit-sha"
        }
      }
    }
    # Get installation token for your repository
    sfp server repository auth-token \
      --repository owner/repo \
      --sfp-server-url https://your-server-url \
      --email [email protected] \
      --json
    
    # Use the returned token
    export GITHUB_TOKEN=<token-from-above>
    # Step 1: Get GitHub App installation token from sfp server
    GITHUB_TOKEN=$(sfp server repository auth-token \
      --repository owner/repo \
      --sfp-server-url https://your-server-url \
      --email [email protected] \
      --json | jq -r '.token')
    
    # Step 2: Set environment variables
    export GITHUB_TOKEN
    export GITHUB_ACTIONS=true
    export GITHUB_EVENT_NAME=pull_request
    export GITHUB_REPOSITORY=owner/repo
    export GITHUB_SHA=your-commit-sha
    export GITHUB_EVENT_PATH=/tmp/pr-event.json
    
    # Step 3: Create event file
    cat > /tmp/pr-event.json <<EOF
    {
      "number": 123,
      "pull_request": {
        "number": 123,
        "base": { "sha": "base-sha" },
        "head": { "sha": "head-sha" }
      }
    }
    EOF
    
    # Step 4: Run analysis
    sfp project:analyze \
        --base-ref base-sha \
        --head-ref head-sha
    ✅ Detected PR context #123
    ✅ Retrieved X changed files from PR #123
    ✅ Creating check for [linter]...
    ✅ Successfully created GitHub check: https://github.com/owner/repo/pull/123/checks
    Skipping check creation - not supported in this environment
    Cannot create GitHub check: Missing GitHub context information
    Failed to get auth token: Neither App credentials nor personal access token found
    Changes: +0 -0 lines
    src-env-specific/
    └── main/
        ├── default/        # Fallback for sandboxes
        │   └── classes/
        ├── dev/           # Dev-specific metadata
        │   └── classes/
        └── prod/          # Production metadata
            └── classes/
    # preDeploy/replacements.yml
    replacements:
      - name: "API Endpoint"
        glob: "**/*.cls"
        pattern: "%%API_URL%%"
        environments:
          default: "https://api.dev.example.com"
          prod: "https://api.example.com"
    my-package/
    ├── main/
    │   └── default/
    ├── pre-destructive/     # Deleted before main deployment
    │   └── objects/
    └── post-destructive/    # Deleted after main deployment
        └── classes/
    {
      "packageDirectories": [
        {
          "path": "src/my-package",
          "package": "my-package",
          "versionNumber": "1.0.0.NEXT",
          "dependencies": [
            {"package": "another-source-package"},
            {"package": "[email protected]"},
            {"subscriberPackageVersionId": "04t..."}
          ]
        }
      ]
    }
    Validation Pipeline Execution

    Configuration

    Add script paths to your sfdx-project.json file:

    Script Arguments

    Scripts receive three arguments in this order:

    1. Context File Path - Absolute path to temporary JSON file containing validation context

    2. Target Org - Username of the target organization for validation

    3. Hub Org - Username of the hub organization (empty string "" if not available)

    Context Data Structure

    Pre-Validation Context

    The context file contains information about packages ready for validation:

    Post-Validation Context

    The context file includes all pre-validation data plus validation results:

    Example Scripts

    Pre-Validation Script

    Post-Validation Script

    Error Handling & Behavior

    Script Type
    Failure Behavior
    Timeout
    Use Cases

    Pre-validation

    Halts validation process

    30 minutes

    Setup test data, configure environments, validate prerequisites

    Post-validation

    Logged as warning, validation continues

    30 minutes

    Cleanup resources, send notifications, generate reports

    Best Practices

    • Make scripts executable: chmod +x scripts/pre-validate.sh

    • Use set -e: Exit on errors to ensure proper failure handling

    • Parse JSON safely: Use jq for reliable JSON parsing

    • Handle missing data: Check if fields exist before using them

    • Log clearly: Scripts appear in validation logs with CI/CD folding

    • Keep scripts fast: Remember the 30-minute timeout limit

    • Test locally: Validate script behavior before committing

    Common Use Cases

    Pre-Validation Scripts

    • Set up test data specific to validation scenarios

    • Configure external API endpoints for testing

    • Validate prerequisites (licenses, feature flags, etc.)

    • Initialize monitoring or logging for the validation process

    Post-Validation Scripts

    • Clean up test data created during validation

    • Send notifications to Slack, Teams, or other systems

    • Generate custom reports or metrics

    • Update external tracking systems with validation results

    • Archive validation artifacts or logs

    Availability

    ✅

    ✅

    From

    Aug 25 - 02

    December 25

    Usage

    Flags

    Flag
    Description
    Required

    -o, --targetusername

    Username or alias of the target org

    Yes

    -p, --package

    Name of the package to push

    No

    -d, --domain

    Name of the domain to push

    No

    -s, --source-path

    Path to the local source files to push

    No

    Flag Details

    • The -p, -d, and -s flags are mutually exclusive. Use only one to specify the scope of the push operation.

    • --ignore-conflicts: Use this flag to override conflicts and push changes to the org, potentially overwriting org metadata.

    • --no-replacements: Disables automatic text replacements. By default, sfp applies configured replacements from preDeploy/replacements.yml.

    • --replacementsoverride: Specify a custom YAML file containing replacement configurations to use instead of the default.

    • --json: When specified, the command outputs a structured JSON object with detailed information about the push operation, including replacement details.

    Source Tracking

    Source tracking is a feature that keeps track of the changes made to metadata both in your local project and in the org. When source tracking is enabled, the project:push command can more efficiently deploy only the changes made locally since the last sync, rather than deploying all metadata.

    How Source Tracking Works with project:push

    • When pushing to a source-tracked org without specifying a package, domain, or source path, the command will use source tracking to deploy only the local changes.

    • For non-source-tracked orgs or when a specific scope is provided (via -p, -d, or -s flags), the command will deploy all metadata within the specified scope.

    • Source tracking provides faster and more efficient deployment of changes, especially in large projects.

    Limitations

    • Source tracking is not available for all org types. It's primarily used with scratch orgs and some sandbox orgs.

    • If source tracking is not enabled or supported, the project:push command will fall back to deploying all metadata within the specified scope.

    Text Replacements (Pro Feature)

    Availability: String replacements are available from September 2025 in sfp-pro only.

    The push command automatically applies text replacements to convert placeholder values in your source files to environment-specific values before deployment. This feature helps manage environment-specific configurations without modifying source files.

    For detailed information about string replacements, see String Replacements.

    Quick Example

    If your source contains placeholders:

    During push to a dev org, it becomes:

    To skip replacements:

    Examples

    Push changes using source tracking (if available):

    Push changes for a specific package:

    Push changes for a specific domain:

    Push changes from a specific source path:

    Push changes and ignore conflicts:

    JSON Output

    When --json is specified, the command outputs a JSON object with the following structure:

    The replacements field (available in sfp-pro) provides detailed information about text replacements applied during the push operation.

    Error Handling

    If an error occurs during the push operation, the command will throw an error with details about what went wrong. Use the --json flag to get structured error information in the output.

    Availability

    ✅

    ❌

    From

    August 24

    Pull Changes from your org

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    August 24

    The sfp project:pull command retrieves source from a Salesforce org and updates your local project files. It can pull changes based on a package, domain, or specific source path. This command is useful for synchronizing your local project with the latest changes in your Salesforce org.

    Source Tracking

    Source tracking is a feature in Salesforce development that keeps track of the changes made to metadata both in your local project and in the org. When source tracking is enabled, the project:pull command can more efficiently retrieve only the changes made in the org since the last sync, rather than retrieving all metadata.

    How Source Tracking Works

    • Source tracking maintains a history of changes in both your local project and the Salesforce org.

    • It allows sfp to determine which components have been added, modified, or deleted since the last synchronization.

    • This feature is automatically enabled for scratch orgs and can be enabled for non-scratch orgs that support it.

    Source Tracking and project:pull

    • When pulling from a source-tracked org without specifying a package, domain, or source path, the command will use source tracking to retrieve only the changes made in the org.

    • For non-source-tracked orgs or when a specific scope is provided (via -p, -d, or -s flags), the command will retrieve all metadata within the specified scope.

    • Source tracking provides faster and more efficient retrieval of changes, especially in large projects.

    Limitations

    • Source tracking is not available for all org types. It's primarily used with scratch orgs and some sandbox orgs.

    • If source tracking is not enabled or supported, the project:pull command will fall back to retrieving all metadata within the specified scope.

    Usage

    Flags

    Flag
    Description
    Required

    Flag Details

    • The -p, -d, and -s flags are mutually exclusive. Use only one to specify the scope of the pull operation.

    • --ignore-conflicts: Use this flag to override conflicts and pull changes from the org, potentially overwriting local changes.

    • --retrieve-path

    Examples

    Pull changes using source tracking (if available):

    Pull changes for a specific package:

    Pull changes for a specific domain:

    Pull changes from a specific source path:

    Pull changes and ignore conflicts:

    Pull changes without applying reverse replacements:

    Pull changes with custom replacement configuration:

    Text Replacements (Pro Feature)

    Availability: String replacements are available from September 2025 in sfp-pro only.

    The pull command automatically applies reverse text replacements to convert environment-specific values back to placeholders when retrieving source from the org. This feature helps maintain clean, environment-agnostic code in your repository.

    For detailed information about string replacements, see .

    How Reverse Replacements Work

    When you pull changes from an org, sfp automatically:

    1. Detects known values: Identifies environment-specific values that match your replacement configurations

    2. Converts to placeholders: Replaces these values with their placeholder equivalents

    3. Suggests new patterns: Detects potential patterns that could be added to your replacements

    Quick Example

    If your org contains:

    After pulling, it becomes:

    Pattern Detection

    During pull operations, sfp analyzes retrieved code for patterns that might benefit from replacements:

    To skip reverse replacements:

    JSON Output

    When --json is specified, the command outputs a JSON object with the following structure:

    The replacements field (available in sfp-pro) provides detailed information about reverse text replacements applied during the pull operation, including any pattern suggestions detected.

    Error Handling

    If an error occurs during the pull operation, the command will throw an error with details about what went wrong. Use the --json flag to get structured error information in the output.

    Creating a package

    All packages start out as directory in your repo!

    A package is a collection of metadata grouped together in a directory, and defined by an entry in your sfdx-project.json (Project Manifest).

    Each package in sfp must have the following attributes as the minimum:

    Attribute
    Required
    Description
    {
      "plugins": {
        "sfp": {
          "validateScripts": {
            "preValidation": "./scripts/pre-validate.sh",
            "postValidation": "./scripts/post-validate.sh"
          }
        }
      }
    }
    # Example script invocation:
    ./scripts/pre-validate.sh /tmp/sfp-validate-pre-1234567890.json [email protected] [email protected]
    {
      "phase": "pre-validation",
      "targetOrg": "scratch-org-username",
      "hubOrg": "devhub-username",
      "validationMode": "thorough",
      "packages": [
        {
          "name": "core-crm",
          "version": "2.1.0.NEXT", 
          "type": "source",
          "isChanged": true
        }
      ]
    }
    {
      "phase": "post-validation",
      "targetOrg": "scratch-org-username",
      "hubOrg": "devhub-username",
      "validationMode": "thorough",
      "packages": [/* same as pre-validation */],
      "validationResults": {
        "status": "success",
        "deployedPackages": ["core-crm", "shared-utils"],
        "failedPackages": [],
        "error": undefined
      }
    }
    #!/bin/bash
    set -e
    
    CONTEXT_FILE="$1"
    TARGET_ORG="$2"
    HUB_ORG="$3"
    
    echo "🔧 Pre-validation setup starting..."
    
    # Parse context
    PACKAGES=$(cat "$CONTEXT_FILE" | jq -r '.packages[].name' | tr '\n' ' ')
    VALIDATION_MODE=$(cat "$CONTEXT_FILE" | jq -r '.validationMode')
    
    echo "📦 Packages to validate: $PACKAGES"
    echo "🎯 Validation mode: $VALIDATION_MODE"
    echo "🏢 Target org: $TARGET_ORG"
    
    # Custom setup logic
    if [ "$VALIDATION_MODE" = "thorough" ]; then
        echo "🔧 Setting up comprehensive validation environment..."
        # Setup test data, configure external systems, etc.
    fi
    
    # Example: Notify external systems
    curl -X POST "https://internal-api.company.com/validation/started" \
         -H "Content-Type: application/json" \
         -d "{\"packages\": \"$PACKAGES\", \"targetOrg\": \"$TARGET_ORG\"}"
    
    echo "✅ Pre-validation setup completed"
    #!/bin/bash
    CONTEXT_FILE="$1"
    TARGET_ORG="$2"
    HUB_ORG="$3"
    
    echo "🏁 Post-validation processing starting..."
    
    # Parse results
    STATUS=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.status')
    DEPLOYED=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.deployedPackages[]' | tr '\n' ' ')
    FAILED=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.failedPackages[]' | tr '\n' ' ')
    
    echo "📊 Validation status: $STATUS"
    echo "✅ Deployed packages: $DEPLOYED"
    
    if [ "$STATUS" = "failed" ]; then
        echo "❌ Failed packages: $FAILED"
        ERROR=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.error // "Unknown error"')
        echo "🔍 Error details: $ERROR"
        
        # Notify failure
        curl -X POST "https://internal-api.company.com/validation/failed" \
             -H "Content-Type: application/json" \
             -d "{\"error\": \"$ERROR\", \"failedPackages\": \"$FAILED\"}"
    else
        echo "🎉 Validation successful!"
        
        # Notify success  
        curl -X POST "https://internal-api.company.com/validation/success" \
             -H "Content-Type: application/json" \
             -d "{\"deployedPackages\": \"$DEPLOYED\"}"
    fi
    
    echo "✅ Post-validation processing completed"
    sfp project:push -o <org> [flags]
    private static final String API_URL = '%%API_ENDPOINT%%';
    private static final String API_URL = 'https://api-dev.example.com';
    sfp push -p myPackage -o myOrg --no-replacements
    sfp project:push -o myOrg
    sfp project:push -o myOrg -p myPackage
    sfp project:push -o myOrg -d myDomain
    sfp project:push -o myOrg -s force-app/main/default
    sfp project:push -o myOrg -i
    {
      "hasError": boolean,
      "errorMessage": string,
      "errors": [
        {
          "Name": string,
          "Type": string,
          "Status": string,
          "Message": string
        }
      ],
      "conflicts": [
        {
          "fullName": string,
          "type": string,
          "filePath": string,
          "state": string
        }
      ],
      "replacements": {
        "success": boolean,
        "packageName": string,
        "filesModified": [
          {
            "path": string,
            "replacements": [
              {
                "pattern": string,
                "value": string,
                "count": number
              }
            ],
            "totalCount": number
          }
        ],
        "totalFiles": number,
        "totalReplacements": number,
        "errors": [],
        "orgAlias": string
      }
    }

    -i, --ignore-conflicts

    Ignore conflicts during push

    No

    --no-replacements

    Skip text replacements during push

    No

    --replacementsoverride

    Path to override replacements file

    No

    --json

    Format output as JSON

    No

    --loglevel

    Logging level

    No

    -r, --retrieve-path

    Path where the retrieved source should be placed

    No

    -i, --ignore-conflicts

    Ignore conflicts during pull

    No

    --no-replacements

    Skip text replacements during pull

    No

    --replacementsoverride

    Path to override replacements file

    No

    --json

    Format output as JSON

    No

    --loglevel

    Logging level

    No

    : Specifies a custom location for the retrieved source files.
  • --no-replacements: Disables automatic text replacements. By default, sfp applies reverse replacements to convert environment-specific values back to placeholders.

  • --replacementsoverride: Specify a custom YAML file containing replacement configurations to use instead of the default.

  • --json: When specified, the command outputs a structured JSON object with detailed information about the pull operation, including replacement details and pattern suggestions.

  • -o, --targetusername

    Username or alias of the target org

    Yes

    -p, --package

    Name of the package to pull

    No

    -d, --domain

    Name of the domain to pull

    No

    -s, --source-path

    Path to the local source files to pull

    No

    String Replacements
    sfp project:pull -o <org> [flags]
    sfp project:pull -o myOrg
    sfp project:pull -o myOrg -p myPackage
    sfp project:pull -o myOrg -d myDomain
    sfp project:pull -o myOrg -s force-app/main/default
    sfp project:pull -o myOrg -p myPackage --ignore-conflicts
    sfp project:pull -o myOrg -p myPackage --no-replacements
    sfp project:pull -o myOrg -p myPackage --replacementsoverride custom-replacements.yml
    private static final String API_URL = 'https://api-dev.example.com';
    private static final String API_URL = '%%API_ENDPOINT%%';
    ⚠️  Potential replacements detected:
    
      📄 force-app/main/default/classes/APIService.cls:
         • URL detected: 'https://new-api.example.com/v2'
           New URL pattern detected. Consider adding to replacements.yml
    
      💡 To include these values in future replacements, update your replacements.yml file.
    sfp pull -p myPackage -o myOrg --no-replacements
    {
      "hasError": boolean,
      "errorMessage": string,
      "files": [
        {
          "fullName": string,
          "type": string,
          "createdByName": string,
          "lastModifiedByName": string,
          "createdDate": string,
          "lastModifiedDate": string
        }
      ],
      "conflicts": [
        {
          "fullName": string,
          "type": string,
          "filePath": string,
          "state": string
        }
      ],
      "errors": [
        {
          "fileName": string,
          "problem": string
        }
      ],
      "replacements": {
        "success": boolean,
        "packageName": string,
        "filesModified": [
          {
            "path": string,
            "replacements": [
              {
                "pattern": string,
                "value": string,
                "count": number
              }
            ],
            "totalCount": number
          }
        ],
        "totalFiles": number,
        "totalReplacements": number,
        "errors": [],
        "orgAlias": string,
        "suggestions": [
          {
            "filePath": string,
            "suggestions": [
              {
                "type": string,
                "value": string,
                "message": string,
                "pattern": string
              }
            ]
          }
        ]
      }
    }

    versionNumber

    yes

    The version number of the package

    versionDescription

    no

    Description for a particular version of the package

    sfp will not consider any entries in your sfdx-project.json for its operations if it is missing 'package' or 'versionNumber' attribute.

    Package Types

    By default, sfp treats all entries in sfdx-project.json as Source Packages. You can create different types of packages depending on your needs:

    Package Type
    sfp-pro
    sfp (community)
    Description

    Source Package

    ✅

    Manual

    Default package type for deploying metadata

    Unlocked Package

    ✅

    SF CLI

    Versioned, upgradeable package

    Org-Dependent Unlocked

    ✅

    SF CLI

    Creating Packages with sfp-pro

    Source Package

    Create a source package using the sfp-pro CLI:

    Flags:

    • -n, --name (required): Package name

    • -r, --path (required): Directory path for the package

    • -d, --description: Package description

    • --domain: Mark package as a domain package

    • --no-insert: Don't insert into sfdx-project.json automatically

    • --insert-after: Insert after a specific package

    Unlocked Package

    Create an unlocked package with automatic DevHub registration:

    Flags:

    • -n, --name (required): Package name

    • -r, --path (required): Directory path for the package

    • -v, --targetdevhubusername: DevHub alias/username

    • --org-dependent: Create org-dependent unlocked package

    • --no-namespace: Create without namespace

    • -d, --description: Package description

    • --error-notification-username: Username for error notifications

    • --domain: Mark package as a domain package

    Data Package

    Create a data package for data migration:

    Flags:

    • -n, --name (required): Package name

    • -r, --path (required): Directory path for the package

    • -d, --description: Package description

    • --domain: Mark package as a domain package

    Ensure your data package directory contains an export.json and the required CSV files. See Data Packages for details.

    Diff Package

    Create a diff package to track changes from a baseline:

    Flags:

    • -n, --name (required): Package name

    • -r, --path (required): Directory path for the package

    • -c, --commit-id: Baseline commit ID

    • -v, --targetdevhubusername: DevHub alias/username

    • -d, --description: Package description

    • --domain: Mark package as a domain package

    Creating Packages for Community Edition

    For sfp community edition users, packages need to be created manually or using Salesforce CLI.

    Source Package (Manual)

    1. Create a directory for your package

    2. Add an entry to your sfdx-project.json:

    Unlocked Package (Using SF CLI)

    1. Ensure your sfdx-project.json contains an entry for the package with path, package, and versionNumber

    2. Create the package using Salesforce CLI:

    1. Commit the updated sfdx-project.json with the new package ID in packageAliases

    Data Package (Manual)

    1. Create a directory for your data package

    2. Add the required export.json and CSV files

    3. Add an entry to sfdx-project.json with type: "data":

    Diff Package (Manual)

    1. Create a directory for your diff package

    2. Add an entry to sfdx-project.json with type: "diff":

    1. Create a record in SfpowerscriptsArtifact2__c object in your DevHub with:

      • Package name

      • Initial version number

      • Baseline commit ID

    Best Practices

    1. Use descriptive names: Package names should clearly indicate their purpose

    2. Organize by domain: Group related packages using domains

    3. Version consistently: Use semantic versioning (MAJOR.MINOR.PATCH)

    4. Document packages: Add meaningful version descriptions

    5. Choose the right type:

      • Source packages for most metadata

      • Unlocked packages for distributed, versioned components

      • Data packages for reference data

      • Diff packages for selective deployments

    Next Steps

    • Defining a Domain - Organize packages into domains

    • Building Artifacts - Build deployable artifacts from packages

    • Package Types - Learn more about different package types

    path

    yes

    Path to the directory that contains the contents of the package

    package

    yes

    The name of the package

    Logo

    String Replacements

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    September 2025

    String replacements provide a mechanism to manage environment-specific values in your Salesforce code without modifying source files. This feature automatically replaces placeholders with appropriate values during build, install, and push operations, and converts values back to placeholders during pull operations.

    String replacements complement the existing aliasfy packages feature. While aliasfy packages handle structural metadata differences by deploying different files per environment, string replacements handle configuration value differences within the same files, reducing duplication and maintenance overhead.

    How It Works

    String replacements work across multiple sfp commands:

    Build Operations: During sfp build, replacement configurations are analyzed and embedded in the artifact for later use during installation.

    Install/Deploy Operations: During sfp install or sfp deploy, placeholders are replaced with environment-specific values based on the target org:

    Push Operations: During sfp push, placeholders in your source files are replaced with environment-specific values before deployment:

    Pull Operations: During sfp pull, environment-specific values are converted back to placeholders:

    Configuration

    Replacements are configured in a replacements.yml file within each package's preDeploy directory:

    Example configuration:

    The properties accepted by the configuration file are:

    Field
    Required
    Description

    Example Usage

    API Configuration Example

    Source file with placeholders:

    After pushing to a dev org, the placeholders are replaced:

    Pattern Detection

    During pull operations, sfp automatically detects potential patterns that could be converted to replacements:

    • URLs: Detects HTTP/HTTPS URLs

    • Email Addresses: Identifies email patterns

    • API Keys: Recognizes common API key formats

    • Custom Patterns: Detects repetitive values across files

    When patterns are detected, sfp provides suggestions:

    Environment Resolution

    Replacements are resolved based on the target org:

    • Exact Alias Match: First checks for an exact match with the org alias

    • Sandbox Default: For sandbox/scratch orgs, uses the default value

    • Production Requirement: Production deployments require explicit configuration

    Org Alias Mapping

    The org alias is determined from your Salesforce CLI authentication:

    Command Support

    String replacements are supported across the following sfp commands:

    Command
    Support
    Description

    Command Line Options

    Disable Replacements

    Override Replacements

    JSON Output

    Both push and pull commands support JSON output with detailed replacement information:

    JSON Output Structure

    Troubleshooting

    Replacements Not Applied

    • Check File Location: Ensure replacements.yml is in preDeploy directory

    • Verify Glob Pattern: Test glob pattern matches your files

    • Check Org Alias: Verify the org alias matches your configuration

    Pattern Not Found

    • Case Sensitivity: Patterns are case-sensitive

    • Special Characters: Escape special regex characters if needed

    • File Encoding: Ensure files are UTF-8 encoded

    Wrong Value Applied

    • Org Detection: Verify org alias with sf org list

    • Environment Priority: Check resolution order (exact match → default)

    • Override Files: Check if override file is being used

    Limitations

    • Replacements are text-based and work with any text file format

    • Binary files are not supported

    • Large files may impact performance

    • Regex patterns should be used carefully to avoid unintended matches

    Related Documentation

    • - Configure string replacements in packages

    • - How replacements work during installation

    • - Deploy changes with replacements

    • - Retrieve changes with reverse replacements

    Configuring LLM Providers

    sfp-pro
    sfp (community)

    This guide covers the setup and configuration of Large Language Model (LLM) providers for AI-powered features in sfp:

    AI Assisted Architecture Analysis

    sfp-pro
    sfp (community)

    The AI-powered review functionality provides intelligent architecture and code quality analysis during pull request reviews. This feature automatically analyzes changed files using advanced language models to provide contextual insights about architectural patterns, Flxbl framework compliance, and potential improvements.

    // A sample sfdx-project.json with a package
    {
      "packageDirectories": [
        {
          "path": "src/my-package",
          "package": "my-package",
          "versionNumber": "1.0.0.NEXT"
        }
      ]
    }
    sfp package create source -n "my-source-package" -r "src/my-package"
    
    # With domain (for organizing packages)
    sfp package create source -n "my-source-package" -r "src/my-package" --domain
    sfp package create unlocked -n "my-unlocked-package" -r "src/my-package" -v devhub
    
    # Org-dependent unlocked package
    sfp package create unlocked -n "my-package" -r "src/my-package" --org-dependent -v devhub
    
    # With namespace
    sfp package create unlocked -n "my-package" -r "src/my-package" --no-namespace -v devhub
    sfp package create data -n "my-data-package" -r "data/my-data-package"
    sfp package create diff -n "my-diff-package" -r "src/my-diff-package" -c "baseline-commit-id" -v devhub
    {
      "packageDirectories": [
        {
          "path": "src/my-source-package",
          "package": "my-source-package",
          "versionNumber": "1.0.0.NEXT"
        }
      ]
    }
    # Standard unlocked package
    sf package create --name my-package --package-type Unlocked --no-namespace -v devhub
    
    # Org-dependent unlocked package
    sf package create --name my-package --package-type Unlocked --org-dependent --no-namespace -v devhub
    {
      "path": "data/my-data-package",
      "package": "my-data-package",
      "versionNumber": "1.0.0.NEXT",
      "type": "data"
    }
    {
      "path": "src/my-diff-package",
      "package": "my-diff-package",
      "versionNumber": "1.0.0.NEXT",
      "type": "diff"
    }

    Unlocked package with org dependencies

    Data Package

    ✅

    Manual

    Package for data migration

    Diff Package

    ✅

    Manual

    Package containing only changed components

    Logo
    Logo

    environments.default

    No

    Default value used for sandbox/scratch orgs (recommended)

    isRegex

    No

    Whether the pattern is a regular expression (default: false)

    sfp pull

    ✅

    Applies reverse replacements during source pull

    sfp validate

    ✅

    Applies replacements during validation

    Review Logs: Check debug logs for replacement processing

    name

    Yes

    Human-readable name for the replacement

    pattern

    Yes

    The placeholder pattern to replace (e.g., %%API_URL%%)

    glob

    Yes

    File pattern for matching files (e.g., **/*.cls for all Apex classes)

    environments

    Yes

    Map of environment aliases to replacement values

    sfp build

    ✅

    Analyzes and embeds replacement configurations in artifacts

    sfp install

    ✅

    Applies replacements during artifact installation

    sfp deploy

    ✅

    Applies replacements during artifact deployment

    sfp push

    ✅

    Applies forward replacements during source push

    String Replacements Configuration
    String Replacements During Install
    sfp push
    sfp pull
    Overview

    The architecture analysis performs real-time analysis of pull request changes to:

    • Analyze architectural patterns and design consistency

    • Identify alignment with Flxbl framework best practices

    • Suggest improvements based on changed files context

    • Provide severity-based insights (info, warning, concern)

    • Generate actionable recommendations

    How It Works

    The AI assisted architecture analyzer integrates into the project:analyze command and:

    1. Detects PR Context: Automatically identifies when running in a pull request environment

    2. Analyzes Changed Files: Focuses analysis on modified files only (up to 10 files for token optimization)

    3. Applies AI Analysis: Uses configured AI provider to analyze architectural patterns

    4. Reports Findings: Generates structured insights without failing the build (informational only)

    5. Creates GitHub Checks: Posts results as GitHub check annotations when running in CI

    Prerequisites

    OpenCode is currently only supported on OSX or Linux runtimes. It's not supported for Windows platforms.

    This feature is exclusive to sfp-pro and not available in the community edition.

    For complete setup instructions, see Configuring LLM Providers.

    Quick Setup:

    Configuration

    The architecture analyzer is configured through a YAML configuration file at config/ai-architecture.yaml:

    Minimal Configuration

    For quick setup, create a minimal configuration:

    The linter will auto-detect available AI providers and use sensible defaults.

    AI Provider Setup

    For detailed provider configuration, see Configuring LLM Providers.

    Quick Reference

    Provider
    Default Model
    Setup Command

    Anthropic (Recommended)

    claude-4-5

    sfp ai auth --provider anthropic --auth

    OpenAI

    gpt-5

    sfp ai auth --provider openai --auth

    Amazon Bedrock

    claude-4-sonnet-xxxxx

    Configure AWS credentials

    The linter auto-detects providers in this priority:

    1. Environment variables (ANTHROPIC_API_KEY, OPENAI_API_KEY, etc.)

    2. Configuration in ai-architecture.yaml

    Usage in Pull Requests

    Automatic PR Detection

    When running in GitHub Actions or with PR environment variables:

    Manual Changed Files Specification

    For local testing or custom CI environments:

    Understanding Results

    The AI linter provides structured insights without failing builds:

    Insight Types

    • Pattern: Architectural patterns observed or missing

    • Concern: Potential issues requiring attention

    • Suggestion: Improvement recommendations

    • Alignment: Framework compliance observations

    Severity Levels

    • Info: Informational observations

    • Warning: Areas needing attention

    • Concern: Significant architectural considerations

    Sample Output

    Integration with CI/CD

    AI linter results are informational only and never fail the build. This ensures PR checks remain stable even if AI providers are unavailable.

    GitHub Actions Integration

    Handling Rate Limits

    The linter gracefully handles API limitations:

    • Rate Limits: Skips analysis with informational message

    • Timeouts: 60-second timeout protection

    • Token Limits: Analyzes up to 10 files, content limited to 5KB per file

    • Failures: Never blocks PR merge (informational only)

    Best Practices

    1. Configure Focus Areas

    Tailor analysis to your team's priorities:

    2. Add Context Files

    Provide architectural documentation for better analysis:

    3. Use with Other Linters

    Combine with other analysis tools for comprehensive coverage:

    4. Token Optimization

    For large PRs, the linter automatically:

    • Limits to 10 most relevant files

    • Truncates file content to 5KB

    • Focuses on text-based source files

    Troubleshooting

    AI Provider Not Detected

    Analysis Skipped

    Common reasons and solutions:

    1. Not Enabled: Set enabled: true in config/ai-architecture.yaml

    2. No Provider: Configure API keys or authenticate with sfp ai auth

    3. Rate Limited: Wait for rate limit reset or use different provider

    4. No Changed Files: Ensure PR context is properly detected

    Debugging

    Enable debug logging for detailed information:

    This shows:

    • Provider detection process

    • Changed files identified

    • API calls and responses

    • Error details if analysis fails

    Limitations

    1. Binary Files: Skips non-text files

    2. Build Impact: Never fails builds (informational only)

    3. Language Support: Best for Apex, JavaScript, TypeScript, XML

    Availability

    ✅

    ❌

    From

    October 25

    Not Available

    Artifact (%%API_URL%%) → Install → Target Org (https://api.example.com)
    Source File (%%API_URL%%) → Push → Target Org (https://api.example.com)
    Target Org (https://api.example.com) → Pull → Source File (%%API_URL%%)
    src/
      your-package/
        preDeploy/
          replacements.yml
        main/
          default/
            classes/
    replacements:
      - name: "API Endpoint"
        pattern: "%%API_ENDPOINT%%"
        glob: "**/*.cls"
        environments:
          default: "https://api-sandbox.example.com"
          dev: "https://api-dev.example.com"
          staging: "https://api-staging.example.com"
          prod: "https://api.example.com"
    
      - name: "Support Email"
        pattern: "%%SUPPORT_EMAIL%%"
        glob: "**/*.cls"
        environments:
          default: "[email protected]"
          dev: "[email protected]"
          prod: "[email protected]"
    public class APIService {
        private static final String ENDPOINT = '%%API_ENDPOINT%%';
        private static final String API_KEY = '%%API_KEY%%';
    }
    public class APIService {
        private static final String ENDPOINT = 'https://api-dev.example.com';
        private static final String API_KEY = 'dev-key-12345';
    }
    ⚠️  Potential replacements detected:
    
      📄 src/package/main/default/classes/APIService.cls:
         • URL detected: 'https://new-api.example.com/v2'
           New URL pattern detected. Consider adding to replacements.yml
    
      💡 To include these values in future replacements, update your replacements.yml file.
    # Check your org aliases
    sf org list
    
    # Push with specific org alias
    sfp push -o dev-sandbox -p your-package
    # Skip replacements during install
    sfp install --targetorg dev --artifactdir artifacts --no-replacements
    
    # Skip replacements during push
    sfp push -p your-package -o dev --no-replacements
    
    # Skip replacements during pull
    sfp pull -p your-package -o dev --no-replacements
    # Use override file during install
    sfp install --targetorg dev --artifactdir artifacts --replacementsoverride custom-replacements.yml
    
    # Use override file during push
    sfp push -p your-package -o dev --replacementsoverride custom-replacements.yml
    
    # Use override file during pull
    sfp pull -p your-package -o dev --replacementsoverride custom-replacements.yml
    # Get JSON output with replacement details
    sfp push -p your-package -o dev --json
    sfp pull -p your-package -o dev --json
    {
      "hasError": false,
      "replacements": {
        "success": true,
        "packageName": "your-package",
        "filesModified": [
          {
            "path": "main/default/classes/APIService.cls",
            "replacements": [
              {
                "pattern": "%%API_ENDPOINT%%",
                "value": "https://api-dev.example.com",
                "count": 1
              }
            ],
            "totalCount": 1
          }
        ],
        "totalFiles": 1,
        "totalReplacements": 1,
        "errors": [],
        "orgAlias": "dev"
      }
    }
    # Install OpenCode CLI
    npm install -g opencode-ai
    
    nfigure Anthropic (recommended)
    sfp ai auth --provider anthropic --auth
    # Enable/disable AI architecture analysis
    enabled: true
    
    # AI Provider Configuration (optional - auto-detects if not specified)
    provider: anthropic  # Options: anthropic, openai, google
    model: claude-4-sonnet-xxxxx # Optional - uses provider defaults if not specified
    
    # Architectural Patterns to Check
    patterns:
      - singleton
      - factory
      - repository
      - service-layer
    
    # Architecture Principles
    principles:
      - separation-of-concerns
      - single-responsibility
      - dependency-inversion
    
    # Focus Areas for Analysis
    focusAreas:
      - security
      - performance
      - maintainability
      - testability
    
    # Additional Context Files (optional)
    contextFiles:
      - ARCHITECTURE.md
      - docs/patterns.md
    enabled: true
    # Automatically detects PR context and analyzes only changed files
    sfp project:analyze
    
    # Explicitly exclude AI linter if needed
    sfp project:analyze --exclude-linters architecture
    # Manually specify changed files
    sfp project:analyze --changed-files "src/classes/MyClass.cls,src/lwc/myComponent/myComponent.js"
    📐 Architecture Analysis Results
    ════════════════════════════════
    
    ✅ Analysis Complete (AI-powered by anthropic/claude-4-sonnet)
    
    ## Summary
    Analyzed 5 changed files focusing on architectural patterns and Flxbl compliance.
    
    ## Key Insights
    
    ### ⚠️ Service Layer Pattern (Warning)
    File: src/classes/AccountController.cls
    Description: Direct SOQL queries in controller violates service layer pattern.
    Consider moving data access logic to a dedicated service class.
    
    ### ℹ️ Dependency Management (Info)
    File: src/classes/OrderService.cls
    Description: Good use of dependency injection pattern for testability.
    This aligns well with Flxbl framework principles.
    
    ### ⚠️ Error Handling (Concern)
    File: src/classes/PaymentProcessor.cls:45
    Description: Missing comprehensive error handling for external callouts.
    Implement try-catch blocks with proper logging and user feedback.
    
    ## Recommendations
    1. Extract data access logic to service layer classes
    2. Implement centralized error handling strategy
    3. Consider adding unit tests for new service methods
    4. Document architectural decisions in ARCHITECTURE.md
    - name: Run Project Analysis with AI Linter
      run: |
        sfp project:analyze --output-format github
      env:
        ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
        # GitHub context automatically detected
    focusAreas:
      - security        # For compliance-critical projects
      - performance     # For high-volume applications
      - maintainability # For long-term projects
    contextFiles:
      - ARCHITECTURE.md
      - docs/coding-standards.md
      - docs/patterns.md
    # Run all linters including AI analysis
    sfp project:analyze --fail-on duplicates,compliance
    
    # AI linter provides insights, others enforce rules
    # Check available providers
    echo $ANTHROPIC_API_KEY
    echo $OPENAI_API_KEY
    sfp project:analyze --loglevel debug
    • AI-Powered PR Linter - sfp-pro only

    • AI Assisted Insight Reports - Available in both sfp-pro and community (alpha)

    • AI-Assisted Error Analysis - Intelligent validation error analysis

    These features require OpenCode CLI and an authenticated LLM provider.

    OpenCode is currently only supported on OSX or Linux runtimes. It's not supported for Windows platforms.

    For sfp (community) users: These AI features are available in alpha. You can use npm install -g @flxbl-io/sfp@ai to access them.

    Prerequisites

    OpenCode CLI Installation

    OpenCode CLI is the underlying engine that manages AI interactions for sfp's AI-powered features. It handles provider authentication, model selection, and secure API communication.

    Installation Methods

    Global Installation (Recommended)

    Alternative Installation Methods

    For more installation options and troubleshooting, see the OpenCode documentation.

    Supported LLM Providers

    sfp currently supports the following LLM providers through OpenCode:

    Provider
    Status
    Recommended
    Best For

    Anthropic (Claude)

    ✅ Fully Supported

    ⭐ Yes

    Best overall performance, Flxbl framework understanding

    OpenAI

    ✅ Fully Supported

    Yes

    Wide model selection, good performance

    Amazon Bedrock

    ✅ Fully Supported

    Yes

    Provider Configuration

    Anthropic (Claude) - Recommended

    Anthropic's Claude models provide the best understanding of Salesforce and Flxbl framework patterns. The default model used is claude-4-sonnet-xxxxx which offers optimal balance between performance and cost.

    Setup Methods

    Method 1: Interactive Authentication (Recommended)

    Method 2: Environment Variable

    Method 3: Configuration File Create or edit config/ai-architecture.yaml:

    Getting an Anthropic API Key

    1. Visit console.anthropic.com

    2. Sign up or log in to your account

    3. Navigate to API Keys section

    4. Create a new API key for sfp usage

    5. Copy the key (starts with sk-ant-)

    Claude Models Available:

    • claude-4-sonnet-xxxxx - Recommended, best balance (default)

    • claude-4-opus-xxxxx - Most capable, higher cost

    OpenAI

    OpenAI provides access to GPT models with good code analysis capabilities.

    Setup Methods

    Method 1: Interactive Authentication

    Method 2: Environment Variable

    Method 3: Configuration File

    Getting an OpenAI API Key

    1. Visit platform.openai.com

    2. Sign up or log in

    3. Go to API Keys section

    4. Create a new secret key

    5. Copy the key (starts with sk-)

    Amazon Bedrock

    Amazon Bedrock is ideal for enterprise environments already using AWS infrastructure. It provides access to Claude models through AWS.

    Setup Methods

    Method 1: AWS Profile

    Method 2: AWS Credentials

    Method 3: Configuration File

    Important: AWS Bedrock requires both AWS_BEARER_TOKEN_BEDROCK and AWS_REGION environment variables to be set. Authentication will fail if either is missing.

    Bedrock Model Access: Ensure your AWS account has access to the Claude models in Bedrock. You may need to request access through the AWS Console under Bedrock > Model access.

    Regional Considerations

    Bedrock automatically handles model prefixes based on your AWS region:

    • US Regions: Models may require us. prefix

    • EU Regions: Models may require eu. prefix

    • AP Regions: Models may require apac. prefix

    The OpenCode SDK handles this automatically based on your AWS_REGION.

    GitHub Copilot

    GitHub Copilot can be used if you have an active subscription with model access enabled.

    Setup

    GitHub Copilot requires the corresponding models to be activated in your GitHub Copilot Settings. Visit GitHub Copilot Features to enable model access.

    Configuration File Reference

    The AI features are configured through config/ai-assist.yaml in your project root:

    Authentication Management

    Checking Authentication Status

    Testing Provider Inference

    After configuring authentication, you can verify that providers are working correctly using the ai check command:

    This command performs a simple inference test to verify:

    • Authentication is configured correctly

    • The provider is accessible

    • Model inference is working

    • Response time and performance

    Authentication Storage

    Credentials are stored securely in ~/.sfp/ai-auth.json with appropriate file permissions. This file is created automatically when you authenticate.

    Rotating API Keys

    Usage Priority

    When multiple authentication methods are available, sfp uses the following priority:

    1. Environment Variables - Highest priority, useful for CI/CD

    2. Stored Credentials - From ~/.sfp/ai-auth.json

    3. Configuration File - From config/ai-assist.yaml

    Troubleshooting

    OpenCode CLI Not Found

    Provider Not Available

    AWS Bedrock Specific Issues

    Both Environment Variables Required

    Authentication Failed

    • Verify both AWS_BEARER_TOKEN_BEDROCK and AWS_REGION are set

    • Check that your bearer token is valid and not expired

    • Ensure your AWS account has access to Claude models in Bedrock

    API Rate Limits

    If you encounter rate limits:

    • Anthropic: Check your usage at console.anthropic.com

    • OpenAI: Monitor at platform.openai.com/usage

    • Bedrock: Check AWS CloudWatch metrics

    Model Not Found

    Ensure you're using the correct model identifier:

    Availability

    ✅

    🔶

    From

    October 25

    December 25

    Features

    PR Linter, Reports, Error Analysis

    Reports Only

    Development Workflow

    This guide walks through the complete development workflow using sfp in a modular Salesforce project following the Flxbl framework.

    Overview

    The development workflow in an sfp-powered project follows an iterative approach where developers work in isolated environments, make changes using source-driven development, and submit their work through pull requests that trigger automated validation and review environments.

    # Install OpenCode CLI globally via npm
    npm install -g opencode-ai
    
    # Verify installation
    opencode --version
    # Using yarn
    yarn global add opencode-ai
    
    # Using pnpm
    pnpm add -g opencode-ai
    
    # Using Homebrew (macOS/Linux)
    brew install opencode-ai
    # Authenticate with Anthropic
    sfp ai auth --provider anthropic --auth
    
    # This will prompt for your API key and store it securely
    # Add to your shell profile (.bashrc, .zshrc, etc.)
    export ANTHROPIC_API_KEY="sk-ant-xxxxxxxxxxxxx"
    enabled: true
    provider: anthropic
    # Model is optional - uses claude-sonnet-4-20250514 by default
    sfp ai auth --provider openai --auth
    export OPENAI_API_KEY="sk-xxxxxxxxxxxxx"
    # In config/ai-assist.yaml
    enabled: true
    provider: openai
    # Model is optional - uses gpt-5 by default
    # Set both required environment variables
    export AWS_BEARER_TOKEN_BEDROCK="your-bearer-token"
    export AWS_REGION="us-east-1"
    
    # Both variables must be set for authentication to work
    # Authenticate with Amazon Bedrock
    sfp ai auth --provider amazon-bedrock --auth
    
    # This will prompt for both Bearer Token and Region
    # In config/ai-assist.yaml
    enabled: true
    provider: amazon-bedrock
    model: anthropic.claude-sonnet-4-20250514-v1:0  # Default model
    # Authenticate with GitHub Copilot
    sfp ai auth --provider github-copilot --auth
    
    # Ensure models are enabled in GitHub Settings:
    # https://github.com/settings/copilot/features
    # Enable/disable AI features
    enabled: true
    
    # Provider Configuration
    provider: anthropic  # anthropic, openai, amazon-bedrock, github-copilot
    
    # Model Configuration (Optional - uses provider defaults if not specified)
    # Default models:
    # - anthropic: claude-sonnet-4-20250514
    # - github-copilot: claude-sonnet-4
    # - openai: gpt-5
    # - amazon-bedrock: anthropic.claude-sonnet-4-20250514-v1:0
    model: claude-sonnet-4-20250514  # Override default model
    
    # Architectural Patterns to Check (for PR Linter)
    patterns:
      - singleton
      - factory
      - repository
      - service-layer
    
    # Architecture Principles
    principles:
      - separation-of-concerns
      - single-responsibility
      - dependency-inversion
    
    # Focus Areas for Analysis
    focusAreas:
      - security
      - performance
      - maintainability
      - testability
    
    # Additional Context Files
    contextFiles:
      - ARCHITECTURE.md
      - docs/patterns.md
      - docs/coding-standards.md
    # Check all providers
    sfp ai auth
    
    # Check specific provider
    sfp ai auth --provider anthropic
    
    # List all supported providers
    sfp ai auth --list
    # Test all configured providers
    sfp ai check
    
    # Test specific provider with default model
    sfp ai check --provider anthropic
    
    # Test Amazon Bedrock (uses default: anthropic.claude-sonnet-4-20250514-v1:0)
    sfp ai check --provider amazon-bedrock
    
    # Test GitHub Copilot (uses default: claude-sonnet-4)
    sfp ai check --provider github-copilot
    
    # Re-authenticate to update stored credentials
    sfp ai auth --provider anthropic --auth
    
    # Or update environment variable
    export ANTHROPIC_API_KEY="sk-ant-new-key-xxxxx"
    # Verify installation
    which opencode
    
    # If not found, reinstall
    npm install -g opencode-ai
    
    # Check npm global bin path is in PATH
    npm bin -g
    # Check authentication
    sfp ai auth --provider anthropic
    
    # Verify environment variables
    echo $ANTHROPIC_API_KEY
    
    # For AWS Bedrock - check both required variables
    echo $AWS_BEARER_TOKEN_BEDROCK
    echo $AWS_REGION
    
    # Check stored credentials exist
    ls -la ~/.sfp/ai-auth.json
    
    # Test provider inference
    sfp ai check --provider <provider-name>
    # This will NOT work (missing region)
    export AWS_BEARER_TOKEN_BEDROCK="token"
    
    # This will work (both variables set)
    export AWS_BEARER_TOKEN_BEDROCK="token"
    export AWS_REGION="us-east-1"
    # Correct
    model: claude-4-sonnet-xxxxx
    

    Enterprise environments with AWS infrastructure

    Prerequisites

    DevHub Access Required

    Before starting development with sfp, ensure you have:

    1. DevHub access - Required for:

      • Building packages (all types)

      • Creating scratch orgs

      • Managing unlocked packages

    2. DevHub user setup:

      • Your user must be added to the DevHub org

      • Follow Salesforce's guide:

      • Authenticate to your DevHub:

    3. Verify DevHub connection:

    1. Starting a New Feature

    Fetch a Development Environment

    Every feature or story begins with a developer fetching a fresh environment from a pre-prepared pool. The frequency depends on your team's practice:

    • Scratch Orgs: Can be fetched for every story or feature

    • Sandboxes: Typically fetched at the start of an iteration or sprint

    Fetch from Scratch Org Pool (Community Edition - Local Pools)

    Fetch from Pool using sfp Server (sfp-pro - Server-Managed Pools)

    Create a New Sandbox (if needed)

    Authenticate to Your Environment

    Once you have your environment:

    2. Development Cycle

    Pull Latest Metadata

    Before making changes, ensure you have the latest metadata from your org:

    The pull command will:

    • Retrieve metadata changes from your org

    • Apply reverse text replacements to convert environment-specific values back to placeholders

    • Update your local source files

    Make Your Changes

    Now you can work on your feature using your preferred IDE:

    1. Modify existing metadata in package directories

    2. Create new components using SF CLI or your IDE

    3. Add new packages if needed:

    1. Organize packages into logical groups using release configs (domains are conceptual, not explicit commands)

    Push Changes to Your Org

    Deploy your local changes to the development org:

    The push command will:

    • Apply text replacements for environment-specific values

    • Deploy metadata to your org

    • Run tests if specified

    Build and Test Locally

    Build Artifacts

    Test that your packages can be built successfully:

    Run Apex Tests

    Execute tests in your development org to validate your changes. sfp follows a package-centric testing approach:

    For detailed information on test levels, coverage validation, output formats, and CI/CD integration, see Running Apex Tests.

    Install to Your Org (Optional)

    While developers rarely need to install built artifacts to their own orgs, you can test the installation:

    3. Dependency Management

    As you develop, you may need to manage package dependencies:

    Analyze Dependencies

    4. Submitting Your Work

    Create a Pull Request

    Once your feature is complete:

    CI/CD Pipeline Takes Over

    When you create a PR, the automated pipeline will:

    1. Run sfp validate to verify your changes:

    1. Create a review environment for acceptance testing:

    1. Run quality checks:

      • Code coverage validation

      • Dependency validation

      • Package structure verification

    Review Environment Testing

    The review environment URL is posted to your PR for stakeholders to test:

    • Product owners can validate functionality

    • QA can run acceptance tests

    • Other developers can review the implementation

    5. Post-Merge

    After your PR is approved and merged:

    1. Artifacts are built from the main branch

    2. Published to artifact repository

    3. Ready for release to higher environments

    Common Workflows

    Working with Aliasified Packages

    When working with environment-specific metadata:

    Using Text Replacements

    For configuration values that change per environment:

    Then push/pull will automatically handle replacements:

    Handling Destructive Changes

    When you need to delete metadata:

    1. Move components to pre-destructive/ or post-destructive/ folders

    2. Push changes normally:

    The destructive changes are automatically processed.

    Troubleshooting

    Pool is Empty

    Community Edition (Local Pools)

    sfp-pro (Server-Managed Pools)

    Push/Pull Conflicts

    Build Failures

    DevHub Connection Issues

    # List available scratch orgs in pool (alias: pool:list)
    sfp pool scratch list --tag dev-pool
    
    # Fetch a scratch org from the pool (alias: pool:fetch)
    sfp pool scratch fetch --tag dev-pool --alias my-feature-org
    
    # Initialize a pool if empty (aliases: prepare, pool:prepare)
    sfp pool scratch init --tag dev-pool \
      --targetdevhubusername mydevhub \
      --config config/project-scratch-def.json \
      --count 5
    # List available instances (works for both scratch orgs and sandboxes)
    sfp server pool instance list \
      --repository myorg/myrepo \
      --tag dev-pool
    
    # Fetch an org from the pool (scratch or sandbox)
    sfp server pool instance fetch \
      --repository myorg/myrepo \
      --tag dev-pool \
      --assignment-id feature-123
    
    # Extend org expiration if needed
    sfp server pool instance extend \
      --repository myorg/myrepo \
      --tag dev-pool \
      --assignment-id feature-123 \
      --expiration-hours 48
    
    # Unassign and return to pool when done
    sfp server pool instance unassign \
      --repository myorg/myrepo \
      --tag dev-pool \
      --assignment-id feature-123
    # Create a new sandbox directly
    sfp sandbox create --name feature-sandbox \
      --type Developer \
      --source-org production \
      --alias my-feature-sandbox
    # Open the org to verify access (sfp-pro)
    sfp org open --targetusername my-feature-org
    
    # Open in a specific browser (sfp-pro)
    sfp org open --targetusername my-feature-org --browser chrome
    
    # For community edition, use Salesforce CLI
    sf org open --target-org my-feature-org
    
    # Set as default for convenience (sfp-pro)
    sfp config set target-org my-feature-org
    
    # Set globally (sfp-pro)
    sfp config set target-org my-feature-org --global
    # Pull all changes from the org (using aliases: pull, source:pull, project:pull)
    sfp pull --targetusername my-feature-org
    
    # Pull with conflict resolution
    sfp pull --targetusername my-feature-org --ignore-conflicts
    
    # Pull a specific package
    sfp pull --targetusername my-feature-org --package my-package
    
    # Pull and see what replacements were reversed (sfp-pro)
    sfp pull --targetusername my-feature-org --json
    # Create a new source package (sfp-pro)
    sfp package create source -n "feature-payment" \
      -r "src/payment-processing" \
      --domain
    
    # Create an unlocked package
    sfp package create unlocked -n "feature-payment" \
      -r "src/payment-processing" \
      -v mydevhub
    
    # Create a data package
    sfp package create data -n "reference-data" \
      -r "data/reference-data"
    
    # For community edition, manually add to sfdx-project.json
    # Push all changes (using aliases: push, source:push, project:push)
    sfp push --targetusername my-feature-org
    
    # Push a specific package
    sfp push --targetusername my-feature-org --package my-package
    
    # Push ignoring conflicts
    sfp push --targetusername my-feature-org --ignore-conflicts
    
    # Push and see what replacements were applied (sfp-pro)
    sfp push --targetusername my-feature-org --json
    # Build all packages (DevHub required)
    sfp build --devhubalias mydevhub
    
    # Build a specific domain
    sfp build --devhubalias mydevhub --domain sales
    
    # Build a specific package
    sfp build --devhubalias mydevhub --package payment-processing
    
    # Build with different options
    sfp build --devhubalias mydevhub \
      --branch feature/payment \
      --buildnumber 123 \
      --diffcheck
    
    # Note: DevHub is required even for source packages to resolve dependencies
    # Test a specific package (recommended)
    sfp apextests trigger -o my-feature-org -l RunAllTestsInPackage -n sales-core
    
    # Test all packages in a domain
    sfp apextests trigger -o my-feature-org -l RunAllTestsInDomain \
      -r config/release-config.yaml
    
    # Quick test during development
    sfp apextests trigger -o my-feature-org -l RunSpecifiedTests \
      --specifiedtests PaymentProcessorTest
    
    # Test with code coverage validation
    sfp apextests trigger -o my-feature-org -l RunAllTestsInPackage \
      -n sales-core -c -p 80
    # Install a single package
    sfp install --target-org my-feature-org \
      --artifacts artifacts \
      --package payment-processing
    
    # Install with skip testing for faster deployment
    sfp install --target-org my-feature-org \
      --artifacts artifacts \
      --skipifalreadyinstalled
    # Understand package dependencies
    sfp dependency explain --package payment-processing
    
    # Expand all transitive dependencies (for troubleshooting)
    sfp dependency expand --target-devhub mydevhub
    
    # Clean up redundant dependencies
    sfp dependency shrink --target-devhub mydevhub
    # Commit your changes
    git add .
    git commit -m "feat: implement payment processing module"
    
    # Push to your feature branch
    git push origin feature/payment-processing
    
    # Create PR using GitHub CLI (optional)
    gh pr create --title "Payment Processing Module" \
      --body "Implements new payment gateway integration"
    # This runs automatically in CI/CD
    sfp validate org --target-org validation-org \
      --mode thorough \
      --coverageThreshold 75
    # CI/CD creates an ephemeral environment
    sfp pool scratch fetch --pool review-pool \
      --alias pr-123-review
    
    # Install the changes
    sfp install --target-org pr-123-review \
      --artifacts artifacts
    # This happens automatically in CI/CD
    sfp build --branch main
    sfp publish --artifacts artifacts \
      --npm-registry https://your-registry.com
    # Pull from a specific environment
    sfp pull --targetusername dev-sandbox
    
    # The correct variant is automatically selected
    # src-env-specific/main/dev/* contents are used
    # Create preDeploy/replacements.yml in your package
    replacements:
      - name: "API Endpoint"
        glob: "**/*.cls"
        pattern: "%%API_URL%%"
        environments:
          default: "https://api.dev.example.com"
          prod: "https://api.example.com"
    # Push replaces placeholders with environment values
    sfp push --targetusername my-feature-org
    
    # Pull reverses replacements back to placeholders
    sfp pull --targetusername my-feature-org
    sfp push --targetusername my-feature-org
    # Check pool status
    sfp pool scratch list --tag dev-pool
    
    # Replenish the pool (aliases: prepare, pool:prepare)
    sfp pool scratch init --tag dev-pool \
      --targetdevhubusername mydevhub \
      --count 5
    # Check pool status
    sfp server pool status --repository myorg/myrepo --tag dev-pool
    
    # Replenish pool (works for both scratch orgs and sandboxes)
    sfp server pool replenish \
      --repository myorg/myrepo \
      --tag dev-pool
    # Ignore conflicts during pull
    sfp pull --targetusername my-feature-org --ignore-conflicts
    
    # Ignore conflicts during push
    sfp push --targetusername my-feature-org --ignore-conflicts
    # Check for issues in specific package
    sfp build --devhubalias mydevhub \
      --package problematic-package \
      --loglevel DEBUG
    
    # Validate dependencies
    sfp dependency explain --package problematic-package
    # Re-authenticate to DevHub
    sf org login web --alias mydevhub --set-default-dev-hub
    
    # Verify DevHub is enabled
    sf org display --target-dev-hub
    
    # Check DevHub limits
    sf limits api display --target-org mydevhub
    Add DevHub License Users
    sf org login web --alias mydevhub --set-default-dev-hub
    # Check your DevHub connection
    sf org display --target-dev-hub

    Running Apex Tests

    The apextests trigger command allows you to independently execute Apex tests in your Salesforce org. While the validate command automatically runs tests per package during validation, this command gives you direct control over test execution with support for multiple test levels, code coverage validation, and output formats.

    Primary Testing Patterns

    sfp follows a package-centric testing approach where tests are organized and executed at the package or domain level, rather than running all org tests together. This aligns with how the validate command works and provides better isolation and faster feedback.

    Test Levels

    RunAllTestsInPackage (Recommended)

    Runs all tests within specified package(s). This is the primary testing pattern in sfp and matches how the validate command executes tests. Supports code coverage validation at both package and individual class levels.

    This pattern matches how validate command executes tests - each package is tested independently with its own test classes. This provides:

    • Better test isolation and faster feedback

    • Package-level code coverage validation

    • Clear attribution of test failures to specific packages

    • Parallel test execution per package (when enabled)

    RunAllTestsInDomain (Recommended for Domain Validation)

    Runs tests for all packages defined in a domain from your release config. This is the recommended pattern for validating entire domains and matches how you would validate a domain for release.

    This executes tests for each package in the domain sequentially, providing comprehensive domain validation. Use this when:

    • Validating changes across a domain before release

    • Testing related packages together as a unit

    • Performing end-to-end domain validation

    RunSpecifiedTests

    Runs specific test classes or methods. Useful for rapid iteration during active development.

    Use during development for quick feedback cycles when working on specific features.

    RunApexTestSuite

    Runs all tests in a test suite defined in your org.

    Useful for running pre-defined test groups or smoke test suites.

    RunLocalTests

    Runs all tests in your org except those from managed packages. This is the default test level in Salesforce but not the recommended pattern in sfp.

    Note: While this is the Salesforce default, sfp recommends package-level or domain-level testing for better isolation and faster feedback. Use this only when you specifically need to run all org tests together, such as for compliance requirements or full org validation.

    RunAllTestsInOrg

    Runs all tests in your org, including managed packages. Rarely used due to long execution time.

    Use only for complete org validation scenarios.

    Code Coverage Validation

    Individual Class Coverage

    Validates that each Apex class in the package meets the minimum coverage threshold. Every class must meet or exceed the specified percentage.

    Coverage threshold:

    • Default: 75%

    • Adjustable with -p flag

    • Applied per class, not as an average

    Output includes:

    • List of all classes with their coverage percentages

    • Classes that meet the threshold

    • Classes that fail to meet the threshold

    • Overall package coverage percentage

    Package Coverage

    Validates that the overall package coverage meets the minimum threshold. The average coverage across all classes must meet or exceed the specified percentage.

    Coverage calculation:

    • Aggregates coverage across all classes in package

    • Calculated as: (total covered lines / total lines) * 100

    • Only classes with Apex code count toward coverage

    Coverage vs No Coverage

    Running tests without coverage flags still executes tests but doesn't fetch or validate coverage data:

    Note: Fetching coverage data adds time to test execution, so only use it when needed.

    Output Formats

    Note: The dashboard output format is a new feature introduced in the November 2025 release of sfp-pro.

    Raw Format (Default)

    Standard Salesforce API output with JUnit XML and JSON results. This is the default format.

    Generates:

    • .testresults/test-result-<testRunId>.json - Raw Salesforce test results

    • .testresults/test-result-<testRunId>-junit.xml - JUnit XML format

    • .testresults/test-result-<testRunId>-coverage.json - Coverage data (if coverage enabled)

    Dashboard Format

    Available in: sfp-pro November 2024 release and later

    Structured JSON format optimized for dashboards, metrics systems, and reporting tools. Unlike the raw Salesforce API output, the dashboard format provides enriched, pre-processed data that's ready for consumption by external systems.

    Generates all raw format files plus:

    • .testresults/<testRunId>/dashboard.json - Structured test results

    • .testresults/<testRunId>/testresults.md - Enhanced markdown summary

    • .testresults/latest.json - Symlink to latest dashboard result

    Dashboard JSON Schema

    The dashboard.json file contains a comprehensive test execution report:

    How Dashboard Output is Used

    For Metrics and Observability:

    For Test History Tracking:

    Dashboard vs Raw Format

    Feature
    Raw Format
    Dashboard Format

    Non-Blocking Dashboard Mode

    In dashboard mode, test failures don't cause the command to exit with error code 1, allowing you to collect test results even when tests fail:

    This is useful for:

    • Collecting metrics regardless of test outcome

    • Generating reports without blocking pipelines

    • Archiving test history across passing and failing runs

    • Trend analysis and test reliability tracking

    Both Format

    Available in: sfp-pro November 2025 release and later

    Generates both raw and dashboard formats in a single execution.

    When to use:

    • Maintaining compatibility while adopting dashboard format

    • Comprehensive test result archiving

    Output Directory Structure

    After running tests, sfp creates a .testresults directory:

    Troubleshooting

    Tests Timeout

    Control how long the command waits for tests to complete:

    Wait time behavior:

    • Omit -w flag: Wait indefinitely (no timeout)

    • -w 0: Wait indefinitely (no timeout)

    • -w <minutes>: Wait up to specified minutes before timing out

    For most scenarios, omitting the wait time or using 0 is recommended to avoid premature timeouts on large test suites.

    Parallel vs Serial Execution

    Some test classes interfere with each other when run in parallel. Configure serial execution in your package descriptor:

    Or use the synchronous flag (if supported):

    Coverage Validation Failures

    See which classes failed coverage requirements:

    The debug output shows:

    • Each class and its coverage percentage

    • Which classes passed/failed threshold

    • Overall package coverage

    Tests Not Found

    If no tests are executed:

    1. Check that test classes exist in the package:

    2. Ensure test classes follow naming conventions:

      • Class name ends with Test

      • Methods are annotated with @isTest

    Mixed Results with Retries

    sfp automatically retries failed tests in serial mode. This is normal behavior to handle flaky tests that fail in parallel execution:

    1. First run: Tests execute in parallel

    2. If failures occur: Failed tests retry in serial mode

    3. Final results: Combines both runs, removes duplicates

    Additional Options

    Wait Time Control

    Control test execution timeout behavior:

    Options:

    • Omit -w: Wait indefinitely

    • -w 0: Wait indefinitely

    • -w <minutes>: Wait specified minutes before timeout

    Specifying API Version

    Override the API version for the test run:

    Git Metadata

    Include git information in test results:

    This metadata appears in:

    • Dashboard JSON output

    • Markdown summaries

    • Test reports

    Custom Environment Name

    Specify environment name for dashboard format:

    Defaults to the target org alias if not specified.

    # Test a specific package (primary pattern)
    sfp apextests trigger -o my-org -l RunAllTestsInPackage -n my-package
    
    # Test all packages in a domain (recommended for domain validation)
    sfp apextests trigger -o my-org -l RunAllTestsInDomain -r config/release-config.yaml
    
    # Test multiple packages together
    sfp apextests trigger -o my-org -l RunAllTestsInPackage -n package-a -n package-b
    
    # Quick test during development
    sfp apextests trigger -o my-org -l RunSpecifiedTests --specifiedtests MyTest

    .testresults/testresults.md - Markdown summary

    Latest Symlink

    No

    Yes (latest.json)

    Metadata

    Limited

    Environment, repo, commit

    Use Case

    Salesforce tooling

    External systems, dashboards

    Exit on Failure

    Yes (exit code 1)

    No (exit code 0)

  • Verify test classes are in the correct package directory

  • Output

    Salesforce API response

    Processed, enriched data

    Structure

    Flat, verbose

    Hierarchical, organized

    File Location

    .testresults/ root

    .testresults/<testRunId>/

    Coverage

    Separate file

    Integrated in JSON

    # Single package
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n sales-core
    
    # Multiple packages
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
      -n sales-core \
      -n sales-ui \
      -n sales-integration
    sfp apextests trigger -o dev-org -l RunAllTestsInDomain \
      -r config/release-config-sales.yaml
    # Run specific test classes
    sfp apextests trigger -o dev-org -l RunSpecifiedTests \
      --specifiedtests AccountTest,ContactTest
    
    # Run specific test methods
    sfp apextests trigger -o dev-org -l RunSpecifiedTests \
      --specifiedtests AccountTest.testCreate,ContactTest.testUpdate
    sfp apextests trigger -o dev-org -l RunApexTestSuite \
      --apextestsuite QuickTests
    sfp apextests trigger -o dev-org -l RunLocalTests
    sfp apextests trigger -o dev-org -l RunAllTestsInOrg
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
      -n my-package \
      -c \
      -p 80
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
      -n my-package \
      --validatepackagecoverage \
      -p 75
    # Run tests without coverage data
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package
    
    # Run tests with coverage fetching and validation
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -c
    sfp apextests trigger -o dev-org -l RunLocalTests
    # Generate dashboard format
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
      --outputformat dashboard \
      --environment dev
    {
      "environment": "dev",
      "timestamp": "2025-11-24T10:30:00.000Z",
      "duration": 125000,
      "testExecutionTime": 120000,
      "commandTime": 115000,
      "repository": "https://github.com/myorg/myrepo",
      "commitSha": "abc123def",
      "branch": "main",
    
      "summary": {
        "totalTests": 150,
        "passed": 145,
        "failed": 5,
        "skipped": 0,
        "passingRate": 96.67,
        "overallCoverage": 82.5,
        "coveredLines": 8250,
        "totalLines": 10000,
        "outcome": "Failed"
      },
    
      "coverage": {
        "overallCoverage": 82.5,
        "totalLines": 10000,
        "coveredLines": 8250,
        "classes": [
          {
            "name": "AccountService",
            "id": "01p...",
            "coverage": 95.5,
            "totalLines": 200,
            "coveredLines": 191,
            "status": "pass"
          }
        ],
        "uncoveredClasses": ["LegacyHelper"],
        "belowThreshold": [
          {
            "name": "OldProcessor",
            "coverage": 65.0,
            "threshold": 75
          }
        ]
      },
    
      "testCases": [
        {
          "id": "07M...",
          "name": "AccountService.testCreateAccount",
          "className": "AccountService",
          "methodName": "testCreateAccount",
          "time": 250,
          "status": "passed"
        }
      ],
    
      "topFailingTests": [
        {
          "name": "ContactTest.testValidation",
          "className": "ContactTest",
          "methodName": "testValidation",
          "failureMessage": "System.AssertException: Expected 5, but got 3"
        }
      ],
    
      "metadata": {
        "testRunId": "707...",
        "orgId": "00D...",
        "username": "[email protected]",
        "package": "sales-core",
        "testLevel": "RunAllTestsInPackage"
      }
    }
    # Run tests and extract metrics
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
      --outputformat dashboard --json > results.json
    
    # Extract key metrics
    PASSING_RATE=$(jq -r '.result.summary.passingRate' results.json)
    COVERAGE=$(jq -r '.result.summary.overallCoverage' results.json)
    
    # Push to your metrics backend
    curl -X POST https://metrics.example.com/api/tests \
      -d @.testresults/latest.json
    # The latest.json symlink always points to most recent result
    jq -r '.summary.passingRate' .testresults/latest.json
    
    # Track coverage trends
    jq -r '.coverage.belowThreshold[] | "\(.name): \(.coverage)%"' \
      .testresults/latest.json
    # Command succeeds even if tests fail
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
      --outputformat dashboard
    
    echo $?  # Always 0 in dashboard mode
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
      --outputformat both \
      --environment ci
    .testresults/
    ├── test-result-<testRunId>-junit.xml      # JUnit XML format
    ├── test-result-<testRunId>.json           # Raw Salesforce test results
    ├── test-result-<testRunId>-coverage.json  # Code coverage data
    ├── testresults.md                         # Markdown summary
    ├── <testRunId>/                           # Dashboard format directory
    │   ├── dashboard.json                     # Structured test results
    │   └── testresults.md                     # Enhanced markdown summary
    └── latest.json                            # Symlink to latest dashboard result
    # Wait up to 120 minutes
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 120
    
    # Wait indefinitely (no timeout) - useful for very large test suites
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 0
    
    # Omitting the flag also waits indefinitely
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package
    {
      "path": "src/my-package",
      "package": "my-package",
      "testSynchronous": true
    }
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
      -n my-package -s
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
      -n my-package -c --loglevel debug
    # Verify package contents
    sfp build -d mydevhub -n my-package --loglevel debug
    # Wait indefinitely (recommended for large test suites)
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 0
    
    # Wait up to 120 minutes
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 120
    
    # Omitting -w also waits indefinitely
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package
    sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package --apiversion 60.0
    sfp apextests trigger -o dev-org -l RunLocalTests \
      --commitsha abc123def \
      --repourl https://github.com/myorg/myrepo
    sfp apextests trigger -o dev-org -l RunLocalTests \
      --outputformat dashboard \
      --environment "dev-feature-branch"

    Compliance Check

    sfp-pro
    sfp (community)

    Availability

    ✅

    ❌

    From

    September 25

    The compliance check functionality ensures your Salesforce metadata adheres to organizational standards and best practices. This feature helps maintain code quality, security, and consistency across your Salesforce project by enforcing configurable rules.

    Overview

    Compliance checking provides a comprehensive framework for:

    • Enforcing coding standards and best practices

    • Preventing security vulnerabilities like hardcoded IDs and URLs

    • Maintaining API version consistency

    • Validating metadata field values against organizational policies

    How It Works

    The compliance checker:

    1. Loads rules from your configuration file (defaults to config/compliance-rules.yaml)

    2. Scans metadata components using Salesforce's ComponentSet API

    3. Applies rules based on metadata type and field specifications

    4. Evaluates content and field values using configurable operators

    Configuration

    Compliance checking is configured through YAML files that define rules and their enforcement:

    Configuration File Structure

    Create a compliance rules file using the generate command:

    This creates config/compliance-rules.yaml with sample rules:

    Built-in Rules

    The system includes several built-in rules that can be enabled:

    Rule ID
    Description
    Metadata Types
    Default

    Documentation & Metadata Quality

    Rule ID
    Description
    Metadata Types
    Default

    Custom Rules

    Define custom rules to match your specific organizational requirements:

    Rule Configuration Options

    Field
    Description
    Required
    Values

    Supported Operators

    Operator
    Description
    Example

    Field Path Specifications

    XML Field Paths

    For XML metadata files, use dot notation to specify field paths:

    Content Analysis

    Use the special _content field to analyze file content:

    Understanding Results

    The compliance check provides detailed violation reports with multiple output formats:

    Console Output

    Violation Details

    Each violation includes:

    • File Path: Exact location of the violation

    • Line Number: Specific line where the issue occurs (when applicable)

    • Rule Name: Which rule was violated

    • Severity: Error, Warning, or Info level

    Integration with CI/CD

    Integration is limited only to GitHub at the moment. The command needs GITHUB_APP_PRIVATE_KEY and GITHUB_APP_ID to be set in environment variables for results to be reported as GitHub checks.

    When integrating compliance checking in your CI/CD pipeline:

    1. Enforce Compliance Standards:

    2. Generate Reports:

    3. GitHub Actions Integration:

    Scoping Compliance Checks

    Use the same scoping options as other analysis commands:

    By Package

    By Domain

    By Source Path

    By Changed Files

    Analyze only specific changed files:

    In GitHub Actions PR context, the analyzer automatically detects and analyzes only changed files when no package, domain, or source path filters are specified.

    Output Formats

    The compliance checker supports multiple output formats:

    • Console: Human-readable terminal output with color coding

    • Markdown: Detailed reports suitable for documentation

    • JSON: Machine-readable format for integration with other tools

    • GitHub: Special format for GitHub Checks API integration

    Common Compliance Scenarios

    API Version Management

    Ensure all components use recent API versions:

    Security Hardening

    Prevent security vulnerabilities:

    Profile Security

    Restrict dangerous permissions:

    Logging and Debugging

    Use different log levels to control output verbosity:

    Debug logging shows:

    • Rules being processed

    • Files being scanned

    • Field extraction details

    • Rule evaluation results

    Troubleshooting

    Rule Not Triggering

    1. Verify Field Path: Ensure the field path matches the XML structure

    2. Check Metadata Types: Confirm the rule applies to the correct metadata types

    3. Validate Operators: Ensure the operator logic matches your expectations

    4. Enable Debug Logging: Use --loglevel debug

    False Positives

    1. Refine Rule Conditions: Adjust operators or values to be more specific

    2. Scope Rules Appropriately: Use metadata type filters to target specific components

    3. Create Exceptions: Disable rules for specific packages or paths when needed

    Performance Issues

    1. Scope Analysis: Use package, domain, or path filters to reduce scope

    2. Optimize Rules: Complex regex patterns can slow down content analysis

    3. Exclude Large Files: Consider excluding generated or vendor files

    Configuration Examples

    Organization Standards

    Documentation Standards

    Generating detailed violation reports with remediation guidance

    Reports violations with file locations, severity levels, and helpful messages

    flow-inactive-check

    Detects inactive or draft flows

    Flow

    Disabled

    permissionset-view-all-data

    Prevents ViewAllData permission

    PermissionSet

    Disabled

    permissionset-modify-all-data

    Prevents ModifyAllData permission

    PermissionSet

    Disabled

    permissionset-author-apex

    Detects AuthorApex permission

    PermissionSet

    Disabled

    permissionset-customize-application

    Detects CustomizeApplication permission

    PermissionSet

    Disabled

    validation-rule-missing-description

    Ensures validation rules have descriptions

    ValidationRule

    Disabled

    enabled

    Whether the rule is active

    No

    true/false (default: false)

    metadata

    Metadata types to check

    Yes

    Array of metadata type names

    field

    Field path to evaluate

    Yes

    Dot-notation path or "_content"

    operator

    Comparison operator

    Yes

    See operators table below

    value

    Expected value for comparison

    Yes

    String, Number, Boolean

    severity

    Violation severity level

    Yes

    error, warning, info

    message

    Custom violation message

    No

    String

    greater_than

    Numeric field is greater than value

    apiVersion > 58.0

    less_than

    Numeric field is less than value

    apiVersion < 60.0

    greater_or_equal

    Numeric field is greater than or equal

    apiVersion >= 59.0

    less_or_equal

    Numeric field is less than or equal

    apiVersion <= 59.0

    regex

    Field matches regular expression

    Content matches [a-zA-Z0-9]{15}

    Message: Descriptive explanation and remediation guidance

  • Actual Value: The found value that triggered the violation

  • Expected Value: What the rule expected to find

  • to see detailed evaluation

    no-hardcoded-ids

    Prevents hardcoded 15/18 character IDs

    ApexClass, ApexTrigger, Flow

    Disabled

    no-hardcoded-urls

    Prevents hardcoded Salesforce URLs

    ApexClass, ApexTrigger, Flow

    Disabled

    profile-no-modify-all

    Prevents Modify All Data permission

    Profile

    field-missing-description

    Ensures custom fields have descriptions

    CustomField

    Disabled

    field-missing-help-text

    Ensures custom fields have help text

    CustomField

    Disabled

    object-missing-description

    Ensures custom objects have descriptions

    CustomObject

    id

    Unique identifier for the rule

    Yes

    String

    name

    Human-readable rule name

    No

    String

    description

    Detailed rule description

    No

    equals

    Field equals expected value

    apiVersion equals 59.0

    not_equals

    Field does not equal expected value

    status not_equals Inactive

    contains

    Field contains substring

    Content contains hardcoded

    not_contains

    Field does not contain substring

    Content not_contains TODO

    Disabled

    Disabled

    String

    sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliance
    sfp project:analyze --generate-compliance-config
    # SFP Compliance Rules Configuration
    extends: default
    rules:
      - id: no-hardcoded-ids
        enabled: true
        comment: Enable this rule to prevent hardcoded Salesforce IDs
      - id: no-hardcoded-urls
        enabled: false
        comment: Enable this rule to prevent hardcoded Salesforce URLs
      - id: profile-no-modify-all
        enabled: false
        comment: Enable this rule to prevent profiles with Modify All Data permission
      - id: custom-api-version
        name: Minimum API Version Check
        enabled: true
        metadata:
          - ApexClass
          - ApexTrigger
        field: ApexClass.apiVersion
        operator: greater_or_equal
        value: 59.0
        severity: warning
        message: Component should use API version 59.0 or higher
    rules:
      - id: minimum-api-version
        name: Enforce Minimum API Version
        description: All components must use API version 59.0 or higher
        enabled: true
        metadata:
          - ApexClass
          - ApexTrigger
        field: ApexClass.apiVersion
        operator: greater_or_equal
        value: 59.0
        severity: warning
        message: Component should use API version 59.0 or higher
    # For ApexClass metadata XML
    field: ApexClass.apiVersion
    
    # For nested fields
    field: ApexClass.packageVersions.majorNumber
    
    # For Profile permissions
    field: Profile.userPermissions.ModifyAllData
    field: _content
    operator: regex
    value: '[a-zA-Z0-9]{15}|[a-zA-Z0-9]{18}'  # Salesforce ID pattern
    📋 Compliance Check Results
    ═══════════════════════════
    
    Rules Checked: 3/5
    
    Found 5 violations:
    
    ❌ Errors (2):
      • src/classes/MyClass.cls-meta.xml:3
        Component should use API version 59.0 or higher
        Rule: Minimum API Version Check
    
    ⚠️  Warnings (3):
      • src/classes/Controller.cls:15
        Hardcoded Salesforce IDs found - use Custom Settings, Custom Metadata, or SOQL queries instead
        Rule: No Hardcoded Salesforce IDs
    sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliance
    sfp project:analyze --compliance-rules config/compliance-rules.yaml --report-dir ./reports --output-format markdown
    - name: Run Compliance Check
      run: |
        sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliance --output-format github
      env:
        GITHUB_APP_PRIVATE_KEY: ${{ secrets.GITHUB_APP_PRIVATE_KEY }}
        GITHUB_APP_ID: ${{ secrets.GITHUB_APP_ID }}
    sfp project:analyze --compliance-rules config/compliance-rules.yaml -p core,utils
    sfp project:analyze --compliance-rules config/compliance-rules.yaml -d sales
    sfp project:analyze --compliance-rules config/compliance-rules.yaml -s ./force-app/main/default
    # Manual specification of changed files
    sfp project:analyze --changed-files "src/classes/MyClass.cls,src/lwc/myComponent/myComponent.html"
    - id: minimum-api-version
      enabled: true
      metadata: [ApexClass, ApexTrigger]
      field: ApexClass.apiVersion
      operator: greater_or_equal
      value: 59.0
      severity: warning
    - id: no-hardcoded-credentials
      enabled: true
      metadata: [ApexClass]
      field: _content
      operator: regex
      value: '(password|secret|key)\s*=\s*["\'][^"\']+["\']'
      severity: error
      message: Remove hardcoded credentials - use Named Credentials or Custom Settings
    - id: no-modify-all-data
      enabled: true
      metadata: [Profile]
      field: Profile.userPermissions.ModifyAllData
      operator: equals
      value: true
      severity: error
      message: Profile should not have Modify All Data permission
    # Basic progress information
    sfp project:analyze --compliance-rules config/compliance-rules.yaml --loglevel info
    
    # Detailed debugging information
    sfp project:analyze --compliance-rules config/compliance-rules.yaml --loglevel debug
    extends: default
    rules:
      # Security Rules
      - id: no-hardcoded-ids
        enabled: true
      - id: no-hardcoded-urls
        enabled: true
    
      # API Version Standards
      - id: minimum-api-version
        name: Enforce API Version 59+
        enabled: true
        metadata: [ApexClass, ApexTrigger]
        field: ApexClass.apiVersion
        operator: greater_or_equal
        value: 59.0
        severity: warning
    
      # Profile Security
      - id: profile-no-modify-all
        enabled: true
        severity: error
    extends: default
    rules:
      # Field Documentation
      - id: field-missing-description
        enabled: true
        severity: warning
    
      - id: field-missing-help-text
        enabled: true
        severity: warning
    
      # Object Documentation
      - id: object-missing-description
        enabled: true
        severity: warning
    
      # Validation Rules
      - id: validation-rule-missing-description
        enabled: true
        severity: info
        message: Test classes should follow naming convention with 'Test' suffix