Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Depending on your edition, please select the guide from the below link
Install sfp community editionInstall sfp-proThe following list of software is the minimum to get started using sfp. Our assumption is that you are familiar with the Salesforce CLI and are comfortable using VS Code. To keep the material simple, we will be assuming you already have access to a Salesforce Sandbox that you can build and install artifacts.
sfp provides various features to alter the installation behaviour of a package. These behaviours have to be applied as an additional property of a package during build time. The following section details each of the parameters that is available.
sfp is a purpose-built CLI tool for modular Salesforce development and release management. sfp streamlines and automates the build, test, and deployment processes of Salesforce metadata, code, and data. It extends sf cli functionalities, focusing on artifact-driven development to support #flxbl Salesforce project development.
sfp is available in two editions:
sfp (Community Edition): Open-source CLI with core build, deploy, and orchestration capabilities
sfp-pro: Enterprise edition with additional features including a centralized server, environment management, authentication services, and team collaboration tools
Built with codified process: sfp is derived from extensive experience in modular Salesforce implementations. By embracing the #FLXBL framework, it streamlines the process of creating a well-architected, composable Salesforce Org, eliminating time-consuming efforts usually spent on re-inventing fundamental processes.
Artifact-Centric Approach: sfp packages Salesforce code and metadata into artifacts with deployment details, ensuring consistent deployments and simplified version management across environments.
Best-in-Class Mono Repo Support: Offers robust support for mono repositories, facilitating streamlined development, integration, and collaboration.
sfp incorporates a suite of commands to aid in your end-to-end development cycle for Salesforce. Starting with the core commands, you can perform basic workflows to build and deploy artifacts across environments through the command line. As you get comfortable with the core commands, you can utilize more advanced commands and flags in your CI/CD platform to drive a complete release process, leveraging release definitions, changelogs, metrics, and more.
sfp is constantly evolving and driven by the passionate community that has embraced our work methods. Over the years, we have introduced utility commands to solve pain points specific to the Salesforce Platform. The commands have been successfully tested and used on large-scale enterprise implementations.
Below is a high-level snapshot of the main command categories in sfp.
sfp docker images are published from the flxbl-io Github packages registry at the link provided below
One can utilize the flxbl-io sfp images by using the
docker pull ghcr.io/flxbl-io/sfp:latestYou can also pin to a specific version of the docker image, by using the version published here To preview latest images for the docker image, visit the release candidate page and update your container image reference. For example:
default:
image: ghcr.io/flxbl-io/sfp-rc:<version-number>
or
image: ghcr.io/flxbl-io/sfp-rc:<sha>sfp is a natural fit for organisations that utilize Salesforce in a large enterprise setting as its purpose built to deal with modular saleforce development. Often these organisations have teams dedicated to a particular business function, think of a team who works on features related to billing, while a team works on features related to service\
The diagram illustrates the method of organizing packages into specific categories for various business units. These categories are referred to as 'domains' within the context of sfp cli. Each domain can either contain further domains ( sub-domains) and each domain constitute of one or more packages.
sfp cli utilizes 'Release Config' to organise packages into domains. You can read more about creating a release config in the next section
Packages are containers used to group related metadata together. A package would contain components such as objects, fields, apex, flows and more allowing these elements to be easily installed, upgraded and managed as a single unit Packages in the context of sfp are not limited to second generation packaging (2GP), sfp supports various types of packages which you can read in the following sections
The Salesforce CLI is a command-line interface that simplifies development and build automation when working with your Salesforce org. There are numerous of commands that the sf cli provides natively that is beyond the scope of this site and can be found on the official Salesforce Documentation Site.
From a build, test, and deployment perspective, the following diagram depicts the bare minimum commands necessary to get up and running in setting up your sf project, retrieving and deploying code to the target environments.
An artifact is a key concept within sfp. An artifact is a point in time snapshot of a version of a package, as mentioned in sfdx-project.json . The snapshot contains source code of a package directory , additional metadata information regarding the particular version, changelog and other details. An artifact for 2GP package would also contain details such as
Artifacts provide an abstraction over version control, as it detaches the version control from from the point of releasing into a salesforce org. Almost all commands in sfp operates on an artifact or generates an artifact. \
In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package or specific package types introduce by sfp, an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices. sfp's artifacts are built to be compatible for npm package supported registries , most CI/CD providers provide a npm compatible registry to host these packages/artifacts. Here is the link to operate on Github Package Manager for instance (
All the attributes that are configured additionally to a package in your project is recorded within the artifact. When the artifact is being installed to the target org, the following steps are undertaken in sequence as seen in the below diagram. Each of the configured attributes below to one of the category in the sequence and executed
sfp cli features an intiutive command to build artifacts of all your packages in your project directory. The 'build' command automatically detects the type of package and builds an artifact individually for each package
By default, the behaviour of sfp's build command is to build a new version of all packages in your project directory and and creates it associated artifact.
sfp's build command is also equipped with an ability to selectively build only packages that are changed. Read more on how sfp determines a package to be built on the subsequent sections.
sfp provides mechanisms to control the aspects of the build command, the following section details how one can configure these mechanisms in your sfdx-project.json
An artifact can be build individually for a package, sfp will determine the and the corresponding api's will be invoked



In Salesforce, a package is a container that groups together metadata and code in order to facilitate the deployment and distribution of customizations and apps. Salesforce supports different types of packages, such as:
Unlocked Packages: These are more modular and flexible than traditional managed packages, allowing developers to group and release customizations independently. They are version-controlled, upgradeable, and can include dependencies on other packages.
Org-Dependent Unlocked Packages: Similar to unlocked packages but with a dependency on the metadata of the target org, making them less portable but useful for specific org customizations.
Managed Packages: Managed packages are a type of Salesforce package primarily used by ISVs (Independent Software Vendors) for distributing and selling applications on the Salesforce AppExchange. They are fully encapsulated, which means the underlying code and metadata are not accessible or editable by the installing organization. Managed packages support versioning, dependency management, and can enforce licensing. They are ideal for creating applications that need to be securely distributed and updated across multiple Salesforce orgs without exposing proprietary code.
sfp auguments the above formal salesforce package types with additional package types such as below
Source Packages: These are not formal packages in Salesforce but refer to a collection of metadata and code retrieved from a Salesforce org or version control that can be deployed to an org but aren't versioned or managed as a single entity.
Diff Packages: These are not a formal type of Salesforce package but refer to packages created by determining the differences (diff) between two sets of metadata, often used for deploying specific changes.
In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package (of any type mentioned above), an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices.
Key differences between Salesforce packages and sfp artifacts include:
Versioning and Dependencies: While Salesforce packages support versioning, sfp artifacts enrich this with detailed dependency tracking, ensuring that the CI/CD pipeline respects the order of package installations based on dependencies.
Installation Behavior: Artifacts in sfp carries additional metadata that defines custom installation behaviors, such as pre- and post-installation scripts or conditional installation steps, which are not inherently a part of Salesforce packages.
CI/CD Integration: Artifacts in sfp are specifically designed to fit into a CI/CD pipeline, such as supporting storing in an artifact registory, version tracking, and release management that are essential for automated deployments but are outside the scope of Salesforce packages themselves.
Support for Multiple Package Types: sfp accommodates various Salesforce package types with streamlined commands, enabling modular development, independent versioning, and flexible deployment strategies.
Orchestrate Across Entire Lifecycle: sfp provides an extensive set of functionality across the entire lifecycle of your Salesforce development.
End-to-End Observability: sfp is built with comprehensive metrics emitted on every command, providing unparalleled visibility into your ALM process.
Centralized Server (sfp-pro): A backend server that provides environment management, authentication, webhooks, and API access for team-based workflows.
publish
releasedefinition
auth
impact
artifacts
changelog
environment
repo
metrics
org
flow
pool
webhook
repository
review-envs
key-value
doc-store
quickbuild
validate
init
profile
build
pool
start / stop / status
apextests
install
release
health / logs / scale

dependency
sfp build -v <devhub_name> --branch <value> sfp build -v <devhub_name> --branch <value> --diffcheck// Sample sfdx project json
{
"packageDirectories": [
{
"path": "./src-env-specific-pre",
"package": "src-env-specific-pre",
"versionNumber": "1.0.0.0",
},
{
"path": "./src/frameworks/feature-mgmt",
"package": "feature-mgmt",
"versionNumber": "1
}
]
}// Build a single package
sfp build -v devhub --branch=main -p feature-mgmtalwaysDeploy
boolean
Deploys an artifact of the package, even if it's installed already in the org. The artifact has to be present in the artifact directory for this particular option to work
unlocked
org-dependent unlocked
source
diff
To ensure that an artifact of a package is always deployed, irrespective the same version of the artifact is previously deployed to the org, you can utlize alwaysDeploy as a property added to your package,
{
"packageDirectories": [
{
"path": "src-env-specific-pre",
"package": "env-specific-pre",
"versionDescription": "Environment related settings",
"versionNumber": "4.7.0.NEXT",
"alwaysDeploy":true
},
...
]
}skipCoverageValidation
boolean
Skip apex test coverage validation of a package
unlocked
org-dependent unlocked
source
diff
sfp during validation checks the apex test coverage of a package depending on the package type. This is beneficial so that you dont run into any coverage issues while deploying into higher environments or building an unlocked package. \
However, there could be situations where the test coverage calculation is flaky , sfp provides you with an option to turn the coverage validation off.
// Demonstrating how to do use skipCoverageValidation
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"skipCoverageValidation": true
},
...
]
}A domain is defined by a release configuration. In order to define a domain, you need to create a new release config yaml file in your repository
A simple release config can be defined as shown below
// Sample release config <business_domain.yaml>
releaseName: <business_domain> # --> The name of the domain
pool: <sandbox/scratch org pools>
excludeAllPackageDependencies: true
includeOnlyArtifacts: # --> Insert packages
- <pkg1>
releasedefinitionProperties:
promotePackagesBeforeDeploymentToOrg: prodThe resulting file could be stored in your repository and utilized by the commands such as build, release etc.
skipTesting
boolean
Skip trigger of apex tests while validating or deploying
unlocked
org-dependent unlocked
source
diff
One can utilize this attribute on a package definition is sfdx-project.json to skipTesting while a change is validated. Use this option with caution, as the package when it gets deployed (depending on the package behaviour) might trigger testing while being deployed to production
// Demonstrating how to do use skipTesting
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"skipTesting": true
},
...
]
}ignoreOnStage
array
Ignore a package from being processed by a particular stage
unlocked
org-dependent unlocked
source
diff
Using the ignoreOnStage:[ "build" ] property on a package, causes the particular package to be skipped by the build command. Similarly you can use ignoreOnStage:[ "quickbuild" ] to skip packages in the quickbuild stage.
{
"packageDirectories": [
{
"path": "./src-env-specific-pre",
"package": "src-env-specific-pre",
"versionNumber": "1.0.0.NEXT",
"ignoreOnStage": [
"build"
]
},
{
"path": "./src/frameworks/feature-mgmt",
"package": "feature-mgmt",
"versionNumber": "1.0.0.NEXT"
}
]
}Sometimes, due to certain platform errors, some metadata components need to be ignored during build (especially for unlocked packages) while the same being required for other commands like validate . sfp offer you an easy mechanism which allows to switch .forceignore files depending on the operation.
Add this entry to your sfdx-project.json and as in the example below, mention the path to different files that need to be used for different stages
{
"packageDirectories": [
{
"path": "core",
"package": "core-package",
"versionName": "Core 1.0",
"versionNumber": "1.0.0.NEXT",
"default": true,
}
],
"plugins": {
"sfp": {
"ignoreFiles": {
"build": "forceignores/.buildignore"
"validate": ".forceignore"
}
}
}Given a directory of artifacts and a target org, the deploy command will deploy the artifacts to the target org according to the sequence defined in the project configuration file.
// Command to install set of artifacts to devhub
sfp install -u devhub --artifactdir artifactsThe install command runs through the following steps
Reads all the sfp artifacts provided through the artifact directory
Unzips the artifacts and finds the latest sfdx-project.json.ori to determine the deployment order, if this particular file is not found, it utilizes sfdx-project.json on the repo
Read the installed packages in the target org utilizing the records in SfpowerscriptsArtifacts2__c
Install each artifact from the provided artifact directory to the target org based on the deployment order respecting the attributes configured for each package
Assume you have a domain 'sales' as defined by release config sales.yaml provided as shown in this example here.
# release-config for sales
releaseName: sales
pool: sales-pool
excludeAllPackageDependencies: true
includeOnlyArtifacts:
- sales-ui
- sales-channels
releasedefinitionProperties:
skipIfAlreadyInstalled: true
usePackageDiffDeploymentMode: true
promotePackagesBeforeDeploymentToOrg: prod
changelog:
workItemFilters:
- (AKG|GIK)-[0-9]{2,5}
workItemUrl: https://example.atlassian.net/browse
limit: 30In order to build the artifact of the packages defined by the above release config, you would use the the build command with the flags as described here.
sfp build --releaseconfig sales.yaml -v devhub --branch main If you require only to build packages that's changed form the last published packages, you would add an additional diffcheck flag.
sfp build --releaseconfig sales.yaml -v devhub --branch main --diffcheckdiffcheck will work accurately only if the build command is able to access the latest tag in the repository. In certain CI system if the command is operated on a repository where only the head commit is checked out, diffchek will result in building all the artifacts for all packages within the domain
sfp is based on the The Open CLI Framework for building a command line interface (CLI) in Node.js. Instead of being a typical Salesforce CLI plugin, sfp is standalone and leverages the same core libraries and APIs as the @salesforce/cli. sfp releases are independently managed and as core npm libraries are stable, we will update them as needed to ensure no breaking changes are introduced.
The diagram below depicts the basic flow of the development and test process, building artifacts, and deploying to target environments.
Once you have mastered the basic workflow, you can progress to publishing artifacts to a NPM Repository that will store immutable, versions of the metadata and code used to drive the release of your packages across Salesforce Environments.
The list below is a curated list of core sf cli and Salesforce DX developer guides for your reference.

Artifacts built by sfp follow a naming convention that starts with the <name_of_the_package>sfpowerscripts_artifact_<Major>.<Minor>.<Patch>-<BuildNumber>. One can use any of the npm commands to interact with sfp artifacts. \

In the earlier section, we looked at how we configured the project directory for sfp, Now lets walk through some core commands.
The build command will generate a zipped artifact file for each package you have defined in the sfdx-project.json file. The artifact file will contain metadata and the source code at the point build creation for you to use to install.
Open up a terminal within your Salesforce project directory and enter the following command:
You will see the logs with details of your package creation, for instance here is a sample output
A new "artifacts" folder will be generated within your source project containing a zipped artifact file for each package defined in your sfdx-project.json file.
For example, the artifact files will contain the following naming convention with the "_sfpowerscript_artifact_" in between the package name and version number.
package-name_sfpowerscripts_artifact_1.0.0-1.zip
\
Ensure you have authenticated your salesforce cli to a org first. If you haven't, please find the instructions .
Once you have the authenticated to the sandbox, you could execute the installation command as below
Navigate to your target org and confirm that the package is now installed with the expected changes from your artifact. In this example above, a new custom field has been added to the Account Standard Object.
Org-dependent unlocked packages, a variation of unlocked packages, allow you to create packages that depend on unpackaged metadata in the target org. Org dependent package is very useful in the context of orgs that have lots of metadata and is struggling with understanding the dependency while building a 'unlocked package'
Org dependent packages significantly enhance the efficiency of #flxbl projects who are already on scratch org based development. By allowing installation on a clean slate, these packages validate dependencies upfront, thereby reducing the additional validation time often required by unlocked packages.
Org-dependent unlocked packages bypass the test coverage requirements, enabling installation in production without standard validation. This differs significantly from metadata deployments, where each Apex class deployed must meet a 75% coverage threshold or rely on the org's overall test coverage. While beneficial for large, established orgs, this approach should be used cautiously.
To address this, sfp incorporates a default test coverage for org-dependent unlocked packages during the validation process. To disable this test coverage check during validation, additional attributes must be added to the package directory in the sfdx-project.json file.
\
Good Work! If you made it past the getting started guide with minimal errors and questions, you are well on your way to introduce sfp into your Salesforce Delivery workflow.
Let's summarize what you have done:
Setup pre-requisite software on your workstation and have access to a Salesforce Org.
Installed the latest sfp cli.
Configured your source project and added additional properties required for sfp cli to generate artifacts.
Build artifact(s) locally to be used to deploy.
Installed artifact(s) to target org.
This is just the tip of the iceberg for the full features sfp can provide for you and your team. Please continue to read further and experiment.
For any comments/recommendations to sfp so please join our . If you are adventurous, contribute!
assignPermSetsPreDeployment
array
Apply permsets before installing an artifact to the deployment user
unlocked
org-dependent unlocked
source
diff
assignPermSetsPostDeployment
array
Apply permsets after installing an artifact to the deployment user
unlocked
org-dependent unlocked
source
diff
Occasionally you might need to assign a permission set for the deployment user to successfully install the package, run apex tests or functional tests, sfp provides you an easy mechanism to assign permission set either before installation or after installation of an artifact. assignPermSetsPreDeployment assumes the permission sets are already deployed on the target org and proceed to assign these permission sets to the user
assignPermSetsPostDeployment can be used to assign permission sets that are introduced by the artifact to the target org for any aspects of testing or any other automation usage
Have you ever encountered this error when deploying an updated Entitlement Process to your org?
It's important to note that Salesforce prevents the deployment of an Entitlement Process that is currently in use, irrespective of its activation status in the org. This limitation holds true even for inactive processes, potentially impacting deployment strategies.
To create a new version of an Entitlement Process, navigate to the Entitlement Process Detail page in Salesforce and perform the creation. If you then use sf cli to retrieve this new version and attempt to deploy it into another org, you're likely to encounter errors. Deploying it as a new version can bypass these issues, but this requires enabling versioning in the target org first.
The culprit behind this is an attribute in the Entitlement Process metadata file: versionMaster.
According to the Salesforce documentation, versionMaster ‘identifies the sequence of versions to which this entitlement process belongs and it can be any value as long as it is identical among all versions of the entitlement process.’ An important factor here about versionMaster is: versionMaster is org specific. When you deploy a completely new Entitlement Process to an org with a randomly defined versionMaster, Salesforce will generate an ID for the Entitlement Process and map it to this specific versionMaster.
Deploying updated Entitlement Processes from one Salesforce org to another can often lead to deployment errors due to discrepancies in the versionMaster attribute, sfp's enitlement helper enables you to automatically align the versionMaster by pulling its value from the target org and updating the deploying metadata file accordingly. This ensures that your deployment process is consistent and error-free.
Enable Versioning: First and foremost, activate versioning in the target org to manage various versions of the Entitlement Process.
Create or Retrieve Metadata File:
Create a new version of the Entitlement Process as a metadata file, which can be done via the Salesforce UI and retrieved with the Salesforce CLI (sf cli).
Ensure Metadata Accuracy
Automation around entitlement filter can be disabled globally by using these attributes in your sfdx-project.json
testSynchronous
boolean
Ensure all tests of the package is triggered synchronously
unlocked
org-dependent unlocked
source
diff
sfp by default attempts to trigger all the apex tests in a package in asynchronous mode during validation. This is to ensure that you have a highly performant apex test classes which provide you fast feedback.
If you have inherited an existing code base or your unit tests have lot of DML statemements, you will encounter test failuers when a tests in package is triggered asynchronously. You can instruct sfp to trigger the tests of the package in a synchronous mode by setting 'testSynchnronous as true
\
// Command to deploy set of artifacts to devhub
sfp install -u devhub --artifactdir artifacts --skipifalreadyinstalled By using the skipifalreadyinstalled option with the deploy command, you can prevent the reinstallation of an artifact that is already present in the target organization.
The --baselineorg parameter allows you to specify the alias or username of an org against which to check whether the incoming package versions have already been installed and form a deployment plan.This overrides the default behaviour which is to compare against the deployment target org. This is an optional feature which allows to ensure each org's are updated with the same installation across every org's in the path to production.
skipDeployOnOrgs
array
Skips installation of an artifact on a target org
org-dependent unlocked
unlocked
data
source
sfp cli when encounters the attribute skipDeployOnOrgs on a package, the generated artifact during installation is checked against the alias or the username passed onto the installation command. If the username or the alias is matched, the artifact installation is skipped
// Demonstrating how to do use skipDeployOnOrgs
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"skipDeployOnOrgs":[
"qa",
"[email protected]
]
},
...
]
}In the above example, if the alias to installation command is qa, the artifact for core-crm will get skipped during intallation. The same can also be applied for username of an org as well
reconcileProfiles
boolean
Reconcile profiles to only apply permissions to objects, fields and features that are present in the target org.
source
diff
In order to prevent deployment errors to target Salesforce Orgs, enabling reconcileProfiles totrue ensures and extra steps are taken to ensure the profile metadata is cleansed of any missing attributes prior to deployment.
During the reconcilement process, you can provide the follow flags to the command:
Folder - path to the folder which contains the profiles to be reconciled, if project contain multiple package directories, please provide a comma separated list, if
Profile List - list of profiles to be reconciled. If omitted, all the profiles components will be reconciled.
Source Only - set this flag to reconcile profiles only against component available in the project only. Configure ignored permissions in sfdx-project.json file in the array
Destination Folder - the destination folder for reconciled profiles, if omitted existing profiles will be reconciled and will be rewritten in the current location
For more details on thesfp profile reconcile command, .
enableFlowActivation
boolean
Enable Flows automatically in Production
source
diff
unlocked
While installing a source/diff package, flows by default are deployed as 'Inactive' in the production org. One can deploy flow as 'Active' using the steps mentioned here, however, this requires the flow to meet test coverage requirement.
Also making a flow inactive, is convoluted, find the detailed article provided by Gearset
sfp has automation built in that attempts to align the status of a particular flow version as kept inside the package. It automatically activates the latest version of a Flow in the target org (assuming the package has the latest version). If an earlier version of package is deployed, it skips the activation.
At the same time, if the package contains a Flow with status Obsolete/InvalidDraft/Draft, all the versions of the flow would be rendered inactive.\
Availability
✅
❌
From
November 25
To ensure a package is always included during validation when its domain is impacted, add alwaysSync as a property to your package descriptor:
When framework-core is changed and validation is triggered, framework-config will automatically be included because it belongs to the same domain and has alwaysSync: true.
This also works across domains when using dependencyOn in release configs. If Domain B has a dependencyOn referencing a package from Domain A, and that package changes, all alwaysSync packages in Domain A will be included.
sfp is a purpose-built cli tool used predominantly in a modular salesforce project. \
A project utilizing sfp implements the following concepts.
In an sfp-powered Salesforce project, "Domains" represent distinct business capabilities. A project can encompass multiple domains, and each domain may include various sub-domains. Domains are not explicitly declared within sfp but are conceptually organized through ""
sfp cli supports operations on various types of packages within your repository. A short summary on the comparison between different package types is provided below\
This section details identifies on how sfp analyzes and classifies different package types by using the information in sfdx-project.json
Unlocked Packages are identified if a matching alias with a package version ID is found and verified through the DevHub. For example, the package named "Expense-Manager-Util" is found to be an Unlocked package upon correlation with its alias "Expense Manager - Util" and subsequent verification.
Source Packages are assumed if no matching alias is found in packageAliases. These packages are typically used for source code that is not meant to be packaged and released as a managed or unlocked package.
To fully leverage the capabilities of sfp, a few addition steps need to be configured in your Salesforce Orgs. Please follow the following steps.
To enable modular package development, the following configurations need to be turned on in order to create Scratch Orgs and Unlock Packages.
in your Salesforce org so you can create and manage scratch orgs and second-generation packages. Scratch orgs are disposable Salesforce orgs to support development and testing.
validate command helps you to test ((deployability, apex tests, coverage) a change made to your configuration / code against a target org. This command is typically triggered as part of your Pull Request (PR) or Merge process, to ensure the correctness of configuration/code, before being merged into your main branch.
sfp validates a change by deploying the changed packages into the target org. This is different from 'check only' deployment in other CI/CD solutions.
validate can either utilise a scratch org from a tagged pool prepared earlier using the command or one could use the a target org for its purpose
validate pool / validate org command runs the following checks with the options to enable additional features such as dependency and impact analysis:
sfp's build commands process the packages in the order as mentioned in your sfdx-project.json. The commands also read your dependencies property, and then when triggered, will wait till all its dependencies are resolved, before triggering the equivalent package creation command depending on the type of the package and generating an artifact in the due process
sfp's build command is equipped with a diffcheck functionality, which is enabled when one utilizes diffcheck flag, A comparison (using git diff) is made between the latest source code and the previous version of the package published by the '' command. If any difference is detected in the package directory, package version or scratch org definition file (applies to unlocked packages only), then the package will be created - otherwise, it is skipped. For eg: provided the followings packages in sfdx-project.json along with its dependencies
Scenario 1 : Build All
Trigger creation of artifact for package A
Artifacts to be built can be limited by various mechanisms. This section deals with various techniques to limit artifacts being built
Artifacts to be built can be restricted by during a build process in sfp by utilizing specific configurations. Consider the example provided in
You can use the path to the config file to the build command to limit the artifacts being built as in the sample below
Projects that utilise sfp predominantly follow a mono-repo structure similar to the picture shown above. Each repository has a "src" folder that holds one or more packages that map to your sfdx-project.json file.
Different folders in each of the structure are explained as below:
core-crm: A folder to house all the core model of your org which is shared with all other domains.
frameworks: This folder houses multiple packages which are basically utilities/technical frameworks such as Triggers, Logging and Error Handling, Dependency Injection etc.
The Publish command pushes artifacts created by the build command to a npm registry and also provides you functionality to tag a version of the artifact in your git repository.
To publish packages to your private registry, you'll need to configure your credentials accordingly. Follow the guidelines provided by your registry's service to ensure a smooth setup.
Locate Documentation: Jump into your private registry provider's documentation or help center.
Salesforce inherently does not support the deployment of picklist values as part of unlocked package upgrades. This limitation has led to the need for workarounds, traditionally solved by either replicating the picklist in the source package or manually adding the changes in the target organization. Both approaches add a burden of extra maintenance work. This issue has been documented and recognised as a known problem, which can be reviewed in detail .
To ensure picklist consistency between local environments and the target Salesforce org, the following pre-deployment steps outline the process for managing picklists within unlocked packages:
sfp cli optimally deploys artifacts to the target organisation by reducing the time spent on running Apex tests where possible. This section explains how this optimisation works for different package types.
diff
A package (synonymous with modules) is a container that groups related metadata together. A package would contain components such as objects, fields, apex, flows and more, allowing these elements to be easily installed, upgraded and managed as a single unit. A package is defined as a directory in your project repository and is defined by an entry in sfdx-project.json
An artifact represents a versioned snapshot of a package at a specific point in time. It includes the source code from the package directory (as specified in sfdx-project.json), along with metadata about the version, change logs, and other relevant details. Artifacts are the deployable units in the sfp framework, ensuring consistency and traceability across the development lifecycle.
When sfp is integrated into a Salesforce project, it centralizes around the following key essential process
Building' refers to the creation of an artifact from a package. Utilizing the build command, sfp facilitates the generation of artifacts for each package in your repository. This process encapsulates the package's source code and metadata into a versioned artifact ready for installation.
Publishing a domain involves the process of publishing the artifacts generated by the build command into an artifact repository. This is the storage area where the artifacts are fetched for releases, rollback, etc.
Releasing an domain involves the process of promoting, installing a collection of artifacts to a higher org such as production, generating an associated changelog for the domain. This process is driven by the release command along with a release definition.
\

sales: An example of a domain in your org. Under this particular domain, multiple packages that belong to the domain are included.
src-access-mgmt: This package is typically one of the packages that is deployed second to last in the deployment order and used to store profiles, permission sets, and permission set groups that are applied across the org. Permission Sets and Permission Set Groups particular to a domain should be in their respective package directory.
src-env-specific: An aliasified package which carries metadata for each particular stage (environment) of your path to production. Some examples include named credentials, remote site settings, web links, custom metadata, custom settings, etc.
src-temp: This folder is marked as the default folder in sfdx-project.json. This is the landing folder for all metadata and this particular folder doesn't get deployed anywhere other than a developers scratch org. This place is utilized to decide where the new metadata should be placed into.
src-ui: Should include page layouts, flexipages and Lightning/Classic apps unless we are sure these will only reference the components of a single domain package and its dependencies. In general, custom UI components such as LWC, Aura and Visualforce should be included in a relevant domain package.
runbooks: This folder stores markdown files required for each release and or sandbox refresh to ensure all manual steps are accounted for and versioned control. As releases are completed to production, each release run book can be archived as the manual steps should typically no longer be required. Sandbox refresh run books should be managed accordingly to the type of sandbox depending if they have data or only contain metadata.
scripts: This optional folder is to store commonly used APEX or SOQL scripts that need to be version controlled and reference by multiple team members.
src-env-specific should be added to .forceignore files and should not be deployed to a scratch org.


sfp --version
@flxbl-io/sfp/37.0.0 darwin-arm64 node-v20.3.1


// Org dependent unlocked package with additonal attributes
"packageDirectories": [
{
"path": "src/core/core-crm",
"package": "core-crm",
"versionNumber": "4.7.0.NEXT",
"skipTesting": true,
"skipCoverageValidation": true
},
...
]
}// Demonstrating how to do use testSynchronous
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"testSynchronous": true
},
...
]
}// Command to deploy with baseline
sfp install -u qa \
--artifactdir artifacts \
--skipifalreadyinstalled --baselineorg devhub// Demonstrating package by disabling flow activation
{
"packageDirectories": [
{
"path": "src/order-management",
"package": "order-management",
"versionNumber": "2.0.10.NEXT",
"enableFlowActivation" : false
}alwaysSync
boolean
During validation, automatically includes this package when any other package in the same domain is impacted. Useful for config/settings packages that must stay synchronized with their domain.
unlocked
org-dependent unlocked
source
data
Checks accuracy of metadata by deploying the metadata to an org
Triggers Apex Tests
Validates Apex Test Coverage of each package (default: 75%)
Toggle between different modes for validation:
Thorough (Default) - Comprehensive validation with full deployments and all tests
Individual - Validates changed packages individually, ideal for PRs
Automatically differentiates between:
Packages to synchronize (upstream changes)
Packages to validate (PR changes)
[optional] - AI-Assisted Error Analysis - Intelligent error analysis and fix suggestions (learn more)
[optional] - Limit validation scope using release configurations (--releaseconfig)
[optional] - Validate dependencies between packages for changed components
[optional] - Disable diff check while validating (--diffcheck)
[optional] - Disable parallel testing of apex tests (--disableparalleltesting)
[optional] - Skip test execution entirely (--skipTesting)
[optional] - Execute custom validation scripts for setup/cleanup workflows
sfp features commands and functionality that helps in dealing with complexity of defining dependency of unlocked packages. These functionalities are designed considering aspects of a #flxbl project, such as the predominant use of mono repositories in the context of a non-ISV scenarios.
Package dependencies are defined in the sfdx-project.json (Project Manifest). More information on defining package dependencies can be found in the Salesforce docs.
Let's unpack the concepts utilizing the above example:
There are two unlocked packages and one source package
Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias
Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'
Expense Manager Org Config - a source package, lets assume has some reports and dashboards
External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.
Source Packages/Org Dependent Unlocked / Data packages have an implied dependency as defined by the order of installation, as in it assumes any depedent metadata is already available in the target org before the installation of components within the package
The configFile flag in the build command is used for features and settings of the scratch org used to validate your unlocked package. It is optional and if not passed in, it will assume the default one which is none.
Typically, all packages in the same repo share the same scratch org definition. Therefore, you pass in the definition that you are using to build your scratch org pool and use the same to build your unlocked package.
sfp build --configFile <path-to-config-file> -v <devhub>However, there is an option to use multiple definitions.
For example, if you have two packages, package 1 is dependent on SharedActivities, and package 2 is not. You would pass a scratch org definition file to package 1 via scratchOrgDefFilePaths in sfdx-project. Package 2 would be using. the default definitionFile.
{
"packageDirectories": [
{
"path": "package1",
"default": true,
},
{
"path": "package2",
"default": false
}
],
"plugins": {
"sfp":{
"scratchOrgDefFilePaths":{
"enableMultiDefinitionFiles": true,
"packages": {
"package1":"scratchOrgDef/package1-def.json"
}
}
}
}
}Navigate to your sfdx-project.json file and locate the packageDirectories property.
In the above example, the package directories are
force-app
unpackaged
utils
Add the following additional attributes to the sfdx-project.json and save.
package
versionNumber
Thats the minimal configuration required to run sfp on a project.
Move on to the next chapter to execute sfp commands in this directory.
Registry credentials from your welcome email
Login to the Gitea registry:
Pull the desired image:
The version numbers can be found at https://source.flxbl.io/flxbl/-/packages/container/sfp-pro/
(Optional) Tag for your registry:
Use specific version tags in production
Cache images in your private registry for better performance
Implement proper access controls in your registry
Document image versions used in your pipelines
If you need to build the images yourself, you can access the source code from source.flxbl.io and follow these instructions:
Docker with BuildKit support
GitHub Personal Access Token with packages:read permissions
Node.js (for local development)
The following build arguments are supported:
NODE_MAJOR: Node.js major version (default: 22)
SFP_VERSION: Version of SFP Pro to build
GIT_COMMIT: Git commit hash for versioning
SF_COMMIT_ID: Salesforce commit ID
For issues or questions about Docker images, please contact flxbl support through your designated support channels.
The presence of an additional type attribute within a package directory will further inform sfp of the specific nature of the package. For instance, types such as "data" for data packages or "diff" for diff packages
The sfdx-project.json file outlines various specifications for Salesforce DX projects, including the definition and management of different types of Salesforce packages. From the sample provided, sfp (Salesforce Package Builder) analyzes the "package" attribute within each packageDirectories entry, correlating with packageAliases to identify package IDs, thereby determining the package's type as 2GP (Second Generation Packaging).
Consider the following sfdx-project.json
The sfdx-project.json sample can be used to determine how sfp processes and categorizes packages within a Salesforce DX project. The determination process for each package type, based on the attributes defined in the packageDirectories, unfolds as follows:
Unlocked Packages: For a package to be identified as an Unlocked package, sfp looks for a correlation between the package name defined in packageDirectories and an alias within packageAliases. In the provided example, the "Expense-Manager-Util" under the util path is matched with its alias "Expense Manager - Util", subsequently confirmed through the DevHub with its package version ID, categorizing it as an Unlocked package.
Source Packages: If a package does not have a corresponding alias in packageAliases, it is treated as a Source package. These packages are typically utilized for organizing source code not intended for release. For instance, packages specified in paths like "exp-core-config" and "expense-manager-test-data" would default to Source packages if no matching aliases are found.
Specialized Package Types: The explicit declaration of a type attribute within a package directory allows for the differentiation into more specialized package types. For example, the specification of "type": "data" explicitly marks a package as a Data package, targeting specific use cases different from typical code packages.
Artifacts to be built can be limited only to certain packages. This could be very helpful, in case you want to build an artifact of only one package or a few while testing locally.
Packages can be ignored from the build command by utilising an additional descriptor on the package in project manifest (sfdx-project.json)
In the above scenario, src-env-specific-pre will be ignored while build command is invoked
# release-config for sales
releaseName: sales
pool: sales-pool
excludeAllPackageDependencies: true
includeOnlyArtifacts:
- sales-ui
- sales-channels
releasedefinitionProperties:
skipIfAlreadyInstalled: true
usePackageDiffDeploymentMode: true
promotePackagesBeforeDeploymentToOrg: prod
changelog:
workItemFilters:
- (AKG|GIK)-[0-9]{2,5}
workItemUrl: https://example.atlassian.net/browse
limit: 30Credentials Setup: Find the section dedicated to setting up publishing credentials or authentication.
Follow Steps: Adhere to the detailed steps provided by your registry to configure your system for publishing.
When working with sfp and managing artifacts, it’s required to use a private NPM registry. This allows for more control over package distribution, increased security, and custom package management. Here are some popular NPM compatible private registries, including instructions on how to configure NPM to use them:
GitHub Packages acts as a hosting service for npm packages, allowing for seamless integration with existing GitHub workflows. For configuration details, visit GitHub's guide on configuring NPM for use with GitHub Packages.
GitLab offers a fully integrated npm registry within its continuous integration/continuous deployment (CI/CD) pipelines. To configure NPM to interact with GitLab’s registry, refer to GitLab's NPM registry documentation.
Azure Artifacts provides a npm feed that's compatible with NPM, enabling package hosting, sharing, and version control. Note: The link provided previously was incorrect. Azure Artifacts information can be found here: Azure Artifacts documentation.
JFrog Artifactory offers a robust solution for managing npm packages, with features supporting binary repository management. For setting up Artifactory to work with NPM, refer to JFrog’s npm Registry documentation.
MyGet provides package management support for npm, among other package managers, facilitating the hosting and management of private npm packages. For specifics on utilizing NPM with MyGet, check out MyGet’s NPM support documentation.
Each of these registries offers its own advantages, and the choice between them should be based on your project’s needs and existing infrastructure.
Follow the instructions on your npm registry to generate .npmrc file with the correct URL and access token (which has the permission to publish into your registry.
Utilize the parameters in sfp publish and provide the npmrc file along with activating npm
sfp's publish command provides you with various options to tag the published artifacts in version control. Please note that these tags are utilized by sfp's build command to determine which packages are to be built when used with diffcheckflag
Retrieval and Comparison: Initially, picklists defined within the packages marked for deployment will be retrieved from the target org. These retrieved values are then compared with the local versions of the picklists.
Update on Change Detection: Should any discrepancies be identified between the local and retrieved picklists, the differing values will be updated in the target org using the Salesforce Tooling API.
Handling New Picklists: During the retrieval phase, picklists that are present locally but absent in the target org are identified as newly created. These new picklists are deployed using the standard deployment process, as Salesforce permits the deployment of new picklists from unlocked packages.
Picklist Enabled Package Identification: Throughout the build process, the sfp cli examines the contents of each unlocked package artifact. A package is marked as 'picklist enabled' if it contains one or more picklist(s).
enablePicklist
boolean
Enable picklist upgrade for unlocked packages
unlocked
org dependent unlocked
Unlocked Package
No
75% for the contents of a package validated during build
Org Dependent Unlocked Package
No
No
Source Package
Yes
Yes, either each classes need to have individual 75% coverage or use the entire org's apex test coverage
Diff Package
Yes
Yes, either each classes need to have individual 75% coverage or use the entire org's coverage
Data Package
No
No
To ensure salesforce deployment requirements for source packages, each Apex class within the source package must achieve a minimum of 75% test coverage. This coverage must be verified individually for each class. It's imperative that the test classes responsible for this coverage are included within the same package.
During the artifact preparation phase, sfp cli will automatically identify Apex test classes included within the package. These identified test classes will then be utilised at the time of installation to verify that the test coverage requirement is met for each Apex class, ensuring compliance with Salesforce's deployment standards. When there are circumstances, where individual coverage cannot be achieved by the apex classes within a package, one can disable 'optimized deployment' feature by using the attribute mentioned below.
isOptimizedDeployment
boolean
Detects test classes in a source package automatically and utilise it to deploy the provided package
source
diff
sfp commands
Command Summary
─────────────────────────────── ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
apextests trigger Triggers Apex unit test in an org. Supports test level RunAllTestsInPackage, which optionally allows validation of individual class code coverage
artifacts fetch Fetch sfp artifacts from a NPM compatible registry using a release definition file
artifacts promote Promotes artifacts predominantly for unlocked packages with code coverage greater than 75%
artifacts query Fetch details about artifacts installed in a target org
build Build artifact(s) of your packages in the current project
...sfp build -v <DevHubAlias/DevHubUsername>sfp install -o <TargetOrgAlias/TargetOrgUsername> {
"path": "src/health-cloud",
"package": "health-cloud",
"versionName": "health-cloud-1.0",
"versionNumber": "1.0.0.0",
"assignPermSetsPreDeployment": [
"HealthCloudFoundation",
"HealthCloudSocialDeterminants",
"HealthCloudAppointmentManagement",
"HealthCloudVideoCalls",
"HealthCloudUtilizationManagement",
"HealthCloudMemberServices",
"HealthCloudAdmin",
"HealthCloudApi",
"HealthCloudLimited",
"HealthCloudPermissionSetLicense",
"HealthCloudStandard",
"HealthcareAssociatedInfectionDiseaseGroupAccess"
]
},// Sample package
{
"packageDirectories": [
{
"path": "packages/access-mgmt",
"package": "access-mgmt",
"default": false,
"versionName": "access-mgmt",
"versionNumber": "1.0.0.0",
"reconcileProfiles": "true"
}
{
"packageDirectories": [
{
"path": "src/frameworks/framework-core",
"package": "framework-core",
"versionNumber": "1.0.0.NEXT"
},
{
"path": "src/frameworks/framework-config",
"package": "framework-config",
"versionNumber": "1.0.0.NEXT",
"alwaysSync": true
}
]
}// An example where validate is utilized against a pool
// of earlier prepared scratch orgs with label review
sfp validate pool -p review \
-v devhub \
--diffcheck// An example where validate is utilized against a target org
sfp validate org -o ci \
-v devhub \
--installdeps// An example with branch references for intelligent synchronization
sfp validate org -o ci \
-v devhub \
--ref feature-branch \
--baseRef main// An example with release config for domain-limited validation
sfp validate org -o ci \
-v devhub \
--releaseconfig config/release-sales.yml{
"packageDirectories": [
{
"path": "util",
"package": "Expense-Manager-Util",
"versionName": "Winter ‘25",
"versionDescription": "Welcome to Winter 2025 Release of Expense Manager Util Package",
"versionNumber": "4.7.0.NEXT"
},
{
"path": "exp-core",
"default": false,
"package": "ExpenseManager",
"versionName": "v 3.2",
"versionDescription": "Winter 2025 Release",
"versionNumber": "3.2.0.NEXT",
"dependencies": [
{
"package": "ExpenseManager-Util",
"versionNumber": "4.7.0.LATEST"
},
{
"package": "TriggerFramework",
"versionNumber": "1.7.0.LATEST"
},
{
"package": "External Apex Library - 1.0.0.4"
}
]
},
{
"path": "src/exp-core-config",
"package": "Expense-Manager-Org-Config",
"type" : "source",
"versionName": "Winter ‘25",
"versionDescription": "Welcome to Winter 2025 Release of Expense Manager Util Package",
"versionNumber": "4.7.0.NEXT"
},
],
"sourceApiVersion": "47.0",
"packageAliases": {
"TriggerFramework": "0HoB00000004RFpLAM",
"Expense Manager - Util": "0HoB00000004CFpKAM",
"External Apex [email protected]": "04tB0000000IB1EIAW",
"Expense Manager": "0HoB00000004CFuKAM"
}
}{
"packageDirectories" : [
{ "path": "force-app", "default": true},
{ "path" : "unpackaged" },
{ "path" : "utils" }
],
"namespace": "",
"sfdcLoginUrl" : "https://login.salesforce.com",
"sourceApiVersion": "60.0"
}{
"packageDirectories" : [
{
"package": "force-app",
"versionNumber": "1.0.0.NEXT",
"path": "force-app",
"default": true
},
{ "path" : "unpackaged" }, // You can repeat the same for additonal directories
{ "path" : "utils" } // You can repeat the same for additonal directories
],
"namespace": "",
"sfdcLoginUrl" : "https://login.salesforce.com",
"sourceApiVersion": "60.0"
}docker login source.flxbl.io -u your-username# For base sfp-pro image
docker pull source.flxbl.io/sfp-pro-lite:version
# For sfp-pro with SF CLI
docker pull source.flxbl.io/sfp-pro:version# Tag for your registry
docker tag source.flxbl.io/sfp-pro:version your-registry/sfp-pro:version
# Push to your registry
docker push your-registry/sfp-pro:version# Create a file containing your GITEA token
echo "YOUR_GITEA_TOKEN" > .npmrc.token
# Build the base sfp-pro image (without SF CLI)
docker buildx build \
--secret id=npm_token,src=.npmrc.token \
--build-arg NODE_MAJOR=22 \
--file dockerfiles/sfp-pro-lite.Dockerfile \
--tag sfp-pro-lite:local .
# Remove the token file
rm .npmrc.token# Create a file containing your GITEA token
echo "YOUR_GITEA_TOKEN" > .npmrc.token
# Build the sfp-pro image with SF CLI bundled
docker buildx build \
--secret id=npm_token,src=.npmrc.token \
--build-arg NODE_MAJOR=22 \
--file dockerfiles/sfp-pro.Dockerfile \
--tag sfp-pro:local .
# Remove the token file
rm .npmrc.token// Sample sfdx-project.json
{
"packageDirectories": [
{
"path": "util",
"default": true,
"package": "Expense-Manager-Util",
"versionName": "Winter ‘20",
"versionDescription": "Welcome to Winter 2020 Release of Expense Manager Util Package",
"versionNumber": "4.7.0.NEXT"
},
{
"path": "exp-core",
"default": false,
"package": "ExpenseManager",
"versionName": "v 3.2",
"versionDescription": "Winter 2020 Release",
"versionNumber": "3.2.0.NEXT",
"dependencies": [
{
"package": "ExpenseManager-Util",
"versionNumber": "4.7.0.LATEST"
},
{
"package": "TriggerFramework",
"versionNumber": "1.7.0.LATEST"
},
{
"package": "External Apex Library - 1.0.0.4"
}
]
},
{
"path": "exp-core-config",
"package": "expense-manager-config",
"versionNumber": "1.0.1.NEXT",
"versionDescription": "This source package extends expense manager unlocked package",
},
{
"path": "expense-manager-test-data",
"package": "exppense-manager-test-data",
"type":"data",
"versionNumber": "1.0.1.NEXT",
"versionDescription": "This source package extends expense manager unlocked package",
},
],
"sourceApiVersion": "47.0",
"packageAliases": {
"TriggerFramework": "0HoB00000004RFpLAM",
"Expense Manager - Util": "0HoB00000004CFpKAM",
"External Apex [email protected]": "04tB0000000IB1EIAW",
"Expense Manager": "0HoB00000004CFuKAM"
}
}// Limit build by release config
sfp build -v devhub --branch=main --domain config/sales.yaml// Limit build by certain packages
sfp build -v devhub --branch=main -p sales-ui,sales-channels// Sample sfdx project json
{
"packageDirectories": [
{
"path": "./src-env-specific-pre",
"package": "src-env-specific-pre",
"versionNumber": "1.0.0.0",
"ignoreOnStage": [
"build"
]
},
{
"path": "./src/frameworks/feature-mgmt",
"package": "feature-mgmt",
"versionNumber": "1.0.0.NEXT"
}
]
} --npm Upload artifacts to a pre-authenticated private npm registry
--npmrcpath=<value> Path to .npmrc file used for authentication to registry. If left blank, defaults to home
directory
--npmtag=<value> Add an optional distribution tag to NPM packages. If not provided, the 'latest' tag is set
to the published version.
--scope=<value> (required for NPM) User or Organisation scope of the NPM package --gittag Tag the current commit ID with an annotated tag containing the package name and version -
does not push tag
--gittagage=<value> Specifies the number of days,for a tag to be retained,any tags older the provided number
will be deleted
--gittaglimit=<value> Specifies the minimum number of tags to be retained for a package
--pushgittag Pushes the git tags created by this command to the repo, ensure you have access to the repo// Demonstrating how to disable optimized deployment
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"enablePicklist": false
},
...
]
}// Demonstrating how to disable optimized deployment
{
"packageDirectories": [
{
"path": "core-crm",
"package": "core-crm",
"versionDescription": "Package containing core schema and classes",
"versionNumber": "4.7.0.NEXT",
"isOptimizedDeployment": false
},
...
]
}API Name: Validate that the API name within the metadata file accurately reflects the new version.
Version Number: Update the versionNumber in your metadata file to represent the new version clearly.
Default Version: Confirm that only one version of the Entitlement Process is set as Default to avoid deployment conflicts.
Download the appropriate installer for your platform:
Windows
MSI
sfp-pro-49.3.0-windows-x64.msi
macOS
DMG
sfp-pro-49.3.0-darwin-universal.dmg
Linux (Debian/Ubuntu)
DEB
sfp-pro_49.3.0_linux_amd64.deb
Linux (RHEL/Fedora)
RPM
Download and run the latest installer - it will automatically upgrade your existing installation.
Use "Add or Remove Programs" in Control Panel
Dependency Declaration
Yes
Yes (supported by sfp)
Yes
Yes
Yes (supported by sfp)
Requires dependency to be resolved during creation
Yes
No
No
N/A
No
Supported Metadata Types
Unlocked Package Section in
Unlocked Package Section in
Metadata API Section in
N/A
Metadata API Section in
Code Coverage Requirement
Package should have 75% code coverage or more
Not enforced by Salesforce, sfp by default checks for 75% code coverage
Each apex class should have a coverage of 75% or above for optimal deployment, otherwise the entire coverage of the org will be utilized for deployment
N/A
Each apex class that's part of the delta between the current version and the baseline needs a test class and requires a coverage of 75%.
Component Lifecycle
Automated
Automated
Explicit, utilize destructiveManifest or manual deletion
N/A
Explicit, utilize destructiveManifest or manual deletion
Component Lock
Yes, only one package can own the component
Yes, only one package can own the component
No
N/A
No
Version Management
Salesforce enforced versioning; Promotion required to deploy to prod
Salesforce enforced versioning; Promotion required to deploy to prod
sfp enforced versioning
sfp enforced versioning
sfp enforced versioning
Dependency Validation
Occurs during package creation
Occurs during package installation
Occurs during package installation
N/A
Occurs during package installation
Navigate to the Setup menu
Go to Development > Dev Hub
Toggle the button to on for Enable Dev Hub
Enable Unlocked Packages and Second-Generation Managed Packages
The sfpowerscripts-artifact package is a lightweight unlocked package consisting of a custom setting SfpowerscriptsArtifact2__c that is used to keep a record of the artifacts that have been installed in the org. This enables package installation, using sfp, to be skipped if the same artifact version already exists in the target org.
Once the command completes, confirm the unlocked package has been installed.
Navigate to the Setup menu
Go to Apps > Packaging > Installed Packages
Confirm the package sfpowerscripts-artifact is listed in the "Installed Packages"
Ensure that you install sfpowerscripts artifact unlocked package in all your target orgs that you intend to deploy using sfp.
Once A is completed, trigger creation of artifacts for packages B & C **,**using the version of A, created in step 1
Once C is completed, trigger creation of package D
Scenario 2 : Build with diffCheck enabled on a package with no dependencies
In this scenario, where only a single package has changed and diffCheck is enabled, the build command will only trigger the creation of Package B
Scenario 3 : Build with diffCheck enabled on changes in multiple packages
In this scenario, where there are changes in multiple packages, say B & C, the build command will trigger the creation of artifacts for these packages in parallel, as their dependent package A has not changed (hence fulfilled). Please note even though there is a change in C, package D will not be triggered, unless there is an explicit version change of version number (major.minor.patch) of package D

Breaking Changes: sfp community edition to sfp-pro
sfp build (alias: orchestrator:build)
✅
Pro adds --commit flag
sfp install
✅
Core functionality unchanged
sfp publish
✅
Critical: Update Your Pipelines
Most orchestrator: command aliases have been removed in sfp-pro. You must update your pipelines:
New/Changed in sfp-pro:
Exception: sfp orchestrator:publish still works in sfp-pro.
--commit flag added to build command for specifying commit hash
Server flags added to most commands (--server-url, --api-key)
Additional commands in pro: server:*
✅ What Stays the Same
Core command functionality is unchanged
All flags from sfp-comm still work
Command outputs remain compatible
Migration Notes:
🆕 New in sfp-pro
All core orchestrator commands (build, install, publish, release) work identically
Existing pipelines using these commands will continue to work
New flags are optional and backward compatible
This feature is by default activated whenever build/quickbuild even in implicit scenarios such as validate, prepare etc, which might result in building packages.
Let's consider the following sfdx-project.json to explain how this feature works.
The above project manifest (sfdx-project.json) describes three packages, sfdc-logging, feature-mgmt., core-crm . Each package are defined with dependencies as described below
sfdc-logging
None
feature-mgmt
sfdc-logging
As you might have noticed, this is an incorrect representation, as per the definitions of unlocked package, the package 'core-crm' should be explicitly defining all its dependencies. This means it should be as described below.
To successfully create a version of core-crm , both sfdc-logging and feature-mgmt. should be defined as an explicit dependency in the sfdx-project.json
As the number of packages grow in your project, it is often seen developers might accidentally miss declaring dependencies or the sfdx-project.json has become so large due to large amount of repetition of dependencies between packages. This condition would result in build stage often failing with missing dependencies.
sfp features a transitive dependency resolution which can autofill the dependencies of the package by inferring the dependencies from sfdx-project.json, so the above package descriptor of core-crm will be resolved correctly to \
Please note, in the current iteration, it will autofill dependency information from the current sfdx-project.json and doesn't consider variations among previous versions.
For dependencies outside of the sfdx-project.json, one could define an externalDependencyMap as shown below
If you need to disable this feature and have stringent dependency management rules, utilize the following in your sfdx-project.json
In a Flxbl project powered by sfp, development follows a structured, iterative workflow designed for team collaboration and continuous delivery. This section guides you through the complete development lifecycle - from setting up your environment to submitting code for review.
DevHub Access is Required for sfp development:
Building any type of package (source, unlocked, data)
Creating and managing scratch orgs
Resolving package dependencies
Ensure your Salesforce user has DevHub access by following .
Development in an sfp project follows these key principles:
Isolated Environments: Each developer works in their own sandbox or scratch org
Source-Driven Development: All changes are tracked in version control
Iterative Cycles: Developers continuously pull, modify, and push changes
Automated Validation: Pull requests trigger comprehensive validation
Developers in an sfp project typically follow this pattern:
Fetch a fresh environment from a pre-prepared pool
Scratch orgs: Can be fetched per story for maximum isolation
Sandboxes: Usually fetched at sprint start for longer-running work
Pull the latest metadata to sync with the org
Pull changes from the org to stay synchronized
Make modifications to metadata and code
Push changes back to the org for testing
Run tests locally to validate functionality
Create a pull request with your changes
Automated validation runs via CI/CD pipeline
Review environment is created for acceptance testing
Merge after approval and successful validation
sfp provides powerful commands for environment management through pools:
Pre-prepared scratch orgs managed locally via DevHub:
Centralized pool management for both scratch orgs and sandboxes:
This pooling approach means developers spend less time waiting for environment creation and more time coding.
The foundation of sfp development is bidirectional source synchronization:
Retrieves changes from your org to local source:
Automatically handles text replacement reversals
Detects conflicts and provides resolution options
Suggests new replacements for hardcoded values
Deploys local changes to your org:
Applies environment-specific text replacements
Handles destructive changes automatically
Supports various test execution levels
For a complete walkthrough of the development workflow with detailed commands and examples:
For specific development tasks:
For managing environments:
sfp provides powerful commands to manage and understand package dependencies in your Salesforce projects. These tools help you maintain clean dependency declarations, troubleshoot dependency issues, and optimize your build processes.
Add all transitive dependencies to make them explicit
In Salesforce DX projects, packages can depend on other packages. These dependencies come in two forms:
Direct Dependencies: Dependencies explicitly declared in a package's configuration
Transitive Dependencies: Dependencies of your dependencies (indirect dependencies)
For example, if Package A depends on Package B, and Package B depends on Package C, then:
Package A has a direct dependency on Package B
Package A has a transitive dependency on Package C
A typical dependency management workflow involves:
Development Phase: Use sfp dependency:expand to make all dependencies explicit during development, helping identify potential issues early
Analysis: Use sfp dependency:explain to understand dependency relationships and identify unnecessary dependencies
Cleanup: Use sfp dependency:shrink before committing to maintain minimal, clean dependency declarations
External dependencies are packages from outside your project, typically:
Managed packages from AppExchange
Packages from other Dev Hub organizations
Third-party components
These are configured in your sfdx-project.json under plugins.sfp.externalDependencyMap:
Keep dependencies minimal: Only declare direct dependencies in your source control
Use expand for analysis: Temporarily expand dependencies to understand the full graph
Validate regularly: Run dependency commands in CI/CD to catch issues early
Document external dependencies: Clearly document why each external dependency is needed
If packages depend on each other in a circular manner, the dependency commands will report an error. Refactor your packages to break the circular dependency.
If a package references components from another package without declaring the dependency, add the missing dependency to the package's configuration.
When different packages require different versions of the same dependency, sfp will use the highest compatible version. Consider standardizing versions across your project.
- Conceptual overview of dependencies
- How sfp resolves dependencies
Availability
✅
❌
From
October 25
sfp provides intelligent AI-assisted error analysis to help developers quickly understand and resolve validation failures. When enabled through the errorAnalysis configuration in ai-assist.yaml, the system automatically analyzes error patterns and provides actionable insights.
AI-assisted error analysis requires:
The errorAnalysis.enabled flag set to true in config/ai-assist.yaml
Create config/ai-assist.yaml in your project root:
Add the minimal configuration:
Set your LLM provider credentials (e.g., in CI/CD secrets)
That's it! AI error analysis will automatically activate during validation failures.
Before triggering AI analysis, sfp evaluates if changes are significant enough to warrant review:
Metadata Type Detection: Uses Salesforce ComponentSet for accurate identification
Smart Thresholds: Different thresholds per file type (Apex: 3 lines, Flows: 1 line, LWC: 10 lines)
Automatic Exclusions: Skips non-critical metadata (CustomLabels, StaticResources, Translations)
When validation fails and changes are significant, AI provides:
Root Cause Analysis: Understanding why the error occurred
Quick Fix Suggestions: Immediate actions to resolve issues
Related Components: Other files that might be involved
Documentation Links: References to relevant Salesforce docs
Configure AI assistance through config/ai-assist.yaml:
AI error analysis is automatically enabled when:
A config/ai-assist.yaml file exists in your project
The errorAnalysis.enabled flag is set to true
Valid LLM provider credentials are available
No additional CLI flags are required - sfp automatically detects and uses the configuration.
Validation processes often need to synchronize the provided organization by installing packages that differ from those already installed. This task can become particularly time-consuming for large projects with hundreds of packages, especially in monorepo setups with multiple independent domains.
To streamline the validation process and focus it on specific domains, you can use release configurations with the --releaseconfig flag. This approach limits the scope of validation to only the packages defined in your release configuration, significantly enhancing efficiency and reducing validation time.
In this example, validation is limited to packages defined in the release-domain-sales.yml configuration file. Only packages that:
Are listed in the release configuration AND
Have changes compared to what's installed in the org
will be validated.
For projects with multiple independent domains, you can specify multiple release configurations:
Faster Feedback: Validate only the relevant packages for your team's domain
Reduced Dependencies: Avoid failures from unrelated packages in other domains
Parallel Development: Multiple teams can work independently without blocking each other
Optimized CI/CD: Shorter validation times mean more efficient pipeline execution
Organize by Domain: Create separate release configurations for each logical domain
Keep Configurations Updated: Regularly review and update package lists in release configs
Use in CI/CD: Automate domain-specific validation in your pipeline
Document Dependencies: Clearly document cross-domain dependencies in your configurations
Note: The deprecated
--mode=thorough-release-configand--mode=ff-release-confighave been replaced by using the standard modes with the--releaseconfigflag. This provides the same functionality with a simpler, more consistent interface.
A diff package is a variant of 'source package' , where the contents of the package only contain the components that are changed. This package is generated by sfpowerscripts by computing a git diff of the current commit id against a baseline set in the Dev Hub Org
A diff package mimics a Change Set model, where only changes are contained in the artifact. As such, this package is always an incremental package. It only deploys the changes incrementally compared to baseline and applied to the target org. Unless both previous version and the current version have the exact same components, a diff package can never be rolled back, as the impact on the org is unpredictable. It is always recommended to roll forward a diff package by fixing or creating a necessary change that counters the impact that you want on the target orgs.
Diff packages doesnt work with scratch orgs. It should be used with sandboxes only.
A diff package is the least consistent package among the various package types available within sfpowerscripts, and should only be used for transitioning to a modular development model as prescribed by flxbl
The below example demonstrates a sfdx-project.json where the package unpackaged is a diff package. You can mark a diff package with the type 'diff'. All other attributes applicable to source packages are applicable for diff packages.
A manual entry in sfpowerscripts_Artifact2__c custom object should be made with the name of the package and the baseline commit id and version. Subsequent deployments will automatically reset the baseline when the package gets deployed to Dev Hub
buildCollection
array
Build a collection of packages together, even if only one package among the collection is changed
unlocked
org-dependent unlocked
source
diff
In certain scenarios, it's necessary to build a new version of a package when any package in the collection undergoes changes. This can be accomplished by utilizing the buildCollection attribute in the sfdx-project.json file.
To ensure that a new version of a package is built whenever any package in a specified collection undergoes a change, you can use the buildCollection attribute in the sfdx-project.json file. Below is an example illustrating how to define a collection of packages that should be built together.
enableFHT
boolean
Enable field history tracking for fields
unlocked
org dependent unlocked
enableFT
boolean
Enable Feed Tracking for fields
unlocked
org dependent unlocked
Salesforce has a strict limit on the number of fields that can be tracked. Of course, this limit can be increased by raising it to the Salesforce Account Executive (AE). However, it would be problematic if an unlocked package is deployed to orgs that do not have the limit increased. So don’t be surprised when you are working on a flxbl project and happen to find that the deployment of field history tracking from unlocked packages is disabled by Salesforce.
One workaround is to keep a copy of all the fields that need to be tracked in a separate source package (field-history-tracking or similar) and deploy it as one of the last packages with the ‘alwaysDeploy’ option.
However, this specific package has to be carefully aligned with the original source/unlocked packages, to which the fields originally belong. As the number of tracked fields increases in large projects, this package becomes larger and more difficult to maintain. In addition, since it’s often the case that the project does not own the metadata definition of fields from managed packages, it doesn’t make much sense to carry the metadata only for field history tracking purposes.
To resolve this, sfp features the ability to automate the deployment of field history tracking for both unlocked packaged and managed packages without having to maintain additional packages.
Two mechanisms are implemented to ensure that all the fields that need to be field history tracking enabled are properly deployed. Specifically, a YAML file that contains all the tracked fields is added to the package directory.
During deployment, the YAML file is examined and the fields to be set are stored in an internal representation inside the deployment artifact. Meanwhile, the components to be deployed are analyzed and the ones with ‘trackHistory’ on are added to the same artifact. This acts as a double assurance that any field that is missed in the YAML files due to human error will also be picked up. After the package installation, all the fields in the artifact are retrieved from the target org and, for those fields, the trackHistory is turned on before they are redeployed to the org.
In this way, the deployment of field history tracking is completely automated. One now has a declarative way of defining field history tracking (as well as feed tracking) without having to store the metadata in the repo. The only maintenance effort left for developers is to manage the YAML file.
Data packages are a sfpowerscripts construct that utilise the to create a versioned artifact of Salesforce object records in csv format, which can be deployed to the a Salesforce org using the sfpowerscripts package installation command.
The Data Package offers a seamless method of integrating Salesforce data into your CICD pipelines , and is primarily intended for record-based configuration of managed package such as CPQ, Vlocity (Salesforce Industries), and nCino.
Data packages are a wrapper around SFDMU that provide a few key benefits:
Ability to skip the package if already installed: By keeping a record of the version of the package installed in the target org with the support of an unlocked package, sfpowerscripts can skip installation of data packages if it is already installed in the org
sfp handles destructive changes according to the type of package. Here is a rundown on how the behaviour is according to various package types and modes
Salesforce handles destructive changes in unlocked packages / org dependent unlocked packages as part of the package upgrade process. From the Salesforce documentation ()
Metadata that was removed in the new package version is also removed from the target org as part of the upgrade. Removed metadata is metadata not included in the current package version install, but present in the previous package version installed in the target org. If metadata is removed before the upgrade occurs, the upgrade proceeds normally. Some examples where metadata is deprecated and not deleted are:
The sfp dependency:shrink command optimizes your project's dependency declarations by removing redundant transitive dependencies from your sfdx-project.json. This results in a cleaner project configuration with only the necessary direct dependencies declared for each package.
The sfp dependency:explain command helps you understand the dependencies between packages in your project. It can analyze either a specific package's dependencies or all package dependencies in the project, showing both direct and transitive dependencies.
In some situations, you might need to execute a pre/post deployment script to do manipulate the data before or after being deployed to the org.
The AI-powered report functionality generates comprehensive analysis reports for your Salesforce projects using advanced language models. This feature provides deep insights into code quality, architecture, and best practices specific to the Flxbl framework.
"plugins": {
"sfp": {
"disableEntitlementFilter": true //disable entitlement filtering
}
}# Double-click the .msi file or run:
msiexec /i sfp-pro-*.msi# 1. Open the DMG file
# 2. Drag sfp-pro.app to your Applications folder
# 3. Run the installer script from the DMG:
sudo bash /Volumes/sfp-pro-*/install-cli.sh
# Or if you've already unmounted the DMG:
sudo /Applications/sfp-pro.app/Contents/Resources/install-cli.shsudo dpkg -i sfp-pro_*.debsudo rpm -i sfp-pro_*.rpm
# or
sudo yum install sfp-pro_*.rpmsfp --version
# Example output: @flxbl-io/sfp/49.3.0 linux-x64 node-v22.0.0sudo rm -rf /Applications/sfp-pro.app
sudo rm -f /usr/local/bin/sfpsudo dpkg -r sfp-prosudo rpm -e sfp-pro
# or
sudo yum remove sfp-prosf package install --package 04t1P000000ka9mQAA -o <your_org_alias> --security-type=AdminsOnly --wait=120
Waiting 120 minutes for package install to complete.... done
Successfully installed package [04t1P000000ka9mQAA]{
"packageDirectories": [
{
"path": "./src/frameworks/sfdc-logging",
"package": "sfdc-logging",
"versionName": "Version 1.0.2",
"versionNumber": "1.0.2.NEXT"
},
{
"path": "./src/frameworks/feature-mgmt",
"package": "feature-mgmt",
"versionName": "Version 1.0.6",
"versionNumber": "1.0.6.NEXT",
"dependencies": [
{
"package": "sfdc-logging",
"versionNumber": "1.0.2.LATEST"
}
]
},
{
"path": "./src/core-crm",
"package": "core-crm",
"versionName": "Version 1.0.4",
"versionNumber": "1.0.4.NEXT",
"dependencies": [
{
"package": "feature-mgmt",
"versionNumber": "1.0.6.LATEST"
}
]
}
],
"namespace": "",
"sfdcLoginUrl": "https://login.salesforce.com",
"sourceApiVersion": "60.0",
"packageAliases": {
"feature-mgmt": "0Ho5f000000GmkrCAC",
"sfdc-logging": "0Ho5f000000GmerCAC",
"core-crm": "0Ho5f000000Amz7CAC"
},
"plugins": {}
}sfp-pro_49.3.0_linux_amd64.rpm
dev:*project:*sandbox:*Orchestrator alias: Both versions support sfp orchestrator:build as alias for sfp build
--commit flag on build command
Server mode with --server-url and --api-key flags
Additional commands: server:*, dev:*, project:*, sandbox:*, config:*
Core functionality unchanged
sfp release
✅
Core functionality unchanged
sfp orchestrator:build
sfp build
sfp orchestrator:deploy
sfp install
sfp orchestrator:release
sfp release
sfp orchestrator:validate
sfp validate
sfp orchestrator:quickbuild
sfp quickbuild




core-crm
feature-mgmt
sfdc-logging
None
feature-mgmt
sfdc-logging
core-crm
sfdc-logging, feature-mgmt
Review Environments: Stakeholders can test changes before merge
Create a feature branch for version control
Build artifacts to ensure packages are valid
Build Optimization: Expanded dependencies help build tools understand the complete dependency graph for optimized builds
Version carefully: Use specific versions for production, .LATEST for development
Remove redundant transitive dependencies for cleaner configuration
Analyze and understand package dependencies
See Configuring LLM Providers for setup instructions.

User-entered data in custom objects and fields are deprecated and not deleted. Admins can export such data if necessary.
An object such as an Apex class is deprecated and not deleted if it’s referenced in a Lightning component that is part of the package.
sfp utilizes mixed mode while installing unlocked packages to the target org. So any metadata that can be deleted is removed from the target org. If the component is deprecated, it has to be manually removed.
Components that are hard deleted upon a version upgrade is found here.
Source packages support destructive changes using folder structure to demarcate components that need to be deleted. One can make use of pre-destructive and `post-destructive folders to mark components that need to be deleted
The package installation is a single deployment transaction with components that are part of pre/post deployed along with destructive operation as specified in the folder structure. This would allow one to refactor the current code to facilitate refactoring for the destructive changes to succeed, as often deletion is only allowed if there are no existing components in the org that have a reference to the component that is being deleted
{% hint style="info" %} Destructive Changes support for source package is currently available only in sfp (pro) version. {% endhint %}
{% hint style="info" %}
Test destructive changes in your review environment thoroughly before merging your changes
You will need to understand the dependency implications while dealing with destructive changes, especially the follow on effects of a deletion in other packages, It is recommended you do a compile all of all apex classes (https://salesforce.stackexchange.com/a/149955 & https://salesforce.stackexchange.com/a/391614) to detect any errors on apex classes or triggers
After the version of package is installed across all the target orgs, you would need to merge another change which would remove the post-destructive or pre-destructive folders. You do not need to rush through this , as sfp ignores any warning associated with missing components in the org {% endhint %}
Data packages utilize sfdmu under the hood, and one can utilize any of the below approaches to remove data records.
Approach 1: Combined Upsert and Delete Operations
One effective method involves configuring SFDMU to perform both upsert and delete operations in sequence for the same object. This approach ensures comprehensive data management—updating and inserting relevant records first, followed by removing outdated entries based on specific conditions.
Upsert Operation: Updates or inserts records based on a defined external ID, aligning the Salesforce org with new or updated data from a source file.
Delete Operation: Deletes specific records that meet certain criteria, such as being marked as inactive, to ensure the org only contains relevant and active data.
Approach 2: Utilizing deleteOldData
Another approach involves using the deleteOldData parameter. This parameter is particularly useful when needing to clean up old data that no longer matches the current dataset in the source before inserting or updating new records.
Delete Old Data: Before performing data insertion or updates, SFDMU can be configured to remove all existing records that no longer match the new dataset criteria, thus simplifying the maintenance of data freshness and relevance in the target org
-o, --overwrite
Overwrites the existing sfdx-project.json file with the shrunk configuration
No
-v, --targetdevhubusername
Username or alias of the target Dev Hub org
Yes
--loglevel
Logging level (trace, debug, info, warn, error, fatal)
No
Without --overwrite, creates a new file at ./project-config/sfdx-project.min.json
With --overwrite, backs up existing sfdx-project.json to ./project-config/sfdx-project.json.bak and overwrites the original
Removes transitive dependencies that are already covered by direct dependencies
Preserves external dependency mappings
External dependencies can be configured in your sfdx-project.json file using the plugins.sfp section. This is particularly useful for managed packages or packages from other Dev Hubs that your project depends on.
External dependencies must be defined with their 04t IDs (subscriber package version IDs)
The versionNumber can use .LATEST to automatically use the latest version that matches the major.minor.patch pattern
External dependencies are preserved during both shrink and expand operations
These dependencies are automatically included when calculating transitive dependencies
Availability
✅
✅
-p, --package
Name of the package to analyze dependencies for
No
--json
Format output as JSON
No
--loglevel
Logging level (trace, debug, info, warn, error, fatal)
No
When analyzing dependencies, the command provides information about:
Direct dependencies: Dependencies explicitly declared in the package's configuration
Transitive dependencies: Dependencies that are required by your direct dependencies
For transitive dependencies, the command shows which packages contribute to requiring that dependency
When using the --json flag, the command returns data in the following structure:
Analyze all package dependencies:
Analyze dependencies for a specific package:
Get dependencies in JSON format:
The command requires a valid SFDX project configuration
Dependencies are resolved based on the information in your sfdx-project.json file
Transitive dependency resolution helps identify indirect dependencies that might not be immediately obvious from the project configuration
Availability
✅
✅
From
November 24
December 24
The scripts are called with the following parameters. In your script you can refer to the parameters using positional parameters.
Please note scripts are copied into the artifacts and are not executed from version control. sfpowerscripts only copies the script mentioned by this parameter and do not copy any additional files or dependencies. Please ensure pre/post deployment scripts are independent or should be able to download its dependencies
1
Name of the Package
2
Username of the target org where the package is being deployed
3
Alias of the target org where the package is being deployed
4
Path to the working directory that has the contents of the package
5
Path to the package directory. One would need to combine parameter 4 and 5 to find the absolute path to the contents of the package
Please note the script has to be completely independent and should not have dependency on a file in the version control, as scripts are executed within the context of an artifact.
preDeploymentScript
string
Run an executable script before deploying an artifact. Users need to provide a path to the script file
unlocked
org-dependent unlocked
source
diff
postDeploymentScript
string
Run an executable script after deploying an package. Users need to provide a path to the script file
unlocked
org-dependent unlocked
source
diff
String replacements enable automatic substitution of placeholder values with environment-specific values during package installation, similar to how aliasfy packages work with folder structures.
During build, when a package contains a preDeploy/replacements.yml file:
The configuration is analyzed and validated
The replacement patterns are embedded in the artifact
During installation, placeholders are replaced based on the target org
For example, if you have placeholders like %%API_ENDPOINT%% in your code, they get replaced with the appropriate value for the target environment during installation.
Place your replacements.yml file in the package's preDeploy directory:
The replacement configuration will be automatically detected and processed during build.
String replacements are only supported for source packages. If you attempt to use replacements with other package types (unlocked, org-dependent unlocked, or data), the build will fail with an error message indicating that you must either remove the replacements.yml file or change the package type to 'source'.
The replacements.yml file in the preDeploy folder is automatically detected and processed during build.
String replacements and aliasfy packages are complementary features for managing environment-specific configurations:
Configuration values that change between environments (API endpoints, email addresses, feature flags)
Reducing duplication when only specific values differ in otherwise identical files
Maintaining single source for business logic while varying configuration
Code-based metadata (Apex classes, Lightning components) with environment-specific values
Structural differences between environments (different fields, objects, workflows)
Completely different metadata per environment (unique page layouts, permission sets)
Declarative configurations that vary significantly by environment
Binary metadata that cannot be text-replaced (images, static resources)
If you're currently using aliasfy packages with duplicated files just for value changes, consider migrating to string replacements:
Identify candidates: Find files duplicated solely for configuration value changes
Consolidate files: Create single source file with placeholder patterns
Create replacements.yml: Define environment-specific values
Remove duplicates: Delete redundant files from aliasfy folders
Test thoroughly: Validate in lower environments before production
Before (Using Aliasfy - 3 duplicate files):
After (Using String Replacements - 1 file + config):
replacements
configuration file
Enable replacement of placeholder values with environment-specific values during installation
source
The report generator analyzes your codebase through multiple perspectives:
Package architecture and design patterns
Dependencies and coupling between packages
Code quality and technical debt
Flxbl best practices compliance
Security and compliance considerations
For complete setup and configuration instructions, see Configuring LLM Providers.
Reports are generated in Markdown format with the following structure:
Executive Summary - High-level findings and recommendations
Package/Domain Overview - Architecture and design analysis
Dependencies Analysis - Inter-package relationships
Code Quality Insights - Technical debt and improvement opportunities
Recommendations - Prioritized action items
If you see an error about OpenCode CLI not being installed:
If you encounter rate limits:
Reduce --prompt-count to lower token usage
Analyze smaller scopes (single package vs domain)
Consider using a different model with higher limits
Token Usage: Package analysis typically uses 10-30K tokens, domains 30-80K tokens
Models: Default models are optimized for best value and performance
GitHub Copilot: No additional cost if you have Copilot subscription
Amazon Bedrock: Pay-per-use pricing through AWS, check Bedrock pricing in your region
Availability
✅
❌
From
October 25
Not Available
{
"path": "./src/core-crm",
"package": "core-crm",
"versionName": "Version 1.0.6",
"versionNumber": "1.0.6.NEXT",
"dependencies": [
{
"package": "sfdc-logging",
"versionNumber": "1.0.2.LATEST"
},
{
"package": "feature-mgmt",
"versionNumber": "1.0.6.LATEST"
}
]
},
"plugins": {
"sfp": {
"disableTransitiveDependencyResolver": true,
"ignoreFiles": {
"prepare": ".forceignore",
"validate": ".forceignore",
"quickbuild": "forceignores/.quickbuildignore",
"build": "forceignores/.buildignore"
},
"externalDependencyMap": {
"trigger-framework": [
{
"package": "0H1000XRTCam",
"versionNumber": "1.0.3.LATEST"
}
],
}
}
}"plugins": {
"sfp": {
"disableTransitiveDependencyResolver": true,
}
}# Fetch a scratch org for your feature (aliases: pool:fetch)
sfp pool scratch fetch --tag dev-pool --alias my-feature \
--targetdevhubusername mydevhub# Fetch an org instance from the server pool
sfp server pool instance fetch \
--repository myorg/myrepo \
--tag dev-pool \
--assignment-id feature-123
# The same command works for both scratch orgs and sandboxes
# Pool type is configured at the server level{
"plugins": {
"sfp": {
"externalDependencyMap": {
"external-package-name": [
{
"package": "04tXXXXXXXXXXXXXXX",
"versionNumber": "1.0.0.LATEST"
}
]
}
}
}
}mkdir -p config
touch config/ai-assist.yaml# config/ai-assist.yaml
errorAnalysis:
enabled: true# config/ai-assist.yaml
# Error Analysis Configuration
errorAnalysis:
enabled: true # Enable/disable AI error analysis
provider: openai # AI provider (openai, anthropic, etc.)
model: gpt-4 # Model to use
timeout: 180000 # Timeout in ms (default: 3 minutes)
maxSuggestedFixes: 5 # Max number of fix suggestions
# Change Significance Configuration
architecture:
changeSignificance:
# Metadata types to exclude from analysis
excludedMetadataTypes:
- CustomLabels
- StaticResource
- CustomTab
# Thresholds for triggering analysis
fileTypeThresholds:
apex:
lines: 3 # Very strict for Apex
files: 1
flows:
lines: 1 # Any flow change
files: 1
lwc:
lines: 10 # More lenient for UI
files: 2sfp validate org -o ci \
-v devhub \
--mode thorough \
--releaseconfig=config/release-domain-sales.ymlsfp validate org -o ci \
-v devhub \
--mode thorough \
--releaseconfig config/release-sales.yml \
--releaseconfig config/release-service.yml# config/release-sales.yml
name: Sales Domain
includeOnlyPackages:
- sales-core
- sales-ui
- sales-integrations
- opportunity-management
- quote-management
skipPackages:
- sales-deprecatedsfp validate org -o ci \
-v devhub \
--diffcheck \
--releaseconfig=config/release.ymlsfp validate org -o ci \
-v devhub \
--mode individual \
--releaseconfig=config/release.ymlsfp validate org -o ci \
-v devhub \
--ref feature-branch \
--baseRef main \
--releaseconfig=config/release.yml// sfdx-project.json
{
"packageDirectories": [
{
"path": "util",
"default": false,
"package": "Expense-Manager-Util",
"versionName": "Winter ‘24",
"versionDescription": "Welcome to Winter 2024 Release of Expense Manager Util Package",
"versionNumber": "4.7.0.NEXT"
},
{
"path": "unpackaged",
"default": true,
"package": "unpackaged",
"versionName": "v3.2",
"type":"diff"
}
],
"sourceApiVersion": "58.0",
"packageAliases": {
"TriggerFramework": "0HoB00000004RFpLAM",
"Expense Manager - Util": "0HoB00000004CFpKAM",
"External Apex [email protected]": "04tB0000000IB1EIAW",
"Expense Manager": "0HoB00000004CFuKAM"
}
}{
"packageDirectories": [
{
"path": "core",
"package": "core-package",
"versionName": "Core 1.0",
"versionNumber": "1.0.0.NEXT",
"default": true,
"buildCollection": [
"core-package",
"featureA-package",
"featureB-package"
]
},
{
"path": "features/featureA",
"package": "featureA-package",
"versionName": "Feature A 1.0",
"versionNumber": "1.0.0.NEXT"
"buildCollection": [
"core-package",
"featureA-package",
"featureB-package"
]
},
{
"path": "features/featureB",
"package": "featureB-package",
"versionName": "Feature B 1.0",
"versionNumber": "1.0.0.NEXT",
"buildCollection": [
"core-package",
"featureA-package",
"featureB-package"
]
}
],
}// Consider a source package feature-management
// with path as src/feature-management
└── feature-management
├── main
├──── default
├──────── <metadata-contents>
├── post-destructive
├──────── <metadata-contents>
├── pre-destructive
├──────── <metadata-contents>
└── test{
"name": "CustomObject__c",
"operation": "Upsert",
"externalId": "External_Id__c",
"query": "SELECT Id, Name, IsActive__c FROM CustomObject__c WHERE SomeCondition = true"
}{
"name": "CustomObject__c",
"operation": "Delete",
"query": "SELECT Id FROM CustomObject__c WHERE IsActive__c = false"
}// Use of deleteOldData
{
"name": "CustomObject__c",
"operation": "Upsert",
"externalId": "External_Id__c",
"deleteOldData": true
}
# Create a shrunk version of the project configuration
sfp dependency:shrink
# Overwrite the existing sfdx-project.json with shrunk dependencies
sfp dependency:shrink --overwrite{
"packageDirectories": [...],
"plugins": {
"sfp": {
"externalDependencyMap": {
"package-name": [
{
"package": "04tXXXXXXXXXXXXXXX",
"versionNumber": "1.0.0.LATEST"
}
]
}
}
}
}{
"plugins": {
"sfp": {
"externalDependencyMap": {
"trigger-framework": [
{
"package": "0H1000XRTCam",
"versionNumber": "1.0.3.LATEST"
}
]
}
}
}
}# Explain dependencies for all packages
sfp dependency:explain
# Explain dependencies for a specific package
sfp dependency:explain -p <package-name>{
"packages": [
{
"name": "package-name",
"dependencies": [
{
"name": "dependency-name",
"version": "version-number",
"type": "direct|transitive",
"contributors": ["package-names"] // Only present for transitive dependencies
}
]
}
]
}sfp dependency:explainsfp dependency:explain -p my-packagesfp dependency:explain -p my-package --json// Sample package
{
"packageDirectories": [
{
"path": "src/data-package-cl",
"package": "data-package-cloudlending",
"type": "data",
"versionNumber": "2.0.10.NEXT",
"preDeploymentScript": "scripts/enableEmailDeliverability.sh"
"postDeploymentScript": "scripts/pushData.sh"
}
// A script to enable email deliverablity to All
#!/bin/bash
export SF_DISABLE_DNS_CHECK=true
# Create a temporary file
temp_file="$(mktemp)"
# Write the anonymous Apex code to the temp file
cat << EOT >> "$temp_file"
{
"$schema": "https://raw.githubusercontent.com/amtrack/sfdx-browserforce-plugin/master/src/plugins/schema.json",
"settings": {
"emailDeliverability": {
"accessLevel": "All email"
}
}
}
EOT
if [ "$3" = 'ci' ]; then
# Execute the browserforce configuration to this org
sf browserforce:apply -f "$temp_file" --target-org $3
fi
# Clean up by removing the temporary file
rm "$temp_file"src/
your-package/
preDeploy/
replacements.yml
main/
default/// Demonstrating package directory with string replacements
{
"packageDirectories": [
{
"path": "src/your-package",
"package": "your-package",
"versionNumber": "1.0.0.NEXT"
// No specific flag needed - replacements are auto-detected
}
]
}src-env-specific/
default/classes/APIService.cls # endpoint = "https://api-dev.example.com"
staging/classes/APIService.cls # endpoint = "https://api-staging.example.com"
production/classes/APIService.cls # endpoint = "https://api.example.com"// Single APIService.cls with placeholder
public class APIService {
private static final String ENDPOINT = '%%API_ENDPOINT%%';
}
// preDeploy/replacements.yml
replacements:
- name: "API Endpoint"
pattern: "%%API_ENDPOINT%%"
glob: "**/APIService.cls"
environments:
default: "https://api-dev.example.com"
staging: "https://api-staging.example.com"
production: "https://api.example.com"# Install OpenCode CLI
npm install -g opencode-ai# Analyze single package
sfp project report --package nextGen
# Analyze multiple packages
sfp project report --package core --package utils --output core-utils-analysis.md# Analyze all packages in a domain
sfp project report --domain billing --output billing-analysis.md# Uses defaults (provider: anthropic, model: claude-sonnet-4-20250514)
sfp project report --package nextGen --output nextgen-analysis.md
# Specify different model (if needed)
sfp project report --model claude-sonnet-4-20250514 --package core# Must specify provider explicitly (uses claude-sonnet-4 by default)
sfp project report --provider github-copilot --package rate-changes
# Uses default model for GitHub Copilot
sfp project report --provider github-copilot --domain service# Uses default model: anthropic.claude-sonnet-4-20250514-v1:0
sfp project report --provider amazon-bedrock --package core
# Specify different region (if not in environment)
export AWS_REGION=eu-west-1
sfp project report --provider amazon-bedrock --domain billing# Install OpenCode CLI
npm install -g opencode-ai
# Verify installation
opencode --versionOrchestration: Data package creation and installation can be orchestrated by sfpowerscripts, which means less scripting
Simply add an entry in the package directories, providing the package's name, path, version number and type (data). Your editor may complain that the 'type' property is not allowed, but this can be safely ignored.
Export your Salesforce records to csv files using the SFDMU plugin. For more information on plugin installation, creating an export.json file, and exporting to csv files, refer to Plugin Basic > Basic Usage in SFDMU's documentation.
Data packages support the following options, through the sfdx-project.json.
Refer to this link for more details on how to add a pre/post script to data package
sfpowerscripts support vlocity RBC migration using the vlocity build tool (vbt). sfpowerscripts will be automatically able to detect whether a data package need to be deployed using vlocity or using sfdmu. (Please not to enable vlocity in preparing scratchOrgs, the enableVlocity flag need to be turned on in the pool configuration file)
A vlocity data package need to have vlocityComponents.yaml file in the root of the package directory and it should have the following definition
The same package would be defined in the sfdx-project.json as follows
Release configuration is a fundamental setup that outlines the organisation of packages within a project, streamlining across different lifecycle of your project, such as validating, building, deploying/release of artifacts. In flxbl projects, a release config is used to define the concept of a domain/subdomains. This configuration is instrumental when using sfp commands, as it allows for selective operations on specified packages defined by a configuration. By employing a release configuration, teams can efficiently manage a mono repository of packages across various teams.
The below table list the options that are currently available for release configuration
releaseName
No
String
A release configuration also can contain additional options that can be used by certain sfp commands to generate . These properties in a release definiton alters the behaviour of deployment of artifacts during a release
Availability
✅
❌
From
January 25
The project analysis command helps you analyze your Salesforce project for potential issues and provides detailed reports in various formats. This command is particularly useful for identifying issues such as duplicate components, compliance violations, hardcoded IDs and URLs, and other code quality concerns.
The analyze command serves several key purposes:
Runs various available linters across the project
Generating comprehensive analysis reports
Integration with CI/CD pipelines for automated checks
The command provides three mutually exclusive ways to scope your analysis:
By Package: Analyze specific packages
By Domain: Analyze all packages in a domain
By Source Path: Analyze a specific directory
The command supports multiple output formats:
Markdown: Human-readable documentation format
JSON: Machine-readable format for integration with other tools
GitHub: Special format for GitHub Checks API integration
When running in GitHub Actions, the command automatically:
Creates GitHub Check runs for each analysis
Adds annotations to the code for identified issues
Provides detailed summaries in the GitHub UI
Basic analysis of all packages:
Analyze specific packages with JSON output:
Analyze with strict validation:
Generate reports in a specific directory:
mergeMode
boolean
Enable deployment of contents of a folder that matches the alias of the environment using merge
source
mergeMode adds an additional mode for deploying aliasified packages with content inheritance. During package build, the default folder's content is merged with subfolders that matches the org with alias name, along with subfolders able to override inherited content. This reduces metadata duplication while using aliasifed packages.\
Note in the image above that the AccountNumberDefault__c field is replicated in each deployment directory that matches the alias.
The table below describes the behavior of each command when merge mode is enabled/disabled.
Before merge mode, the whole package was "force ignored." With merge mode, you have the option to allow push/pull from aliasfy packages by not ignoring the default subfolder.
aliasfy
boolean
Enable deployment of contents of a folder that matches the alias of the environment
source
Aliasify enables deployment of a subfolder in a source package that matches the target org. For example, you have a source package as listed below. \
During Installation, only the metadata contents of the folder that matches the alias gets deployed. If the alias is not found, sfp will fallback to the 'default' folder. If the default folder is not found, an error will be displayed saying default folder or alias is missing.\
Both aliasfy packages and string replacements provide environment-specific deployments, but serve different purposes:
You have structural metadata differences between environments (different fields, objects, workflows, or permissions)
You need completely different files per environment (e.g., different page layouts, record types)
The entire metadata component varies by environment
You're managing environment-specific declarative configurations
Only configuration values differ between environments (endpoints, emails, feature flags)
You want to avoid duplicating entire files just to change a few values
You need to maintain single source of truth for business logic
You're dealing with code-based metadata (Apex, LWC, Aura) with environment-specific values
Large packages need both structural differences AND configuration value changes
Different teams manage different aspects of environment-specific configurations
You're gradually migrating from full file duplication to value-based replacements
Aliasfy Approach - Different permission sets per environment:
String Replacements Approach - Same code, different values:
Availability
✅
❌
From
September 2025
During artifact installation, sfp automatically applies string replacements to convert placeholder values to environment-specific values based on the target org.
When installing artifacts with embedded replacement configurations:
Org Detection: Identifies the target org alias and whether it's a sandbox or production org
Value Resolution: Determines the appropriate replacement value based on org alias or defaults
File Modification: Applies replacements to matching files before deployment
Deployment: Deploys the modified components to the target org
Replacements are resolved in this order:
Exact alias match: If the org alias matches a configured environment
Default value: For sandbox/scratch orgs when no exact match is found
Error: For production orgs without explicit configuration
During installation with replacements, you'll see:
To skip replacements during installation:
To use a custom replacement configuration file:
Production deployments require explicit configuration for the org alias. If no configuration is found, the installation will fail:
Verify the artifact was built with replacements embedded
Check that glob patterns match your files
Ensure the org alias has configured values
Verify org alias with sf org list
Check environment resolution order
Ensure production orgs have explicit configuration
The duplicate check functionality helps identify duplicate metadata components across your Salesforce project. This analysis is crucial for maintaining clean code organization and preventing conflicts in your deployment process.
{
"path": "path--to--data--package",
"package": "name--of-the-data package", //mandatory, when used with sfpowerscripts
"versionNumber": "X.Y.Z.0 // 0 will be replaced by the build number passed",
"type": "data", // required
} {
"path": "path--to--package",
"package": "name--of-the-package", //mandatory, when used with sfpowerscripts
"versionNumber": "X.Y.Z.[NEXT/BUILDNUMBER]",
"type": "data", // required
"aliasfy": <boolean>, // Only for source packages, allows to deploy a subfolder whose name matches the alias of the org when using deploy command
"assignPermSetsPreDeployment: ["","",],
"assignPermSetsPostDeployment: ["","",],
"preDeploymentScript":<path>, //All Packages
"postDeploymentScript":<path> // All packages
}# $1 package name
# $2 org
# $3 alias
# $4 working directory
# $5 package directory
sfdx force:apex:execute -f scripts/datascript.apex -u $2projectPath: src/vlocity-config // Path to the package directory
expansionPath: datapacks // Path to the folder containing vlocity attributes
autoRetryErrors: true //Additional items
manifest: {
"path": "./src/vlocity-config",
"package": "vlocity-attributes",
"versionNumber": "1.0.0.0",
"type": "data"
}// Sample package
{
"packageDirectories": [
{
"path": "src/core-crm",
"package": "core-crm",
"versionNumber": "2.0.10.NEXT",
"enableFHT" : true,
"enableFT" : true
}
---
releaseName: core
pool: core_pool
includeOnlyArtifacts:
- src-env-specific-pre
- src-env-specific-alias-pre
- core-crm
- telephony-service
excludePackageDependencies:
- Genesys Cloud for Salesforce
- Marketing Cloud
releasedefinitionProperties:
changelog:
workItemFilters:
- BRO-[0-9]{3,4}
workItemUrl: https://bro.atlassian.net/browse
limit: 30

releasedefinitionProperties.changelog.repoUrl
No
Prop
The URL of the version control system to push changelog files
releasedefinitionProperties.changelog.workItemFilters
No
Prop
An array of regular expression used to identify work items in your commit messages
releasedefinitionProperties.changelog.workitemUrl
No
Prop
The generic URL of work items, to which to append work item codes. Allows easy redirection to user stories by clicking on the work-item link in the changelog.
releasedefinitionProperties.changelog.limit
No
Prop
Limit the number of releases to display in the changelog markdown
releasedefinitionProperties.changelog.showAllArtifacts
No
Prop
Whether to show artifacts that haven't changed between releases
Name of the release config, in flxbl project, this name is used as the name of the domain
pool
No
String
Name of the scratch org or sandbox pool associated with this release config during validation
excludeArtifacts
No
Array
An array of artifacts that need to be excluded while creating the release definition
includeOnlyArtifacts
No
Array
An array of artifacts that should only be included while creating the release definition
dependencyOn
No
Array
An array of packages that denotes the dependency this configuration has. The dependencies mentioned will be used for synchronization in review sandboxes
excludePackageDependencies
No
Array
Exclude the mentioned package dependencies from the release definition
includeOnlyPackageDependencies
No
Array
Include only the mentioned package dependencies from the release definition
releasedefinitionProperties
No
Object
Properties of release definition that should be added to the generated release definition. See below
releasedefinitionProperties.skipIfAlreadyInstalled
No
boolean
Skip installation of artifact if it's already installed in target org
releasedefinitionProperties.baselineOrg
No
string
The org used to decide whether to to skip installation of an artifact. Defaults to the target org when not provided
releasedefinitionProperties.promotePackagesBeforeDeploymentToOrg
No
string
Promote packages before they are installed into an org that matches alias of the org
Default merging not available
Deploys subfolder that matches enviroment alias name. If there is no match, deploys default (Only sandboxes)
Only default is pushed
Only default is pushed
Default merging not available
Deploys subfolder that matches enviroment alias name. If there is no match, fails
Nothing to push
Nothing to pull
Availability
✅
❌
From
September 24
Default metadata is merged into each subfolder
Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production)
Only default is pushed
Only default is pulled
Default merging not available


Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production)

--exclude-linters
Comma-separated list of linters to exclude
No
[]
--fail-on
Linters that should cause command failure if issues found
No
[]
--show-aliasfy-notes
Show notes for aliasified packages
No
true
--fail-on-unclaimed
Fail when duplicates are found in unclaimed packages
No
false
--output-format
Output format (markdown, json, github)
No
markdown
--report-dir
Directory for analysis reports
No
-
--compliance-rules
Path to compliance rules YAML file
No
config/compliance-rules.yaml
--generate-compliance-config
Generate sample compliance rules configuration
No
false
Run compliance checks with custom rules:
--package, -p
The name of the package to analyze
No
-
--domain, -d
The domain to analyze
No
-
--source-path, -s
The path to analyze
No
-
Duplicate checking scans your project's metadata components and identifies:
Components that exist in multiple packages
Components in aliasified packages
Components in unclaimed directories (not associated with a package)
The duplicate checker:
Scans all specified directories for metadata components
Creates a unique identifier for each component based on type and name
Identifies components that appear in multiple locations
Provides detailed reporting with context about each duplicate
Duplicate checking can be configured through several flags in the project:analyze command:
--show-aliasfy-notes
Show information about aliasified packages
true
--fail-on-unclaimed
Fail when duplicates are in unclaimed packages
false
The duplicate check provides three types of findings:
Direct Duplicates: Same component in multiple packages
Aliasified Components: Components in aliasified packages (typically expected)
Unclaimed Components: Components in directories (such as unpacakged or src/temp) not associated with packages
The analysis uses several indicators to help you understand the results:
❌ Indicates a problematic duplicate that needs attention
⚠️ Indicates a duplicate that might be intentional (e.g., in aliasified packages)
(aliasified) Marks components in aliasified packages
(unclaimed) Marks components in unclaimed directories
Regular Scanning: Run duplicate checks regularly during development
Clean Package Structure: Keep each component in its appropriate package
Proper Package Configuration: Ensure all directories are properly claimed in sfdx-project.json
Documented Exceptions: Document cases where duplication is intentional
Some components may need to exist in multiple packages. In these cases:
Use aliasified packages when appropriate
Document the reason for duplication
Consider creating a shared package for common components
If you find components in unclaimed directories:
Add the directory to sfdx-project.json
Assign it to an appropriate package
Re-run the analysis to verify the fix
Duplicates in aliasified packages are often intentional:
Used for environment-specific configurations
Different versions of components for different contexts
Not considered errors by default
When integrating duplicate checking in your CI/CD pipeline:
Configure Failure Conditions:
Generate Reports:
Review Results:
Use generated reports for tracking
Address critical duplicates
Document accepted duplicates
False Positives
Verify package configurations in sfdx-project.json
Check if components should be in aliasified packages
Ensure directories are properly claimed
Missing Components
Verify scan scope (package, domain, or path)
Check file permissions
Validate metadata format
Availability
✅
❌
From
January 25
// Demonstrating package directory with aliasfy merge mode
{
"packageDirectories": [
{
"path": "src/src-env-specific-alias-post",
"package": "src-env-specific-alias-post",
"versionNumber": "2.0.10.NEXT",
"aliasfy" : {
"mergeMode": true
}
}# .forceIgnore (aliasfy non-merge mode)
src-env-specific-alias-post# .forceIgnore (aliasfy merge mode)
src-env-specific-alias-post/dev
src-env-specific-alias-post/test
src-env-specific-alias-post/prod// Demonstrating package directory with aliasfy
{
"packageDirectories": [
{
"path": "src/src-env-specific-alias-post",
"package": "src-env-specific-alias-post",
"versionNumber": "2.0.10.NEXT",
"aliasfy" : true
}src-env-specific/
default/
permissionsets/
StandardUser.permissionset-meta.xml # Basic permissions
production/
permissionsets/
StandardUser.permissionset-meta.xml # Production-specific permissions
staging/
permissionsets/
StandardUser.permissionset-meta.xml # Staging-specific permissions// Single file with placeholders
public class IntegrationService {
private static final String ENDPOINT = '%%API_ENDPOINT%%';
private static final String API_KEY = '%%API_KEY%%';
}Text replacements for org alias dev (sandbox)
Total replacement configs 3
Modified APIService.cls with 2 replacements
Modified ConfigService.cls with 1 replacements
Replacements summary: 3 replacements in 2 files using dev environmentsfp install --targetorg dev --artifactdir artifacts --no-replacementssfp install --targetorg dev --artifactdir artifacts --replacementsoverride custom-replacements.ymlERROR: No replacement value found for production org with alias 'prod'
Production deployment requires explicit configuration for alias 'prod' in replacement 'API Endpoint'sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliancesfp project:analyze [flags]sfp project:analyze -p core,utilssfp project:analyze -d salessfp project:analyze -s ./force-app/main/defaultsfp project:analyzesfp project:analyze -p core,utils --output-format jsonsfp project:analyze --fail-on duplicates --fail-on-unclaimedsfp project:analyze --report-dir ./analysis-reportssfp project:analyze --generate-compliance-configsfp project:analyze --fail-on duplicates --fail-on-unclaimed --output-format githubsfp project:analyze --report-dir ./reports --output-format markdownsfp project:analyze --fail-on duplicates --fail-on-unclaimed# Duplicate Analysis Report
## Aliasified Package Information
- Note: Package "env-specific" is aliasified - duplicates are allowed.
## Component Analysis
### ❌ CustomObject: Account.Custom_Field__c (Duplicate Component)
- `src/package1/objects/Account.object`
- `src/package2/objects/Account.object`
### ⚠️ CustomLabel: Common_Label (Aliasified Component)
- `src/env-specific/qa/labels/Custom.labels` (aliasified)
- `src/env-specific/prod/labels/Custom.labels` (aliasified)--mode=thorough
Individual
Ignore packages installed in scratch org, identify list of changed packages from PR/Merge Request, and validate each of the changed packages (respecting any dependencies) using thorough validation rules.
--mode=individual
Both validation modes support filtering packages using a release configuration file through the --releaseconfig flag. When provided, only packages defined in the release config that have changes will be validated. This is useful for:
Large monorepos with multiple domains
Focusing validation on specific package groups
Reducing validation time by limiting scope
Fast Feedback mode was initially introduced to provide quicker validation by:
Installing only changed components instead of full packages
Running selective tests based on impact analysis
Skipping coverage calculations
Skipping packages with only descriptor changes
However, this mode was deprecated and removed because:
Complexity vs. Value: The mode introduced significant complexity in determining what to test versus what to synchronize, while the time savings were inconsistent.
Improved Alternative Approach: The validation logic was enhanced to automatically differentiate between:
Packages to synchronize: Already validated packages from upstream changes (deployed but not tested)
Packages to validate: Packages with changes introduced by the current PR (deployed and tested)
This automatic differentiation provides the speed benefits of fast feedback without requiring a separate mode.
Better Options Available:
Use --ref and --baseRef flags to specify exact comparison points
Use --releaseconfig to limit validation scope
Fast Feedback (--mode=fastfeedback) - Removed in favor of automatic synchronization logic
Fast Feedback Release Config (--mode=ff-release-config) - Use --releaseconfig with standard modes instead
Thorough Release Config (--mode=thorough-release-config) - Use --mode=thorough --releaseconfig instead
Note: The current validation intelligently handles synchronized vs. validated packages automatically when you provide
--refand--baseRefflags, achieving faster feedback without a separate mode.
The following steps are orchestrated by the validate command:
When using pools:
Fetch a scratch org from the provided pools in a sequential manner
Authenticate to the scratch org using the auth URL fetched from the Scratch Org Info Object
When using a provided org:
Authenticate to the provided target org
Identify packages to validate:
Build packages that are changed by comparing the tags in your repo against the packages installed in the target
If --releaseconfig is provided, filter packages based on the release configuration
For each package to validate:
Thorough Mode (Default):
Deploy all the built packages as source packages / data packages (unlocked packages are installed as source packages)
Trigger Apex Tests if there are any apex tests in the package
Validate test coverage of the package depending on the type:
Source packages: Each class needs to have 75% or more coverage
Individual Mode:
Ignore packages that are installed in the scratch org (eliminates the requirement of using a pooled org)
Compute changed packages by observing the diff of Pull/Merge Request
Validate each of the changed packages individually
Install any dependencies required for each package
By default, all apex tests are triggered in parallel with automated retry. Tests that fail due to SOQL locks or other transient issues are automatically retried synchronously. You can override this behavior:
--disableparalleltesting: Forces all tests to run synchronously
--skipTesting: Skip test execution entirely (use with caution)
Default coverage threshold: 75%
Configure custom threshold: --coveragepercent <value>
Coverage is validated per class for source packages and per package for unlocked packages
Skipping tests with --skipTesting bypasses critical quality checks. Only use this option in development environments or when you're certain the changes don't require test validation.
Thorough (Default)
Include package dependencies, code coverage, all test classes during full package deployments. This is the recommended mode for comprehensive validation.
Unlocked/Org Dependent Unlocked Packages
There is a huge amount of documentation on unlocked packages. Here are list of curated links that can help you get started on learning more about unlocked package
The Basics
Advanced Materials
The following sections deals with more of the operational aspects when dealing with unlocked packages
Unlocked Packages, excluding Org-Dependent unlocked packages have mandatory test coverage requirements. Each package should have minimum of 75% coverage requirement. A validated build (or in sfp) validates the coverage of package during the build phase. To enable the feedback earlier in the process, sfp provide you functionality to validate test coverage of a package such as during the Pull Request Validation process.
For unlocked packages, we ask users to follow a versioning of packages.
Note that an unlocked package must be before it can be installed to a production org, and either the major, minor or patch (not build) version must be higher than the last version of this package which was promoted. These version number changes should be made in the sfdx-project.json file before the final package build and promotion.
Unlocked packages provide traceability in the org by locking down the metadata components to the package that introduces it. This feature which is the main benefit of unlocked package can also create issues when you want to refactor components from one package to another. Let's look at some scenarios and common strategies that need to be applied
For a project that has two packages.
Package A and Package B
Package B is dependent on Package A.
Remove a component from Package A, provided the component has no dependency
Solution: Create a new version of Package A with the metadata component being removed and install the package.
Move a metadata component from Package A to Package B
Solution: This scenario is pretty straight forward, one can remove the metadata component from Package A and move that to Package B. When a new version of Package A gets installed, the following things happen:
If the deployment of the unlocked package is set to mixed, and no other metadata component is dependent on the component, the component gets .
Move a metadata component from Package B to Package A, where the component currently has other dependencies in Package B
Solution: In this scenario, one can move the component to Package A and get the packages built. However during deployment to an org, Package A will fail with an error this component exists in Package B. To mitigate this one should do the following:
Deploy a version of Package B which removes the lock on the metadata component using deprecate mode. Some times this needs extensive refactoring to other components to break the dependencies. So evaluate whether the approach will work.
Package dependencies are defined in the sfdx-project.json. More information on defining package dependencies can be found in the Salesforce .
Let's unpack the concepts utilizing the above example:
There are two unlocked packages
Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias
Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'
External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.
Unlocked packages have two build modes, one with and one without. A package being built without skipping dependency check cant be deployed into production and can usually take a long time to build. sfp cli tries to build packages in parallel understanding your dependency, however some of your packages could spend a significant time in validation.
During these situations, we ask you to consider whether the time taken to build all validated packages on an average is within your build budget, If not, here are your options
: Org-dependent unlocked packages are a variant of unlocked packages. Org-dependent packages do not validate the dependencies of a package and will be faster. However please note that all the org's where the earlier unlocked package was installed, had to be deprecated and the component locks removed, before the new org-dependent unlocked package is installed.
Use it as the least resort, source packages have a fairly loose lifecycle management.
Create a and move the metadata and any associated dependencies over to that particular package.
Availability
✅
✅
The sfp dependency:expand command enriches your project's dependency declarations by automatically adding all transitive dependencies to your sfdx-project.json. This ensures that each package explicitly declares all its dependencies, both direct and indirect, making the dependency graph complete and explicit.
When you run sfp dependency:expand, it:
Analyzes all packages in your project to identify their direct dependencies
Resolves transitive dependencies by recursively finding dependencies of dependencies
Updates each package to include both direct and transitive dependencies
Maintains proper ordering ensuring dependencies are listed in topological order
Expanding dependencies is useful for:
Explicit dependency management: Makes all dependencies visible in the project configuration
Build optimization: Helps build tools understand the complete dependency graph
Troubleshooting: Easier to identify dependency-related issues when all dependencies are explicit
CI/CD pipelines: Ensures all required dependencies are known upfront
Without --overwrite, creates a new file at ./project-config/sfdx-project.exp.json
With --overwrite, backs up existing sfdx-project.json to ./project-config/sfdx-project.json.bak and overwrites the original
The expand and shrink commands are complementary:
Expand: Adds all transitive dependencies, making them explicit
Shrink: Removes redundant transitive dependencies, keeping only direct ones
Typical workflow:
Use expand during development to understand full dependency graphs
Use shrink before committing to maintain clean, minimal dependency declarations
When multiple packages require different versions of the same dependency, expand automatically selects the highest version that satisfies all requirements. For example:
Package A requires [email protected]
Package B requires [email protected]
Result: Both will get [email protected]
External dependencies (managed packages from other Dev Hubs) defined in your externalDependencyMap are:
Automatically included in the expansion process
Preserved with their version specifications
Added to packages that transitively depend on them
The command validates all dependencies exist in the project
Circular dependencies are detected and reported as errors
External dependencies must be properly configured in the plugins.sfp.externalDependencyMap section
The expanded configuration can become large for projects with many packages
- Remove redundant transitive dependencies
- Understand package dependencies
This guide helps organizations set up automated synchronization of sfp pro images from Flxbl's registry to their own container registry, with optional customization capabilities
Validation scripts allow you to execute custom logic at specific points during the validation process. These global-level scripts provide hooks for setup, cleanup, reporting, and integration with external systems during validation workflows.
The sfp project:push command deploys source from your local project to a specified Salesforce org. It can push changes based on a package, domain, or specific source path. This command is useful for deploying local changes to your Salesforce org.
# Validate with release config filtering
sfp validate org --targetorg myorg --mode thorough --releaseconfig config/release.ymlUse --skipTesting when tests aren't needed
Use individual mode for isolated package validation
Unlocked packages: Package as a whole needs to have 75% or more coverage
Apply thorough validation rules (deployment, testing, coverage)
On the subsequent install of Package B, Package B restores the field and takes ownership of the component.
If not, you can go to the UI (Setup > Packaging > Installed Packages > <Name of Package> > View Components and Remove) and remove the lock for a package.
Preserves external dependencies defined in your project configuration
Maintains topological ordering of dependencies
Handles version conflicts by selecting the highest version required
-o, --overwrite
Overwrites the existing sfdx-project.json file with the expanded configuration
No
-v, --targetdevhubusername
Username or alias of the target Dev Hub org
Yes
--loglevel
Logging level (trace, debug, info, warn, error, fatal)
No
Reduced external dependencies during CI/CD runs
Ability to add organization-specific customizations
Improved pull performance from your own registry
Compliance with internal security policies
Create a GitHub repository in your organization specifically for Docker image management (e.g., docker-images or sfp-docker).
Add the following secrets to your repository (Settings → Secrets and variables → Actions):
GITEA_USER
Your Gitea username
From your welcome email
GITEA_PAT
Personal Access Token for Gitea
Generate at source.flxbl.io (Settings → Applications → Personal Access Tokens) with read:package permission
Create .github/workflows/sync-sfp-pro.yml in your repository:
If you need to add organization-specific tools or configurations, create a Dockerfile:
Then modify the workflow to build and push your custom image:
Update your project workflows to use images from your registry:
After running the workflow, verify the synchronization:
If you encounter authentication errors:
Verify your PAT has read:package permission
Check that secrets are correctly set in repository settings
Ensure your Gitea username is correct
If the source image cannot be pulled:
Check the version exists at https://source.flxbl.io/flxbl/-/packages/container/sfp-pro/
Verify your network can reach source.flxbl.io
Confirm your credentials are valid
Ensure the workflow has packages: write permission
Verify the repository name in IMAGE_PREFIX is correct
Check GitHub Packages settings for your repository
Note: The GITHUB_TOKEN provided by GitHub Actions has the necessary permissions to create checks. This is why it works automatically in GitHub Actions but requires special setup in other CI platforms (see below).
The command automatically:
✅ Detects it's running in a PR context
✅ Fetches changed files from the PR
✅ Creates GitHub Checks with results
✅ Adds annotations to files with issues
If you're using a different CI platform but want to push checks to GitHub, you need to manually provide environment variables.
GITHUB_ACTIONS
Set to "true" to enable GitHub mode
true
GITHUB_EVENT_NAME
Event type that triggered the workflow
pull_request
GITHUB_REPOSITORY
Repository in format owner/repo
owner/repo
GITHUB_EVENT_PATH
Path to file containing PR event data
/tmp/pr-event.json
GITHUB_SHA
Commit SHA to attach check to
abc123...
GITHUB_TOKEN
GitHub token with checks:write permission
ghs_xxxxx
GITHUB_RUN_ID
Optional: Run ID for details URL
12345
--base-ref
Base commit/branch for comparison
main or abc123...
--head-ref
Head commit/branch for comparison
HEAD or def456...
The GITHUB_EVENT_PATH should point to a JSON file with this structure:
IMPORTANT: Creating GitHub Checks requires a GitHub App installation token, not a regular GITHUB_TOKEN.
If you have sfp server installed, use it to generate installation tokens:
The token from sfp server has the required GitHub App permissions:
checks:write - Create/update checks
contents:read - Read repository contents
pull_requests:read - Read PR information
If you don't have sfp server, you need to:
Create your own GitHub App with the permissions listed above
Install it on your repository/organization
Generate an installation token using your own tooling
Set GITHUB_TOKEN to that installation token
Note: Regular GitHub Actions GITHUB_TOKEN or Personal Access Tokens will NOT work for creating checks.
Test your configuration locally:
Expected output:
Solution: Verify GITHUB_ACTIONS=true and GITHUB_EVENT_NAME=pull_request are set.
Solution: Ensure GITHUB_REPOSITORY, GITHUB_SHA, and GITHUB_EVENT_PATH are set.
Solution: Set GITHUB_TOKEN environment variable with a valid token.
Solution: Provide correct --base-ref and --head-ref flags. In PR contexts, use the actual base/head SHAs, not just HEAD.
Source Packages are metadata deployments from a Salesforce perspective - they are groups of components that are deployed to an org as a unit. Unlike Unlocked packages which are First Class Salesforce deployment constructs with lifecycle governance (versioning, component locking, automated dependency validation), source packages provide more flexibility at the cost of some safety guarantees.
While we generally recommend using unlocked packages over source packages for production code, source packages excel in several scenarios:
Application Configuration: Configuring applications delivered by managed packages (e.g., changes to help text, field descriptions)
Org-Specific Metadata: Global or org-specific components (queues, profiles, permission sets, custom settings)
Environment-Specific Configuration: Components that vary significantly across environments
Composite UI Layouts: Complex UI configurations that don't package well
Rapid Development: When iteration speed is more important than package versioning
Large Monolithic Applications: When breaking into unlocked packages is not feasible
Starting Package Development: Teams beginning their journey to package-based development
Known bugs or limitations with unlocked package deployment
Unlocked Package validation taking too long (consider org-dependent as alternative)
Need for destructive changes support
Requirement for environment-specific text replacements
Deployment Speed: Significantly faster deployment times (no package creation/validation overhead)
Testing Control: Flexible test execution - can skip tests in sandboxes for faster iterations (see Testing Behavior)
Environment Management: Support for aliasified folders and text replacements for environment-specific configurations
Destructive Changes: Full support for pre and post-destructive changes
Metadata Coverage: Supports all metadata types, including those not supported by unlocked packages
No Version Constraints: No need to manage package versions or dependencies at the platform level
Quick Fixes: Can deploy hotfixes immediately without package creation
Development Flexibility: Easier to refactor and reorganize code
No Component Locking: Salesforce org is unaware of package boundaries - components can be overwritten by other packages
No Platform Validation: Dependencies are not validated by Salesforce
No Automated Rollback: Cannot easily rollback to previous versions
Manual Destructive Changes: Must be explicitly managed through pre/post-destructive folders
No Package Lifecycle: No built-in versioning, upgrade, or deprecation paths
Dependency Risks: Dependencies only validated at deployment time
Production Risk: Higher risk of deployment failures in production
Metadata components that commonly get overridden across packages:
Profiles and Permission Sets
Custom Settings
Global Value Sets
Source packages in sfp-pro support two powerful mechanisms for managing environment-specific differences:
Deploy different metadata variants based on org aliases:
See Aliasified Packages for details.
Replace configuration values during deployment without file duplication:
See String Replacements for details.
Source packages fully support destructive changes through dedicated folders:
pre-destructive/: Components deleted before deployment
post-destructive/: Components deleted after deployment
Destructive changes are automatically detected and processed during package installation.
Source packages have the following test execution control:
Development/Sandbox: Tests skipped by default for faster iterations
Production: Tests always run & coverage validated (Salesforce requirement)
Override Options:
sfp install --runtests: Force test execution while installing a package to a sandbox
Package-level skipTesting in sfdx-project.json (ignored in production, in production tests are always executed and each individual class needs to have a coverage of 75% or more)
This provides significant performance improvements during development while maintaining production safety.
Source packages can depend on other unlocked packages or managed packages. Dependencies are validated at deployment time, meaning the dependent metadata must already exist in the target org.
For development in scratch orgs, you can add dependencies to enable automatic installation:
sfp commands like prepare and validate will automatically install dependencies before deploying the source package.
Use unlocked packages for shared libraries that need version control and component protection
Use source packages for environment-specific configuration and org-specific metadata
Organize large source packages into logical domains for better maintainability
Leverage aliasified packages for structural differences between environments
Use text replacements for configuration values that change across environments
Document destructive changes clearly in your release notes
Test in lower environments before production deployment
Consider org-dependent unlocked packages as a middle ground when validation time is a concern
Identify logical groupings of metadata
Create source package entries in sfdx-project.json
Move metadata into package directories
Define dependencies between packages
Test deployment order
Start with source packages to organize metadata
Identify stable components suitable for packaging
Gradually convert source packages to unlocked packages
Keep environment-specific components as source packages
Availability
✅
✅
{
"packageDirectories": [
{
"path": "util",
"default": true,
"package": "Expense-Manager-Util",
"versionName": "Winter ‘20",
"versionDescription": "Welcome to Winter 2020 Release of Expense Manager Util Package",
"versionNumber": "4.7.0.NEXT"
},
{
"path": "exp-core",
"default": false,
"package": "ExpenseManager",
"versionName": "v 3.2",
"versionDescription": "Winter 2020 Release",
"versionNumber": "3.2.0.NEXT",
"dependencies": [
{
"package": "ExpenseManager-Util",
"versionNumber": "4.7.0.LATEST"
},
{
"package": "TriggerFramework",
"versionNumber": "1.7.0.LATEST"
},
{
"package": "External Apex Library - 1.0.0.4"
}
]
}
],
"sourceApiVersion": "47.0",
"packageAliases": {
"TriggerFramework": "0HoB00000004RFpLAM",
"Expense Manager - Util": "0HoB00000004CFpKAM",
"External Apex [email protected]": "04tB0000000IB1EIAW",
"Expense Manager": "0HoB00000004CFuKAM"
}
}# Create an expanded version of the project configuration
sfp dependency:expand
# Overwrite the existing sfdx-project.json with expanded dependencies
sfp dependency:expand --overwrite{
"packageDirectories": [
{
"path": "packages/feature-a",
"package": "feature-a",
"dependencies": [
{ "package": "core-package" }
]
},
{
"path": "packages/feature-b",
"package": "feature-b",
"dependencies": [
{ "package": "feature-a" }
]
},
{
"path": "packages/core-package",
"package": "core-package",
"dependencies": [
{ "package": "base-package" }
]
},
{
"path": "packages/base-package",
"package": "base-package",
"dependencies": []
}
]
}{
"packageDirectories": [
{
"path": "packages/feature-a",
"package": "feature-a",
"dependencies": [
{ "package": "base-package" }, // Transitive dependency added
{ "package": "core-package" } // Direct dependency
]
},
{
"path": "packages/feature-b",
"package": "feature-b",
"dependencies": [
{ "package": "base-package" }, // Transitive dependency added
{ "package": "core-package" }, // Transitive dependency added
{ "package": "feature-a" } // Direct dependency
]
},
{
"path": "packages/core-package",
"package": "core-package",
"dependencies": [
{ "package": "base-package" } // Direct dependency
]
},
{
"path": "packages/base-package",
"package": "base-package",
"dependencies": []
}
]
}name: Sync SFP Pro Images
on:
workflow_dispatch:
inputs:
sfp_version:
description: 'SFP Pro version (leave empty for latest)'
required: false
type: string
include_sf_cli:
description: 'Also sync SF CLI variant'
required: false
type: boolean
default: true
env:
REGISTRY: ghcr.io
IMAGE_PREFIX: ${{ github.repository }}
jobs:
sync-images:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Login to source.flxbl.io
run: |
echo "${{ secrets.GITEA_PAT }}" | docker login source.flxbl.io \
-u ${{ secrets.GITEA_USER }} \
--password-stdin
- name: Login to GitHub Container Registry
run: |
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io \
-u ${{ github.actor }} \
--password-stdin
- name: Determine version
id: version
run: |
if [ -n "${{ github.event.inputs.sfp_version }}" ]; then
echo "version=${{ github.event.inputs.sfp_version }}" >> $GITHUB_OUTPUT
else
# Fetch latest version from your version strategy
echo "version=latest" >> $GITHUB_OUTPUT
fi
- name: Sync base SFP-Pro Lite image
run: |
SOURCE_IMAGE="source.flxbl.io/flxbl/sfp-pro-lite:${{ steps.version.outputs.version }}"
TARGET_IMAGE="${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-lite"
docker pull ${SOURCE_IMAGE}
docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:latest
docker push ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
docker push ${TARGET_IMAGE}:latest
- name: Sync SFP-Pro with SF CLI image
if: github.event.inputs.include_sf_cli == true
run: |
SOURCE_IMAGE="source.flxbl.io/flxbl/sfp-pro:${{ steps.version.outputs.version }}"
TARGET_IMAGE="${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro"
docker pull ${SOURCE_IMAGE}
docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
docker tag ${SOURCE_IMAGE} ${TARGET_IMAGE}:latest
docker push ${TARGET_IMAGE}:${{ steps.version.outputs.version }}
docker push ${TARGET_IMAGE}:latestARG BASE_VERSION=latest
FROM source.flxbl.io/flxbl/sfp-pro-lite:${BASE_VERSION}
# Add your customizations
RUN apt-get update && apt-get install -y \
jq \
your-custom-tools \
&& rm -rf /var/lib/apt/lists/*
# Copy custom scripts or configurations
# COPY scripts/ /usr/local/bin/
# COPY config/ /etc/your-app/ARG BASE_VERSION=latest
FROM source.flxbl.io/flxbl/sfp-pro:${BASE_VERSION}
# Your customizations here - name: Build and push custom image
run: |
docker build \
--build-arg BASE_VERSION=${{ steps.version.outputs.version }} \
-f Dockerfile \
-t ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:${{ steps.version.outputs.version }} \
-t ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:latest \
.
docker push ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:${{ steps.version.outputs.version }}
docker push ${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}/sfp-pro-custom:latestjobs:
build:
runs-on: ubuntu-latest
container:
image: ghcr.io/your-org/docker-images/sfp-pro:latest
credentials:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}image: ghcr.io/your-org/docker-images/sfp-pro:latest
before_script:
- echo "$CI_REGISTRY_PASSWORD" | docker login ghcr.io -u "$CI_REGISTRY_USER" --password-stdinresources:
containers:
- container: sfp
image: ghcr.io/your-org/docker-images/sfp-pro:latest
endpoint: your-service-connection
jobs:
- job: Build
container: sfp# List available images in your registry
docker search ghcr.io/your-org/docker-images
# Pull and test the synchronized image
docker pull ghcr.io/your-org/docker-images/sfp-pro:latest
docker run --rm ghcr.io/your-org/docker-images/sfp-pro:latest sfp --version# .github/workflows/pr-analysis.yml
name: PR Analysis
on: pull_request
jobs:
analyze:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run Project Analysis
run: sfp project:analyze
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}{
"number": 123,
"pull_request": {
"number": 123,
"base": {
"sha": "base-commit-sha"
},
"head": {
"sha": "head-commit-sha"
}
}
}# Get installation token for your repository
sfp server repository auth-token \
--repository owner/repo \
--sfp-server-url https://your-server-url \
--email [email protected] \
--json
# Use the returned token
export GITHUB_TOKEN=<token-from-above># Step 1: Get GitHub App installation token from sfp server
GITHUB_TOKEN=$(sfp server repository auth-token \
--repository owner/repo \
--sfp-server-url https://your-server-url \
--email [email protected] \
--json | jq -r '.token')
# Step 2: Set environment variables
export GITHUB_TOKEN
export GITHUB_ACTIONS=true
export GITHUB_EVENT_NAME=pull_request
export GITHUB_REPOSITORY=owner/repo
export GITHUB_SHA=your-commit-sha
export GITHUB_EVENT_PATH=/tmp/pr-event.json
# Step 3: Create event file
cat > /tmp/pr-event.json <<EOF
{
"number": 123,
"pull_request": {
"number": 123,
"base": { "sha": "base-sha" },
"head": { "sha": "head-sha" }
}
}
EOF
# Step 4: Run analysis
sfp project:analyze \
--base-ref base-sha \
--head-ref head-sha✅ Detected PR context #123
✅ Retrieved X changed files from PR #123
✅ Creating check for [linter]...
✅ Successfully created GitHub check: https://github.com/owner/repo/pull/123/checksSkipping check creation - not supported in this environmentCannot create GitHub check: Missing GitHub context informationFailed to get auth token: Neither App credentials nor personal access token foundChanges: +0 -0 linessrc-env-specific/
└── main/
├── default/ # Fallback for sandboxes
│ └── classes/
├── dev/ # Dev-specific metadata
│ └── classes/
└── prod/ # Production metadata
└── classes/# preDeploy/replacements.yml
replacements:
- name: "API Endpoint"
glob: "**/*.cls"
pattern: "%%API_URL%%"
environments:
default: "https://api.dev.example.com"
prod: "https://api.example.com"my-package/
├── main/
│ └── default/
├── pre-destructive/ # Deleted before main deployment
│ └── objects/
└── post-destructive/ # Deleted after main deployment
└── classes/{
"packageDirectories": [
{
"path": "src/my-package",
"package": "my-package",
"versionNumber": "1.0.0.NEXT",
"dependencies": [
{"package": "another-source-package"},
{"package": "[email protected]"},
{"subscriberPackageVersionId": "04t..."}
]
}
]
}Add script paths to your sfdx-project.json file:
Scripts receive three arguments in this order:
Context File Path - Absolute path to temporary JSON file containing validation context
Target Org - Username of the target organization for validation
Hub Org - Username of the hub organization (empty string "" if not available)
The context file contains information about packages ready for validation:
The context file includes all pre-validation data plus validation results:
Pre-validation
Halts validation process
30 minutes
Setup test data, configure environments, validate prerequisites
Post-validation
Logged as warning, validation continues
30 minutes
Cleanup resources, send notifications, generate reports
Make scripts executable: chmod +x scripts/pre-validate.sh
Use set -e: Exit on errors to ensure proper failure handling
Parse JSON safely: Use jq for reliable JSON parsing
Handle missing data: Check if fields exist before using them
Log clearly: Scripts appear in validation logs with CI/CD folding
Keep scripts fast: Remember the 30-minute timeout limit
Test locally: Validate script behavior before committing
Set up test data specific to validation scenarios
Configure external API endpoints for testing
Validate prerequisites (licenses, feature flags, etc.)
Initialize monitoring or logging for the validation process
Clean up test data created during validation
Send notifications to Slack, Teams, or other systems
Generate custom reports or metrics
Update external tracking systems with validation results
Archive validation artifacts or logs
Availability
✅
✅
From
Aug 25 - 02
December 25
-o, --targetusername
Username or alias of the target org
Yes
-p, --package
Name of the package to push
No
-d, --domain
Name of the domain to push
No
-s, --source-path
Path to the local source files to push
No
The -p, -d, and -s flags are mutually exclusive. Use only one to specify the scope of the push operation.
--ignore-conflicts: Use this flag to override conflicts and push changes to the org, potentially overwriting org metadata.
--no-replacements: Disables automatic text replacements. By default, sfp applies configured replacements from preDeploy/replacements.yml.
--replacementsoverride: Specify a custom YAML file containing replacement configurations to use instead of the default.
--json: When specified, the command outputs a structured JSON object with detailed information about the push operation, including replacement details.
Source tracking is a feature that keeps track of the changes made to metadata both in your local project and in the org. When source tracking is enabled, the project:push command can more efficiently deploy only the changes made locally since the last sync, rather than deploying all metadata.
When pushing to a source-tracked org without specifying a package, domain, or source path, the command will use source tracking to deploy only the local changes.
For non-source-tracked orgs or when a specific scope is provided (via -p, -d, or -s flags), the command will deploy all metadata within the specified scope.
Source tracking provides faster and more efficient deployment of changes, especially in large projects.
Source tracking is not available for all org types. It's primarily used with scratch orgs and some sandbox orgs.
If source tracking is not enabled or supported, the project:push command will fall back to deploying all metadata within the specified scope.
The push command automatically applies text replacements to convert placeholder values in your source files to environment-specific values before deployment. This feature helps manage environment-specific configurations without modifying source files.
For detailed information about string replacements, see String Replacements.
If your source contains placeholders:
During push to a dev org, it becomes:
To skip replacements:
Push changes using source tracking (if available):
Push changes for a specific package:
Push changes for a specific domain:
Push changes from a specific source path:
Push changes and ignore conflicts:
When --json is specified, the command outputs a JSON object with the following structure:
The replacements field (available in sfp-pro) provides detailed information about text replacements applied during the push operation.
If an error occurs during the push operation, the command will throw an error with details about what went wrong. Use the --json flag to get structured error information in the output.
Availability
✅
❌
From
August 24

Availability
✅
❌
From
August 24
The sfp project:pull command retrieves source from a Salesforce org and updates your local project files. It can pull changes based on a package, domain, or specific source path. This command is useful for synchronizing your local project with the latest changes in your Salesforce org.
Source tracking is a feature in Salesforce development that keeps track of the changes made to metadata both in your local project and in the org. When source tracking is enabled, the project:pull command can more efficiently retrieve only the changes made in the org since the last sync, rather than retrieving all metadata.
Source tracking maintains a history of changes in both your local project and the Salesforce org.
It allows sfp to determine which components have been added, modified, or deleted since the last synchronization.
This feature is automatically enabled for scratch orgs and can be enabled for non-scratch orgs that support it.
project:pullWhen pulling from a source-tracked org without specifying a package, domain, or source path, the command will use source tracking to retrieve only the changes made in the org.
For non-source-tracked orgs or when a specific scope is provided (via -p, -d, or -s flags), the command will retrieve all metadata within the specified scope.
Source tracking provides faster and more efficient retrieval of changes, especially in large projects.
Source tracking is not available for all org types. It's primarily used with scratch orgs and some sandbox orgs.
If source tracking is not enabled or supported, the project:pull command will fall back to retrieving all metadata within the specified scope.
The -p, -d, and -s flags are mutually exclusive. Use only one to specify the scope of the pull operation.
--ignore-conflicts: Use this flag to override conflicts and pull changes from the org, potentially overwriting local changes.
--retrieve-path
Pull changes using source tracking (if available):
Pull changes for a specific package:
Pull changes for a specific domain:
Pull changes from a specific source path:
Pull changes and ignore conflicts:
Pull changes without applying reverse replacements:
Pull changes with custom replacement configuration:
The pull command automatically applies reverse text replacements to convert environment-specific values back to placeholders when retrieving source from the org. This feature helps maintain clean, environment-agnostic code in your repository.
For detailed information about string replacements, see .
When you pull changes from an org, sfp automatically:
Detects known values: Identifies environment-specific values that match your replacement configurations
Converts to placeholders: Replaces these values with their placeholder equivalents
Suggests new patterns: Detects potential patterns that could be added to your replacements
If your org contains:
After pulling, it becomes:
During pull operations, sfp analyzes retrieved code for patterns that might benefit from replacements:
To skip reverse replacements:
When --json is specified, the command outputs a JSON object with the following structure:
The replacements field (available in sfp-pro) provides detailed information about reverse text replacements applied during the pull operation, including any pattern suggestions detected.
If an error occurs during the pull operation, the command will throw an error with details about what went wrong. Use the --json flag to get structured error information in the output.

All packages start out as directory in your repo!
A package is a collection of metadata grouped together in a directory, and defined by an entry in your sfdx-project.json (Project Manifest).
Each package in sfp must have the following attributes as the minimum:
{
"plugins": {
"sfp": {
"validateScripts": {
"preValidation": "./scripts/pre-validate.sh",
"postValidation": "./scripts/post-validate.sh"
}
}
}
}# Example script invocation:
./scripts/pre-validate.sh /tmp/sfp-validate-pre-1234567890.json [email protected] [email protected]{
"phase": "pre-validation",
"targetOrg": "scratch-org-username",
"hubOrg": "devhub-username",
"validationMode": "thorough",
"packages": [
{
"name": "core-crm",
"version": "2.1.0.NEXT",
"type": "source",
"isChanged": true
}
]
}{
"phase": "post-validation",
"targetOrg": "scratch-org-username",
"hubOrg": "devhub-username",
"validationMode": "thorough",
"packages": [/* same as pre-validation */],
"validationResults": {
"status": "success",
"deployedPackages": ["core-crm", "shared-utils"],
"failedPackages": [],
"error": undefined
}
}#!/bin/bash
set -e
CONTEXT_FILE="$1"
TARGET_ORG="$2"
HUB_ORG="$3"
echo "🔧 Pre-validation setup starting..."
# Parse context
PACKAGES=$(cat "$CONTEXT_FILE" | jq -r '.packages[].name' | tr '\n' ' ')
VALIDATION_MODE=$(cat "$CONTEXT_FILE" | jq -r '.validationMode')
echo "📦 Packages to validate: $PACKAGES"
echo "🎯 Validation mode: $VALIDATION_MODE"
echo "🏢 Target org: $TARGET_ORG"
# Custom setup logic
if [ "$VALIDATION_MODE" = "thorough" ]; then
echo "🔧 Setting up comprehensive validation environment..."
# Setup test data, configure external systems, etc.
fi
# Example: Notify external systems
curl -X POST "https://internal-api.company.com/validation/started" \
-H "Content-Type: application/json" \
-d "{\"packages\": \"$PACKAGES\", \"targetOrg\": \"$TARGET_ORG\"}"
echo "✅ Pre-validation setup completed"#!/bin/bash
CONTEXT_FILE="$1"
TARGET_ORG="$2"
HUB_ORG="$3"
echo "🏁 Post-validation processing starting..."
# Parse results
STATUS=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.status')
DEPLOYED=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.deployedPackages[]' | tr '\n' ' ')
FAILED=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.failedPackages[]' | tr '\n' ' ')
echo "📊 Validation status: $STATUS"
echo "✅ Deployed packages: $DEPLOYED"
if [ "$STATUS" = "failed" ]; then
echo "❌ Failed packages: $FAILED"
ERROR=$(cat "$CONTEXT_FILE" | jq -r '.validationResults.error // "Unknown error"')
echo "🔍 Error details: $ERROR"
# Notify failure
curl -X POST "https://internal-api.company.com/validation/failed" \
-H "Content-Type: application/json" \
-d "{\"error\": \"$ERROR\", \"failedPackages\": \"$FAILED\"}"
else
echo "🎉 Validation successful!"
# Notify success
curl -X POST "https://internal-api.company.com/validation/success" \
-H "Content-Type: application/json" \
-d "{\"deployedPackages\": \"$DEPLOYED\"}"
fi
echo "✅ Post-validation processing completed"sfp project:push -o <org> [flags]private static final String API_URL = '%%API_ENDPOINT%%';private static final String API_URL = 'https://api-dev.example.com';sfp push -p myPackage -o myOrg --no-replacementssfp project:push -o myOrgsfp project:push -o myOrg -p myPackagesfp project:push -o myOrg -d myDomainsfp project:push -o myOrg -s force-app/main/defaultsfp project:push -o myOrg -i{
"hasError": boolean,
"errorMessage": string,
"errors": [
{
"Name": string,
"Type": string,
"Status": string,
"Message": string
}
],
"conflicts": [
{
"fullName": string,
"type": string,
"filePath": string,
"state": string
}
],
"replacements": {
"success": boolean,
"packageName": string,
"filesModified": [
{
"path": string,
"replacements": [
{
"pattern": string,
"value": string,
"count": number
}
],
"totalCount": number
}
],
"totalFiles": number,
"totalReplacements": number,
"errors": [],
"orgAlias": string
}
}-i, --ignore-conflicts
Ignore conflicts during push
No
--no-replacements
Skip text replacements during push
No
--replacementsoverride
Path to override replacements file
No
--json
Format output as JSON
No
--loglevel
Logging level
No


-r, --retrieve-path
Path where the retrieved source should be placed
No
-i, --ignore-conflicts
Ignore conflicts during pull
No
--no-replacements
Skip text replacements during pull
No
--replacementsoverride
Path to override replacements file
No
--json
Format output as JSON
No
--loglevel
Logging level
No
--no-replacements: Disables automatic text replacements. By default, sfp applies reverse replacements to convert environment-specific values back to placeholders.
--replacementsoverride: Specify a custom YAML file containing replacement configurations to use instead of the default.
--json: When specified, the command outputs a structured JSON object with detailed information about the pull operation, including replacement details and pattern suggestions.
-o, --targetusername
Username or alias of the target org
Yes
-p, --package
Name of the package to pull
No
-d, --domain
Name of the domain to pull
No
-s, --source-path
Path to the local source files to pull
No
sfp project:pull -o <org> [flags]sfp project:pull -o myOrgsfp project:pull -o myOrg -p myPackagesfp project:pull -o myOrg -d myDomainsfp project:pull -o myOrg -s force-app/main/defaultsfp project:pull -o myOrg -p myPackage --ignore-conflictssfp project:pull -o myOrg -p myPackage --no-replacementssfp project:pull -o myOrg -p myPackage --replacementsoverride custom-replacements.ymlprivate static final String API_URL = 'https://api-dev.example.com';private static final String API_URL = '%%API_ENDPOINT%%';⚠️ Potential replacements detected:
📄 force-app/main/default/classes/APIService.cls:
• URL detected: 'https://new-api.example.com/v2'
New URL pattern detected. Consider adding to replacements.yml
💡 To include these values in future replacements, update your replacements.yml file.sfp pull -p myPackage -o myOrg --no-replacements{
"hasError": boolean,
"errorMessage": string,
"files": [
{
"fullName": string,
"type": string,
"createdByName": string,
"lastModifiedByName": string,
"createdDate": string,
"lastModifiedDate": string
}
],
"conflicts": [
{
"fullName": string,
"type": string,
"filePath": string,
"state": string
}
],
"errors": [
{
"fileName": string,
"problem": string
}
],
"replacements": {
"success": boolean,
"packageName": string,
"filesModified": [
{
"path": string,
"replacements": [
{
"pattern": string,
"value": string,
"count": number
}
],
"totalCount": number
}
],
"totalFiles": number,
"totalReplacements": number,
"errors": [],
"orgAlias": string,
"suggestions": [
{
"filePath": string,
"suggestions": [
{
"type": string,
"value": string,
"message": string,
"pattern": string
}
]
}
]
}
}versionNumber
yes
The version number of the package
versionDescription
no
Description for a particular version of the package
By default, sfp treats all entries in sfdx-project.json as Source Packages. You can create different types of packages depending on your needs:
Source Package
✅
Manual
Default package type for deploying metadata
Unlocked Package
✅
SF CLI
Versioned, upgradeable package
Org-Dependent Unlocked
✅
SF CLI
Create a source package using the sfp-pro CLI:
Flags:
-n, --name (required): Package name
-r, --path (required): Directory path for the package
-d, --description: Package description
--domain: Mark package as a domain package
--no-insert: Don't insert into sfdx-project.json automatically
--insert-after: Insert after a specific package
Create an unlocked package with automatic DevHub registration:
Flags:
-n, --name (required): Package name
-r, --path (required): Directory path for the package
-v, --targetdevhubusername: DevHub alias/username
--org-dependent: Create org-dependent unlocked package
--no-namespace: Create without namespace
-d, --description: Package description
--error-notification-username: Username for error notifications
--domain: Mark package as a domain package
Create a data package for data migration:
Flags:
-n, --name (required): Package name
-r, --path (required): Directory path for the package
-d, --description: Package description
--domain: Mark package as a domain package
Create a diff package to track changes from a baseline:
Flags:
-n, --name (required): Package name
-r, --path (required): Directory path for the package
-c, --commit-id: Baseline commit ID
-v, --targetdevhubusername: DevHub alias/username
-d, --description: Package description
--domain: Mark package as a domain package
For sfp community edition users, packages need to be created manually or using Salesforce CLI.
Create a directory for your package
Add an entry to your sfdx-project.json:
Ensure your sfdx-project.json contains an entry for the package with path, package, and versionNumber
Create the package using Salesforce CLI:
Commit the updated sfdx-project.json with the new package ID in packageAliases
Create a directory for your data package
Add the required export.json and CSV files
Add an entry to sfdx-project.json with type: "data":
Create a directory for your diff package
Add an entry to sfdx-project.json with type: "diff":
Create a record in SfpowerscriptsArtifact2__c object in your DevHub with:
Package name
Initial version number
Baseline commit ID
Use descriptive names: Package names should clearly indicate their purpose
Organize by domain: Group related packages using domains
Version consistently: Use semantic versioning (MAJOR.MINOR.PATCH)
Document packages: Add meaningful version descriptions
Choose the right type:
Source packages for most metadata
Unlocked packages for distributed, versioned components
Data packages for reference data
Diff packages for selective deployments
Defining a Domain - Organize packages into domains
Building Artifacts - Build deployable artifacts from packages
Package Types - Learn more about different package types
path
yes
Path to the directory that contains the contents of the package
package
yes
The name of the package
Availability
✅
❌
From
September 2025
String replacements provide a mechanism to manage environment-specific values in your Salesforce code without modifying source files. This feature automatically replaces placeholders with appropriate values during build, install, and push operations, and converts values back to placeholders during pull operations.
String replacements complement the existing aliasfy packages feature. While aliasfy packages handle structural metadata differences by deploying different files per environment, string replacements handle configuration value differences within the same files, reducing duplication and maintenance overhead.
String replacements work across multiple sfp commands:
Build Operations: During sfp build, replacement configurations are analyzed and embedded in the artifact for later use during installation.
Install/Deploy Operations: During sfp install or sfp deploy, placeholders are replaced with environment-specific values based on the target org:
Push Operations: During sfp push, placeholders in your source files are replaced with environment-specific values before deployment:
Pull Operations: During sfp pull, environment-specific values are converted back to placeholders:
Replacements are configured in a replacements.yml file within each package's preDeploy directory:
Example configuration:
The properties accepted by the configuration file are:
Source file with placeholders:
After pushing to a dev org, the placeholders are replaced:
During pull operations, sfp automatically detects potential patterns that could be converted to replacements:
URLs: Detects HTTP/HTTPS URLs
Email Addresses: Identifies email patterns
API Keys: Recognizes common API key formats
Custom Patterns: Detects repetitive values across files
When patterns are detected, sfp provides suggestions:
Replacements are resolved based on the target org:
Exact Alias Match: First checks for an exact match with the org alias
Sandbox Default: For sandbox/scratch orgs, uses the default value
Production Requirement: Production deployments require explicit configuration
The org alias is determined from your Salesforce CLI authentication:
String replacements are supported across the following sfp commands:
Both push and pull commands support JSON output with detailed replacement information:
Check File Location: Ensure replacements.yml is in preDeploy directory
Verify Glob Pattern: Test glob pattern matches your files
Check Org Alias: Verify the org alias matches your configuration
Case Sensitivity: Patterns are case-sensitive
Special Characters: Escape special regex characters if needed
File Encoding: Ensure files are UTF-8 encoded
Org Detection: Verify org alias with sf org list
Environment Priority: Check resolution order (exact match → default)
Override Files: Check if override file is being used
Replacements are text-based and work with any text file format
Binary files are not supported
Large files may impact performance
Regex patterns should be used carefully to avoid unintended matches
- Configure string replacements in packages
- How replacements work during installation
- Deploy changes with replacements
- Retrieve changes with reverse replacements
This guide covers the setup and configuration of Large Language Model (LLM) providers for AI-powered features in sfp:
The AI-powered review functionality provides intelligent architecture and code quality analysis during pull request reviews. This feature automatically analyzes changed files using advanced language models to provide contextual insights about architectural patterns, Flxbl framework compliance, and potential improvements.
// A sample sfdx-project.json with a package
{
"packageDirectories": [
{
"path": "src/my-package",
"package": "my-package",
"versionNumber": "1.0.0.NEXT"
}
]
}sfp package create source -n "my-source-package" -r "src/my-package"
# With domain (for organizing packages)
sfp package create source -n "my-source-package" -r "src/my-package" --domainsfp package create unlocked -n "my-unlocked-package" -r "src/my-package" -v devhub
# Org-dependent unlocked package
sfp package create unlocked -n "my-package" -r "src/my-package" --org-dependent -v devhub
# With namespace
sfp package create unlocked -n "my-package" -r "src/my-package" --no-namespace -v devhubsfp package create data -n "my-data-package" -r "data/my-data-package"sfp package create diff -n "my-diff-package" -r "src/my-diff-package" -c "baseline-commit-id" -v devhub{
"packageDirectories": [
{
"path": "src/my-source-package",
"package": "my-source-package",
"versionNumber": "1.0.0.NEXT"
}
]
}# Standard unlocked package
sf package create --name my-package --package-type Unlocked --no-namespace -v devhub
# Org-dependent unlocked package
sf package create --name my-package --package-type Unlocked --org-dependent --no-namespace -v devhub{
"path": "data/my-data-package",
"package": "my-data-package",
"versionNumber": "1.0.0.NEXT",
"type": "data"
}{
"path": "src/my-diff-package",
"package": "my-diff-package",
"versionNumber": "1.0.0.NEXT",
"type": "diff"
}Unlocked package with org dependencies
Data Package
✅
Manual
Package for data migration
Diff Package
✅
Manual
Package containing only changed components
environments.default
No
Default value used for sandbox/scratch orgs (recommended)
isRegex
No
Whether the pattern is a regular expression (default: false)
sfp pull
✅
Applies reverse replacements during source pull
sfp validate
✅
Applies replacements during validation
name
Yes
Human-readable name for the replacement
pattern
Yes
The placeholder pattern to replace (e.g., %%API_URL%%)
glob
Yes
File pattern for matching files (e.g., **/*.cls for all Apex classes)
environments
Yes
Map of environment aliases to replacement values
sfp build
✅
Analyzes and embeds replacement configurations in artifacts
sfp install
✅
Applies replacements during artifact installation
sfp deploy
✅
Applies replacements during artifact deployment
sfp push
✅
Applies forward replacements during source push
The architecture analysis performs real-time analysis of pull request changes to:
Analyze architectural patterns and design consistency
Identify alignment with Flxbl framework best practices
Suggest improvements based on changed files context
Provide severity-based insights (info, warning, concern)
Generate actionable recommendations
The AI assisted architecture analyzer integrates into the project:analyze command and:
Detects PR Context: Automatically identifies when running in a pull request environment
Analyzes Changed Files: Focuses analysis on modified files only (up to 10 files for token optimization)
Applies AI Analysis: Uses configured AI provider to analyze architectural patterns
Reports Findings: Generates structured insights without failing the build (informational only)
Creates GitHub Checks: Posts results as GitHub check annotations when running in CI
OpenCode is currently only supported on OSX or Linux runtimes. It's not supported for Windows platforms.
For complete setup instructions, see Configuring LLM Providers.
Quick Setup:
The architecture analyzer is configured through a YAML configuration file at config/ai-architecture.yaml:
For quick setup, create a minimal configuration:
The linter will auto-detect available AI providers and use sensible defaults.
For detailed provider configuration, see Configuring LLM Providers.
Anthropic (Recommended)
claude-4-5
sfp ai auth --provider anthropic --auth
OpenAI
gpt-5
sfp ai auth --provider openai --auth
Amazon Bedrock
claude-4-sonnet-xxxxx
Configure AWS credentials
The linter auto-detects providers in this priority:
Environment variables (ANTHROPIC_API_KEY, OPENAI_API_KEY, etc.)
Configuration in ai-architecture.yaml
When running in GitHub Actions or with PR environment variables:
For local testing or custom CI environments:
The AI linter provides structured insights without failing builds:
Pattern: Architectural patterns observed or missing
Concern: Potential issues requiring attention
Suggestion: Improvement recommendations
Alignment: Framework compliance observations
Info: Informational observations
Warning: Areas needing attention
Concern: Significant architectural considerations
The linter gracefully handles API limitations:
Rate Limits: Skips analysis with informational message
Timeouts: 60-second timeout protection
Token Limits: Analyzes up to 10 files, content limited to 5KB per file
Failures: Never blocks PR merge (informational only)
Tailor analysis to your team's priorities:
Provide architectural documentation for better analysis:
Combine with other analysis tools for comprehensive coverage:
For large PRs, the linter automatically:
Limits to 10 most relevant files
Truncates file content to 5KB
Focuses on text-based source files
Common reasons and solutions:
Not Enabled: Set enabled: true in config/ai-architecture.yaml
No Provider: Configure API keys or authenticate with sfp ai auth
Rate Limited: Wait for rate limit reset or use different provider
No Changed Files: Ensure PR context is properly detected
Enable debug logging for detailed information:
This shows:
Provider detection process
Changed files identified
API calls and responses
Error details if analysis fails
Binary Files: Skips non-text files
Build Impact: Never fails builds (informational only)
Language Support: Best for Apex, JavaScript, TypeScript, XML
Availability
✅
❌
From
October 25
Not Available
Artifact (%%API_URL%%) → Install → Target Org (https://api.example.com)Source File (%%API_URL%%) → Push → Target Org (https://api.example.com)Target Org (https://api.example.com) → Pull → Source File (%%API_URL%%)src/
your-package/
preDeploy/
replacements.yml
main/
default/
classes/replacements:
- name: "API Endpoint"
pattern: "%%API_ENDPOINT%%"
glob: "**/*.cls"
environments:
default: "https://api-sandbox.example.com"
dev: "https://api-dev.example.com"
staging: "https://api-staging.example.com"
prod: "https://api.example.com"
- name: "Support Email"
pattern: "%%SUPPORT_EMAIL%%"
glob: "**/*.cls"
environments:
default: "[email protected]"
dev: "[email protected]"
prod: "[email protected]"public class APIService {
private static final String ENDPOINT = '%%API_ENDPOINT%%';
private static final String API_KEY = '%%API_KEY%%';
}public class APIService {
private static final String ENDPOINT = 'https://api-dev.example.com';
private static final String API_KEY = 'dev-key-12345';
}⚠️ Potential replacements detected:
📄 src/package/main/default/classes/APIService.cls:
• URL detected: 'https://new-api.example.com/v2'
New URL pattern detected. Consider adding to replacements.yml
💡 To include these values in future replacements, update your replacements.yml file.# Check your org aliases
sf org list
# Push with specific org alias
sfp push -o dev-sandbox -p your-package# Skip replacements during install
sfp install --targetorg dev --artifactdir artifacts --no-replacements
# Skip replacements during push
sfp push -p your-package -o dev --no-replacements
# Skip replacements during pull
sfp pull -p your-package -o dev --no-replacements# Use override file during install
sfp install --targetorg dev --artifactdir artifacts --replacementsoverride custom-replacements.yml
# Use override file during push
sfp push -p your-package -o dev --replacementsoverride custom-replacements.yml
# Use override file during pull
sfp pull -p your-package -o dev --replacementsoverride custom-replacements.yml# Get JSON output with replacement details
sfp push -p your-package -o dev --json
sfp pull -p your-package -o dev --json{
"hasError": false,
"replacements": {
"success": true,
"packageName": "your-package",
"filesModified": [
{
"path": "main/default/classes/APIService.cls",
"replacements": [
{
"pattern": "%%API_ENDPOINT%%",
"value": "https://api-dev.example.com",
"count": 1
}
],
"totalCount": 1
}
],
"totalFiles": 1,
"totalReplacements": 1,
"errors": [],
"orgAlias": "dev"
}
}# Install OpenCode CLI
npm install -g opencode-ai
nfigure Anthropic (recommended)
sfp ai auth --provider anthropic --auth# Enable/disable AI architecture analysis
enabled: true
# AI Provider Configuration (optional - auto-detects if not specified)
provider: anthropic # Options: anthropic, openai, google
model: claude-4-sonnet-xxxxx # Optional - uses provider defaults if not specified
# Architectural Patterns to Check
patterns:
- singleton
- factory
- repository
- service-layer
# Architecture Principles
principles:
- separation-of-concerns
- single-responsibility
- dependency-inversion
# Focus Areas for Analysis
focusAreas:
- security
- performance
- maintainability
- testability
# Additional Context Files (optional)
contextFiles:
- ARCHITECTURE.md
- docs/patterns.mdenabled: true# Automatically detects PR context and analyzes only changed files
sfp project:analyze
# Explicitly exclude AI linter if needed
sfp project:analyze --exclude-linters architecture# Manually specify changed files
sfp project:analyze --changed-files "src/classes/MyClass.cls,src/lwc/myComponent/myComponent.js"📐 Architecture Analysis Results
════════════════════════════════
✅ Analysis Complete (AI-powered by anthropic/claude-4-sonnet)
## Summary
Analyzed 5 changed files focusing on architectural patterns and Flxbl compliance.
## Key Insights
### ⚠️ Service Layer Pattern (Warning)
File: src/classes/AccountController.cls
Description: Direct SOQL queries in controller violates service layer pattern.
Consider moving data access logic to a dedicated service class.
### ℹ️ Dependency Management (Info)
File: src/classes/OrderService.cls
Description: Good use of dependency injection pattern for testability.
This aligns well with Flxbl framework principles.
### ⚠️ Error Handling (Concern)
File: src/classes/PaymentProcessor.cls:45
Description: Missing comprehensive error handling for external callouts.
Implement try-catch blocks with proper logging and user feedback.
## Recommendations
1. Extract data access logic to service layer classes
2. Implement centralized error handling strategy
3. Consider adding unit tests for new service methods
4. Document architectural decisions in ARCHITECTURE.md- name: Run Project Analysis with AI Linter
run: |
sfp project:analyze --output-format github
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
# GitHub context automatically detectedfocusAreas:
- security # For compliance-critical projects
- performance # For high-volume applications
- maintainability # For long-term projectscontextFiles:
- ARCHITECTURE.md
- docs/coding-standards.md
- docs/patterns.md# Run all linters including AI analysis
sfp project:analyze --fail-on duplicates,compliance
# AI linter provides insights, others enforce rules# Check available providers
echo $ANTHROPIC_API_KEY
echo $OPENAI_API_KEYsfp project:analyze --loglevel debugAI-Powered PR Linter - sfp-pro only
AI Assisted Insight Reports - Available in both sfp-pro and community (alpha)
AI-Assisted Error Analysis - Intelligent validation error analysis
These features require OpenCode CLI and an authenticated LLM provider.
OpenCode is currently only supported on OSX or Linux runtimes. It's not supported for Windows platforms.
For sfp (community) users: These AI features are available in alpha. You can use npm install -g @flxbl-io/sfp@ai to access them.
OpenCode CLI is the underlying engine that manages AI interactions for sfp's AI-powered features. It handles provider authentication, model selection, and secure API communication.
Global Installation (Recommended)
Alternative Installation Methods
For more installation options and troubleshooting, see the OpenCode documentation.
sfp currently supports the following LLM providers through OpenCode:
Anthropic (Claude)
✅ Fully Supported
⭐ Yes
Best overall performance, Flxbl framework understanding
OpenAI
✅ Fully Supported
Yes
Wide model selection, good performance
Amazon Bedrock
✅ Fully Supported
Yes
Anthropic's Claude models provide the best understanding of Salesforce and Flxbl framework patterns. The default model used is claude-4-sonnet-xxxxx which offers optimal balance between performance and cost.
Method 1: Interactive Authentication (Recommended)
Method 2: Environment Variable
Method 3: Configuration File Create or edit config/ai-architecture.yaml:
Visit console.anthropic.com
Sign up or log in to your account
Navigate to API Keys section
Create a new API key for sfp usage
Copy the key (starts with sk-ant-)
OpenAI provides access to GPT models with good code analysis capabilities.
Method 1: Interactive Authentication
Method 2: Environment Variable
Method 3: Configuration File
Visit platform.openai.com
Sign up or log in
Go to API Keys section
Create a new secret key
Copy the key (starts with sk-)
Amazon Bedrock is ideal for enterprise environments already using AWS infrastructure. It provides access to Claude models through AWS.
Method 1: AWS Profile
Method 2: AWS Credentials
Method 3: Configuration File
Important: AWS Bedrock requires both AWS_BEARER_TOKEN_BEDROCK and AWS_REGION environment variables to be set. Authentication will fail if either is missing.
Bedrock Model Access: Ensure your AWS account has access to the Claude models in Bedrock. You may need to request access through the AWS Console under Bedrock > Model access.
Bedrock automatically handles model prefixes based on your AWS region:
US Regions: Models may require us. prefix
EU Regions: Models may require eu. prefix
AP Regions: Models may require apac. prefix
The OpenCode SDK handles this automatically based on your AWS_REGION.
GitHub Copilot can be used if you have an active subscription with model access enabled.
The AI features are configured through config/ai-assist.yaml in your project root:
After configuring authentication, you can verify that providers are working correctly using the ai check command:
This command performs a simple inference test to verify:
Authentication is configured correctly
The provider is accessible
Model inference is working
Response time and performance
Credentials are stored securely in ~/.sfp/ai-auth.json with appropriate file permissions. This file is created automatically when you authenticate.
When multiple authentication methods are available, sfp uses the following priority:
Environment Variables - Highest priority, useful for CI/CD
Stored Credentials - From ~/.sfp/ai-auth.json
Configuration File - From config/ai-assist.yaml
Both Environment Variables Required
Authentication Failed
Verify both AWS_BEARER_TOKEN_BEDROCK and AWS_REGION are set
Check that your bearer token is valid and not expired
Ensure your AWS account has access to Claude models in Bedrock
If you encounter rate limits:
Anthropic: Check your usage at console.anthropic.com
OpenAI: Monitor at platform.openai.com/usage
Bedrock: Check AWS CloudWatch metrics
Ensure you're using the correct model identifier:
Availability
✅
🔶
From
October 25
December 25
Features
PR Linter, Reports, Error Analysis
Reports Only
This guide walks through the complete development workflow using sfp in a modular Salesforce project following the Flxbl framework.
The development workflow in an sfp-powered project follows an iterative approach where developers work in isolated environments, make changes using source-driven development, and submit their work through pull requests that trigger automated validation and review environments.
# Install OpenCode CLI globally via npm
npm install -g opencode-ai
# Verify installation
opencode --version# Using yarn
yarn global add opencode-ai
# Using pnpm
pnpm add -g opencode-ai
# Using Homebrew (macOS/Linux)
brew install opencode-ai# Authenticate with Anthropic
sfp ai auth --provider anthropic --auth
# This will prompt for your API key and store it securely# Add to your shell profile (.bashrc, .zshrc, etc.)
export ANTHROPIC_API_KEY="sk-ant-xxxxxxxxxxxxx"enabled: true
provider: anthropic
# Model is optional - uses claude-sonnet-4-20250514 by defaultsfp ai auth --provider openai --authexport OPENAI_API_KEY="sk-xxxxxxxxxxxxx"# In config/ai-assist.yaml
enabled: true
provider: openai
# Model is optional - uses gpt-5 by default# Set both required environment variables
export AWS_BEARER_TOKEN_BEDROCK="your-bearer-token"
export AWS_REGION="us-east-1"
# Both variables must be set for authentication to work# Authenticate with Amazon Bedrock
sfp ai auth --provider amazon-bedrock --auth
# This will prompt for both Bearer Token and Region# In config/ai-assist.yaml
enabled: true
provider: amazon-bedrock
model: anthropic.claude-sonnet-4-20250514-v1:0 # Default model# Authenticate with GitHub Copilot
sfp ai auth --provider github-copilot --auth
# Ensure models are enabled in GitHub Settings:
# https://github.com/settings/copilot/features# Enable/disable AI features
enabled: true
# Provider Configuration
provider: anthropic # anthropic, openai, amazon-bedrock, github-copilot
# Model Configuration (Optional - uses provider defaults if not specified)
# Default models:
# - anthropic: claude-sonnet-4-20250514
# - github-copilot: claude-sonnet-4
# - openai: gpt-5
# - amazon-bedrock: anthropic.claude-sonnet-4-20250514-v1:0
model: claude-sonnet-4-20250514 # Override default model
# Architectural Patterns to Check (for PR Linter)
patterns:
- singleton
- factory
- repository
- service-layer
# Architecture Principles
principles:
- separation-of-concerns
- single-responsibility
- dependency-inversion
# Focus Areas for Analysis
focusAreas:
- security
- performance
- maintainability
- testability
# Additional Context Files
contextFiles:
- ARCHITECTURE.md
- docs/patterns.md
- docs/coding-standards.md# Check all providers
sfp ai auth
# Check specific provider
sfp ai auth --provider anthropic
# List all supported providers
sfp ai auth --list# Test all configured providers
sfp ai check
# Test specific provider with default model
sfp ai check --provider anthropic
# Test Amazon Bedrock (uses default: anthropic.claude-sonnet-4-20250514-v1:0)
sfp ai check --provider amazon-bedrock
# Test GitHub Copilot (uses default: claude-sonnet-4)
sfp ai check --provider github-copilot
# Re-authenticate to update stored credentials
sfp ai auth --provider anthropic --auth
# Or update environment variable
export ANTHROPIC_API_KEY="sk-ant-new-key-xxxxx"# Verify installation
which opencode
# If not found, reinstall
npm install -g opencode-ai
# Check npm global bin path is in PATH
npm bin -g# Check authentication
sfp ai auth --provider anthropic
# Verify environment variables
echo $ANTHROPIC_API_KEY
# For AWS Bedrock - check both required variables
echo $AWS_BEARER_TOKEN_BEDROCK
echo $AWS_REGION
# Check stored credentials exist
ls -la ~/.sfp/ai-auth.json
# Test provider inference
sfp ai check --provider <provider-name># This will NOT work (missing region)
export AWS_BEARER_TOKEN_BEDROCK="token"
# This will work (both variables set)
export AWS_BEARER_TOKEN_BEDROCK="token"
export AWS_REGION="us-east-1"# Correct
model: claude-4-sonnet-xxxxx
Enterprise environments with AWS infrastructure
Before starting development with sfp, ensure you have:
DevHub access - Required for:
Building packages (all types)
Creating scratch orgs
Managing unlocked packages
DevHub user setup:
Your user must be added to the DevHub org
Follow Salesforce's guide:
Authenticate to your DevHub:
Verify DevHub connection:
Every feature or story begins with a developer fetching a fresh environment from a pre-prepared pool. The frequency depends on your team's practice:
Scratch Orgs: Can be fetched for every story or feature
Sandboxes: Typically fetched at the start of an iteration or sprint
Once you have your environment:
Before making changes, ensure you have the latest metadata from your org:
The pull command will:
Retrieve metadata changes from your org
Apply reverse text replacements to convert environment-specific values back to placeholders
Update your local source files
Now you can work on your feature using your preferred IDE:
Modify existing metadata in package directories
Create new components using SF CLI or your IDE
Add new packages if needed:
Organize packages into logical groups using release configs (domains are conceptual, not explicit commands)
Deploy your local changes to the development org:
The push command will:
Apply text replacements for environment-specific values
Deploy metadata to your org
Run tests if specified
Test that your packages can be built successfully:
Execute tests in your development org to validate your changes. sfp follows a package-centric testing approach:
For detailed information on test levels, coverage validation, output formats, and CI/CD integration, see Running Apex Tests.
While developers rarely need to install built artifacts to their own orgs, you can test the installation:
As you develop, you may need to manage package dependencies:
Once your feature is complete:
When you create a PR, the automated pipeline will:
Run sfp validate to verify your changes:
Create a review environment for acceptance testing:
Run quality checks:
Code coverage validation
Dependency validation
Package structure verification
The review environment URL is posted to your PR for stakeholders to test:
Product owners can validate functionality
QA can run acceptance tests
Other developers can review the implementation
After your PR is approved and merged:
Artifacts are built from the main branch
Published to artifact repository
Ready for release to higher environments
When working with environment-specific metadata:
For configuration values that change per environment:
Then push/pull will automatically handle replacements:
When you need to delete metadata:
Move components to pre-destructive/ or post-destructive/ folders
Push changes normally:
The destructive changes are automatically processed.
# List available scratch orgs in pool (alias: pool:list)
sfp pool scratch list --tag dev-pool
# Fetch a scratch org from the pool (alias: pool:fetch)
sfp pool scratch fetch --tag dev-pool --alias my-feature-org
# Initialize a pool if empty (aliases: prepare, pool:prepare)
sfp pool scratch init --tag dev-pool \
--targetdevhubusername mydevhub \
--config config/project-scratch-def.json \
--count 5# List available instances (works for both scratch orgs and sandboxes)
sfp server pool instance list \
--repository myorg/myrepo \
--tag dev-pool
# Fetch an org from the pool (scratch or sandbox)
sfp server pool instance fetch \
--repository myorg/myrepo \
--tag dev-pool \
--assignment-id feature-123
# Extend org expiration if needed
sfp server pool instance extend \
--repository myorg/myrepo \
--tag dev-pool \
--assignment-id feature-123 \
--expiration-hours 48
# Unassign and return to pool when done
sfp server pool instance unassign \
--repository myorg/myrepo \
--tag dev-pool \
--assignment-id feature-123# Create a new sandbox directly
sfp sandbox create --name feature-sandbox \
--type Developer \
--source-org production \
--alias my-feature-sandbox# Open the org to verify access (sfp-pro)
sfp org open --targetusername my-feature-org
# Open in a specific browser (sfp-pro)
sfp org open --targetusername my-feature-org --browser chrome
# For community edition, use Salesforce CLI
sf org open --target-org my-feature-org
# Set as default for convenience (sfp-pro)
sfp config set target-org my-feature-org
# Set globally (sfp-pro)
sfp config set target-org my-feature-org --global# Pull all changes from the org (using aliases: pull, source:pull, project:pull)
sfp pull --targetusername my-feature-org
# Pull with conflict resolution
sfp pull --targetusername my-feature-org --ignore-conflicts
# Pull a specific package
sfp pull --targetusername my-feature-org --package my-package
# Pull and see what replacements were reversed (sfp-pro)
sfp pull --targetusername my-feature-org --json# Create a new source package (sfp-pro)
sfp package create source -n "feature-payment" \
-r "src/payment-processing" \
--domain
# Create an unlocked package
sfp package create unlocked -n "feature-payment" \
-r "src/payment-processing" \
-v mydevhub
# Create a data package
sfp package create data -n "reference-data" \
-r "data/reference-data"
# For community edition, manually add to sfdx-project.json# Push all changes (using aliases: push, source:push, project:push)
sfp push --targetusername my-feature-org
# Push a specific package
sfp push --targetusername my-feature-org --package my-package
# Push ignoring conflicts
sfp push --targetusername my-feature-org --ignore-conflicts
# Push and see what replacements were applied (sfp-pro)
sfp push --targetusername my-feature-org --json# Build all packages (DevHub required)
sfp build --devhubalias mydevhub
# Build a specific domain
sfp build --devhubalias mydevhub --domain sales
# Build a specific package
sfp build --devhubalias mydevhub --package payment-processing
# Build with different options
sfp build --devhubalias mydevhub \
--branch feature/payment \
--buildnumber 123 \
--diffcheck
# Note: DevHub is required even for source packages to resolve dependencies# Test a specific package (recommended)
sfp apextests trigger -o my-feature-org -l RunAllTestsInPackage -n sales-core
# Test all packages in a domain
sfp apextests trigger -o my-feature-org -l RunAllTestsInDomain \
-r config/release-config.yaml
# Quick test during development
sfp apextests trigger -o my-feature-org -l RunSpecifiedTests \
--specifiedtests PaymentProcessorTest
# Test with code coverage validation
sfp apextests trigger -o my-feature-org -l RunAllTestsInPackage \
-n sales-core -c -p 80# Install a single package
sfp install --target-org my-feature-org \
--artifacts artifacts \
--package payment-processing
# Install with skip testing for faster deployment
sfp install --target-org my-feature-org \
--artifacts artifacts \
--skipifalreadyinstalled# Understand package dependencies
sfp dependency explain --package payment-processing
# Expand all transitive dependencies (for troubleshooting)
sfp dependency expand --target-devhub mydevhub
# Clean up redundant dependencies
sfp dependency shrink --target-devhub mydevhub# Commit your changes
git add .
git commit -m "feat: implement payment processing module"
# Push to your feature branch
git push origin feature/payment-processing
# Create PR using GitHub CLI (optional)
gh pr create --title "Payment Processing Module" \
--body "Implements new payment gateway integration"# This runs automatically in CI/CD
sfp validate org --target-org validation-org \
--mode thorough \
--coverageThreshold 75# CI/CD creates an ephemeral environment
sfp pool scratch fetch --pool review-pool \
--alias pr-123-review
# Install the changes
sfp install --target-org pr-123-review \
--artifacts artifacts# This happens automatically in CI/CD
sfp build --branch main
sfp publish --artifacts artifacts \
--npm-registry https://your-registry.com# Pull from a specific environment
sfp pull --targetusername dev-sandbox
# The correct variant is automatically selected
# src-env-specific/main/dev/* contents are used# Create preDeploy/replacements.yml in your package
replacements:
- name: "API Endpoint"
glob: "**/*.cls"
pattern: "%%API_URL%%"
environments:
default: "https://api.dev.example.com"
prod: "https://api.example.com"# Push replaces placeholders with environment values
sfp push --targetusername my-feature-org
# Pull reverses replacements back to placeholders
sfp pull --targetusername my-feature-orgsfp push --targetusername my-feature-org# Check pool status
sfp pool scratch list --tag dev-pool
# Replenish the pool (aliases: prepare, pool:prepare)
sfp pool scratch init --tag dev-pool \
--targetdevhubusername mydevhub \
--count 5# Check pool status
sfp server pool status --repository myorg/myrepo --tag dev-pool
# Replenish pool (works for both scratch orgs and sandboxes)
sfp server pool replenish \
--repository myorg/myrepo \
--tag dev-pool# Ignore conflicts during pull
sfp pull --targetusername my-feature-org --ignore-conflicts
# Ignore conflicts during push
sfp push --targetusername my-feature-org --ignore-conflicts# Check for issues in specific package
sfp build --devhubalias mydevhub \
--package problematic-package \
--loglevel DEBUG
# Validate dependencies
sfp dependency explain --package problematic-package# Re-authenticate to DevHub
sf org login web --alias mydevhub --set-default-dev-hub
# Verify DevHub is enabled
sf org display --target-dev-hub
# Check DevHub limits
sf limits api display --target-org mydevhubsf org login web --alias mydevhub --set-default-dev-hub# Check your DevHub connection
sf org display --target-dev-hubThe apextests trigger command allows you to independently execute Apex tests in your Salesforce org. While the validate command automatically runs tests per package during validation, this command gives you direct control over test execution with support for multiple test levels, code coverage validation, and output formats.
sfp follows a package-centric testing approach where tests are organized and executed at the package or domain level, rather than running all org tests together. This aligns with how the validate command works and provides better isolation and faster feedback.
Runs all tests within specified package(s). This is the primary testing pattern in sfp and matches how the validate command executes tests. Supports code coverage validation at both package and individual class levels.
This pattern matches how validate command executes tests - each package is tested independently with its own test classes. This provides:
Better test isolation and faster feedback
Package-level code coverage validation
Clear attribution of test failures to specific packages
Parallel test execution per package (when enabled)
Runs tests for all packages defined in a domain from your release config. This is the recommended pattern for validating entire domains and matches how you would validate a domain for release.
This executes tests for each package in the domain sequentially, providing comprehensive domain validation. Use this when:
Validating changes across a domain before release
Testing related packages together as a unit
Performing end-to-end domain validation
Runs specific test classes or methods. Useful for rapid iteration during active development.
Use during development for quick feedback cycles when working on specific features.
Runs all tests in a test suite defined in your org.
Useful for running pre-defined test groups or smoke test suites.
Runs all tests in your org except those from managed packages. This is the default test level in Salesforce but not the recommended pattern in sfp.
Note: While this is the Salesforce default, sfp recommends package-level or domain-level testing for better isolation and faster feedback. Use this only when you specifically need to run all org tests together, such as for compliance requirements or full org validation.
Runs all tests in your org, including managed packages. Rarely used due to long execution time.
Use only for complete org validation scenarios.
Validates that each Apex class in the package meets the minimum coverage threshold. Every class must meet or exceed the specified percentage.
Coverage threshold:
Default: 75%
Adjustable with -p flag
Applied per class, not as an average
Output includes:
List of all classes with their coverage percentages
Classes that meet the threshold
Classes that fail to meet the threshold
Overall package coverage percentage
Validates that the overall package coverage meets the minimum threshold. The average coverage across all classes must meet or exceed the specified percentage.
Coverage calculation:
Aggregates coverage across all classes in package
Calculated as: (total covered lines / total lines) * 100
Only classes with Apex code count toward coverage
Running tests without coverage flags still executes tests but doesn't fetch or validate coverage data:
Note: Fetching coverage data adds time to test execution, so only use it when needed.
Note: The dashboard output format is a new feature introduced in the November 2025 release of sfp-pro.
Standard Salesforce API output with JUnit XML and JSON results. This is the default format.
Generates:
.testresults/test-result-<testRunId>.json - Raw Salesforce test results
.testresults/test-result-<testRunId>-junit.xml - JUnit XML format
.testresults/test-result-<testRunId>-coverage.json - Coverage data (if coverage enabled)
Available in: sfp-pro November 2024 release and later
Structured JSON format optimized for dashboards, metrics systems, and reporting tools. Unlike the raw Salesforce API output, the dashboard format provides enriched, pre-processed data that's ready for consumption by external systems.
Generates all raw format files plus:
.testresults/<testRunId>/dashboard.json - Structured test results
.testresults/<testRunId>/testresults.md - Enhanced markdown summary
.testresults/latest.json - Symlink to latest dashboard result
The dashboard.json file contains a comprehensive test execution report:
For Metrics and Observability:
For Test History Tracking:
In dashboard mode, test failures don't cause the command to exit with error code 1, allowing you to collect test results even when tests fail:
This is useful for:
Collecting metrics regardless of test outcome
Generating reports without blocking pipelines
Archiving test history across passing and failing runs
Trend analysis and test reliability tracking
Available in: sfp-pro November 2025 release and later
Generates both raw and dashboard formats in a single execution.
When to use:
Maintaining compatibility while adopting dashboard format
Comprehensive test result archiving
After running tests, sfp creates a .testresults directory:
Control how long the command waits for tests to complete:
Wait time behavior:
Omit -w flag: Wait indefinitely (no timeout)
-w 0: Wait indefinitely (no timeout)
-w <minutes>: Wait up to specified minutes before timing out
For most scenarios, omitting the wait time or using 0 is recommended to avoid premature timeouts on large test suites.
Some test classes interfere with each other when run in parallel. Configure serial execution in your package descriptor:
Or use the synchronous flag (if supported):
See which classes failed coverage requirements:
The debug output shows:
Each class and its coverage percentage
Which classes passed/failed threshold
Overall package coverage
If no tests are executed:
Check that test classes exist in the package:
Ensure test classes follow naming conventions:
Class name ends with Test
Methods are annotated with @isTest
sfp automatically retries failed tests in serial mode. This is normal behavior to handle flaky tests that fail in parallel execution:
First run: Tests execute in parallel
If failures occur: Failed tests retry in serial mode
Final results: Combines both runs, removes duplicates
Control test execution timeout behavior:
Options:
Omit -w: Wait indefinitely
-w 0: Wait indefinitely
-w <minutes>: Wait specified minutes before timeout
Override the API version for the test run:
Include git information in test results:
This metadata appears in:
Dashboard JSON output
Markdown summaries
Test reports
Specify environment name for dashboard format:
Defaults to the target org alias if not specified.
# Test a specific package (primary pattern)
sfp apextests trigger -o my-org -l RunAllTestsInPackage -n my-package
# Test all packages in a domain (recommended for domain validation)
sfp apextests trigger -o my-org -l RunAllTestsInDomain -r config/release-config.yaml
# Test multiple packages together
sfp apextests trigger -o my-org -l RunAllTestsInPackage -n package-a -n package-b
# Quick test during development
sfp apextests trigger -o my-org -l RunSpecifiedTests --specifiedtests MyTest.testresults/testresults.md - Markdown summary
Latest Symlink
No
Yes (latest.json)
Metadata
Limited
Environment, repo, commit
Use Case
Salesforce tooling
External systems, dashboards
Exit on Failure
Yes (exit code 1)
No (exit code 0)
Verify test classes are in the correct package directory
Output
Salesforce API response
Processed, enriched data
Structure
Flat, verbose
Hierarchical, organized
File Location
.testresults/ root
.testresults/<testRunId>/
Coverage
Separate file
Integrated in JSON
# Single package
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n sales-core
# Multiple packages
sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
-n sales-core \
-n sales-ui \
-n sales-integrationsfp apextests trigger -o dev-org -l RunAllTestsInDomain \
-r config/release-config-sales.yaml# Run specific test classes
sfp apextests trigger -o dev-org -l RunSpecifiedTests \
--specifiedtests AccountTest,ContactTest
# Run specific test methods
sfp apextests trigger -o dev-org -l RunSpecifiedTests \
--specifiedtests AccountTest.testCreate,ContactTest.testUpdatesfp apextests trigger -o dev-org -l RunApexTestSuite \
--apextestsuite QuickTestssfp apextests trigger -o dev-org -l RunLocalTestssfp apextests trigger -o dev-org -l RunAllTestsInOrgsfp apextests trigger -o dev-org -l RunAllTestsInPackage \
-n my-package \
-c \
-p 80sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
-n my-package \
--validatepackagecoverage \
-p 75# Run tests without coverage data
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package
# Run tests with coverage fetching and validation
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -csfp apextests trigger -o dev-org -l RunLocalTests# Generate dashboard format
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
--outputformat dashboard \
--environment dev{
"environment": "dev",
"timestamp": "2025-11-24T10:30:00.000Z",
"duration": 125000,
"testExecutionTime": 120000,
"commandTime": 115000,
"repository": "https://github.com/myorg/myrepo",
"commitSha": "abc123def",
"branch": "main",
"summary": {
"totalTests": 150,
"passed": 145,
"failed": 5,
"skipped": 0,
"passingRate": 96.67,
"overallCoverage": 82.5,
"coveredLines": 8250,
"totalLines": 10000,
"outcome": "Failed"
},
"coverage": {
"overallCoverage": 82.5,
"totalLines": 10000,
"coveredLines": 8250,
"classes": [
{
"name": "AccountService",
"id": "01p...",
"coverage": 95.5,
"totalLines": 200,
"coveredLines": 191,
"status": "pass"
}
],
"uncoveredClasses": ["LegacyHelper"],
"belowThreshold": [
{
"name": "OldProcessor",
"coverage": 65.0,
"threshold": 75
}
]
},
"testCases": [
{
"id": "07M...",
"name": "AccountService.testCreateAccount",
"className": "AccountService",
"methodName": "testCreateAccount",
"time": 250,
"status": "passed"
}
],
"topFailingTests": [
{
"name": "ContactTest.testValidation",
"className": "ContactTest",
"methodName": "testValidation",
"failureMessage": "System.AssertException: Expected 5, but got 3"
}
],
"metadata": {
"testRunId": "707...",
"orgId": "00D...",
"username": "[email protected]",
"package": "sales-core",
"testLevel": "RunAllTestsInPackage"
}
}# Run tests and extract metrics
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
--outputformat dashboard --json > results.json
# Extract key metrics
PASSING_RATE=$(jq -r '.result.summary.passingRate' results.json)
COVERAGE=$(jq -r '.result.summary.overallCoverage' results.json)
# Push to your metrics backend
curl -X POST https://metrics.example.com/api/tests \
-d @.testresults/latest.json# The latest.json symlink always points to most recent result
jq -r '.summary.passingRate' .testresults/latest.json
# Track coverage trends
jq -r '.coverage.belowThreshold[] | "\(.name): \(.coverage)%"' \
.testresults/latest.json# Command succeeds even if tests fail
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
--outputformat dashboard
echo $? # Always 0 in dashboard modesfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package \
--outputformat both \
--environment ci.testresults/
├── test-result-<testRunId>-junit.xml # JUnit XML format
├── test-result-<testRunId>.json # Raw Salesforce test results
├── test-result-<testRunId>-coverage.json # Code coverage data
├── testresults.md # Markdown summary
├── <testRunId>/ # Dashboard format directory
│ ├── dashboard.json # Structured test results
│ └── testresults.md # Enhanced markdown summary
└── latest.json # Symlink to latest dashboard result# Wait up to 120 minutes
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 120
# Wait indefinitely (no timeout) - useful for very large test suites
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 0
# Omitting the flag also waits indefinitely
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package{
"path": "src/my-package",
"package": "my-package",
"testSynchronous": true
}sfp apextests trigger -o dev-org -l RunAllTestsInPackage \
-n my-package -ssfp apextests trigger -o dev-org -l RunAllTestsInPackage \
-n my-package -c --loglevel debug# Verify package contents
sfp build -d mydevhub -n my-package --loglevel debug# Wait indefinitely (recommended for large test suites)
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 0
# Wait up to 120 minutes
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package -w 120
# Omitting -w also waits indefinitely
sfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-packagesfp apextests trigger -o dev-org -l RunAllTestsInPackage -n my-package --apiversion 60.0sfp apextests trigger -o dev-org -l RunLocalTests \
--commitsha abc123def \
--repourl https://github.com/myorg/myreposfp apextests trigger -o dev-org -l RunLocalTests \
--outputformat dashboard \
--environment "dev-feature-branch"Availability
✅
❌
From
September 25
The compliance check functionality ensures your Salesforce metadata adheres to organizational standards and best practices. This feature helps maintain code quality, security, and consistency across your Salesforce project by enforcing configurable rules.
Compliance checking provides a comprehensive framework for:
Enforcing coding standards and best practices
Preventing security vulnerabilities like hardcoded IDs and URLs
Maintaining API version consistency
Validating metadata field values against organizational policies
The compliance checker:
Loads rules from your configuration file (defaults to config/compliance-rules.yaml)
Scans metadata components using Salesforce's ComponentSet API
Applies rules based on metadata type and field specifications
Evaluates content and field values using configurable operators
Compliance checking is configured through YAML files that define rules and their enforcement:
Create a compliance rules file using the generate command:
This creates config/compliance-rules.yaml with sample rules:
The system includes several built-in rules that can be enabled:
Documentation & Metadata Quality
Define custom rules to match your specific organizational requirements:
For XML metadata files, use dot notation to specify field paths:
Use the special _content field to analyze file content:
The compliance check provides detailed violation reports with multiple output formats:
Each violation includes:
File Path: Exact location of the violation
Line Number: Specific line where the issue occurs (when applicable)
Rule Name: Which rule was violated
Severity: Error, Warning, or Info level
When integrating compliance checking in your CI/CD pipeline:
Enforce Compliance Standards:
Generate Reports:
GitHub Actions Integration:
Use the same scoping options as other analysis commands:
Analyze only specific changed files:
In GitHub Actions PR context, the analyzer automatically detects and analyzes only changed files when no package, domain, or source path filters are specified.
The compliance checker supports multiple output formats:
Console: Human-readable terminal output with color coding
Markdown: Detailed reports suitable for documentation
JSON: Machine-readable format for integration with other tools
GitHub: Special format for GitHub Checks API integration
Ensure all components use recent API versions:
Prevent security vulnerabilities:
Restrict dangerous permissions:
Use different log levels to control output verbosity:
Debug logging shows:
Rules being processed
Files being scanned
Field extraction details
Rule evaluation results
Verify Field Path: Ensure the field path matches the XML structure
Check Metadata Types: Confirm the rule applies to the correct metadata types
Validate Operators: Ensure the operator logic matches your expectations
Enable Debug Logging: Use --loglevel debug
Refine Rule Conditions: Adjust operators or values to be more specific
Scope Rules Appropriately: Use metadata type filters to target specific components
Create Exceptions: Disable rules for specific packages or paths when needed
Scope Analysis: Use package, domain, or path filters to reduce scope
Optimize Rules: Complex regex patterns can slow down content analysis
Exclude Large Files: Consider excluding generated or vendor files
Generating detailed violation reports with remediation guidance
Reports violations with file locations, severity levels, and helpful messages
flow-inactive-check
Detects inactive or draft flows
Flow
Disabled
permissionset-view-all-data
Prevents ViewAllData permission
PermissionSet
Disabled
permissionset-modify-all-data
Prevents ModifyAllData permission
PermissionSet
Disabled
permissionset-author-apex
Detects AuthorApex permission
PermissionSet
Disabled
permissionset-customize-application
Detects CustomizeApplication permission
PermissionSet
Disabled
validation-rule-missing-description
Ensures validation rules have descriptions
ValidationRule
Disabled
enabled
Whether the rule is active
No
true/false (default: false)
metadata
Metadata types to check
Yes
Array of metadata type names
field
Field path to evaluate
Yes
Dot-notation path or "_content"
operator
Comparison operator
Yes
See operators table below
value
Expected value for comparison
Yes
String, Number, Boolean
severity
Violation severity level
Yes
error, warning, info
message
Custom violation message
No
String
greater_than
Numeric field is greater than value
apiVersion > 58.0
less_than
Numeric field is less than value
apiVersion < 60.0
greater_or_equal
Numeric field is greater than or equal
apiVersion >= 59.0
less_or_equal
Numeric field is less than or equal
apiVersion <= 59.0
regex
Field matches regular expression
Content matches [a-zA-Z0-9]{15}
Message: Descriptive explanation and remediation guidance
Actual Value: The found value that triggered the violation
Expected Value: What the rule expected to find
no-hardcoded-ids
Prevents hardcoded 15/18 character IDs
ApexClass, ApexTrigger, Flow
Disabled
no-hardcoded-urls
Prevents hardcoded Salesforce URLs
ApexClass, ApexTrigger, Flow
Disabled
profile-no-modify-all
Prevents Modify All Data permission
Profile
field-missing-description
Ensures custom fields have descriptions
CustomField
Disabled
field-missing-help-text
Ensures custom fields have help text
CustomField
Disabled
object-missing-description
Ensures custom objects have descriptions
CustomObject
id
Unique identifier for the rule
Yes
String
name
Human-readable rule name
No
String
description
Detailed rule description
No
equals
Field equals expected value
apiVersion equals 59.0
not_equals
Field does not equal expected value
status not_equals Inactive
contains
Field contains substring
Content contains hardcoded
not_contains
Field does not contain substring
Content not_contains TODO
Disabled
Disabled
String
sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliancesfp project:analyze --generate-compliance-config# SFP Compliance Rules Configuration
extends: default
rules:
- id: no-hardcoded-ids
enabled: true
comment: Enable this rule to prevent hardcoded Salesforce IDs
- id: no-hardcoded-urls
enabled: false
comment: Enable this rule to prevent hardcoded Salesforce URLs
- id: profile-no-modify-all
enabled: false
comment: Enable this rule to prevent profiles with Modify All Data permission
- id: custom-api-version
name: Minimum API Version Check
enabled: true
metadata:
- ApexClass
- ApexTrigger
field: ApexClass.apiVersion
operator: greater_or_equal
value: 59.0
severity: warning
message: Component should use API version 59.0 or higherrules:
- id: minimum-api-version
name: Enforce Minimum API Version
description: All components must use API version 59.0 or higher
enabled: true
metadata:
- ApexClass
- ApexTrigger
field: ApexClass.apiVersion
operator: greater_or_equal
value: 59.0
severity: warning
message: Component should use API version 59.0 or higher# For ApexClass metadata XML
field: ApexClass.apiVersion
# For nested fields
field: ApexClass.packageVersions.majorNumber
# For Profile permissions
field: Profile.userPermissions.ModifyAllDatafield: _content
operator: regex
value: '[a-zA-Z0-9]{15}|[a-zA-Z0-9]{18}' # Salesforce ID pattern📋 Compliance Check Results
═══════════════════════════
Rules Checked: 3/5
Found 5 violations:
❌ Errors (2):
• src/classes/MyClass.cls-meta.xml:3
Component should use API version 59.0 or higher
Rule: Minimum API Version Check
⚠️ Warnings (3):
• src/classes/Controller.cls:15
Hardcoded Salesforce IDs found - use Custom Settings, Custom Metadata, or SOQL queries instead
Rule: No Hardcoded Salesforce IDssfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliancesfp project:analyze --compliance-rules config/compliance-rules.yaml --report-dir ./reports --output-format markdown- name: Run Compliance Check
run: |
sfp project:analyze --compliance-rules config/compliance-rules.yaml --fail-on compliance --output-format github
env:
GITHUB_APP_PRIVATE_KEY: ${{ secrets.GITHUB_APP_PRIVATE_KEY }}
GITHUB_APP_ID: ${{ secrets.GITHUB_APP_ID }}sfp project:analyze --compliance-rules config/compliance-rules.yaml -p core,utilssfp project:analyze --compliance-rules config/compliance-rules.yaml -d salessfp project:analyze --compliance-rules config/compliance-rules.yaml -s ./force-app/main/default# Manual specification of changed files
sfp project:analyze --changed-files "src/classes/MyClass.cls,src/lwc/myComponent/myComponent.html"- id: minimum-api-version
enabled: true
metadata: [ApexClass, ApexTrigger]
field: ApexClass.apiVersion
operator: greater_or_equal
value: 59.0
severity: warning- id: no-hardcoded-credentials
enabled: true
metadata: [ApexClass]
field: _content
operator: regex
value: '(password|secret|key)\s*=\s*["\'][^"\']+["\']'
severity: error
message: Remove hardcoded credentials - use Named Credentials or Custom Settings- id: no-modify-all-data
enabled: true
metadata: [Profile]
field: Profile.userPermissions.ModifyAllData
operator: equals
value: true
severity: error
message: Profile should not have Modify All Data permission# Basic progress information
sfp project:analyze --compliance-rules config/compliance-rules.yaml --loglevel info
# Detailed debugging information
sfp project:analyze --compliance-rules config/compliance-rules.yaml --loglevel debugextends: default
rules:
# Security Rules
- id: no-hardcoded-ids
enabled: true
- id: no-hardcoded-urls
enabled: true
# API Version Standards
- id: minimum-api-version
name: Enforce API Version 59+
enabled: true
metadata: [ApexClass, ApexTrigger]
field: ApexClass.apiVersion
operator: greater_or_equal
value: 59.0
severity: warning
# Profile Security
- id: profile-no-modify-all
enabled: true
severity: errorextends: default
rules:
# Field Documentation
- id: field-missing-description
enabled: true
severity: warning
- id: field-missing-help-text
enabled: true
severity: warning
# Object Documentation
- id: object-missing-description
enabled: true
severity: warning
# Validation Rules
- id: validation-rule-missing-description
enabled: true
severity: info
message: Test classes should follow naming convention with 'Test' suffix