Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The following list of software is the minimum to get started using sfp. Our assumption is that you are familiar with the Salesforce CLI and are comfortable using VS Code. To keep the material simple, we will be assuming you already have access to a Salesforce Sandbox that you can build and install artifacts.
Good Work! If you made it past the getting started guide with minimal errors and questions, you are well on your way to introduce sfp into your Salesforce Delivery workflow.
Let's summarize what you have done:
Setup pre-requisite software on your workstation and have access to a Salesforce Org.
Installed the latest sfp cli.
Configured your source project and added additional properties required for sfp cli to generate artifacts.
Build artifact(s) locally to be used to deploy.
Installed artifact(s) to target org.
This is just the tip of the iceberg for the full features sfp can provide for you and your team. Please continue to read further and experiment.
For any comments/recommendations to sfp so please join our Slack Community. If you are adventurous, contribute!
sfp requires node-gyp for its dependencies. If you are facing issues during installation, with node-gyp, please follow the instructions here https://www.npmjs.com/package/node-gyp
Head to https://source.flxbl.io/flxbl/sfp-pro/releases and download the .tgz file from the one of the releases
sfp-pro requires node-gyp for its dependencies. If you are facing issues during installation, with node-gyp, please follow the instructions here https://www.npmjs.com/package/node-gyp
In the earlier section, we looked at how we configured the project directory for sfp, Now lets walk through some core commands.
The build command will generate a zipped artifact file for each package you have defined in the sfdx-project.json file. The artifact file will contain metadata and the source code at the point build creation for you to use to install.
Open up a terminal within your Salesforce project directory and enter the following command:
You will see the logs with details of your package creation, for instance here is a sample output
A new "artifacts" folder will be generated within your source project containing a zipped artifact file for each package defined in your sfdx-project.json file.
For example, the artifact files will contain the following naming convention with the "_sfpowerscript_artifact_" in between the package name and version number.
package-name_sfpowerscripts_artifact_1.0.0-1.zip
Ensure you have authenticated your salesforce cli to a org first. If you haven't, please find the instructions here.
Once you have the authenticated to the sandbox, you could execute the installation command as below
Navigate to your target org and confirm that the package is now installed with the expected changes from your artifact. In this example above, a new custom field has been added to the Account Standard Object.
Depending on the type of packages, sfp will issue the equivalent test classes with in the package directory and it could result in failures during installation, Please fix the issues in your code and repeat till you get a sucessful installation. If your packages doesn't have sufficient test coverage, you may need to use the all tests in the org to get your package installed. Refer to the material here
sfp docker images are published from the flxbl-io Github packages registry at the link provided below
One can utilize the flxbl-io sfp images by using the
Ensure you have in your production org or created a developer org.
Ensure you have authenticated your local development environment to your DevHub. If you are not familiar with the process, you can follow the instructions provided by Salesforce .
sfp cli only works with a Salesforce DX project, with source format as described . If your project is not source format, you would need to convert into source format using the
Navigate to your sfdx-project.json file and locate the packageDirectories
property.
In the above example, the package directories are
force-app
unpackaged
utils
Add the following additional attributes to the sfdx-project.json and save.
package
versionNumber
Thats the minimal configuration required to run sfp on a project.
Move on to the next chapter to execute sfp commands in this directory.
The Salesforce CLI is a command-line interface that simplifies development and build automation when working with your Salesforce org. There are numerous of commands that the sf cli provides natively that is beyond the scope of this site and can be found on the official Salesforce Documentation Site.
From a build, test, and deployment perspective, the following diagram depicts the bare minimum commands necessary to get up and running in setting up your sf project, retrieving and deploying code to the target environments.
The diagram below depicts the basic flow of the development and test process, building artifacts, and deploying to target environments.
Once you have mastered the basic workflow, you can progress to publishing artifacts to a NPM Repository that will store immutable, versions of the metadata and code used to drive the release of your packages across Salesforce Environments.
The list below is a curated list of core sf cli and Salesforce DX developer guides for your reference.
SF CLI
sfp
You can also pin to a specific version of the docker image, by using the version published To preview latest images for the docker image, visit the and update your container image reference. For example:
For detailed configurations on this sfdx-project.json schema for sfp, click .
sfp is based on the for building a command line interface (CLI) in . Instead of being a typical Salesforce CLI plugin, sfp is standalone and leverages the same core libraries and APIs as the . sfp releases are independently managed and as core npm libraries are stable, we will update them as needed to ensure no breaking changes are introduced.
Unlocked/Org Dependent Unlocked Packages
There is a huge amount of documentation on unlocked packages. Here are list of curated links that can help you get started on learning more about unlocked package
The Basics
Advanced Materials
The following sections deals with more of the operational aspects when dealing with unlocked packages
Unlocked Packages, excluding Org-Dependent unlocked packages have mandatory test coverage requirements. Each package should have minimum of 75% coverage requirement. A validated build (or build command in sfp) validates the coverage of package during the build phase. To enable the feedback earlier in the process, sfp provide you functionality to validate test coverage of a package such as during the Pull Request Validation process.
For unlocked packages, we ask users to follow a semantic versioning of packages.
Please note Salesforce packages do not support the concept of PreRelease/BuildMetadata. The last segment of a version number is a build number. We recommend to utilize the auto increment functionality provided by Salesforce rather than rolling out your own build number substitution ( Use 'NEXT' while describing the build version of the package and 'LATEST' to the build number where the package is used as a dependency)
Note that an unlocked package must be promoted before it can be installed to a production org, and either the major, minor or patch (not build) version must be higher than the last version of this package which was promoted. These version number changes should be made in the sfdx-project.json
file before the final package build and promotion.
Unlocked packages provide traceability in the org by locking down the metadata components to the package that introduces it. This feature which is the main benefit of unlocked package can also create issues when you want to refactor components from one package to another. Let's look at some scenarios and common strategies that need to be applied
For a project that has two packages.
Package A and Package B
Package B is dependent on Package A.
Remove a component from Package A, provided the component has no dependency
Solution: Create a new version of Package A with the metadata component being removed and install the package.
Move a metadata component from Package A to Package B
Solution: This scenario is pretty straight forward, one can remove the metadata component from Package A and move that to Package B. When a new version of Package A gets installed, the following things happen:
If the deployment of the unlocked package is set to mixed, and no other metadata component is dependent on the component, the component gets deleted.
On the subsequent install of Package B, Package B restores the field and takes ownership of the component.
Move a metadata component from Package B to Package A, where the component currently has other dependencies in Package B
Solution: In this scenario, one can move the component to Package A and get the packages built. However during deployment to an org, Package A will fail with an error this component exists in Package B. To mitigate this one should do the following:
Deploy a version of Package B which removes the lock on the metadata component using deprecate mode. Some times this needs extensive refactoring to other components to break the dependencies. So evaluate whether the approach will work.
If not, you can go to the UI (Setup > Packaging > Installed Packages > <Name of Package> > View Components and Remove) and remove the lock for a package.
Package dependencies are defined in the sfdx-project.json. More information on defining package dependencies can be found in the Salesforce docs.
Let's unpack the concepts utilizing the above example:
There are two unlocked packages
Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias
Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'
External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.
Unlocked packages have two build modes, one with skip dependency check and one without. A package being built without skipping dependency check cant be deployed into production and can usually take a long time to build. sfp cli tries to build packages in parallel understanding your dependency, however some of your packages could spend a significant time in validation.
During these situations, we ask you to consider whether the time taken to build all validated packages on an average is within your build budget, If not, here are your options
Move to org dependent package: Org-dependent unlocked packages are a variant of unlocked packages. Org-dependent packages do not validate the dependencies of a package and will be faster. However please note that all the org's where the earlier unlocked package was installed, had to be deprecated and the component locks removed, before the new org-dependent unlocked package is installed.
Move to source-package: Use it as the least resort, source packages have a fairly loose lifecycle management.
Create a source package and move the metadata and any associated dependencies over to that particular package.
In Salesforce, a package is a container that groups together metadata and code in order to facilitate the deployment and distribution of customizations and apps. Salesforce supports different types of packages, such as:
Unlocked Packages: These are more modular and flexible than traditional managed packages, allowing developers to group and release customizations independently. They are version-controlled, upgradeable, and can include dependencies on other packages.
Org-Dependent Unlocked Packages: Similar to unlocked packages but with a dependency on the metadata of the target org, making them less portable but useful for specific org customizations.
Managed Packages: Managed packages are a type of Salesforce package primarily used by ISVs (Independent Software Vendors) for distributing and selling applications on the Salesforce AppExchange. They are fully encapsulated, which means the underlying code and metadata are not accessible or editable by the installing organization. Managed packages support versioning, dependency management, and can enforce licensing. They are ideal for creating applications that need to be securely distributed and updated across multiple Salesforce orgs without exposing proprietary code.
sfp auguments the above formal salesforce package types with additional package types such as below
Source Packages: These are not formal packages in Salesforce but refer to a collection of metadata and code retrieved from a Salesforce org or version control that can be deployed to an org but aren't versioned or managed as a single entity.
Diff Packages: These are not a formal type of Salesforce package but refer to packages created by determining the differences (diff) between two sets of metadata, often used for deploying specific changes.
In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package (of any type mentioned above), an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices.
Key differences between Salesforce packages and sfp artifacts include:
Versioning and Dependencies: While Salesforce packages support versioning, sfp artifacts enrich this with detailed dependency tracking, ensuring that the CI/CD pipeline respects the order of package installations based on dependencies.
Installation Behavior: Artifacts in sfp carries additional metadata that defines custom installation behaviors, such as pre- and post-installation scripts or conditional installation steps, which are not inherently a part of Salesforce packages.
CI/CD Integration: Artifacts in sfp are specifically designed to fit into a CI/CD pipeline, such as supporting storing in an artifact registory, version tracking, and release management that are essential for automated deployments but are outside the scope of Salesforce packages themselves.
sfp is a purpose-built cli-based tool for modular Salesforce development and release management. sfp aims to streamline and automate the build, test, and deployment processes of Salesforce metadata, code, and data. It extends sf cli functionalities, focusing on artifact-driven development to support #flxbl salesforce project development
Built with codified process: sfp cli is derived from extensive experience in modular Salesforce implementations. By embracing the #FLXBL framework, it aims to streamline the process of creating a well-architected, composable Salesforce Org. sfp cli eliminates the need for time-consuming efforts usually spent on re-inventing fundamental processes, helping you achieve your objectives faster
Artifact-Centric Approach: sfp packages Salesforce code and metadata into artifacts and deployment details, ensuring consistent deployments and simplified version management across environments.
Best-in-Class Mono Repo Support: Offers robust support for mono repositories, facilitating streamlined development, integration, and collaboration.
Support for Multiple Package Types: sfp accommodates various Salesforce package types with streamlined commands, enabling modular development, independent versioning, and flexible deployment strategies.
Orchestrate Across Entire Lifecycle: sfp provides an extensive set of functionality across the entire lifecycle of your Salesforce development.
End-to-End Observability: sfp is built with comprehensive metrics emitted on every command, providing unparalleled visibility into your ALM process.
sfp incorporates a suite of commands to aid in your end-to-end development cycle for Salesforce. Starting with the core commands, you can perform basic workflows to build and deploy artifacts (locally to start and to a NPM artifact repository after) across environments through the command line. As you get comfortable with the core commands, you can utilize more advanced commands and flags in your CI/CD platform to drive a complete release process, leveraging release definitions, change logs, metrics, and more.
sfp is constantly evolving and driven by the passionate community that has embraced our work methods. Over the years, we have introduced utility commands to solve pain points specific to the Salesforce Platform. The commands have been successfully tested and used on large scale enterprise implementations. As we continue to grow the toolset, we hope to introduce more commands to address the future wave of challenges.
Below is a high-level snapshot of the main topics and commands of sfp.
Packages are containers used to group related metadata together. A package would contain components such as objects, fields, apex, flows and more allowing these elements to be easily installed, upgraded and managed as a single unit Packages in the context of sfp are not limited to second generation packaging (2GP), sfp supports various types of packages which you can read in the following sections
sfp is a purpose-built cli tool used predominantly in a modular salesforce project.
A project utilizing sfp implements the following concepts.
In an sfp-powered Salesforce project, "Domains" represent distinct business capabilities. A project can encompass multiple domains, and each domain may include various sub-domains. Domains are not explicitly declared within sfp but are conceptually organized through "Release Configs"
A package (synonymous with modules) is a container that groups related metadata together. A package would contain components such as objects, fields, apex, flows and more, allowing these elements to be easily installed, upgraded and managed as a single unit. A package is defined as a directory in your project repository and is defined by an entry in sfdx-project.json
An artifact represents a versioned snapshot of a package at a specific point in time. It includes the source code from the package directory (as specified in sfdx-project.json
), along with metadata about the version, change logs, and other relevant details. Artifacts are the deployable units in the sfp framework, ensuring consistency and traceability across the development lifecycle.
When sfp is integrated into a Salesforce project, it centralizes around the following key essential process
Building' refers to the creation of an artifact from a package. Utilizing the build
command, sfp facilitates the generation of artifacts for each package in your repository. This process encapsulates the package's source code and metadata into a versioned artifact ready for installation.
Publishing a domain involves the process of publishing the artifacts generated by the build command into an artifact repository. This is the storage area where the artifacts are fetched for releases, rollback, etc.
Releasing an domain involves the process of promoting, installing a collection of artifacts to a higher org such as production, generating an associated changelog for the domain. This process is driven by the release command along with a release definition.
sfp cli supports operations on various types of packages within your repository. A short summary on the comparison between different package types is provided below
Features | Unlocked Packages | Org Dependent Unlocked Packages | Source Packages | Data Packages | Diff Packages |
---|---|---|---|---|---|
sfp is a natural fit for organisations that utilize Salesforce in a large enterprise setting as its purpose built to deal with modular saleforce development. Often these organisations have teams dedicated to a particular business function, think of a team who works on features related to billing, while a team works on features related to service
The diagram illustrates the method of organizing packages into specific categories for various business units. These categories are referred to as 'domains' within the context of sfp cli. Each domain can either contain further domains ( sub-domains) and each domain constitute of one or more packages.
sfp cli utilizes to organise packages into domains. You can read more about creating a release config in the next section
A diff package is a variant of '' , where the contents of the package only contain the components that are changed. This package is generated by sfpowerscripts by computing a git diff
of the current commit id against a baseline set in the Dev Hub Org
A diff package mimics a model, where only changes are contained in the artifact. As such, this package is always an incremental package. It only deploys the changes incrementally compared to baseline and applied to the target org. Unless both previous version and the current version have the exact same components, a diff package can never be rolled back, as the impact on the org is unpredictable. It is always recommended to roll forward a diff package by fixing or creating a necessary change that counters the impact that you want on the target orgs.
Diff packages doesnt work with scratch orgs. It should be used with sandboxes only.
A diff package is the least consistent package among the various package types available within sfpowerscripts, and should only be used for transitioning to a modular development model as prescribed by flxbl
The below example demonstrates a sfdx-project.json where the package unpackaged
is a diff package. You can mark a diff package with the type 'diff'. All other attributes applicable to source packages are applicable for diff packages.
A manual entry in sfpowerscripts_Artifact2__c
custom object should be made with the name of the package and the baseline commit id and version. Subsequent deployments will automatically reset the baseline when the package gets deployed to Dev Hub
Source Packages is an sfp cli feature that mimics the behaviour of native unlocked packages.
Source Packages are metadata deployments from a Salesforce perspective, it is a group of components that are deployed to an org. Unlocked packages are a First Class Salesforce deployment construct, where the lifecycle is governed by the org, such as deleting/deprecating metadata and validating versions.
We always recommend using unlocked packages over source packages whenever you can. As a matter of preference, this is our priority of package types
Source Packages
Source Packages are typically used for application-config (configuring an application delivered by a managed package such as changes to help text/description of fields ) or when you come across these constraints
Facing bugs while deploying the metadata using unlocked packages
Unlocked Package validation takes too long (still we recommend go org-dependent)
Dealing with metadata that is global or org-specific in nature (such as queues, profiles, etc or composite UI layouts, which doesn't make sense to be packaged using unlocked package)
Development teams who are starting to adopt package-based development and want to organize their metadata
A Salesforce Org can be composed only with source packages, however the lack of dependency validation as in unlocked packages, lack of automated destruction of changes can make it a bit challenging. As salesforce org is not aware of the source package, there is no component lock , so another source package with same metadata component can alwasy overwrite a metadata component deployed by another package. For these, reasons, we always recommend you prefer unlocked package or its variant org dependent unlocked packages.
An example of a common metadata component that typically gets overridden is which can span across multiple packages.
Source packages can depend on other unlocked packages or managed packages, however dependencies of source packages are validated during deployment time, ie., source packages assume that dependent metadata is already there in your org before the metadata in the source package is being deployed. That being said, for purposes of development in scratch org, you could add 'unlocked/managed package' dependencies to a source package, so sfp cli commands like prepare and validate (in sfpowerscripts:orchestrator) will install the dependencies to the target org before proceedint to install source package
Org-dependent unlocked packages, a variation of unlocked packages, allow you to create packages that depend on unpackaged metadata in the target org. Org dependent package is very useful in the context of orgs that have lots of metadata and is struggling with understanding the dependency while building a ''
Org dependent packages significantly enhance the efficiency of #flxbl projects who are already on scratch org based development. By allowing installation on a clean slate, these packages validate dependencies upfront, thereby reducing the additional validation time often required by unlocked packages.
Org-dependent unlocked packages bypass the test coverage requirements, enabling installation in production without standard validation. This differs significantly from metadata deployments, where each Apex class deployed must meet a 75% coverage threshold or rely on the org's overall test coverage. While beneficial for large, established orgs, this approach should be used cautiously.
To address this, sfpowerscripts incorporates a default test coverage validation for org-dependent unlocked packages during the validation process. To disable this test coverage check during validation, additional attributes must be added to the package directory in the sfdx-project.json
file.
Data packages are a sfpowerscripts construct that utilise the to create a versioned artifact of Salesforce object records in csv format, which can be deployed to the a Salesforce org using the sfpowerscripts package installation command.
The Data Package offers a seamless method of integrating Salesforce data into your CICD pipelines , and is primarily intended for record-based configuration of managed package such as CPQ, Vlocity (Salesforce Industries), and nCino.
Data packages are a wrapper around SFDMU that provide a few key benefits:
Ability to skip the package if already installed: By keeping a record of the version of the package installed in the target org with the support of an unlocked package, sfpowerscripts can skip installation of data packages if it is already installed in the org
Versioned Artifact: Aligned with sfpowerscripts principle of traceability, every deployment is traceable to a versioned artifact, which is difficult to achieve when you are using a folder to deploy
Orchestration: Data package creation and installation can be orchestrated by sfpowerscripts, which means less scripting
Simply add an entry in the package directories, providing the package's name, path, version number and type (data). Your editor may complain that the 'type' property is not allowed, but this can be safely ignored.
Data packages support the following options, through the sfdx-project.json.
sfpowerscripts support vlocity RBC migration using the vlocity build tool (vbt). sfpowerscripts will be automatically able to detect whether a data package need to be deployed using vlocity or using sfdmu. (Please not to enable vlocity in preparing scratchOrgs, the enableVlocity flag need to be turned on in the pool configuration file)
A vlocity data package need to have vlocityComponents.yaml file in the root of the package directory and it should have the following definition
The same package would be defined in the sfdx-project.json as follows
This feature is by default activated whenever build/quickbuild even in implicit scenarios such as validate, prepare etc, which might result in building packages.
Let's consider the following sfdx-project.json to explain how this feature works.
The above project manifest (sfdx-project.json) describes three packages, sfdc-logging, feature-mgmt., core-crm . Each package are defined with dependencies as described below
As you might have noticed, this is an incorrect representation, as per the definitions of unlocked package, the package 'core-crm' should be explicitly defining all its dependencies. This means it should be as described below.
To successfully create a version of core-crm , both sfdc-logging and feature-mgmt. should be defined as an explicit dependency in the sfdx-project.json
As the number of packages grow in your project, it is often seen developers might accidentally miss declaring dependencies or the sfdx-project.json has become so large due to large amount of repetition of dependencies between packages. This condition would result in build stage often failing with missing dependencies.
sfp features a transitive dependency resolution which can autofill the dependencies of the package by inferring the dependencies from sfdx-project.json, so the above package descriptor of core-crm will be resolved correctly to
Please note, in the current iteration, it will autofill dependency information from the current sfdx-project.json and doesn't consider variations among previous versions.
For dependencies outside of the sfdx-project.json, one could define an externalDependencyMap as shown below
If you need to disable this feature and have stringent dependency management rules, utilize the following in your sfdx-project.json
An external dependency is a package that is not defined within the current repositories sfdx-project.json. Managed packages and unlocked packages built from other repositories fall into 'external dependency' bucket. IDs of External packages have to be defined explicitly in the packageAliases section.
SFP-Pro provides Docker images through our self-hosted Gitea registry at source.flxbl.io. These pre-built images are maintained and updated regularly with the latest features and security patches.
Access to source.flxbl.io (Gitea server)
Docker installed on your machine
Registry credentials from your welcome email
Login to the Gitea registry:
Pull the desired image:
The version numbers can be found at https://source.flxbl.io/flxbl/-/packages/container/sfp-pro/
(Optional) Tag for your registry:
Use specific version tags in production
Cache images in your private registry for better performance
Implement proper access controls in your registry
Document image versions used in your pipelines
If you need to build the images yourself, you can access the source code from source.flxbl.io and follow these instructions:
Docker with BuildKit support
GitHub Personal Access Token with packages:read
permissions
Node.js (for local development)
The following build arguments are supported:
NODE_MAJOR
: Node.js major version (default: 22)
SFP_VERSION
: Version of SFP Pro to build
GIT_COMMIT
: Git commit hash for versioning
SF_COMMIT_ID
: Salesforce commit ID
For issues or questions about Docker images, please contact flxbl support through your designated support channels.
This section details identifies on how sfp analyzes and classifies different package types by using the information in sfdx-project.json
Unlocked Packages are identified if a matching alias with a package version ID is found and verified through the DevHub. For example, the package named "Expense-Manager-Util" is found to be an Unlocked package upon correlation with its alias "Expense Manager - Util" and subsequent verification.
Source Packages are assumed if no matching alias is found in packageAliases
. These packages are typically used for source code that is not meant to be packaged and released as a managed or unlocked package.
The presence of an additional type
attribute within a package directory will further inform sfp of the specific nature of the package. For instance, types such as "data" for data packages or "diff" for diff packages
The sfdx-project.json file outlines various specifications for Salesforce DX projects, including the definition and management of different types of Salesforce packages. From the sample provided, sfp (Salesforce Package Builder) analyzes the "package" attribute within each packageDirectories
entry, correlating with packageAliases
to identify package IDs, thereby determining the package's type as 2GP (Second Generation Packaging).
Consider the following sfdx-project.json
sfdx-project.json
The sfdx-project.json
sample can be used to determine how sfp processes and categorizes packages within a Salesforce DX project. The determination process for each package type, based on the attributes defined in the packageDirectories
, unfolds as follows:
Unlocked Packages: For a package to be identified as an Unlocked package, sfp looks for a correlation between the package
name defined in packageDirectories
and an alias within packageAliases
. In the provided example, the "Expense-Manager-Util" under the util
path is matched with its alias "Expense Manager - Util", subsequently confirmed through the DevHub with its package version ID, categorizing it as an Unlocked package.
Source Packages: If a package does not have a corresponding alias in packageAliases
, it is treated as a Source package. These packages are typically utilized for organizing source code not intended for release. For instance, packages specified in paths like "exp-core-config" and "expense-manager-test-data" would default to Source packages if no matching aliases are found.
Specialized Package Types: The explicit declaration of a type
attribute within a package directory allows for the differentiation into more specialized package types. For example, the specification of "type": "data"
explicitly marks a package as a Data package, targeting specific use cases different from typical code packages.
sfp features commands and functionality that helps in dealing with complexity of defining dependency of unlocked packages. These functionalities are designed considering aspects of a #flxbl project, such as the predominant use of mono repositories in the context of a non-ISV scenarios.
Package dependencies are defined in the sfdx-project.json (Project Manifest). More information on defining package dependencies can be found in the Salesforce .
Let's unpack the concepts utilizing the above example:
There are two unlocked packages and one source package
Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias
Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'
Expense Manager Org Config - a source package, lets assume has some reports and dashboards
External Apex Library is an external dependency, it could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies must be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.
Source Packages/Org Dependent Unlocked / Data packages have an implied dependency as defined by the order of installation, as in it assumes any depedent metadata is already available in the target org before the installation of components within the package
An artifact is a key concept within sfp. An artifact is a point in time snapshot of a version of a package, as mentioned in sfdx-project.json . The snapshot contains source code of a package directory , additional metadata information regarding the particular version, changelog and other details. An artifact for 2GP package would also contain details such as
In the context of sfp, packages are represented in your version control, and artifact is an output of a the build command when operated on a package
Artifacts provide an abstraction over version control, as it detaches the version control from from the point of releasing into a salesforce org. Almost all commands in sfp operates on an artifact or generates an artifact.
Artifacts built by sfp follow a naming convention that starts with the <name_of_the_package>sfpowerscripts_artifact_<Major>.<Minor>.<Patch>-<BuildNumber>. One can use any of the npm commands to interact with sfp artifacts.
Projects that utilise sfp predominantly follow a mono-repo structure similar to the picture shown above. Each repository has a "src" folder that holds one or more packages that map to your sfdx-project.json file.
Different folders in each of the structure are explained as below:
core-crm: A folder to house all the core model of your org which is shared with all other domains.
frameworks: This folder houses multiple packages which are basically utilities/technical frameworks such as Triggers, Logging and Error Handling, Dependency Injection etc.
sales: An example of a domain in your org. Under this particular domain, multiple packages that belong to the domain are included.
src-access-mgmt: This package is typically one of the packages that is deployed second to last in the deployment order and used to store profiles, permission sets, and permission set groups that are applied across the org. Permission Sets and Permission Set Groups particular to a domain should be in their respective package directory.
src-env-specific: An which carries metadata for each particular stage (environment) of your path to production. Some examples include named credentials, remote site settings, web links, custom metadata, custom settings, etc.
src-temp: This folder is marked as the default folder in sfdx-project.json
. This is the landing folder for all metadata and this particular folder doesn't get deployed anywhere other than a developers scratch org. This place is utilized to decide where the new metadata should be placed into.
src-ui: Should include page layouts, flexipages and Lightning/Classic apps unless we are sure these will only reference the components of a single domain package and its dependencies. In general, custom UI components such as LWC, Aura and Visualforce should be included in a relevant domain package.
runbooks: This folder stores markdown files required for each release and or sandbox refresh to ensure all manual steps are accounted for and versioned control. As releases are completed to production, each release run book can be archived as the manual steps should typically no longer be required. Sandbox refresh run books should be managed accordingly to the type of sandbox depending if they have data or only contain metadata.
scripts: This optional folder is to store commonly used APEX or SOQL scripts that need to be version controlled and reference by multiple team members.
src-env-specific should be added to .forceignore files and should not be deployed to a scratch org.
Release configuration is a fundamental setup that outlines the organisation of packages within a project, streamlining across different lifecycle of your project, such as validating, building, deploying/release of artifacts. In flxbl projects, a release config is used to define the concept of a domain/subdomains. This configuration is instrumental when using sfp commands, as it allows for selective operations on specified packages defined by a configuration. By employing a release configuration, teams can efficiently manage a mono repository of packages across various teams.
The below table list the options that are currently available for release configuration
Export your Salesforce records to csv files using the . For more information on plugin installation, creating an export.json file, and exporting to csv files, refer to Plugin Basic > Basic Usage in SFDMU's .
Refer to this for more details on how to add a pre/post script to data package
Package | Incorrectly Defined Dependencies |
---|
Package | Correctly Defined Dependencies |
---|
In the context of sfp, an artifact represents a more enhanced concept compared to a Salesforce package. While it is based on a Salesforce package or specific package types introduce by sfp, an artifact in sfp includes additional attributes and metadata that describe the package version, dependencies, installation behavior, and other context-specific information relevant to the CI/CD process. Artifacts in sfp are designed to be more than just a bundle of code and metadata; they encapsulate the package along with its CI/CD lifecycle information, making them more aligned with DevOps practices. sfp's artifacts are built to be compatible for npm package supported registries , most CI/CD providers provide a npm compatible registry to host these packages/artifacts. Here is the link to operate on Github Package Manager for instance ()
Parameter | Required | Type | Description |
---|
A release configuration also can contain additional options that can be used by certain sfp commands to generate . These properties in a release definiton alters the behaviour of deployment of artifacts during a release
Parameter | Required | Type | Description |
---|
Dependency Validation
Occurs during package creation
Occurs during package installation
Occurs during package installation
N/A
Occurs during package installation
Dependency Declaration
Yes
Yes (supported by sfp)
Yes
Yes
Yes (supported by sfp)
Requires dependency to be resolved during creation
Yes
No
No
N/A
No
Supported Metadata Types
Unlocked Package Section in Metadata Coverage Report
Unlocked Package Section in Metadata Coverage Report
Metadata API Section in Metadata Coverage Report
N/A
Metadata API Section in Metadata Coverage Report
Code Coverage Requirement
Package should have 75% code coverage or more
Not enforced by Salesforce, sfp by default checks for 75% code coverage
Each apex class should have a coverage of 75% or above for optimal deployment, otherwise the entire coverage of the org will be utilized for deployment
N/A
Each apex class that's part of the delta between the current version and the baseline needs a test class and requires a coverage of 75%.
Component Lifecycle
Automated
Automated
Explicit, utilize destructiveManifest or manual deletion
N/A
Explicit, utilize destructiveManifest or manual deletion
Component Lock
Yes, only one package can own the component
Yes, only one package can own the component
No
N/A
No
Version Management
Salesforce enforced versioning; Promotion required to deploy to prod
Salesforce enforced versioning; Promotion required to deploy to prod
sfp enforced versioning
sfp enforced versioning
sfp enforced versioning
sfdc-logging | None |
feature-mgmt | sfdc-logging |
core-crm | feature-mgmt |
sfdc-logging | None |
feature-mgmt | sfdc-logging |
core-crm | sfdc-logging, feature-mgmt |
releaseName | No | String | Name of the release config, in flxbl project, this name is used as the name of the domain |
pool | No | String | Name of the scratch org or sandbox pool associated with this release config during validation |
excludeArtifacts | No | Array | An array of artifacts that need to be excluded while creating the release definition |
includeOnlyArtifacts | No | Array | An array of artifacts that should only be included while creating the release definition |
dependencyOn | No | Array | An array of packages that denotes the dependency this configuration has. The dependencies mentioned will be used for synchronization in review sandboxes |
excludePackageDependencies | No | Array | Exclude the mentioned package dependencies from the release definition |
includeOnlyPackageDependencies | No | Array | Include only the mentioned package dependencies from the release definition |
releasedefinitionProperties | No | Object | Properties of release definition that should be added to the generated release definition. See below |
releasedefinitionProperties.skipIfAlreadyInstalled | No | boolean | Skip installation of artifact if it's already installed in target org |
releasedefinitionProperties.baselineOrg | No | string | The org used to decide whether to to skip installation of an artifact. Defaults to the target org when not provided |
releasedefinitionProperties.promotePackagesBeforeDeploymentToOrg | No | string | Promote packages before they are installed into an org that matches alias of the org |
releasedefinitionProperties.changelog.repoUrl | No | Prop | The URL of the version control system to push changelog files |
releasedefinitionProperties.changelog.workItemFilters | No | Prop | An array of regular expression used to identify work items in your commit messages |
releasedefinitionProperties.changelog.workitemUrl | No | Prop | The generic URL of work items, to which to append work item codes. Allows easy redirection to user stories by clicking on the work-item link in the changelog. |
releasedefinitionProperties.changelog.limit | No | Prop | Limit the number of releases to display in the changelog markdown |
releasedefinitionProperties.changelog.showAllArtifacts | No | Prop | Whether to show artifacts that haven't changed between releases |
sfp cli features an intiutive command to build artifacts of all your packages in your project directory. The 'build' command automatically detects the type of package and builds an artifact individually for each package
By default, the behaviour of sfp's build command is to build a new version of all packages in your project directory and and creates it associated artifact.
sfp's build command is also equipped with an ability to selectively build only packages that are changed. Read more on how sfp determines a package to be built on the subsequent sections.
Assume you have a domain 'sales' as defined by release config sales.yaml
provided as shown in this example here.
In order to build the artifact of the packages defined by the above release config, you would use the the build command with the flags as described here.
If you require only to build packages that's changed form the last published packages, you would add an additional diffcheck flag.
diffcheck will work accurately only if the build command is able to access the latest tag in the repository. In certain CI system if the command is operated on a repository where only the head commit is checked out, diffchek will result in building all the artifacts for all packages within the domain
An artifact can be build individually for a package, sfp will determine the type of a package and the corresponding api's will be invoked
sfp provides various features to alter the installation behaviour of a package. These behaviours have to be applied as an additional property of a package during build time. The following section details each of the parameters that is available.
sfp is built on the concept of immutable artifacts, hence any properties to control the installation aspects of a package need to be applied during the build command. Installation behaviour of an package cannot be controlled dynamically. If you need to alter or add a new behaviour, please build a new version of the artifact
Artifacts to be built can be limited by various mechanisms. This section deals with various techniques to limit artifacts being built
Artifacts to be built can be restricted by domain during a build process in sfp by utilizing specific configurations. Consider the release config example provided in
You can use the path to the config file to the build command to limit the artifacts being built as in the sample below
Artifacts to be built can be limited only to certain packages. This could be very helpful, in case you want to build an artifact of only one package or a few while testing locally.
Packages can be ignored from the build command by utilising an additional descriptor on the package in project manifest (sfdx-project.json)
In the above scenario, src-env-specific-pre will be ignored while build command is invoked
All packages start out as directory in your repo!
A package is a collection of metadata grouped together in a directory, and defined by an entry in your sfdx-project.json ( Project Manifest).
Each package in the context of sfp need to have the following attributes as the absolute minimum\
Attribute | Required | |
---|---|---|
sfp will not consider any entries in your sfdx-project.json for its operations if it is missing 'package' or 'versionNumber' attribute, By default, sfp treats all entries in sfdx-project.json as Source Packages. If you need to create an unlocked or an org depednent unlocked package, you need to proceed to create the packages using the follwing steps detailed below
sfp-pro users can create a source package by using the following command,
sfp package create source -n "my-source-package" --domain "my-domain" -r "path"
To create an unlocked package using the Salesforce CLI, follow the steps below:
Identify the Package Directory: Ensure that your sfdx-project.json
contains an entry for the package you wish to turn into an unlocked package. The entry must include path
, package
, and versionNumber
.
Create a Package: If the package does not already exist, create it with the command:
Replace <package_name>
with the name of your unlocked package, with the package directory specified in your sfdx-project.json
, and <alias_for_org>
with the alias for your Salesforce org.
Ensure that an entry for your package is created in your packageAliases section. Please commit the updated sfdx-project.json to your version control, before proceeding with sfp commands
sfp-pro users can create an unlocked package by using the following command,
sfp package create unlocked -n "my-unlocked-package" --domain "my-domain" -r "path" -v devhub
To create an unlocked package using the Salesforce CLI, follow the steps below:
Identify the Package Directory: Ensure that your sfdx-project.json
contains an entry for the package you wish to turn into an unlocked package. The entry must include path
, package
, and versionNumber
.
Create a Package: If the package does not already exist, create it with the command:
Replace <package_name>
with the name of your unlocked package, with the package directory specified in your sfdx-project.json
, and <alias_for_org>
with the alias for your Salesforce org.
Ensure that an entry for your package is created in your packageAliases section. Please commit the updated sfdx-project.json to your version control, before proceeding with sfp commands
sfp-pro users can create an org dependent unlocked package by using the following command,
sfp package create unlocked -n "my-unlocked-package" --domain "my-domain" -r "path" --orgdepedendent -v devhub
Identify the Package Directory: Ensure that your sfdx-project.json
contains an entry for the package you wish to turn into an unlocked package. The entry must include path
, package
, and versionNumber
.
Ensure your package directory is populated with an export-json and the required CSV files. Read on here to learn more about data packages
Add an additional attribute of "type":"data" \
sfp-pro users can create a diff package by using the following command,
sfp package create data -n "my-source-package" -r "path" --domain "my-domain"
Identify the Package Directory: Ensure that your sfdx-project.json
contains an entry for the package you wish to turn into an unlocked package. The entry must include path
, package
, and versionNumber
.
Ensure your package directory is populated with an export-json and the required CSV files. Read on here to learn more about diff packages
Add an additional attribute of "type":"diff" \
Ensure a new record is created in SfpowerscriptsArtifact2__c object in your devhub, with the details such as the name of the package, the initial version number, the baseline commit id
sfp-pro users can create a diff package by using the following command,
sfp package create diff -n "my-diff-package" -r "path" -c "commit-id" --domain "my-domain" -v devhub
To fully leverage the capabilities of sfp, a few addition steps need to be configured in your Salesforce Orgs. Please follow the following steps.
To enable modular package development, the following configurations need to be turned on in order to create Scratch Orgs and Unlock Packages.
Enable Dev Hub in your Salesforce org so you can create and manage scratch orgs and second-generation packages. Scratch orgs are disposable Salesforce orgs to support development and testing.
Navigate to the Setup menu
Go to Development > Dev Hub
Toggle the button to on for Enable Dev Hub
Enable Unlocked Packages and Second-Generation Managed Packages
The sfpowerscripts-artifact package is a lightweight unlocked package consisting of a custom setting SfpowerscriptsArtifact2__c that is used to keep a record of the artifacts that have been installed in the org. This enables package installation, using sfp, to be skipped if the same artifact version already exists in the target org.
Once the command completes, confirm the unlocked package has been installed.
Navigate to the Setup menu
Go to Apps > Packaging > Installed Packages
Confirm the package sfpowerscripts-artifact is listed in the "Installed Packages"
Ensure that you install sfpowerscripts artifact unlocked package in all your target orgs that you intend to deploy using sfp.
If refreshing from Production with sfpowerscripts-artifact already installed, you do not need to install again to your sandboxes.
sfp handles destructive changes according to the type of package. Here is a rundown on how the behaviour is according to various package types and modes
Salesforce handles destructive changes in unlocked packages / org dependent unlocked packages as part of the package upgrade process. From the Salesforce documentation (https://developer.salesforce.com/docs/atlas.en-us.sfdx_dev.meta/sfdx_dev/sfdx_dev_unlocked_pkg_install_pkg_upgrade.htm?q=delete+metadata)
Metadata that was removed in the new package version is also removed from the target org as part of the upgrade. Removed metadata is metadata not included in the current package version install, but present in the previous package version installed in the target org. If metadata is removed before the upgrade occurs, the upgrade proceeds normally. Some examples where metadata is deprecated and not deleted are:
User-entered data in custom objects and fields are deprecated and not deleted. Admins can export such data if necessary.
An object such as an Apex class is deprecated and not deleted if it’s referenced in a Lightning component that is part of the package.
sfp utilizes mixed
mode while installing unlocked packages to the target org. So any metadata that can be deleted is removed from the target org. If the component is deprecated, it has to be manually removed.
Components that are hard deleted upon a version upgrade is found here.
Source packages support destructive changes using folder structure to demarcate components that need to be deleted. One can make use of pre-destructive and
`post-destructive
folders to mark components that need to be deleted
The package installation is a single deployment transaction with components that are part of pre/post deployed along with destructive operation as specified in the folder structure. This would allow one to refactor the current code to facilitate refactoring for the destructive changes to succeed, as often deletion is only allowed if there are no existing components in the org that have a reference to the component that is being deleted
Destructive Changes support for source package is currently available only in sfp (pro) version.
Data packages utilize sfdmu under the hood, and one can utilize any of the below approaches to remove data records.
Approach 1: Combined Upsert and Delete Operations
One effective method involves configuring SFDMU to perform both upsert and delete operations in sequence for the same object. This approach ensures comprehensive data management—updating and inserting relevant records first, followed by removing outdated entries based on specific conditions.
Upsert Operation: Updates or inserts records based on a defined external ID, aligning the Salesforce org with new or updated data from a source file.
Delete Operation: Deletes specific records that meet certain criteria, such as being marked as inactive, to ensure the org only contains relevant and active data.
Approach 2: Utilizing deleteOldData
Another approach involves using the deleteOldData
parameter. This parameter is particularly useful when needing to clean up old data that no longer matches the current dataset in the source before inserting or updating new records.
Delete Old Data: Before performing data insertion or updates, SFDMU can be configured to remove all existing records that no longer match the new dataset criteria, thus simplifying the maintenance of data freshness and relevance in the target org
sfp's build commands process the packages in the order as mentioned in your sfdx-project.json. The commands also read your dependencies property, and then when triggered, will wait till all its dependencies are resolved, before triggering the equivalent package creation command depending on the type of the package and generating an artifact in the due process
sfp's build command is equipped with a diffcheck functionality, which is enabled when one utilizes diffcheck flag, A comparison (using git diff) is made between the latest source code and the previous version of the package published by the 'publish' command. If any difference is detected in the package directory, package version or scratch org definition file (applies to unlocked packages only), then the package will be created - otherwise, it is skipped. For eg: provided the followings packages in sfdx-project.json along with its dependencies
Scenario 1 : Build All
Trigger creation of artifact for package A
Once A is completed, trigger creation of artifacts for packages B & C **,**using the version of A, created in step 1
Once C is completed, trigger creation of package D
Scenario 2 : Build with diffCheck enabled on a package with no dependencies
In this scenario, where only a single package has changed and diffCheck is enabled, the build command will only trigger the creation of Package B
Scenario 3 : Build with diffCheck enabled on changes in multiple packages
In this scenario, where there are changes in multiple packages, say B & C, the build command will trigger the creation of artifacts for these packages in parallel, as their dependent package A has not changed (hence fulfilled). Please note even though there is a change in C, package D will not be triggered, unless there is an explicit version change of version number (major.minor.patch) of package D
A domain is defined by a release configuration. In order to define a domain, you need to create a new release config yaml file in your repository
A simple release config can be defined as shown below
The resulting file could be stored in your repository and utilized by the commands such as build, release etc.
Sometimes, due to certain platform errors, some metadata components need to be ignored during build (especially for unlocked packages) while the same being required for other commands like validate . sfp offer you an easy mechanism which allows to switch .forceignore files depending on the operation.
Add this entry to your sfdx-project.json and as in the example below, mention the path to different files that need to be used for different stages
sfp provides mechanisms to control the aspects of the build command, the following section details how one can configure these mechanisms in your sfdx-project.json
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
In certain scenarios, it's necessary to build a new version of a package when any package in the collection undergoes changes. This can be accomplished by utilizing the buildCollection
attribute in the sfdx-project.json
file.
To ensure that a new version of a package is built whenever any package in a specified collection undergoes a change, you can use the buildCollection
attribute in the sfdx-project.json
file. Below is an example illustrating how to define a collection of packages that should be built together.
The configFile
flag in the build
command is used for features and settings of the scratch org used to validate your unlocked package. It is optional and if not passed in, it will assume the default one which is none.
Typically, all packages in the same repo share the same scratch org definition. Therefore, you pass in the definition that you are using to build your scratch org pool and use the same to build your unlocked package.
However, there is an option to use multiple definitions.
For example, if you have two packages, package 1 is dependent on SharedActivities
, and package 2 is not. You would pass a scratch org definition file to package 1 via scratchOrgDefFilePaths
in sfdx-project
. Package 2 would be using. the default definitionFile
.
Attribute | Type | Description | Package Types Applicable |
---|
Using the ignoreOnStage:[ "build" ]
property on a package, causes the particular package to be skipped by the build command. Similarly you can use ignoreOnStage:[ "quickbuild" ]
to skip packages in the quickbuild stage.
Attribute | Type | Description | Package Types Applicable |
---|
sfp cli optimally deploys artifacts to the target organisation by reducing the time spent on running Apex tests where possible. This section explains how this optimisation works for different package types.
Package Type | Apex Test Execution during installation | Coverage Requirement |
---|
To ensure salesforce deployment requirements for source packages, each Apex class within the source package must achieve a minimum of 75% test coverage. This coverage must be verified individually for each class. It's imperative that the test classes responsible for this coverage are included within the same package.
During the artifact preparation phase, sfp cli will automatically identify Apex test classes included within the package. These identified test classes will then be utilised at the time of installation to verify that the test coverage requirement is met for each Apex class, ensuring compliance with Salesforce's deployment standards. When there are circumstances, where individual coverage cannot be achieved by the apex classes within a package, one can disable 'optimized deployment' feature by using the attribute mentioned below.
This option is only applicable to source/diff packages. Disabling this option will trigger the entire local tests in the org and can lead to considerable delays
Attribute | Type | Description | Package Types Applicable |
---|
In some situations, you might need to execute a pre/post deployment script to do manipulate the data before or after being deployed to the org. sfp allow you to provide a path to a shell script (Mac/Unix) / batch script (on Windows).
The scripts are called with the following parameters. In your script you can refer to the parameters using
Please note scripts are copied into the and are not executed from version control. sfpowerscripts only copies the script mentioned by this parameter and do not copy any additional files or dependencies. Please ensure pre/post deployment scripts are independent or should be able to download its dependencies
Position | Value |
---|
Please note the script has to be completely independent and should not have dependency on a file in the version control, as scripts are executed within the context of an artifact.
Attribute | Type | Description | Package Types Applicable |
---|
Salesforce inherently does not support the deployment of picklist values as part of unlocked package upgrades. This limitation has led to the need for workarounds, traditionally solved by either replicating the picklist in the source package or manually adding the changes in the target organization. Both approaches add a burden of extra maintenance work. This issue has been documented and recognised as a known problem, which can be reviewed in detail .
To ensure picklist consistency between local environments and the target Salesforce org, the following pre-deployment steps outline the process for managing picklists within unlocked packages:
Retrieval and Comparison: Initially, picklists defined within the packages marked for deployment will be retrieved from the target org. These retrieved values are then compared with the local versions of the picklists.
Update on Change Detection: Should any discrepancies be identified between the local and retrieved picklists, the differing values will be updated in the target org using the Salesforce Tooling API.
Handling New Picklists: During the retrieval phase, picklists that are present locally but absent in the target org are identified as newly created. These new picklists are deployed using the standard deployment process, as Salesforce permits the deployment of new picklists from unlocked packages.
Picklist Enabled Package Identification: Throughout the build process, the sfp cli examines the contents of each unlocked package artifact. A package is marked as 'picklist enabled' if it contains one or more picklist(s).
Have you ever encountered this error when deploying an updated Entitlement Process to your org?
It's important to note that Salesforce prevents the deployment of an Entitlement Process that is currently in use, irrespective of its activation status in the org. This limitation holds true even for inactive processes, potentially impacting deployment strategies.
To create a new version of an Entitlement Process, navigate to the Entitlement Process Detail page in Salesforce and perform the creation. If you then use sf cli to retrieve this new version and attempt to deploy it into another org, you're likely to encounter errors. Deploying it as a new version can bypass these issues, but this requires enabling versioning in the target org first.
The culprit behind this is an attribute in the Entitlement Process metadata file: versionMaster.
According to the Salesforce documentation, versionMaster ‘identifies the sequence of versions to which this entitlement process belongs and it can be any value as long as it is identical among all versions of the entitlement process.’ An important factor here about versionMaster is: versionMaster is org specific. When you deploy a completely new Entitlement Process to an org with a randomly defined versionMaster, Salesforce will generate an ID for the Entitlement Process and map it to this specific versionMaster.
Deploying updated Entitlement Processes from one Salesforce org to another can often lead to deployment errors due to discrepancies in the versionMaster
attribute, sfp's enitlement helper enables you to automatically align the versionMaster
by pulling its value from the target org and updating the deploying metadata file accordingly. This ensures that your deployment process is consistent and error-free.
Enable Versioning: First and foremost, activate versioning in the target org to manage various versions of the Entitlement Process.
Create or Retrieve Metadata File:
Create a new version of the Entitlement Process as a metadata file, which can be done via the Salesforce UI and retrieved with the Salesforce CLI (sf cli).
Ensure Metadata Accuracy:
API Name: Validate that the API name within the metadata file accurately reflects the new version.
Version Number: Update the versionNumber
in your metadata file to represent the new version clearly.
Default Version: Confirm that only one version of the Entitlement Process is set as Default to avoid deployment conflicts.
Automation around entitlement filter can be disabled globally by using these attributes in your sfdx-project.json
Attribute | Type | Description | Package Types Applicable |
---|
sfp cli when encounters the attribute skipDeployOnOrgs
on a package, the generated artifact during installation is checked against the alias or the username passed onto the installation command. If the username or the alias is matched, the artifact installation is skipped
In the above example, if the alias to installation command is qa, the artifact for core-crm will get skipped during intallation. The same can also be applied for username of an org as well
Attribute | Type | Description | Package Types Applicable |
---|
sfp-pro | sfp (community) |
---|
mergeMode adds an additional mode for deploying aliasified packages with content inheritance. During package build, the default folder's content is merged with subfolders that matches the org with alias name, along with subfolders able to override inherited content. This reduces metadata duplication while using aliasifed packages.
Note in the image above that the AccountNumberDefault__c
field is replicated in each deployment directory that matches the alias.
The default folder is deployed to any type of environment, including production unlike when only used with aliasfy mode
The table below describes the behavior of each command when merge mode is enabled/disabled.
When merge mode is enabled, it also supports push/pull commands, the default subfolder is always used during this process, make sure it is not force ignored.
Before merge mode, the whole package was "force ignored." With merge mode, you have the option to allow push/pull from aliasfy packages by not ignoring the default subfolder.
Attribute | Type | Description | Package Types Applicable |
---|
Salesforce has a strict limit on the number of fields that can be tracked. Of course, this limit can be increased by raising it to the Salesforce Account Executive (AE). However, it would be problematic if an unlocked package is deployed to orgs that do not have the limit increased. So don’t be surprised when you are working on a flxbl project and happen to find that the deployment of field history tracking from unlocked packages is disabled by Salesforce.
One workaround is to keep a copy of all the fields that need to be tracked in a separate source package (field-history-tracking or similar) and deploy it as one of the last packages with the ‘alwaysDeploy’ option.
However, this specific package has to be carefully aligned with the original source/unlocked packages, to which the fields originally belong. As the number of tracked fields increases in large projects, this package becomes larger and more difficult to maintain. In addition, since it’s often the case that the project does not own the metadata definition of fields from managed packages, it doesn’t make much sense to carry the metadata only for field history tracking purposes.
To resolve this, sfp features the ability to automate the deployment of field history tracking for both unlocked packaged and managed packages without having to maintain additional packages.
Two mechanisms are implemented to ensure that all the fields that need to be field history tracking enabled are properly deployed. Specifically, a YAML file that contains all the tracked fields is added to the package directory.
During deployment, the YAML file is examined and the fields to be set are stored in an internal representation inside the deployment artifact. Meanwhile, the components to be deployed are analyzed and the ones with ‘trackHistory’ on are added to the same artifact. This acts as a double assurance that any field that is missed in the YAML files due to human error will also be picked up. After the package installation, all the fields in the artifact are retrieved from the target org and, for those fields, the trackHistory is turned on before they are redeployed to the org.
In this way, the deployment of field history tracking is completely automated. One now has a declarative way of defining field history tracking (as well as feed tracking) without having to store the metadata in the repo. The only maintenance effort left for developers is to manage the YAML file.
Attribute | Type | Description | Package Types Applicable |
---|
Attribute | Type | Description | Package Types Applicable |
---|
Aliasified packages are available only for Source Package
Aliasify enables deployment of a subfolder in a source package that matches the target org. For example, you have a source package as listed below.
During Installation, only the metadata contents of the folder that matches the alias gets deployed. If the alias is not found, sfp will fallback to the 'default' folder. If the default folder is not found, an error will be displayed saying default folder or alias is missing.
Default folder are only deployed to sandboxes
Attribute | Type | Description | Package Types Applicable |
---|
To ensure that an artifact of a package is always deployed, irrespective the same version of the artifact is previously deployed to the org, you can utlize alwaysDeploy
as a property added to your package,
By using the skipifalreadyinstalled
option with the deploy command, you can prevent the reinstallation of an artifact that is already present in the target organization.
The --baselineorg
parameter allows you to specify the alias or username of an org against which to check whether the incoming package versions have already been installed and form a deployment plan.This overrides the default behaviour which is to compare against the deployment target org. This is an optional feature which allows to ensure each org's are updated with the same installation across every org's in the path to production.
All the attributes that are configured additionally to a package in your project is recorded within the artifact. When the artifact is being installed to the target org, the following steps are undertaken in sequence as seen in the below diagram. Each of the configured attributes below to one of the category in the sequence and executed
Merge Mode | Default available? | Build | Install | Push | Pull |
---|
Non-merge mode | Merge Mode |
---|
path
yes
Path to the directory that contains the contents of the package
package
yes
The name of the package
versionNumber
yes
The version number of the package
versionDescription
no
Description for a particular version of the package
ignoreOnStage
array
Ignore a package from being processed by a particular stage
unlocked
org-dependent unlocked
source
diff
Default metadata is merged into each subfolder | Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production) | Only default is pushed | Only default is pulled |
Default merging not available | Deploys subfolder that matches enviroment alias name. If there is no match, then default subfolder is deployed (Including production) |
Default merging not available | Deploys subfolder that matches enviroment alias name. If there is no match, deploys default (Only sandboxes) | Only default is pushed | Only default is pushed |
Default merging not available | Deploys subfolder that matches enviroment alias name. If there is no match, fails | Nothing to push | Nothing to pull |
reconcileProfiles | boolean | Reconcile profiles to only apply permissions to objects, fields and features that are c |
|
ignoreOnStage | array | Ignore a package from being processed by a particular stage |
|
isOptimizedDeployment | boolean | Detects test classes in a source package automatically and utilise it to deploy the provided package |
|
Unlocked Package | No | 75% for the contents of a package validated during build |
Org Dependent Unlocked Package | No | No |
Source Package | Yes | Yes, either each classes need to have individual 75% coverage or use the entire org's apex test coverage |
Diff Package | Yes | Yes, either each classes need to have individual 75% coverage or use the entire org's coverage |
Data Package | No | No |
preDeploymentScript | string | Run an executable script before deploying an artifact. Users need to provide a path to the script file |
|
postDeploymentScript | string | Run an executable script after deploying an package. Users need to provide a path to the script file |
|
1 | Name of the Package |
2 | Username of the target org where the package is being deployed |
3 | Alias of the target org where the package is being deployed |
4 | Path to the working directory that has the contents of the package |
5 | Path to the package directory. One would need to combine parameter 4 and 5 to find the absolute path to the contents of the package |
enablePicklist | boolean | Enable picklist upgrade for unlocked packages |
|
alwaysDeploy | boolean | Deploys an artifact of the package, even if it's installed already in the org. The artifact has to be present in the artifact directory for this particular option to work |
|
skipDeployOnOrgs | array | Skips installation of an artifact on a target org |
|
The fetch command provides you with the ability to fetch artifacts from your artifact registry to your local file system. Artifacts once fetched from your registry can the be used for installation or for further analysis
The fetch command requires a release definition file as the input.
Let's assume you have the following release definition file with the name of the file as Sprint2-13-11.yaml
One can use the fetch command to fetch all the artifacts by issuing the command as shown below
Please note the scope flag, the scope flag should match exactly the scope that was provided during publish command
In a high velocity project operating on a trunk such as a #flxbl project and with substantial number of packages, manually generating release definition can be a chore. This can eased by using generating the release definition file automatically after every publish command. One can utilise release config along with release definition generate command to automate the process of generating release definitions.
The above command will generate a release definition file based on the head of the branch 'main', one can also provide a commit ref ( by providing a commit id to the -c
or --gitref
flag ) to utilize the latest artifacts on the particular commit. The generated release definition is then written to a directory called releasedefns_directory and pushed to a branch called releasedefns
One can utlilze the --nopush
flag, if the intent is only to create a releasedefinition locally and do not push the same to the git repository
The Publish command pushes artifacts created by the build command to a npm registry and also provides you functionality to tag a version of the artifact in your git repository.
To publish packages to your private registry, you'll need to configure your credentials accordingly. Follow the guidelines provided by your registry's service to ensure a smooth setup.
Locate Documentation: Jump into your private registry provider's documentation or help center.
Credentials Setup: Find the section dedicated to setting up publishing credentials or authentication.
Follow Steps: Adhere to the detailed steps provided by your registry to configure your system for publishing.
When working with sfp and managing artifacts, it’s required to use a private NPM registry. This allows for more control over package distribution, increased security, and custom package management. Here are some popular NPM compatible private registries, including instructions on how to configure NPM to use them:
GitHub Packages acts as a hosting service for npm packages, allowing for seamless integration with existing GitHub workflows. For configuration details, visit GitHub's guide on configuring NPM for use with GitHub Packages.
GitLab offers a fully integrated npm registry within its continuous integration/continuous deployment (CI/CD) pipelines. To configure NPM to interact with GitLab’s registry, refer to GitLab's NPM registry documentation.
Azure Artifacts provides a npm feed that's compatible with NPM, enabling package hosting, sharing, and version control. Note: The link provided previously was incorrect. Azure Artifacts information can be found here: Azure Artifacts documentation.
JFrog Artifactory offers a robust solution for managing npm packages, with features supporting binary repository management. For setting up Artifactory to work with NPM, refer to JFrog’s npm Registry documentation.
MyGet provides package management support for npm, among other package managers, facilitating the hosting and management of private npm packages. For specifics on utilizing NPM with MyGet, check out MyGet’s NPM support documentation.
Each of these registries offers its own advantages, and the choice between them should be based on your project’s needs and existing infrastructure.
Follow the instructions on your npm registry to generate .npmrc file with the correct URL and access token (which has the permission to publish into your registry.
Utilize the parameters in sfp publish and provide the npmrc file along with activating npm
sfp's publish command provides you with various options to tag the published artifacts in version control. Please note that these tags are utilized by sfp's build command to determine which packages are to be built when used with diffcheck
flag
Given a directory of artifacts and a target org, the deploy command will deploy the artifacts to the target org according to the sequence defined in the project configuration file.
The deploy command runs through the following steps
Reads all the sfp artifacts provided through the artifact directory
Unzips the artifacts and finds the latest sfdx-project.json.ori to determine the deployment order, if this particular file is not found, it utilizes sfdx-project.json on the repo
Read the installed packages in the target org utilizing the records in SfpowerscriptsArtifacts2__c
Install each artifact from the provided artifact directory to the target org based on the deployment order respecting the attributes configured for each package
validate command helps you to validate a change made to your configuration / code against a target org. This command is typically triggered as part of your Pull Request (PR) or Merge process, to ensure the correctness of configuration/code, before being merged into your main branch. validate can either utilise a scratch org from a tagged pool prepared earlier using the prepare command or one could use the a target org for its purpose
validate pool / validate org command runs the following checks with the options to enable additional features such as dependency and impact analysis:
Checks accuracy of metadata by deploying the metadata to a org
Triggers Apex Tests
Validate Apex Test Coverage of each package (default: 75%)
Toggle between different modes for validation
Individual
Fast Feedback
Thorough (Default)
Fast Feedback Release Config
Thorough Release Config
[optional] - Validate dependencies between packages for changed component
[optional] - Disable diff check while validating, this will validate all the packages in the repository
[optional] - Disable parallel testing of apex tests, this will validate apex tests of each package in synchronous mode
Validation processes often aim to synchronize the provided organization by installing packages that differ from those already installed. This task can become particularly time-consuming for large projects with hundreds of packages.
To streamline the validation process and focus it on specific domains, employing release config based modes is highly recommended. This approach limits the scope of validation, enhancing efficiency and reducing time.
In the above example the changes are only validated against the packages as mentioned in the provided release config
When developers take on the task of creating their own local environments, the process, although tailored, can introduce significant inefficiencies, particularly in terms of spent time and resources. Dissecting the setup into its fundamental steps reveals why this method can be more tedious than beneficial:
Data Imports: The process of importing configurations and test data is not only crucial but also time-intensive.
User Configurations: Assigning the correct permissions requires careful consideration, further slowing down the setup.
Deploy Metadata: Adjusting and deploying metadata to fit the environment involves dealing with intricacies like endpoints and user integrations.
Install Package Dependencies: Vital for functionality, the installation of necessary packages consumes additional time.
Org Shape and Scratch Org Definition: Establishing a scratch org involves creating or utilizing a definition file, a step foundational yet time-consuming.
Scratch Orgs, lauded for their ability to provide a fresh, customizable development environment, also come with a significant drawback—the massive amount of setup time before they're ready for use. Starting with only the standard features, they require a thorough, often lengthy process to incorporate project-specific requirements. While Scratch Orgs offer unmatched flexibility and customization, it's essential to weigh these benefits against the considerable time investment required for their setup.
Scratch org pools represent an indispensable asset for development teams aiming to refine their operational efficiencies and enhance their workflow optimizations. A pool of pre preared scratch orgs facilitate a more rapid and straightforward setup and management of scratch orgs, thereby bolstering team productivity. By adopting scratch org pools, development entities are empowered to access new environments on-demand, circumventing the delays inherent to manual configurations.
List all the active scratch orgs with the given pool tag, only 'available' and 'in progress' scratch orgs are displayed
To keep the pools up to date, a nightly scheduled job can be utilised which can delete all the scratch orgs in each pool. In the case of developer pools (pools with source tracking and used by developers as opposed to CI pools), care must be taken to delete only the unassigned
pools by omitting the --allscratchorgs
flag.
Be careful when running delete command against developer pools with -allscratchorg
flag. It will delete the scratch orgs that are used by developer and can result in potential loss of work
sfpowerscripts during prepare attempts to create multiple scratch orgs in parallel, while respecting the timeout as provided in Pool configuration. At times when the Salesforce infrastructure is stretched such as during release windows or during maintenance windows, it could be often seen some of the scratch orgs requested results in being timed out and are discarded by the rest of the pool activities. However, some of these scratch orgs are in fact created by Salesforce without respecting the timeout parameter, and counted against the active scratch org limits.
These scratch orgs titled orphaned scratch orgs in sfpowerscripts lingo can be reclaimed by issuing the pool: delete command with --orphans flag. This will delete these scratch orgs and hence allow to recover your active scratch org limits
prepare command has inbuilt capability to orchestrate installation of external package dependencies. Package dependencies are defined in the sfdx-project.json. More information on defining package dependencies can be found in the Salesforce docs.
Let's unpack the concepts utilizing the above example:
There are two unlocked packages
Expense Manager - Util is an unlocked package in your DevHub, identifiable by 0H in the packageAlias
Expense Manager - another unlocked package which is dependent on ' Expense Manager - Util', 'TriggerFramework' and 'External Apex Library - 1.0.0.4'
External Apex Library is an external dependency, It could be a managed package or any unlocked package released on a different Dev Hub. All external package dependencies have to be defined with a 04t ID, which can be determined from the installation URL from AppExchange or by contacting your vendor.
sfpowerscripts parses sfdx-project.json and does the following in order
Skips Expense manager - Util as it doesn't have any dependencies
For Expense manager
Checks whether any of the package is part of the same repo, in this example 'Expense Manager-Util' is part of the same repository and will not be installed as a dependency
Installs the latest version of TriggerFramework ( with major, minor and patch versions matching 1.7.0) to the scratch org
Install the 'External Apex Library - 1.0.0.4' by utilizing the 04t id provided in the packageAliases
If any of the managed package has keys, it can be provided as an argument to the prepare command. Check the command's flags for more information.
The format for the 'keys' parameter is a string of key-value pairs separated by spaces - where the key is the name of the package, the value is the protection key of the package, and the key-value pair itself is delimited by a colon .
e.g. --keys "packageA:12345 packageB:pw356 packageC:pw777"
The time taken by this command depends on how many managed packages and your packages that need to be installed. Please note, if you are triggering this command in a CI server, ensure proper time outs are provided for this task, as most cloud-based CI providers have time limits on how long a single task can be run. Explore multi-stage prepare jobs in case this is a blocker for the team.
sfp's prepare command helps you to build a pool of pre-built scratch orgs which can include managed packages as well as packages in your repository. This process allows you to cut down time in re-creating a scratch org during validation process when a scratch org is used as just-in-time environment or when being used as a developer environment. As you integrate more automated processes into Salesforce, incorporating third-party managed packages into your repository's configuration metadata and code becomes necessary. This integration increases the setup time for CI scratch orgs or developer environments for various tasks, such as running data loading scripts or assigning permission sets.
The prepare command does the following sequence of activities:
Calculate the number of scratch orgs to be allocated (Based on your requested number of scratch orgs and your org limits, we calculate what is the number of scratch orgs to be allocated at this point in time)
Fetch the artifacts from registry if "fetchArtifacts" is defined in config, otherwise build all artifacts
Create the scratch orgs in parallel (respecting the timeout requested in the config), and update Allocation_status_c of each these orgs to "In Progress"
On each scratch org, in parallel, do the following activities:
Install SFPOWERSCRIPTS_ARTIFACT_PACKAGE (04t1P000000ka9mQAA) for keeping track of all the packages which will be installed in the org. You can set an environment variable SFPOWERSCRIPTS_ARTIFACT_PACKAGE to override the installation with your own package id (the source code is available here)
Install all the dependencies of your packages, such as managed packages that are marked as dependencies in your sfdx-project.json
Install all the artifacts that is either built/fetched
If enableSourceTracking
is specified in the configuration, the command will create and deploy "sourceTrackingFiles" using sfpowerscripts artifacts to the scratch org. To store local source tracking files, we re-create it when fetching a scratch org from a pool, using the @salesforce/source-tracking library. Checkout the commit from which each sfpowerscripts artifact was created, and update the local source tracking using the package directory. The files are retrieved to the local ".sf" directory, when using sfpowerscripts:pool:fetch
to fetch a scratch org, and allows users to deploy their changes only, through source tracking. Refer to the decision log for more details.
Mark each completed scratch org as "Available", depending on the pool config `succeedOnDeploymentErrors` is true, else scratch orgs are deleted
Ensure that your DevHub is authenticated using SFDX Auth URL and the auth URL is stored in a secure place (Key Management System or Secure Storage).
A developer can fetch a scratch org with all the dependencies installed and the latest code base has been pushed from a pool for their feature development. the developer can start developing new features without spending a few hours preparing the development environment.
Fetches a review environment from a specified pool and assigns it to a pull request/issue.
The commands are only available in sfp-pro (August 24 onwards) and currently limited to GitHub. Using these commands requires an equivalent APP_ID & PRIVATE_KEY in your environment variable.
--repository
: The repository path that stores the pool lock (default: current repo).
--pool
: The name of the pool to fetch the review environment from (required).
--poolType
: The type of the pool, either sandbox
or scratchorg
(required).
--branch
: The pull request branch to fetch the environment for (required).
--issue
: The pull request number to assign the environment to, or a unique id that will be used subsequently to identify (required).
--devhubAlias
: The DevHub alias associated with the pool (default: devhub
).
--wait
: Time in minutes to wait for an environment to become available (default: 20).
--leaseFor
: Time in minutes typically this environment is leased for during similar operations
Checks if an environment is already assigned to the issue.
If assigned, verifies if the previous lease has expired based on the leaseFor
duration. If a job has not released the environment within the earlier mentioned leaseFor, the new job will be provided the environment
If no environment is assigned or the environment assigned to the issue has expired, fetches a new environment from the pool.
Automatically marks the fetched or reassigned environment as "InUse".
Waits for up to the specified wait
time if no environment is immediately available.
The environment remains valid for 24 hours from assignment, regardless of the leaseFor
duration.
The leaseFor
parameter determines how long the current process can use the environment before it becomes available for reassignment for a different job within the same issue.
Removes the assignment of a review environment from an issue.
The commands are only available in sfp-pro (August 24 onwards) and currently limited to GitHub. Using these commands requires an equivalent APP_ID & PRIVATE_KEY in your environment variable.
--repository
: The repository path that stores the pool lock (default: current repo).
--issue
: The pull request number to assign the environment to, or a unique id that will be used subsequently to identify (required).
--returntopool
: If set to true, the environment will be returned to the pool for reuse. If false or not set, it will be marked as expired.
Locates the environment assigned to the specified issue.
Removes the assignment of the environment from the issue.
Based on the --returntopool
flag:
If true: Marks the environment as available for reuse within the pool.
If false or not set: Marks the environment as expired, to be picked up by Pool commands for deletion.
This command should be used when a review environment is no longer needed for an issue.
Consider carefully whether to return the environment to the pool or mark it as expired based on your project's needs and the state of the environment.
Review environments are a crucial component in the CI/CD workflow of flxbl projects. They provide isolated, ephemeral environments (either scratch orgs or sandboxes) for validating deployments, running tests, and performing acceptance testing during the pull request process.
Isolation: Each pull request gets its own environment, preventing conflicts between different features or bug fixes.
Reproducibility: Environments are created from consistent pool configurations, ensuring predictable testing conditions.
Early Validation: Catch deployment issues early in the development process.
Automated Testing: Run tests against review environments to verify changes and prevent regressions.
Collaboration: Provide a shared space for team members to perform acceptance testing and provide feedback.
Compliance: Enforce governance policies and organizational standards before merging changes.
Environment Pools: Review environments are managed in pools, which can be either sandbox or scratch org pools.
Fetching: When a pull request is opened or updated, a review environment is fetched from the appropriate pool. If an environment is already assigned to the issue, it may be reused if its lease hasn't expired.
Usage: The environment is populated with the pull request changes and used for automated checks, manual testing, and review processes.
Leasing: Environments are leased for specific durations to manage resource usage efficiently. While an environment is valid for 24 hours, it can be leased for shorter periods for specific processes.
Status Management: Throughout its lifecycle, an environment's status can be transitioned between 'InUse', 'Available', and 'Expired' to reflect its current state and availability.
Extension: If needed, an environment's overall validity can be extended beyond the initial 24-hour period.
Unassignment: When no longer needed, environments are unassigned from issues and either returned to the pool or marked for deletion.
This lifecycle is managed through a set of sfp reviewenv
commands, which automate the process of fetching, checking, transitioning, extending, and unassigning review environments.
Extends the lease of a review environment assigned to a specific issue.
The commands are only available in sfp-pro (August 24 onwards) and currently limited to GitHub. Using these commands requires an equivalent APP_ID & PRIVATE_KEY in your environment variable.
--repository
: The repository path that stores the pool lock (default: current repo).
--issue
: The pull request number to assign the environment to, or a unique id that will be used subsequently to identify (required).
Locates the environment assigned to the specified issue.
Extends the overall validity of the environment by an additional 24 hours from the current time.
This command is useful when more time is needed for thorough testing or when waiting for stakeholder approval.
It extends the overall validity of the environment, not the lease time for a specific process.
Use judiciously to avoid unnecessarily tying up resources.
A release definition is a YAML file that carries attributes of artifacts and other details. A release definition is the main input for a release command.
Lets examine closely the above release defintion file and understand the various attributes and it purposes
This defines the name of the release. this value will be utilised to generate changelogs or as an identification for other systems to understand what release went into an org
This attribute determines whether artifacts should be skipped from installing if the same version number is already installed in the target org
This attribute determines which artifacts constitute the provided release.
Please note, when using release definition with exact version numbers. The format of the version number would be <packageName>: X.Y.Z-BuildNumber, where the version number adopts the following semantics X - Major Y - Minor Z - Patch BuildNumber - The build number that produced the package The version exactly follows the semantics used by the artifact registry (such as Github Packages). However the format is slightly different from the version that is tagged in the git repository where it follows X.Y.Z.BuildNumber
Example: A package that has the version core_crm_v3.0.0.6936, tagged in git should be referenced as core_crm: 3.0.0-6936
The above attribute determines whether an artifact of type unlocked package should be promoted before installing into the target org as provided by the alias. If the alias of the requested org and the targeOrgAlias matches, the unlocked packages are promoted before installing into the org
The reminder of the attributes are about generating change log, which are explained here
Here is a run down of all attributes that make up a release definition
Pools are defined by creating a pool definition file. The file can be placed in a directory within your repository.
The below configuration details a pool with tag 'DEV-POOL' allocated with 20 scratch orgs. Read on below to understand other configurations that are available
The latest JSON schema for scratch org pool configuration can be found here.
The pool prepare
command accepts a JSON configuration file that defines settings and attributes of the scratch org pool. The properties accepted by the configuration file are shown in the table below.
The Scratch Org Pooling Unlocked Package adds additional custom fields, validation rules, and workflow to the standard object "ScratchOrgInfo" in the DevHub to enable associated scratch org pool commands to work for the pipeline.
In order for pools command to work effectively, ensure that you have authenticated to DevHub using SFDX Auth URL instead of other authentication strategies where you are executing the pool operations
Save only the following part of the sfdxAuthUrl to secret storate and use sf org login sfdx-url
force://PlatformCLI::Cq$QLeQvDxpvUoNKgiDkoTqyVHdeoMupiZvkgHYcdVHsfMaDpqKJNbg#8ZtUpfBuIdVaUD0B21cFav5X2Pzv5X2@yoursalesforce.com
For developers (who are on limited access license) to access scratch orgs created by the CI service user, for their local development, a sharing setting needs to be created on the ScratchOrgInfo object. The sharing setting should grant read/write access to the ScratchOrgInfo records owned by a public group consisting of the CI service user and a public group consisting of the developer users.
Create Public Groups (Setup > Users > Public Groups)
CI Users (Admin users/ CI users who creates scratch orgs in pool)
Developers (developers who are allowed to fetch scratch orgs from pool)
Create Sharing Rule "ScratchOrgInfo RW to Developers" (Setup > Security > Sharing Settings)
Grant Read/Write access to the ScratchOrgInfos records owned by the CI Users to Developers
Assign Users to Public Groups (Setup > Security > Sharing Settings)
CI Users
Developers
The developers must also have object-level and FLS permissions on the ScratchOrgInfo object. One way to achieve this is to assign a permission set that has Read, Create, Edit and Delete access on ScratchOrgInfos, as well as Read and Edit access to the custom fields used for scratch org pooling: Allocation_status__c
, Password__c
, Pooltag__c
and SfdxAuthUrl__c
Permission Set Name: "Scratch Org Developer"
Object: Scratch Org Info
Object Permissions
Read, Create, Edit and Delete
Field Permissions
Read, Edit for Custom Fields
Allocation_status__c
Password__c
Pooltag__c
SfdxAuthUrl__c
System Permissions:
API Enabled = True
API Only User = False
Create and Update Second-Generation Packages = True
To onboard new developers, the following Profiles and Permission Set will need to be assigned to the new Developer User Account in Salesforce.
Profile (Choose 1 Only)
Minimum Access - Salesforce
Licence Type - Salesforce
Limited Access User
Licence Type - Salesforce Limited Access - Free
Permission Set
Scratch Org Developer
Public Groups
Developers - Add Users to "Developers" Group
mergeMode | boolean | Enable deployment of contents of a folder that matches the alias of the environment using merge |
|
Availability | ✅ | ❌ |
From | September 24 |
enableFHT | boolean | Enable field history tracking for fields |
|
enableFT | boolean | Enable Feed Tracking for fields |
|
aliasfy | boolean | Enable deployment of contents of a folder that matches the alias of the environment |
|
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
While installing a source/diff package, flows by default are deployed as 'Inactive' in the production org. One can deploy flow as 'Active' using the steps mentioned here, however, this requires the flow to meet test coverage requirement.
Also making a flow inactive, is convoluted, find the detailed article provided by Gearset
sfp has automation built in that attempts to align the status of a particular flow version as kept inside the package. It automatically activates the latest version of a Flow in the target org (assuming the package has the latest version). If an earlier version of package is deployed, it skips the activation.
At the same time, if the package contains a Flow with status Obsolete/InvalidDraft/Draft, all the versions of the flow would be rendered inactive.
This feature is enabled by default, if you would like to disable the feature, set enableFlowActivation to false on your package descriptor
sfp provides two commands to install artifacts into a target org, sfp install and sfp release. While sfp install installs a set of artifacts from a provided directory into a target org, sfp release allows you to install a defined collection of artifacts from an artifact registry directly into a target org, providing you with the consistency required in a pipeline. sfp release is a combination of predominantly fetch and install commands, and the sequence can be explained with the below image
The release command is invoked by using the command sfp release with the inputs as shown below
sfp has a comprehensive collection of validation techniques one can adopt according to requirements of your projects. Read on more about various validation modes to achieve your objective
Mode | Description | Flag |
---|---|---|
The following are the list of steps that are orchestrated by the validate command If used with pools
Fetch a scratch org from the provided pools in a sequential manner or
Authenticate to the Scratch org using the auth URL fetched from the Scratch Org Info Object
If used against a provided org
Authenticate to to the provided target org
Build packages that are changed by comparing the tags in your repo against the packages installed in the target
For each of the packages (internally calls the Install Command) Thorough Mode (Default)
Deploy all the built packages as source packages / data packages (unlocked packages are installed as source package)
Trigger Apex Tests if there are any apex test in the package
Validate test coverage of the package depending on the type of the package (source packages: each class needs to have 75% or more, unlocked packages: packages as whole need to have 75% or more)
Fast Feedback Mode
Deploy only changed metadata components for built packages as source packages / data packages
Trigger selective Apex Tests based on impact analysis of the changes in the package
Skip coverage calculations
Skip deployment of package if the descriptor is changed
Skip deployment of top level packages direct dependency on the package containing changed components
Individual Mode
Ignore packages that are installed in the scratch org (basically eliminate the requirement of using a pooled org)
Compute changed packages by observing the diff of Pull/Merge Request
Validate each of the changed packages (install any dependencies) using Thorough Mode
Fast Feedback Release Config Mode
Inherit Fast Feedback Mode Features but filtered using a release configuration file containing list of packages to focus validation on
Deploy only change packages in the list by comparing against what is installed in the org
Thorough Release Config Mode
Inherit Thorough Mode Features but filtered using a release configuration file containing list of packages to focus validation on
Deploy only change packages in the list by comparing against what is installed in the org
Use of default "thorough" mode is still recommended in the pipeline with fast feedback to ensure coverage computation and scripts added to the descriptor files are correctly working and deployable in an empty org.
By default, all the apex tests are triggered in parallel, with an automated retry, where any tests that fail to execute due to SOQL locks etc are retried synchronously. You can override this behaviour using --disableparalleltesting
which will trigger tests every time in synchronous mode.
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
One can utilize this attribute on a package definition is sfdx-project.json to skipTesting while a change is validated. Use this option with caution, as the package when it gets deployed (depending on the package behaviour) might trigger testing while being deployed to production
sfp's release command will automate generating a change log that will help the team to understand what are the constituent work items/commits that are part of the release and which artifacts where impacted and what was installed
Here is an example of the changelog that is generated by sfp
The changelog is available as both as a markdown file and also as an associated json file that has the all the necessary information to render the change log accordingly for instance in a different tool
One can instruct the release command to generate the changelog by utilizing the below flags
For eg in the below release command, the release command looks for exisiting releasechangelog.json
to understand the previous release deployed in the environment and it would append the changelog information to both releasechangelog.json
and Release-Changelog.md
file
Additional attributes to control changelog links for instance are available in the release definition
Change log displays work item by identify attributes in commit message using a simple regex pattern as provided in the workItemFilters section of your release definition.
For instance, given a commit message with the following message , the work item BE-1836 is identified by the work item filter BE-[0-9]{2,5}`
sfp will identify the part BE-1836 and add a URL link to your work item tracking system by using the provided workItemUrl
attribute
sfp also has standalone commands to generate changelog to create changelog out of bound of the normal release command . One can use the changelog generate
command for eg:
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
sfp during validation checks the apex test coverage of a package depending on the package type. This is beneficial so that you dont run into any coverage issues while deploying into higher environments or building an unlocked package.
However, there could be situations where the test coverage calculation is flaky , sfp provides you with an option to turn the coverage validation off.
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
sfp by default attempts to trigger all the apex tests in a package in asynchronous mode during validation. This is to ensure that you have a highly performant apex test classes which provide you fast feedback.
During installation of packages, test cases are triggered synchronously for source and diff packages. Unlocked and Org-depednendent packages do not trigger any apex test during installation
If you have inherited an existing code base or your unit tests have lot of DML statemements, you will encounter test failuers when a tests in package is triggered asynchronously. You can instruct sfp to trigger the tests of the package in a synchronous mode by setting 'testSynchnronous
as true
sfp-pro | sfp (community) | |
---|---|---|
The sfp pool sandbox init
command is used to create and initialize a pool of Salesforce sandboxes.
-f, --poolconfig=<path>
: Path to the sandbox pool configuration file (default: 'config/sandbox-pool-config.json')
-v, --targetdevhubusername=<devhub-alias>
: Alias of the target Dev Hub org
-r, --repo=<owner/repo>
: GitHub repository in the format owner/repo
The configuration file should be a JSON file containing an array of pool configurations. Each configuration should include:
pool
: Name of the sandbox pool (will be converted to uppercase)
count
: Number of sandboxes to create for this pool
sourceSB
: Source sandbox name (use 'production' for creating from scratch)
branch
: Git branch associated with this pool
defaultExpirationHours
: (Optional) Default expiration time in hours (default: 24)
extendedExpirationHours
: (Optional) Extended expiration time in hours (default: 24)
averageOrgCreationTime
: (Optional) Average time in hours for sandbox creation (default: 2)
Reads the configuration file
Authenticates with GitHub
Creates sandboxes for each pool configuration
Sets up GitHub repository variables for tracking sandboxes
This command initializes sandbox pools as defined in my-sandbox-pools.json
, using the Dev Hub 'my-devhub', and creates corresponding variables in the GitHub repository 'myorg/myrepo'.
sfp-pro | sfp (community) | |
---|---|---|
Sandbox pools in sfp are designed to provide instantly available Salesforce environments for development, testing, and review processes. By maintaining a pool of pre-created sandboxes, teams can significantly reduce wait times and streamline their development workflows.
Immediate Availability: Eliminate waiting times for sandbox creation, allowing developers to start work instantly.
Reduced Overhead: Minimize the administrative burden of creating and managing individual sandboxes for each task.
Consistent Environment: Ensure all team members work with standardized, pre-configured sandbox environments.
Seamless Integration: Easily incorporate sandbox allocation into automated CI/CD pipelines and development workflows.
The following diagram illustrates the lifecycle of a sandbox within a pool:
Initializes sandbox pools based on configuration files.
Fetches an available sandbox from a pool and assigns it to an issue.
Monitors sandbox status, handles activations, expirations, and deletions.
The sfp sandbox monitor
command is designed to be run as a continuous cron job. This ensures that:
Newly created sandboxes are activated promptly.
Expired sandboxes are identified and marked for deletion.
Marked sandboxes are deleted, freeing up resources.
The pool is kept in a healthy state, always ready for use.
It's recommended to set up this command to run at regular intervals (e.g., every 15-30 minutes) to maintain an up-to-date and efficient sandbox pool.
For detailed information on each command and the expiration process, please refer to the individual command documentation.
sfp-pro | sfp (community) | |
---|---|---|
The sfp pool sandbox fetch
command is used to fetch an available sandbox from a pool and assign it to a specific issue or pull request.
Please note that for this feature to work, you need to use GitHub/GitLab token and have atleast maintainer access
--repository, -r
: The repository path (e.g., owner/repo
). Default is the GITHUB_REPOSITORY
environment variable.
--pool, -p
: The name of the pool to fetch the sandbox from (required).
--branch, -b
: The branch of the pool from where the environment is to be fetched (required).
--issue, -i
: Issue number to be associated with the sandbox (required).
--devhubalias
: The DevHub alias associated with the pool (default: 'devhub').
--wait
: Time in minutes to wait for an available sandbox (default: 20).
--leasefor
: Time in minutes to lease the sandbox for the current task (default: 15).
Checks for existing sandbox assignments to the issue
If no assignment, acquires a lock on the pool
Fetches an available sandbox
Assigns the sandbox to the specified issue
Sets GitHub Actions output variable (if in GitHub Actions environment)
Releases the lock on the pool
The --leasefor
parameter specifies how long the sandbox is reserved for the current task before it can be reassigned within the same issue context. This is different from the overall expiration time of the sandbox in the pool.
This fetches a sandbox from 'dev-pool' for the branch 'feature/new-feature', assigns it to issue 1234, and leases it for 60 minutes.
sfp-pro | sfp (community) | |
---|---|---|
The sfp sandbox monitor
command is used to monitor the status of sandboxes in pools, activate new sandboxes, handle expirations, and manage deletions. This command is designed to be run as a continuous cron job.
-v, --targetdevhubusername
: Alias of the target Dev Hub org (required)
-r, --repo
: Repository in the format owner/repo (required)
-f, --configfile
: Path to the sandbox pool configuration file(s) (required, can be specified multiple times)
This command should be set up as a cron job to run at regular intervals (e.g., every 15-30 minutes). This ensures:
Prompt activation of newly created sandboxes
Timely identification and marking of expired sandboxes
Deletion of marked sandboxes to free up resources
Maintenance of a healthy and efficient sandbox pool
Identifies sandboxes in 'InProgress' state
Attempts to activate them
Updates their status in GitHub variables
Checks sandboxes against their expiration times
Considers default expiration, extended expiration, and average creation time
Marks expired sandboxes in GitHub variables
Identifies sandboxes marked as 'Expired'
Attempts to delete them from the Salesforce org
Removes corresponding GitHub variables
Logs results of the deletion process
Default Expiration: 24 hours (configurable)
Extended Expiration: Additional 24 hours (configurable)
Average Creation Time: 2 hours (configurable)
These times can be customized in the pool configuration file.
This command monitors sandbox pools defined in both configuration files, using the Dev Hub 'my-devhub', and updates statuses in the 'myorg/myrepo' GitHub repository.
Updates the status of a review environment assigned to a specific issue.
The commands are only available in sfp-pro (August 24 onwards) and currently limited to GitHub. Using these commands requires an equivalent APP_ID & PRIVATE_KEY in your environment variable.
--repository
: The repository path that stores the pool lock (default: current repo).
--issue
: The pull request number to assign the environment to, or a unique id that will be used subsequently to identify (required).
--status
: The status to transition the review environment to (required). Options: 'InUse', 'Available', 'Expired'
Locates the environment assigned to the specified issue.
Updates the status of the environment to the specified status.
'InUse': The environment is currently being used by automated checks or another automated process.
'Available': The environment is available for reuse by another automation within the same issue's context.
'Expired': The environment will be picked up by Pool commands for deletion.
This command doesn't reflect the state of the pull request, but rather the current usage state of the environment within the issue's context.
Transitioning to 'Available' before the lease expires allows for efficient reuse within the same issue.
Parameter | Required | Type | Description |
---|---|---|---|
enableFlowActivation
boolean
Enable Flows automatically in Production
source
diff
unlocked
release
Yes
string
Name of the release
skipIfAlreadyInstalled
No
boolean
Skip installation of artifact if it's already installed in target org
baselineOrg
No
string
The org used to decide whether or not to skip installation of an artifact. Defaults to the target org when not provided.
artifacts
Yes
Object
Map of artifacts to deploy and their corresponding version
promotePackagesBeforeDeploymentToOrg
No
string
Promote packages before they are installed into an org that matches alias of the org
packageDependencies
No
Object
Packages dependencies (e.g. managed packages) to install as part of the release. Provide the 04t subscriber package version Id.
changelog.repoUrl
No
Prop
The URL of the version control system to push changelog files
changelog.workItemFilters
No
Prop
An array of regular expression used to identify work items in your commit messages
changelog.workitemUrl
No
Prop
The generic URL of work items, to which to append work item codes. Allows easy redirection to user stories by clicking on the work-item link in the changelog.
changelog.limit
No
Prop
Limit the number of releases to display in the changelog markdown
changelog.showAllArtifacts
No
Prop
Whether to show artifacts that haven't changed between releases
Individual
Ignore packages installed in scratch org, identify list of changed packages from PR/Merge Request, and validate each of the changed packages (respecting any dependencies) using thorough mode.
--mode=individual
Fast Feedback
Skip package dependencies and code coverage and selective test class executions, install only changed components, and ignore changes in package descriptors
--mode=fastfeedback
Thorough (Default)
Include package dependencies, code coverage, all test classes during full package deployments
--mode=thorough
Fast Feedback Release Config
Extension of fast feedback mode but filtered using release configuration file that defines list of packages to validate with only changed packages ending up being used to validate against the scratch org
--mode=ff-release-config
--releaseconfig=<value>
Thorough Release Config
Extension of thorough default mode but filtered using release configuration file that defines list of packages to validate with only changed packages ending up being used to validate against the scratch org
--mode=thorough-release-config
--releaseconfig=<value>
Property
Type
Description
tag
string
Name used to identify the scratch org pool
waitTime
int
Minutes the command should wait for a scratch org to be created, Default is 6 minutes
expiry
int
Number of days for which the scratch orgs are active
maxAllocation
int
Maximum capacity of the pool
batchSize
int
Number of processes for creating scratch orgs in parallel
configFilePath
string
Path to scratch org definition JSON file
succeedOnDeploymentErrors
boolean
Whether to persist scratch org to the pool for a deployment error, default:true
snapshotPool
string
Name of the earlier prepared scratch org pool that can be utilized by this pool, to prepare pools in multiple stages.
installAll
boolean
Install all package artifacts, in addition to the managed package dependencies
releaseConfigFile
string
Path to a release config file to create pools with selected packages. Use in conjunction with installAll
enableSourceTracking
boolean
Enable source tracking by deploying packages using source:push and persisting source tracking files
disableSourcePackageOverride
boolean
Disable overriding unlocked packages as source packages, Rather install unlocked packages as unlocked
relaxAllIPRanges
boolean
Relax all IP addresses, allowing all global access to scratch orgs
ipRangesToBeRelaxed
array
Range of IP addresses that can access the scratch orgs
retryOnFailure
boolean
Retry installation of a package on a failed deployment
maxRetryCount
number
Maximum number of times a package should be retried while deploying to a scratchorg, The default is 2
preDependencyInstallationScriptPath
string
Path to a script file that need to be executed before dependent packages are installed in a scratch org
postDeploymentScriptPath
string
Path to a script file that need to be exectued after all the packages (dependencies+repository) is installed
enableVlocity
boolean
Enable vlocity settings and config deployment. Please note it doesnt install vlocity managed package"
fetchArtifacts
object
Fetch artifacts, to be deployed to scratch orgs, from an artifact registry
fetchArtifacts.artifactFetchScript
string
Path to the shell script containing logic for fetching artifacts from a universal registry, if not using npm
fetchArtifacts.npm
object
Fetch artifacts from NPM registry
fetchArtifacts.npm.scope
string
Scope of the NPM package
Availability
✅
❌
From
September '24
Availability
✅
❌
From
September 24
Availability
✅
❌
From
September 24
skipTesting
boolean
Skip trigger of apex tests while validating or deploying
unlocked
org-dependent unlocked
source
diff
skipCoverageValidation
boolean
Skip apex test coverage validation of a package
unlocked
org-dependent unlocked
source
diff
testSynchronous
boolean
Ensure all tests of the package is triggered synchronously
unlocked
org-dependent unlocked
source
diff
Availability
✅
❌
From
September 24
Attribute | Type | Description | Package Types Applicable |
---|---|---|---|
Occasionally you might need to assign a permission set for the deployment user to successfully install the package, run apex tests or functional tests, sfp provides you an easy mechanism to assign permission set either before installation or after installation of an artifact. assignPermSetsPreDeployment assumes the permission sets are already deployed on the target org and proceed to assign these permission sets to the user
assignPermSetsPostDeployment can be used to assign permission sets that are introduced by the artifact to the target org for any aspects of testing or any other automation usage
assignPermSetsPreDeployment
array
Apply permsets before installing an artifact to the deployment user
unlocked
org-dependent unlocked
source
diff
assignPermSetsPostDeployment
array
Apply permsets after installing an artifact to the deployment user
unlocked
org-dependent unlocked
source
diff
The following section details on how to operate a pool of scratch orgs
Checks the status of review environments assigned to a specific pull request/issue.
The commands are only available in sfp-pro (August 24 onwards) and currently limited to GitHub. Using these commands requires an equivalent APP_ID & PRIVATE_KEY in your environment variable.
--repository
: The repository path that stores the pool lock (default: current repo).
--issue
: The pull request number to assign the environment to, or a unique id that will be used subsequently to identify (required).
--pool
: The name of the pool to filter by (optional).
--poolType
: The type of the pool to filter by, either sandbox
or scratchorg
(optional).
--branch
: The pull request branch to filter by (optional).
Searches the repository's stored environment data for environments associated with the specified issue.
Returns details of any found environments, including:
Environment name or username
Environment type (sandbox or scratch org)
Pool the environment belongs to
Associated pull request branch
Environment status and expiration date
This command is typically used as needed, not within the regular pull request workflow.
It's useful for verifying environment details or troubleshooting.