Destructive Changes
Last updated
Was this helpful?
Last updated
Was this helpful?
sfp handles destructive changes according to the type of package. Here is a rundown on how the behaviour is according to various package types and modes
Salesforce handles destructive changes in unlocked packages / org dependent unlocked packages as part of the package upgrade process. From the Salesforce documentation ()
Metadata that was removed in the new package version is also removed from the target org as part of the upgrade. Removed metadata is metadata not included in the current package version install, but present in the previous package version installed in the target org. If metadata is removed before the upgrade occurs, the upgrade proceeds normally. Some examples where metadata is deprecated and not deleted are:
User-entered data in custom objects and fields are deprecated and not deleted. Admins can export such data if necessary.
An object such as an Apex class is deprecated and not deleted if it’s referenced in a Lightning component that is part of the package.
sfp utilizes mixed
mode while installing unlocked packages to the target org. So any metadata that can be deleted is removed from the target org. If the component is deprecated, it has to be manually removed.
Components that are hard deleted upon a version upgrade is found .
Source packages support destructive changes using folder structure to demarcate components that need to be deleted. One can make use of pre-destructive and
`post-destructive
folders to mark components that need to be deleted
The package installation is a single deployment transaction with components that are part of pre/post deployed along with destructive operation as specified in the folder structure. This would allow one to refactor the current code to facilitate refactoring for the destructive changes to succeed, as often deletion is only allowed if there are no existing components in the org that have a reference to the component that is being deleted
Test destructive changes in your review environment thoroughly before merging your changes
After the version of package is installed across all the target orgs, you would need to merge another change which would remove the post-destructive or pre-destructive folders. You do not need to rush through this , as sfp ignores any warning associated with missing components in the org
Data packages utilize sfdmu under the hood, and one can utilize any of the below approaches to remove data records.
Approach 1: Combined Upsert and Delete Operations
One effective method involves configuring SFDMU to perform both upsert and delete operations in sequence for the same object. This approach ensures comprehensive data management—updating and inserting relevant records first, followed by removing outdated entries based on specific conditions.
Upsert Operation: Updates or inserts records based on a defined external ID, aligning the Salesforce org with new or updated data from a source file.
Delete Operation: Deletes specific records that meet certain criteria, such as being marked as inactive, to ensure the org only contains relevant and active data.
Approach 2: Utilizing deleteOldData
Another approach involves using the deleteOldData
parameter. This parameter is particularly useful when needing to clean up old data that no longer matches the current dataset in the source before inserting or updating new records.
Delete Old Data: Before performing data insertion or updates, SFDMU can be configured to remove all existing records that no longer match the new dataset criteria, thus simplifying the maintenance of data freshness and relevance in the target org
You will need to understand the dependency implications while dealing with destructive changes, especially the follow on effects of a deletion in other packages, It is recommended you do a compile all of all apex classes ( & ) to detect any errors on apex classes or triggers