For those who are new to this platform, there is a popular misconception that there is no need for Salesforce backup as it uses cloud systems storage. This is not true. It is a fact that cloud services providers have solid data recovery practices in case of a disaster, but this is mostly so if something adverse happens on their part. However, in terms of everyday use of Salesforce, where the data is prone may risky exposures, the cloud services may not take the responsibility on your behalf. For this, you need to have a solid backup strategy in place.
Historically, Salesforce had the ability to perform disaster recovery as a last resort for the users, but effective from July 2020, this option was taken off. Even when disaster recovery was available, Salesforce recommended the users have a strong recovery solution in place. Users can go through the Salesforce Training that helps create high-value reports and dashboards, and set up workflow automation . Further, we will discuss some of the possible causes of data loss and recovery strategies to be in place.
Data loss scenarios
Here are some of the sample cases of Salesforce data loss and disaster.
- User errors like changing any field value or accidentally deleting reports.
- Data import or any massive change operations which overwrite the existing data.
- Wrong values are overwriting the existing data.
- New workflow rule, anonymous apex script, or trigger modifying the values in an unwanted way.
- Malicious user behaviors.
While considering Salesforce data backup, it is also important to understand data from metadata. In the context of Salesforce administration, data is the values stored, which is similar to the stored content in the database tables. On the other hand, metadata is the add-on configuration data, which is specifically for your use case, including the custom fields, APEX triggers, layouts, reports, rules, and others. Anu such metadata can be customized or added to the Salesforce org. For organizations using Salesforce without customization, metadata backup may not be as important as the actual data backup. At least, this may not be mandatory at the initial stages of Salesforce implementation.
For organizations that are into heavy customization of their data instance, metadata backup may incur a considerable investment. Salesforce recycles bin may not be able to track any updates or metadata changes as being the only way to ensure data protection.
Salesforce data backup methods
Salesforce offers a few standard options for the users to back up their data. Here are a few practical approaches on why backup your Salesforce data?
- Exporting data manually. For this, you may go to the Setup and choose the option of Data Management. You will see the Data Export option, which can be used for manual data exporting.
- Exporting data through reports. You will see the option for this at the same table.
- Data Loader import and export
- Backup of metadata using Package Manager. For this, go to Setup and Package Manager.
Using full Sandbox
In Salesforce setup, a full sandbox works as a 100% mirror of production instances, which holds both the data and metadata together. It will cost some extra to the users, and it can be refreshed only every six weeks. Sandbox is used by the organization to do Salesforce development or while testing larger data migrations. However, if needed, this can be used as a backup too.
Managing metadata with the development tools
For organizations with Salesforce development teams, metadata is retrieved using tools like Illuminated Cloud-based IntelliJ or MavensMate, etc. These can be versioned effectively and also backed up on the Git type of source code management systems.
When the actual Salesforce database is not accessible, then there are different tools like cData, Heroku, Skyvia, among others, which will enable mirroring of Salesforce data as a relational database like PostgreSQL, Microsoft SQL Server, etc. This is an ideal option when there is an RDBMS system already functional. This can be done along with the backup. These solutions may not backup the metadata on their own except for the customized object fields. So, to have a data backup, it should be separately backed up by the development team or database admin.
Another important thing to note is that not all products may let bi-directional synchronization, so the restoration may require additional steps like preparing the data files, exporting, using DataLoader scripts, etc.
Usage of data backup tool
Using a tool is one of the best solutions, especially for those organizations using Salesforce as more than just a CRM. Say, for example, it is recommended to use a backup solution so the users of Nextian Service Analyzer where the customer orders or the subscription services get deleted. It is so difficult to re-create those from the recycle bin or the dumps of data loader.
The backend process of Salesforce backup is from the database backup or the filesystem. Database backup relies on copying files or setting up the backup storage. In Salesforce, there is no direct access to filesystems and databases, whereas these need to be accessed with API. The backup solution for Salesforce is an advanced import-export system than just a backup and restore the system.
API access also will not distinguish among the general applications and backup, which implies that the workflows and triggers will be executed as it is. For effective restoration of the data, it needed to be disabled through the batch jobs and email sending, etc. Some of the top tools for salesforce data backup and recovery are:
- Nexian etc.
When it comes to the Salesforce restore process, there can be some significant differences from that of the relational DBs. The auto-generated names or object IDs cannot be custom set by the users. Lookup fields of the new objects are also handled easily by the backup packages.
Above all these, Salesforce backup tools can also offer some additional features to enable much effective Salesforce administration.