Certified Data Architect Exam QuestionsBrowse all questions from this exam

Certified Data Architect Exam - Question 57


Universal Containers (UC) requires 2 years of customer related cases to be available on Salesforce for operational reporting. Any cases older than 2 years and up to 7 years need to be available on demand to service agents. UC creates 5 million cases per year.

Which two data archiving strategies should a data architect recommend? (Choose two.)

Show Answer
Correct Answer: ABD

To manage the large volume of cases and meet the requirement of making older cases accessible, syncing cases older than 2 years to an external database allows service agents to access this data without overloading the Salesforce instance. Big Objects are designed to handle massive amounts of data and provide a scalable solution within Salesforce for older cases, ensuring they remain accessible without affecting the main org's performance. Both options ensure that the cases are available on demand and efficiently managed without hitting storage limits.

Discussion

6 comments
Sign in to comment
vkm
Sep 9, 2023

why can't CD

tobicky
Nov 19, 2023

The question doesn’t specify that records need to be deleted. The use of the Bulk API to hard delete them from Salesforce in option C might not be necessary if the requirement is to only make the cases available on demand. The main focus should be on strategies that allow for older cases to be easily accessed without affecting the performance of the Salesforce org.

nooxOptions: CD
Oct 4, 2023

C and D are correct. A - You can't expect a service agent to know how to use a database. B - Custom Object are not designed for LVD (Large Data Volume)

tobicky
Nov 19, 2023

The question doesn’t specify that records need to be deleted. The use of the Bulk API to hard delete them from Salesforce in option C might not be necessary if the requirement is to only make the cases available on demand. The main focus should be on strategies that allow for older cases to be easily accessed without affecting the performance of the Salesforce org.

DonDemik
Jan 21, 2024

1) UC creates 5 million cases per year. 2) There are records up to 7 years old. 1+2 means that a large amount of data is generated in UC and should not be saved in Salesforce but somewhere else. so deleting data from Salesforce is necessary.

tobickyOptions: AD
Nov 19, 2023

Option A is recommended because syncing older cases to an external database and providing service agents access to this database can help reduce the load on the Salesforce org while still making the data accessible. Option D is suggested because Big Objects provide a way to handle and store massive amounts of data within Salesforce’s multi-tenant environment. They are designed to provide consistent performance, whether you have 1 million records, 100 million, or even 1 billion.

vip_10Options: AD
Aug 13, 2023

With option B if the data is stored in custom object still it will count against the org's data storage limit.

lizbetteOptions: AD
Apr 21, 2024

A is definitely right because it's a recommended solution B cannot be right because over time, you'll hit a data limit C cannot be right because Heroku has 20million record limitation, and 5 years x 5 mill cases is 25million which is over the limit D - big objects have enough capacity to handle the data volume A&D

bb0607978Options: CD
May 17, 2024

if the requirement is to archive cases older then 2, but not older then 7 years, then you certainly need archiving strategy but also a deleting strategy as well. About B) option: with such big number of records to store, you can not use Custom Object is out of question -> NOT B) About A) option: well Heroku is also an external db, and it used extra stgorage and archiving place (Heroku Connect App i.e.), so Heroku is a natural choice over A) On the end you are left with C) and D) options.