Certified Data Architect Exam QuestionsBrowse all questions from this exam

Certified Data Architect Exam - Question 23


Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC’s main Salesforce org will be divided into two orgs: Org A and Org B. UC has delivered these requirements to its data architect:

1. The data model for Org B will drastically change with different objects, fields, and picklist values.

2. Three million records will need to be migrated from Org A to Org B for compliance reasons.

3. The migration will need occur within the next two months, prior to the spilt.

Which migration strategy should a data architect use to successfully migrate the data?

Show Answer
Correct Answer: AB

Given the complexity of the data model changes, the volume of records, and the tight timeline for migration, using an ETL (Extract, Transform, Load) tool to orchestrate the migration is the most suitable option. ETL tools are specifically designed to handle complex data migrations like this, including handling different objects, fields, and picklist values, and they also provide robust error handling and logging capabilities that are crucial for ensuring a successful migration of three million records within two months.

Discussion

13 comments
Sign in to comment
AlokvOption: A
Jun 4, 2023

In this scenario, considering the complexity of the data model changes, the volume of records, and the timeframe for migration, the most suitable migration strategy would be: A. Use an ETL tool to orchestrate the migration.

tobickyOption: A
Nov 18, 2023

The most accurate answer is A. Use an ETL tool to orchestrate the migration. Given the complexity of the migration (drastic changes in the data model, large volume of records, and tight timeline), an ETL (Extract, Transform, Load) tool would be the most suitable option. ETL tools are designed to handle complex data migrations, including changes in data models and large volumes of data. They also provide robust error handling and logging capabilities, which are crucial for a successful migration. B. Write a script to use the Bulk API: Writing a script to use the Bulk API could be a viable option, but it would require significant development effort and may not be feasible given the two-month timeline. Additionally, this approach would require extensive testing to ensure the accuracy of the data migration.

AlokvOption: A
May 29, 2023

I think A is correct option. A Batch Script is also some kind of ETL tool. But it is developed by user/developer and therefore prone to have errors.

thneebOption: A
Jun 28, 2023

Drastical changes in the data model let me choose A.

kshoOption: A
Sep 15, 2023

ETL or Batch both would require investment in development resources. However, batch would be a 'from scratch' development effort and there's only two months to complete. Using an ETL tool would greatly shorten the development time to transform and migrate the data and can be easily updated during testing.

ETH777Option: A
Dec 30, 2023

Not B - Bulk API is efficient for bulk data transfer, but it requires significant scripting effort, especially for data mapping and transformation in this complex scenario. A - ETL tool handles complexity, mapping, and orchestration.

BorisBorisOption: A
Jun 23, 2023

Answer A. An ETL tool provides a robust and scalable solution for data migration between Salesforce orgs, especially when dealing with large volumes of data and complex transformations. Here's why it is the recommended approach: Scalability: An ETL tool can handle large data volumes efficiently by leveraging parallel processing capabilities. With three million records to migrate, using an ETL tool can ensure optimal performance and faster data transfer.

DavidHollandOption: A
Nov 22, 2023

Big changes to the data model means I would select A

BorisBorisOption: B
Jun 24, 2023

On reflection, B is probably more appropriate since it will requirer a one-off operation (maybe in batches), therefore, the investment in an ETL tool seems illogical unless one is already in use and, in this scenario, we cannot assume that. Therefore Batch Scripting is appropriate and will yield the required results with accuracy and reliability.

6967185
Mar 20, 2024

Options provided for Data Migration study guide are: serial load, parallel mode, defer sharing calculation, record locks, hierarchical relationship, and bulk API limits. Given this is a test on data migration, would opt for solution that is mentioned.

6967185
Mar 20, 2024

Scratch that, a single batch of records can contain a maximum of 10,000 records. The requirement states 3,000,000 records. :)

lizbetteOption: A
Apr 20, 2024

a correct

Nilesh_NandaOption: A
May 7, 2024

A is correct

RangyaOption: A
May 31, 2024

The script will facilitate extract and load. But transformation is also required here.