Cloud Kicks has the following requirements:
• Their Shipment custom object must always relate to a Product, a Sender, and a Receiver (all
separate custom objects).
• If a Shipment is currently associated with a Product, Sender, or Receiver, deletion of those records
should not be allowed.
• Each custom object must have separate sharing models.
What should an Architect do to fulfill these requirements?
B
Explanation:
A required Lookup relationship ensures that the Shipment record must have a value for each of the
three parent records, and also prevents the deletion of those parent records if they are referenced by
a Shipment record.
A Master-Detail relationship would not allow separate sharing models for each
custom object, and a VLOOKUP formula field would not enforce the relationship or prevent deletion
Universal Containers (UC) is planning to move away from legacy CRM to Salesforce. As part of one-
time data migration, UC will need to keep the original date when a contact was created in the legacy
system. How should an Architect design the data migration solution to meet this requirement?
C
Explanation:
Enabling “Set Audit Fields” allows the user loading the data to set the value of the standard
CreatedDate field to match the original date from the legacy system. This is a one-time permission
that can be revoked after the migration is completed.
The other options would either not work or
require additional customization
An architect has been asked to provide error messages when a future date is detected in a custom
Birthdate _c field on the Contact object. The client wants the ability to translate the error messages.
What are two approaches the architect should use to achieve this solution? Choose 2 answers
B, D
Explanation:
Creating a trigger on Contact and adding an error to the record with a custom label allows the
architect to use the translation workbench to translate the error message based on the user’s
language. Creating a validation rule and translating the error message with translation workbench
also achieves the same result.
The other options would either not provide translation functionality or
not display an error message
What is an advantage of using Custom metadata type over Custom setting?
C
Explanation:
Custom metadata records are deployable using packages, which makes them easier to migrate from
one environment to another. Custom settings records are not deployable using packages, and they
are copied from production to sandbox. Custom metadata types are not available for reporting, and
custom metadata records are not editable in Apex.
Get Cloudy Consulting uses an invoicing system that has specific requirements. One requirement is
that attachments associated with the Invoice_c custom object be classified by Types (i.e., ""Purchase
Order"", ""Receipt"", etc.) so that reporting can be performed on invoices showing the number of
attachments grouped by Type.
What should an Architect do to categorize the attachments to fulfill these requirements?
D
Explanation:
Creating a custom object related to the Invoice object with a picklist field for the Type allows the
architect to categorize the attachments and report on them by Type. The standard Attachment object
does not have a ContentType picklist field, and adding a custom picklist field to it would not be best
practice.
Universal Containers has a legacy system that captures Conferences and Venues. These Conferences
can occur at any Venue. They create hundreds of thousands of Conferences per year. Historically,
they have only used 20 Venues. Which two things should the data architect consider when
denormalizing this data model into a single Conference object with a Venue picklist? Choose 2
answers
C, D
Explanation:
When denormalizing a data model into a single object with a picklist field, the data architect should
consider the Bulk API limitations on picklist fields and the standard list view in-line editing.
The Bulk
API has a limit of 1,000 distinct picklist values per file1
, which could be an issue if there are more
than 1,000 venues in the future.
The standard list view in-line editing allows users to edit multiple
records at once, which could introduce data quality issues if the venue picklist is not validated or
restricted2
. The other options are not relevant to denormalizing a data model.
Universal Container (UC) has around 200,000 Customers (stored in Account object). They get 1 or 2
Orders every month from each Customer. Orders are stored in a custom object called "Order c"; this
has about 50 fields. UC is expecting a growth of 10% year -over -year. What are two considerations an
architect should consider to improve the performance of SOQL queries that retrieve data from the
Order _c object? Choose 2 answers
B, D
Explanation:
To improve the performance of SOQL queries that retrieve data from the Order_c object, the data
architect should work with Salesforce Support to enable Skinny Tables and make the queries more
selective using indexed fields.
Skinny Tables are custom tables that contain frequently used fields and
are kept in sync with the base tables3
. They can improve performance by reducing the number of
table joins and using indexes.
Making the queries more selective using indexed fields can also
improve performance by reducing the query execution time and avoiding query timeouts4
. The other
options are not effective or recommended for improving SOQL performance.
Universal Containers (UC) provides shipping services to its customers. They use Opportunities to
track customer shipments. At any given time, shipping status can be one of the 10 values. UC has
200,000 Opportunity records. When creating a new field to track shipping status on opportunity,
what should the architect do to improve data quality and avoid data skew?
A
Explanation:
To improve data quality and avoid data skew, the data architect should create a picklist field with
values sorted alphabetically for tracking shipping status on opportunity. A picklist field ensures that
only valid values are entered and prevents typos or variations in spelling. Sorting the values
alphabetically makes it easier for users to find and select the correct value. Data skew occurs when a
large number of records are owned by a single user or have a single value for a field. Creating a
picklist field with a limited number of values does not cause data skew, as long as the distribution of
values is balanced and not skewed towards one value.
Universal Containers (UC) management has identified a total of ten text fields on the Contact object
as important to capture any changes made to these fields, such as who made the change, when they
made the change, what is the old value, and what is the new value. UC needs to be able to report on
these field data changes within Salesforce for the past 3 months. What are two approaches that will
meet this requirement? Choose 2 answers
B, C
Explanation:
To capture and report on any changes made to ten text fields on the Contact object for the past 3
months, the data architect should write an Apex trigger on Contact after insert and after update
events and store the old values in another custom object, or turn on field Contact object history
tracking for these ten fields and create reports on contact history. An Apex trigger can capture the old
and new values of the fields, as well as the user and time of the change, and store them in a custom
object that can be used for reporting. Field history tracking can also track the changes to the fields
and store them in a history table that can be used for reporting. However, field history tracking only
retains data for up to 18 months or 24 months with an extension, so it may not be suitable for longer-
term reporting needs. The other options are not feasible or effective for capturing and reporting on
field data changes.
Universal Containers (UC) has an open sharing model for its Salesforce users to allow all its Salesforce
internal users to edit all contacts, regardless of who owns the contact. However, UC management
wants to allow only the owner of a contact record to delete that contact. If a user does not own the
contact, then the user should not be allowed to delete the record. How should the architect
approach the project so that the requirements are met?
A
Explanation:
To allow only the owner of a contact record to delete that contact, the data architect should create a
“before delete” trigger to check if the current user is not the owner. The trigger can use the
UserInfo.getUserId() method to get the current user’s ID and compare it with the OwnerId field of
the contact record. If they are not equal, the trigger can add an error to the record and prevent it
from being deleted. The other options are not suitable for meeting the requirements, as they would
either restrict the edit access or delete access for all users, regardless of ownership.
Universal Containers (UC) uses Salesforce for tracking opportunities (Opportunity). UC uses an
internal ERP system for tracking deliveries and invoicing. The ERP system supports SOAP API and
OData for bi-directional integration between Salesforce and the ERP system. UC has about one
million opportunities. For each opportunity, UC sends 12 invoices, one per month. UC sales reps have
requirements to view current invoice status and invoice amount from the opportunity page. When
creating an object to model invoices, what should the architect recommend, considering
performance and data storage space?
B
Explanation:
Creating an external object Invoice_x with a Lookup relationship with Opportunity is the best option
for modeling invoices, considering performance and data storage space. An external object allows
the data to be stored in the ERP system and accessed via OData in Salesforce. This reduces the data
storage consumption in Salesforce and improves the performance of queries and reports. A Lookup
relationship allows the sales reps to view the invoice status and amount from the opportunity
page.
The other options would either consume more data storage space, require additional
customization, or not provide real-time data access
Universal Containers has a large number of Opportunity fields (100) that they want to track field
history on. Which two actions should an architect perform in order to meet this requirement?
Choose 2 answers
A, B
Explanation:
Creating a custom object to store a copy of the record when changed and creating a custom object to
store the previous and new field values are two possible actions that an architect can perform to
meet the requirement of tracking field history on 100 Opportunity fields. A custom object can store
more fields and records than the standard field history tracking feature, which has a limit of 20 fields
per object and 18 or 24 months of data retention. A custom object can also be used for reporting and
analysis of field history data.
The other options are not feasible or effective for meeting the
requirement
DreamHouse Realty has a Salesforce org that is used to manage Contacts.
What are two things an Architect should consider using to maintain data quality in this situation?
(Choose two.)
B, C
Explanation:
Using Salesforce duplicate management and using validation rules on new record create and edit are
two things that an architect should consider using to maintain data quality for managing Contacts.
Salesforce duplicate management allows the architect to create matching rules and duplicate rules to
identify, prevent, or allow duplicate records based on various criteria. Validation rules allow the
architect to enforce data quality standards and business logic by displaying error messages when
users try to save invalid data.
The other options are not relevant or helpful for maintaining data
quality
Universal Containers is looking to use Salesforce to manage their sales organization. They will be
migrating legacy account data from two aging systems into Salesforce. Which two design
considerations should an architect take to minimize data duplication? Choose 2 answers
B, C
Explanation:
Cleaning data before importing to Salesforce and using Salesforce matching and duplicate rules are
two design considerations that an architect should take to minimize data duplication when migrating
legacy account data from two aging systems into Salesforce. Cleaning data before importing involves
removing or correcting any inaccurate, incomplete, or inconsistent data from the source systems, as
well as identifying and resolving any potential duplicates. This ensures that only high-quality and
unique data is imported to Salesforce. Using Salesforce matching and duplicate rules allows the
architect to define how Salesforce identifies duplicate records during import and how users can
handle them. This prevents or reduces the creation of duplicate records in Salesforce and improves
data quality. The other options are not effective or recommended for minimizing data duplication.
Universal Containers (UC) has a Salesforce instance with over 10.000 Account records. They have
noticed similar, but not identical. Account names and addresses. What should UC do to ensure
proper data quality?
C
Explanation:
Enabling Account de-duplication by creating matching rules in Salesforce, which will mass merge
duplicate Accounts, is what UC should do to ensure proper data quality for their Account records.
Matching rules allow UC to define how Salesforce identifies duplicate Accounts based on various
criteria, such as name, address, phone number, etc. Mass merge allows UC to merge up to 200
duplicate Accounts at a time, based on the matching rules. This simplifies and automates the process
of de-duplicating Accounts and improves data quality. The other options are either more time-
consuming, costly, or error-prone for ensuring proper data quality.