The following is a technique that you can find useful when implementing your Reference and Master
program:
A
Explanation:
When implementing a Reference and Master Data Management (RMDM) program, it is crucial to
utilize techniques that ensure consistency, accuracy, and reliability of data across various systems.
Business key cross-references is one such technique. This technique involves creating a mapping
between different identifiers (keys) used across systems to represent the same business entity. This
mapping ensures that data can be accurately and consistently referenced, integrated, and analyzed
across different systems.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov, which
emphasizes the importance of business key cross-referencing in MDM.
Which of the following is NOT ,1 characteristic of n deterministic matching algorithm?
B
Explanation:
Deterministic matching algorithms rely on exact matches between data fields to determine if records
are the same. These algorithms require high-quality data because any discrepancy, such as
typographical errors or variations in data entry, can prevent a match.
Characteristics of deterministic matching:
It has a discrete all or nothing outcome (C).
It matches exact character to character of one or more fields (D).
All identifiers being matched have equal weight (E).
Since deterministic matching is highly dependent on the quality of the data being matched, option B
is incorrect.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.
Within the Corporate Information Factory, what data is used to understand transactions?
C
Explanation:
In the context of the Corporate Information Factory, understanding transactions involves integrating
various types of data to get a comprehensive view. Master Data (core business entities), Reference
Data (standardized information), and External Data (information sourced from outside the
organization) are essential for providing context and enriching transactional data.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 3: Data Architecture
and Chapter 11: Reference and Master Data Management.
"Building the Data Warehouse" by W.H. Inmon, which introduces the Corporate Information Factory
concept.
For MDMs. what is meant by a classification scheme?
A
Explanation:
In Master Data Management (MDM), a classification scheme refers to a structured way of organizing
data by using codes that represent a controlled set of values. These codes help in categorizing and
standardizing data, making it easier to manage, search, and analyze.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.
Information Governance is a concept that covers the 'what', how', and why' pertaining to the data
assets of an organization. The 'what', 'how', and 'why' are respectively handled by the following
functional areas:
D
Explanation:
Information Governance involves managing and controlling the data assets of an organization,
addressing the 'what', 'how', and 'why'.
'What' pertains to Data Governance, which defines policies and procedures for data management.
'How' relates to Information Security, ensuring that data is protected and secure.
'Why' is about Compliance, ensuring that data management practices meet legal and regulatory
requirements.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 1: Data Governance.
"Information Governance: Concepts, Strategies, and Best Practices" by Robert F. Smallwood.
What is a registry as it applies to Master Data?
A
Explanation:
A registry in the context of Master Data Management (MDM) is a centralized index that maintains
pointers to master data located in various systems of record. This type of architecture is commonly
referred to as a "registry" model and allows organizations to create a unified view of their master
data without consolidating the actual data into a single repository. The registry acts as a directory,
providing metadata and linkage information to the actual data sources.
Reference:
DAMA-DMBOK2 Guide: Chapter 10 – Master and Reference Data Management
"Master Data Management: Creating a Single Source of Truth" by David Loshin
The concept of tracking the number of MDM subject areas and source system attributes Is referred to
as:
D
Explanation:
Tracking the number of MDM subject areas and source system attributes refers to defining the scope
and coverage of the subject areas and attributes involved in an MDM initiative. This process includes
identifying all the data entities (subject areas) and the specific attributes (data elements) within
those entities that need to be managed across the organization. By establishing a clear scope and
coverage, organizations can ensure that all relevant data is accounted for and appropriately
managed.
Reference:
DAMA-DMBOK2 Guide: Chapter 10 – Master and Reference Data Management
"Master Data Management and Data Governance" by Alex Berson, Larry Dubov
All of the following methods arc a moans to protect and secure master data In a production
environment except for which of the following?
D
Explanation:
Protecting and securing master data in a production environment can be achieved through various
methods. Encryption ciphers, static masking, trust model technologies, and dynamic masking are all
techniques used to safeguard data. However, usage agreements, while important for data
governance and legal compliance, are not a technical method for securing data in the same way that
the other options are. Usage agreements define the terms under which data can be accessed and
used, but they do not directly protect the data itself.
Reference:
DAMA-DMBOK2 Guide: Chapter 11 – Data Security Management
"Data Masking: A Key Component of a Secure Data Management Strategy" by Anjali Kaushik
Managing master data elements can be performed at which of the following points?
D
Explanation:
Managing master data elements can be performed at multiple levels within an organization. This
includes third-party providers such as Dun & Bradstreet (D&B) which can supply enriched and
standardized master data. At the enterprise level, organizations manage master data centrally to
ensure consistency and quality across all systems and processes. Within application suites such as
ERP (Enterprise Resource Planning) systems, master data management ensures that data is
consistent and accurate within and across different applications. Therefore, master data elements
can be managed at all these points.
Reference:
DAMA-DMBOK2 Guide: Chapter 10 – Master and Reference Data Management
"The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling" by Ralph Kimball
Which of the following is a method of deterministic matching?
E
Explanation:
Deterministic matching is a method of record linkage that relies on exact matching criteria. This
means that records are considered a match if certain key fields (e.g., name, Social Security Number)
have exactly the same values. Exact string match is a straightforward example of deterministic
matching, where the strings in specific fields must be identical for a match to be declared. Other
methods like sorted neighborhood, regional frequency, editing distance, and phonetic matching are
probabilistic or heuristic approaches that allow for some degree of variation or error in the data.
Reference:
DAMA-DMBOK2 Guide: Chapter 10 – Master and Reference Data Management
"Entity Resolution and Information Quality" by John R. Talburt
Master Data is similar to a physical product produced and sold by a company except for which of the
following characteristics?
D
Explanation:
Master Data, similar to a physical product, must meet certain requirements such as fitting
consumers' needs, needing information about its characteristics, impacting business when
unavailable, and having a useful lifespan. However, unlike physical products, Master Data does not
deplete when pulled from inventory. Master Data remains available for use even after being
accessed multiple times, as it is digital information that can be replicated and shared without loss.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.
Which of the following Is a characteristic of a probabilistic matching algorithm?
D
Explanation:
Probabilistic matching algorithms assign a score based on the weight and degree of match, assign
weights to variables based on their discriminating power, and use individual attribute matching
scores to create a match probability percentage. Additionally, after the matching process, some
records typically require manual review and decisioning to ensure accuracy. Therefore, all provided
characteristics describe the nature of probabilistic matching algorithms accurately.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov
The ISO definition of Master Data quality is which of the following?
D
Explanation:
The ISO definition of Master Data quality focuses on the degree to which the data's characteristics
meet the requirements of individual users. This implies that quality is subjective and depends on
whether the data is suitable and adequate for its intended purpose, fulfilling the specific needs of its
users.
Reference:
ISO 8000-8:2015 - Data quality — Part 8: Information and data quality: Concepts and measuring.
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 13: Data Quality
Management.
Where is the most time/energy typically spent tor any MDM effort?
C
Explanation:
In any Master Data Management (MDM) effort, the most time and energy are typically spent on
vetting business entities and data attributes through the Data Governance process. This step ensures
that the data is accurate, consistent, and adheres to defined standards and policies. It involves
significant collaboration and decision-making among stakeholders to validate and approve the data
elements to be managed.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.
Can Reference data be used for financial trading?
E
Explanation:
Reference data plays a crucial role in financial trading. It includes data such as financial instrument
identifiers, market data, currency codes, and regulatory classifications. Despite the dynamic nature
of financial trades, reference data provides the necessary static information to execute and settle
transactions. Industry estimates suggest that approximately 70% of the data used in financial
transactions is reference data, underscoring its importance in the financial sector.
Reference:
DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and
Master Data Management.
"The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling" by Ralph Kimball and
Margy Ross.
Industry publications and whitepapers on reference data management in financial services.