IBM c1000-150 practice test

IBM Cloud Pak for Business Automation v21.0.3 Administration

Last exam update: Nov 18 ,2025
Page 1 out of 4. Viewing questions 1-15 out of 60

Question 1

Container Application Software for Enterprises (CASE) is a specification that defines what?

  • A. Metadata and structure for packaging, managing, and unpacking containerized applications.
  • B. OpenShift command syntax for file mirroring.
  • C. An industry standard naming convention for OpenShift namespaces.
  • D. An Authentication standard for accessing containerized applications.
Mark Question:
Answer:

A


Explanation:
Container Application Software for Enterprises (CASE) is a specification that defines the metadata
and structure for packaging, managing, and unpacking containerized applications. CASE allows the
creation of a standard format for containerized applications that can be easily shared and deployed
across different environments. It defines the metadata, structure, and packaging of containerized
applications, and provides a way to manage and distribute them across different platforms and
environments.
https://developer.ibm.com/blogs/container-application-software-for-enterprises-packaging-spec/

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2

What is the benefit of using the Rsyslog Sidecar?

  • A. Offers easy adoption of audit logging and shifts the burden of transmitting the messages to the sidecar.
  • B. Multiple deployments of audit logging controller policies.
  • C. Ability to run a service and the rsyslog sidecar in the separate namespace as the fluentd instance.
  • D. More flexibility as to the format structure of the audit logs which can support JSON, XML, and YAML.
Mark Question:
Answer:

A


Explanation:
https://www.ibm.com/docs/en/cpfs?topic=operator-architecture-audit-logging-version-370
Rsyslog is an open-source software utility for forwarding log messages in an IP network. It can act as
a centralized log server and can also be used as a sidecar in a Kubernetes environment. The Rsyslog
sidecar can be used to provide easier adoption of audit logging by shifting the burden of transmitting
log messages to the sidecar. This allows for a more streamlined process for implementing audit
logging within a Kubernetes environment.
Reference:
https://rsyslog.com/
https://kubernetes.io/docs/concepts/cluster-administration/logging/

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 3

A business user wants to integrate events coming from BPMN workflows and from ADS. Which setup
would serve this purpose?

  • A. Avro schema
  • B. BAI Canonical model
  • C. Fixed format
  • D. Kafka unified data model
Mark Question:
Answer:

D


Explanation:
Kafka is a distributed streaming platform that can be used to integrate events coming from BPMN
workflows and from ADS (Application Data Sources). The Kafka unified data model allows for the
standardized and centralized handling of data from various sources, including BPMN workflows and
ADS. This setup can be used to standardize and normalize the data, which can then be used for
analytics, reporting, and real-time processing.
Reference:
https://kafka.apache.org/
https://kafka.apache.org/documentation/streams/
https://kafka.apache.org/documentation/streams/core-concepts

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 4

What kind of data is written to the Business Automation Workflow transaction log file?

  • A. Installation and profile creation data
  • B. Data written to databases
  • C. LDAP query data
  • D. REST requests data
Mark Question:
Answer:

B


Explanation:
The Business Automation Workflow (BAW) transaction log file contains information about the data
that is written to databases. It records the data that is written to the database as part of a BAW
process. This can include information such as the process instance ID, the task that was executed, the
user that performed the task, and the data that was written to the database. This log file is useful for
troubleshooting and auditing purposes.
Reference:
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_18.0.x/com.ibm.dba.baw/t_baw_transaction_log.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_18.0.x/com.ibm.dba.baw/t_baw_transaction_log_file.html

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 5

Which item is best for troubleshooting FileNet Content Engine authentication issues?

  • A. messages.log
  • B. console.log
  • C. systemout.log
  • D. LDAP configuration data
Mark Question:
Answer:

C


Explanation:
It contains details about the security credentials used to access the FileNet Content Engine and if
there are any authentication errors, you can see them recorded in systemout.log. Other useful
references for troubleshooting authentication issues include the IBM Knowledge Center and IBM
Support, as well as the IBM FileNet Content Engine forum.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 6

Prior to deploying the Cloud Pak for Business Automation operator, which two common prerequisites
exist for all Cloud Pak for Business Automation capabilities (excluding Business Automation Insights)?

  • A. LDAP
  • B. Database
  • C. Persistent Volumes
  • D. Network Policies
  • E. Routes
Mark Question:
Answer:

BC


Explanation:
Before deploying the Cloud Pak for Business Automation operator, it is important to ensure that the
necessary database and persistent volume are in place. This will allow the operator to store and
persist data as needed. The other options are not prerequisites for all Cloud Pak for Business
Automation capabilities.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_9.7.1/com.ibm.wbpm.inst.doc/topics/t_k8s_prereq.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_9.7.1/com.ibm.wbpm.inst.doc/topics/t_k8s_install_operator.html

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 7

Which permission can be granted in order to see the RPA Server option in the Platform Ul navigation
menu?

  • A. rpa-manage
  • B. rpa-owner
  • C. rpa-develop
  • D. rpa-edit
Mark Question:
Answer:

B


Explanation:
The rpa-owner permission must be granted in order for the option to appear in the Platform UI
navigation menu. If you would like to grant the permission, please follow these steps: 1) Log into the
Robotic Process Automation (RPA) server 2) Navigate to the Settings tab 3) Select the Security tab 4)
Select the Roles & Permissions tab 5) Select the rpa-owner permission 6) Click Save.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 8

Which two roles have the permission to connect to an LDAP directory?

  • A. Editor
  • B. Cloud Pak Administrator
  • C. Cluster Administrator
  • D. Operator
  • E. Viewer
Mark Question:
Answer:

BC


Explanation:
The two roles that have the permission to connect to an LDAP directory are Cloud Pak Administrator
and Cluster Administrator.
Cloud Pak Administrator is a role that has the highest level of access and can perform all the
administrator tasks for the Cloud Pak.
Cluster Administrator is a role that has the permission to manage the resources of a Kubernetes
cluster such as connecting to an LDAP directory, configuring security settings, and managing users
and roles.
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.rpa/rpa_security_authorization.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.rpa/rpa_user_manage_users.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.rpa/rpa_sec
urity_permission.html

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 9

What kind of probe can be used to determine if an application running in a pod is healthy?

  • A. Liveness probe
  • B. Readiness probe
  • C. Status probe
  • D. Starter probe
Mark Question:
Answer:

A


Explanation:
The most suitable probe in this case would be a liveness probe. This type of probe is used to detect if
an application is running correctly and is able to respond to requests. It is usually used in conjunction
with a readiness probe to ensure that the application is both healthy and ready to serve requests.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10

The migration of data from Cloud Pak for Business Automation versions that do not support an
upgrade require an Administrator to follow which process?

  • A. All Cloud Pak for Business Automation versions currently support a direct upgrade with the standard applicable procedures.
  • B. Uninstall the current deployment and follow the migration instructions for each component to point to the existing persistent stores.
  • C. Upgrade the core capabilities first, then the Cloud Pak for Business Automation Operator, followed by the Foundation Operator.
  • D. Mirror the existing persistent stores allowing the Operator to upgrade accordingly.
Mark Question:
Answer:

B


Explanation:
Migrating data from a Cloud Pak for Business Automation version that does not support an upgrade
requires an Administrator to uninstall the current deployment and follow the migration instructions
for each component, pointing them to the existing persistent stores
When migrating data from Cloud Pak for Business Automation versions that do not support an
upgrade, an Administrator must uninstall the current deployment and follow the migration
instructions for each component to point to the existing persistent stores. This process involves
migrating data from the existing databases and persistent volumes to the new deployment. This
process may also require creating new configurations and customizing the new deployment to match
the previous one.
Reference:
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics/install_migrate_data.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics/install_migrate_data_prereq.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics
/install_migrate_data_procedure.html

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 11

During a Cloud Pak for Business Automation installation, which component helps install, update, and
manage the lifecycle of all operators and services that are deployed in OpenShift Container Platform
clusters?

  • A. Operator Hub catalog
  • B. Operator Lifecycle Manager
  • C. Operator Framework
  • D. Operator discovery services
Mark Question:
Answer:

B


Explanation:
It is a component of the Cloud Pak for Business Automation which helps install, update and manage
the lifecycle of all operators and services that are deployed in OpenShift Container Platform clusters.
Reference:
https://www.ibm.com/blogs/cloud-pak-for-automation/what-is-the-operator-lifecycle-
manager-olm/.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 12

Where do the images reside for an air-gapped Cloud Pak for Business Automation upgrade?

  • A. IBM registry
  • B. RedHat quay.io registry
  • C. Local registry
  • D. Docker Hub
Mark Question:
Answer:

C


Explanation:
When performing an air-gapped upgrade of Cloud Pak for Business Automation, the images used for
the upgrade reside in a local registry. An air-gapped environment is one in which there is no external
network access, so the images cannot be pulled from a remote registry such as IBM registry, RedHat
quay.io registry, or Docker Hub. Instead, the images must be pre-pulled and stored in a local registry
that is accessible to the OpenShift cluster.
Reference:
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics/install_airgap_prereq.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics/install_airgap_procedure.html
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics
/install_airgap_registry.html

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 13

What is a best practice for application pod high availability?

  • A. Use multiple pods across different worker nodes.
  • B. Use multiple pods across different master nodes.
  • C. Use multiple small pods on a master node.
  • D. Use multiple pods across both worker and master nodes.
Mark Question:
Answer:

D


Explanation:
A best practice for application pod high availability is to use multiple pods across both worker and
master nodes. This ensures redundancy and increased availability of the application. Additionally,
using multiple small pods on a master node can improve resource utilization and reduce overall
costs. For more information on best practices for application pod high availability, please refer to the
following references: 1. "Pod Autoscaler Best Practices" from Kubernetes Docs:
https://kubernetes.io/docs/tasks/run-application/horizontal-pod-autoscale-best-practices/ 2. "High
Availability
for
Containerized
Workloads
with
Kubernetes"
from
AWS:
https://aws.amazon.com/blogs/compute/high-availability-for-containerized-workloads-with-
kubernetes/
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 14

A starter deployment requires which two capabilities to be installed independently?

  • A. Operational Decision Manager and Automation Decision Services
  • B. Content Platform Engine and Navigator
  • C. Process Mining and Robotic Process Automation
  • D. Business Automation Insights and Kafka
Mark Question:
Answer:

A


Explanation:
When deploying the Cloud Pak for Business Automation, the starter deployment requires one
capability to be installed independently:
Operational Decision Manager (ODM) and Automation Decision Services (ADS): These capabilities
provide a set of tools for creating and managing business rules, decision services, and analytics. They
are typically used to automate decision-making processes within an organization.
https://www.ibm.com/support/knowledgecenter/en/SSYHZ8_20.0.x/com.ibm.dba.baw.install/topics/install_overview.html

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 15

How are the parameters set for accessing the images in an OpenShift Container Platform
environment?

  • A. In the custom resource file
  • B. In the XML config file
  • C. Using the oc set command
  • D. In the environment variable
Mark Question:
Answer:

C


Explanation:
The parameters for accessing the images in an OpenShift Container Platform environment are set
using the oc set command. This command allows you to set image pull secrets and other image-
related parameters.
Reference:
[1]
https://docs.openshift.com/container-platform/4.4/openshift_images/image-pull-secrets.html
[2]
https://docs.openshift.com/container-platform/4.4/cli_reference/openshift_cli/oc-set.html
In an OpenShift Container Platform environment, the parameters for accessing images are set using
the "oc set" command. The "oc set" command allows an administrator to configure various aspects
of a deployment, such as environment variables, resource limits, and image pull secrets. These
parameters can be set for a specific deployment, service, pod, or other resource in the OpenShift
cluster.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000
To page 2