qlikview qrep practice test

Qlik Replicate

Last exam update: Nov 18 ,2025
Page 1 out of 4. Viewing questions 1-15 out of 60

Question 1

Which is the path to add a new column to a single table in a task?

  • A. Table Selection -> Schemas -> Add Column
  • B. New Transformation -> Column -> Add Column
  • C. Select Table -> Transform -> Add New
  • D. Table Settings -> General -> Add New Column
Mark Question:
Answer:

D


Explanation:
To add a new column to a single table in a Qlik Replicate task, the correct path is through Table
Settings. Here’s the process you would typically follow:
Navigate to the Table Settings of the table you wish to modify within your task.
Go to the General section.
Use the option to Add New Column.
This process allows you to add a column directly to the table’s schema as part of the task
configuration. It’s important to note that this action is part of the task’s design phase, where you can
specify the schema changes that should be applied to the data as it is replicated.
The other options listed, such as New Transformation or Select Table -> Transform, are not the direct
paths for adding a new column to a table’s schema within a task.
They are related to different
aspects of task configuration and transformation1
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2

Using Qlik Replicate, how can the timestamp shown be converted to unlx time (unix epoch - number
of seconds since January 1st 1970)?

  • A. SELECT datetime<1092941466, 'unixepoch*, 'localtime');
  • B. SELECT datetime(482340664, 'localtime', 'unixepoch');
  • C. strftime('%s*,SAR_H_COMMIT_TIMESTAMP) - <code>datetime.datetime</code>('%s','1970-01-01 00:00:00')
  • D. strftime*'%s,,SAR_H_COMMIT_TIMESTAMP) - strftime('%s','1970-01-01 00:00:00')
  • E. Time.now.strftime(%s','1970-01-01 00:00:00')
Mark Question:
Answer:

D


Explanation:
 The goal is to convert a timestamp to Unix time (seconds since January 1, 1970).
 The strftime function is used to format date and time values.
 To get the Unix epoch time, you can use the command: strftime('%s',SAR_H_COMMIT_TIMESTAMP)
- strftime('%s','1970-01-01 00:00:00').
 This command extracts the Unix time from the timestamp and subtracts the Unix epoch start time
to get the number of seconds since January 1, 1970. This is consistent with the Qlik Replicate
documentation and SQL standard functions for handling date and time conversions.
To convert a timestamp to Unix time (also known as Unix epoch time), which is the number of
seconds since January 1st, 1970, you can use the strftime function with the %s format specifier in
Qlik Replicate. The correct syntax for this conversion is:
strftime('%s', SAR_H_COMMIT_TIMESTAMP) - strftime('%s','1970-01-01 00:00:00')
This function will return the number of seconds between the SAR_H_COMMIT_TIMESTAMP and the
Unix epoch start date. Here’s a breakdown of the function:
strftime('%s', SAR_H_COMMIT_TIMESTAMP) converts the SAR_H_COMMIT_TIMESTAMP to Unix
time.
strftime('%s','1970-01-01 00:00:00') gives the Unix time for the epoch start date, which is 0.
Subtracting the second part from the first part is not necessary in this case because the Unix epoch
time is defined as the time since 1970-01-01 00:00:00. However, if the timestamp is in a different
time zone or format, adjustments may be needed.
The other options provided do not correctly represent the conversion to Unix time:
Options A and B use datetime instead of strftime, which is not the correct function for this
operation1
.
Option C incorrectly includes <code>datetime.datetime</code>, which is not a valid function in Qlik
Replicate and seems to be a mix of Python code and SQL1
.
Option E uses Time.now.strftime, which appears to be Ruby code and is not applicable in the context
of Qlik Replicate1
.
Therefore, the verified answer is D, as it correctly uses the strftime function to convert a timestamp
to Unix time in Qlik Replicate1
.

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 3

Which information in Qlik Replicate can be retrieved from the server logs?

  • A. Network and performance issues
  • B. Load status and performance of task
  • C. Specific task information
  • D. Qlik Replicate Server status
Mark Question:
Answer:

D


Explanation:
The server logs in Qlik Replicate provide information about the Qlik Replicate Server instance, rather
than individual tasks.
The logs can include various levels of information, such as errors, warnings,
info, trace, and verbose details1
. Specifically, the server logs can provide insights into:
Network and performance issues: These might be indicated by error or warning messages related to
connectivity or performance bottlenecks.
Load status and performance of task: While the server logs focus on the server instance, they may
contain information about the overall load status and performance, especially if there are server-
level issues affecting tasks.
Specific task information: The server logs can include information about tasks, particularly if there
are errors or warnings that pertain to task execution at the server level.
Qlik Replicate Server status: This includes general information about the server’s health, status, and
any significant events that affect the server’s operation.
Therefore, while the server logs can potentially contain a range of information, the primary purpose
is to provide details on the Qlik Replicate Server status (D), including any issues that may impact the
server’s ability to function properly and manage tasks231
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 4

Which two components are responsible for reading data from the source endpoint and writing it to
the target endpoint in Full Load replication? (Select two.)

  • A. SOURCE_UNLOAD
  • B. TARGET_APPLY
  • C. TARGET_UNLOAD
  • D. SOURCE_CAPTURE
  • E. TARGET_LOAD
Mark Question:
Answer:

AE


Explanation:
 The SOURCE_UNLOAD component is responsible for reading data from the source endpoint.
 The TARGET_LOAD component is responsible for writing the data to the target endpoint.
 These components work in tandem during the Full Load replication process to move data from the
source to the target. According to Qlik Replicate documentation, these two components are crucial in
handling the extraction and loading phases of Full Load replication.
In the context of Full Load replication with Qlik Replicate, the components responsible for reading
data from the source and writing it to the target are:
SOURCE_UNLOAD: This component is responsible for unloading data from the source endpoint.
It
extracts the data that needs to be replicated to the target system1
.
TARGET_LOAD: This component is in charge of loading the data into the target endpoint.
After the
data is extracted by the SOURCE_UNLOAD, the TARGET_LOAD component ensures that the data is
properly inserted into the target system1
.
The other options provided do not align with the Full Load replication process:
B . TARGET_APPLY and D. SOURCE_CAPTURE are typically associated with the Change Data Capture
(CDC) process, not the Full Load process2
.
C . TARGET_UNLOAD is not a recognized component in the context of Qlik Replicate’s Full Load
replication.
Therefore, the correct answers are A. SOURCE_UNLOAD and E. TARGET_LOAD, as they are the
components that handle the reading and writing of data during the Full Load replication process12
.

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 5

Where are the three options in Qlik Replicate used to read the log files located? (Select three.)

  • A. In Windows Event log
  • B. In Diagnostic package
  • C. In External monitoring tool
  • D. In Data directory of Installation
  • E. In Monitor of Qlik Replicate
  • F. In Enterprise Manager
Mark Question:
Answer:

BDE


Explanation:
In Qlik Replicate, the options to read the log files are located in the following places:
In Diagnostic package (B): The diagnostic package in Qlik Replicate includes various log files that can
be used for troubleshooting and analysis purposes1
.
In Data directory of Installation (D): The log files are written to the log directory within the data
directory.
This is the primary location where Qlik Replicate writes its log files, and it is not possible to
change this location2
.
In Monitor of Qlik Replicate (E): The Monitor feature of Qlik Replicate allows users to view and
manage log files.
Users can access the Log Viewer from the Server Logging Levels or File Transfer
Service Logging Level sub-tabs1
.
The other options provided do not align with the locations where log files can be read in Qlik
Replicate:
A . In Windows Event log: This is not a location where Qlik Replicate log files are stored.
C . In External monitoring tool: While external monitoring tools can be used to read log files, they are
not a direct feature of Qlik Replicate for reading log files.
F . In Enterprise Manager: The Enterprise Manager is a separate component that may manage and
monitor multiple Qlik Replicate instances, but it is not where log files are directly read.
Therefore, the verified answers are B, D, and E, as they represent the locations within Qlik Replicate
where log files can be accessed and read21
.

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
F
50%
Discussions
vote your answer:
A
B
C
D
E
F
0 / 1000

Question 6

In the CDC mode of a Qlik Replicate task, which option can be set for Batch optimized apply mode?

  • A. Source connection processes
  • B. Number of changed records
  • C. Time and/or volume
  • D. Maximum time to batch transactions
Mark Question:
Answer:

C


Explanation:
 In Change Data Capture (CDC) mode, Batch optimized apply mode can be set based on time and/or
volume.
 This means that the batching of transactions can be controlled by specifying time intervals or the
volume of data changes to be batched together.
 This optimization helps improve performance by reducing the frequency of writes to the target
system and handling large volumes of changes efficiently. The Qlik Replicate documentation outlines
this option as a method to enhance the efficiency of data replication in CDC mode by batching
transactions based on specific criteria.
In the Change Data Capture (CDC) mode of a Qlik Replicate task, when using the Batch optimized
apply mode, the system allows for tuning based on time and/or volume. This setting is designed to
optimize the application of changes in batches to the target system. Here’s how it works:
Time: You can set intervals at which batched changes are applied.
This includes setting a minimum
amount of time to wait between each application of batch changes, as well as a maximum time to
wait before declaring a timeout1
.
Volume: The system can be configured to force apply a batch when the processing memory exceeds a
certain threshold.
This allows for the consolidation of operations on the same row, reducing the
number of operations on the target to a single transaction2
.
The other options provided do not align with the settings for Batch optimized apply mode in CDC
tasks:
A . Source connection processes: This is not a setting related to the batch apply mode.
B . Number of changed records: While the number of changed records might affect the batch size, it
is not a setting that can be directly configured in this context.
D . Maximum time to batch transactions: This option is related to the time aspect but does not fully
capture the essence of the setting, which includes both time and volume considerations.
Therefore, the verified answer is C. Time and/or volume, as it accurately represents the options that
can be set for Batch optimized apply mode in the CDC tasks of Qlik Replicate21
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 7

How should missing metadata be added in a Qlik Replicate task after the task has been stopped?

  • A. Drop tables or delete tables and data on target side, then run task from a certain timestamp
  • B. Under Advanced Run option choose reload target, stop task again, and then resume processing
  • C. Under Advanced Run option choose metadata only, stop task again, and then resume processing
  • D. Drop tables and data on the target side, run advanced option, create metadata, and then resume task
Mark Question:
Answer:

C


Explanation:
 If a task has missing metadata, you need to first stop the task.
 Navigate to the "Advanced Run" options.
 Select the option "Metadata Only."
 Start the task with this setting to process the missing metadata.
 Stop the task again after the metadata is added.
 Resume normal task processing. This procedure ensures that only the metadata is processed
without affecting the existing data on the target side. This method is recommended in Qlik Replicate
documentation for handling missing metadata issues.
To add missing metadata in a Qlik Replicate task after the task has been stopped, the correct
approach is to use the Advanced Run option for metadata only. Here’s the process:
Select the task that requires metadata to be added.
Go to the Advanced Run options for the task.
Choose the Metadata Only option, which has two sub-options:
Recreate all tables and then stop: This will rebuild metadata for all available tables in the task.
Create missing tables and then stop: This will rebuild metadata only for the missing tables or the
tables that were newly added to the task1
.
By selecting the Metadata Only option and choosing to create missing tables, you can ensure that the
metadata for the newly added tables is updated without affecting the existing tables and data. After
this operation, you can stop the task again and then resume processing.
The other options provided are not the recommended methods for adding missing metadata:
A and D suggest dropping tables or data, which is not necessary for simply adding metadata.
B suggests reloading the target, which is not the same as updating metadata only.
Therefore, the verified answer is C, as it accurately describes the process of adding missing metadata
to a Qlik Replicate task using the Advanced Run options1
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 8

When running a task in Qlik Replicate (From Oracle to MS SQL), the following error message appears:
Failed adding supplemental logging for table "Table name" Which must be done to fix this error?

  • A. Contact the Oracle DBA
  • B. Check the permission on the target endpoint
  • C. Enable supplemental logging
  • D. Check the permission of the source endpoint
Mark Question:
Answer:

C


Explanation:
 The error message "Failed adding supplemental logging for table" indicates that supplemental
logging is not enabled on the Oracle source.
 Supplemental logging must be enabled to capture the necessary changes for replication.
 To fix this error, you should enable supplemental logging on the Oracle database for the specific
table or tables.
 This can usually be done by executing the following SQL command on the Oracle source:
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
Verify that the logging is enabled and then retry the replication task. This solution aligns with the
troubleshooting steps provided in the Qlik Replicate documentation for dealing with supplemental
logging errors.
The error message “Failed adding supplemental logging for table ‘Table name’” indicates that
supplemental logging has not been enabled for the table in the Oracle source database.
Supplemental logging is necessary for Qlik Replicate to capture the changes in the Oracle database
accurately, especially for Change Data Capture (CDC) operations.
To resolve this error, you should:
Enable supplemental logging at the database level by executing the following SQL command in the
Oracle database:
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
This command enables minimal supplemental logging, which is required for Qlik Replicate to
function correctly1
.
If you need to enable supplemental logging for all columns, you can use the following SQL command:
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
This ensures that all necessary column data is logged for replication purposes1
.
After enabling supplemental logging, verify that it is active by querying the v$database view:
SELECT supplemental_log_data_min FROM v$database;
The correct return value should be ‘YES’, indicating that supplemental logging is enabled1
.
The other options provided are not directly related to the issue of supplemental logging:
A . Contact the Oracle DBA: While contacting the DBA might be helpful, the specific action needed is
to enable supplemental logging.
B . Check the permission on the target endpoint: Permissions on the target endpoint are not related
to the supplemental logging requirement on the source database.
D . Check the permission of the source endpoint: Permissions on the source endpoint are important,
but the error message specifically refers to the need for supplemental logging.
Therefore, the verified answer is C. Enable supplemental logging, as it directly addresses the
requirement to fix the error related to supplemental logging in Qlik Replicate21
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 9

Which is the minimum level of permissions required for a user to delete tasks?

  • A. Operator
  • B. Viewer
  • C. Designer
  • D. Admin
Mark Question:
Answer:

C


Explanation:
According to the Qlik Replicate documentation, the minimum level of permissions required for a user
to delete tasks is theDesignerrole. The predefined roles in Qlik Replicate come with different sets of
permissions.
TheAdminandDesignerroles have the permission to delete tasks, while
theOperatorandViewerroles do not1
.
Here’s a breakdown of the permissions for each role related to task management:
Admin: Can create, design, and delete tasks.
Designer: Can create, design, and delete tasks.
Operator: Can perform runtime operations like start, stop, or reload targets but cannot delete tasks.
Viewer: Can view task history and other details but cannot perform task management operations like
deleting tasks.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10

A Qlik Replicate administrator needs to configure Oracle as a source endpoint before running a task
in Qlik Replicate Which are three key prerequisites? (Select three.)

  • A. Enable supplemental logging
  • B. Install the Oracle Instant Client
  • C. Complete a full backup of the source
  • D. Enable ARCHIVELOG mode
  • E. Provide Oracle read-only privileges
  • F. Configure Oracle Recovery Model
Mark Question:
Answer:

ABD


Explanation:
When configuring Oracle as a source endpoint for Qlik Replicate, there are several key prerequisites
that need to be met:
Enable supplemental logging (A): Supplemental logging is crucial for capturing the changes in the
Oracle database accurately, especially for Change Data Capture (CDC) operations.
It ensures that all
necessary column data is logged for replication purposes1
.
Install the Oracle Instant Client (B): The Oracle Instant Client provides the necessary libraries for Qlik
Replicate to connect to and interact with the Oracle database.
It’s required for the proper functioning
of Qlik Replicate2
.
Enable ARCHIVELOG mode (D): ARCHIVELOG mode is necessary for the Oracle database to archive
redo logs, which Qlik Replicate uses to capture changes.
This mode allows the database to continue
functioning and preserve the logs even after a log switch, which is essential for CDC1
.
The other options provided are not listed as key prerequisites for configuring Oracle as a source
endpoint in Qlik Replicate:
C . Complete a full backup of the source: While it’s a good practice to have a backup, it’s not a
prerequisite for configuring the source endpoint.
E . Provide Oracle read-only privileges: Read-only privileges might be necessary for certain
operations, but they are not listed as a key prerequisite.
F . Configure Oracle Recovery Model: This is not mentioned as a prerequisite in the Qlik Replicate
documentation.
Therefore, the verified answers are A, B, and D, as they represent the necessary steps to configure
Oracle as a source endpoint in Qlik Replicate12
.

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
F
50%
Discussions
vote your answer:
A
B
C
D
E
F
0 / 1000

Question 11

Which two task logging components are associated with a Full Load to a target endpomt? (Select
two.)

  • A. TARGET_APPLY
  • B. TARGET_LOAD
  • C. FILE_TRANSFER
  • D. STREAM
  • E. SOURCE UNLOAD
Mark Question:
Answer:

BE


Explanation:
When performing a Full Load to a target endpoint in Qlik Replicate, the task logging components that
are associated with this process are TARGET_LOAD and SOURCE_UNLOAD.
TARGET_LOAD: This component is responsible for loading the data into the target endpoint.
It
represents the process where Qlik Replicate reads all columns/rows from the Source database and
creates the exact same copy on the Target database1
.
SOURCE_UNLOAD: This component is involved in unloading the data from the source endpoint.
It is
part of the Full Load process where the data is read from the source and prepared for transfer to the
target2
.
The other options provided are not directly associated with the Full Load process to a target
endpoint:
TARGET_APPLY is related to the Change Data Capture (CDC) phase where changes from the source
are applied to the target2
.
FILE_TRANSFER is not a term directly associated with Qlik Replicate’s logging components.
STREAM refers to the Log Stream tasks, which is a different type of task configuration used for saving
data changes from the transaction log of a single source database and applying them to multiple
targets2
.
For a comprehensive understanding of the task types and options in Qlik Replicate, you can refer to
the official Qlik community articles on
Qlik Replicate Task Configuration Options
and
An Introduction
to Qlik Replicate Tasks: Full Load vs CDC
.

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 12

Which are valid source endpoint types for Qlik Replicate change processing (CDC)? (Select two )

  • A. Classic Relational RDBMS
  • B. MS Dynamics direct access
  • C. SAP ECC and Extractors
  • D. Generic REST APIs Data Lake file formats
Mark Question:
Answer:

AC


Explanation:
For Qlik Replicate’s Change Data Capture (CDC) process, the valid source endpoint types include:
A . Classic Relational RDBMS: These are traditional relational database management systems that
support CDC.
Qlik Replicate can capture changes from these systems using log-based CDC tools
which are integrated to work with most ETL tools1
.
C . SAP ECC and Extractors: SAP ECC (ERP Central Component) and its extractors are also supported
as source endpoints for CDC in Qlik Replicate.
This allows for the replication of data changes from
SAP’s complex data structures1
.
The other options provided are not typically associated with CDC in Qlik Replicate:
B . MS Dynamics direct access: While Qlik Replicate can connect to various data sources, MS
Dynamics is not commonly listed as a direct source for CDC.
D . Generic REST APIs Data Lake file formats: REST APIs and Data Lake file formats are not standard
sources for CDC as they do not maintain transaction logs, which are essential for CDC to track
changes.
For detailed information on setting up source endpoints and enabling CDC, you can refer to the
official Qlik documentation and community articles that discuss the prerequisites and configurations
needed for various source endpoints2345
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 13

How can the task diagnostic package be downloaded?

  • A. Open task from overview -> Monitor -> Tools -> Support -> Download diagnostic package
  • B. Open task from overview -> Run -> Tools -?
  • C. Download diagnostic package Go to server settings -> Logging -> Right-click task -> Support -> Download diagnostic package
  • D. Right-click task from overview -> Download diagnostic package
Mark Question:
Answer:

A


Explanation:
To download the task diagnostic package in Qlik Replicate, you need to follow these steps:
Open the task from the overview in the Qlik Replicate Console.
Switch to the Monitor view.
Click on the Tools toolbar button.
Navigate to Support.
Select Download Diagnostic Package1
.
This process will generate a task-specific diagnostics package that contains the task log files and
various debugging data that may assist in troubleshooting task-related issues. Depending on your
browser settings, the file will either be automatically downloaded to your designated download
folder, or you will be prompted to download it.
The file will be named in the format
<task_name>__diagnostics__<timestamp>.zip12
.
The other options provided do not accurately describe the process for downloading a diagnostic
package in Qlik Replicate:
B is incomplete and does not provide a valid path.
C incorrectly suggests going to server settings and logging, which is not the correct procedure.
D suggests a method that is not documented in the official Qlik Replicate help resources.
Therefore, the verified answer is A, as it correctly outlines the steps to download a diagnostic
package in Qlik Replicate12
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 14

An operative database can only commit two engines to Qlik Replicate (or initial loads at any given
time. How should the task settings be modified?

  • A. Apply Change Processing Tuning and increase the Apply batched changes intervals to 60 seconds
  • B. Qlik Replicate tasks only load one table at a time by default, so the task settings do not need to be modified.
  • C. Apply Full Load Settings to limit the number of engines to two.
  • D. Apply Full Load Tuning to read a maximum number of tables not greater than two.
Mark Question:
Answer:

C


Explanation:
In a scenario where an operative database can commit only two engines to Qlik Replicate for initial
loads, the task settings should be modified to ensure that no more than two tables are loaded at any
given time. This can be achieved by:
C . Apply Full Load Settings to limit the number of engines to two: This setting allows you to specify
the maximum number of concurrent table loads during the Full Load operation.
By limiting this
number to two, you ensure that the operative database’s capacity is not exceeded1
.
The other options are not suitable because:
A . Apply Change Processing Tuning: This option is related to the CDC (Change Data Capture) phase
and not the initial Full Load phase. Increasing the apply batched changes interval would not limit the
number of engines used during the Full Load.
B . Qlik Replicate tasks only load one table at a time by default: This statement is not accurate as Qlik
Replicate can be configured to load multiple tables concurrently, depending on the task settings.
D . Apply Full Load Tuning to read a maximum number of tables not greater than two: While this
option seems similar to the correct answer, it is not a recognized setting in Qlik Replicate’s
configuration options.
For detailed guidance on configuring task settings in Qlik Replicate, particularly for managing the
number of concurrent loads, you can refer to the official Qlik community articles on
Qlik Replicate
Task Configuration Options
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 15

Which is the default port of Qlik Replicate Server on Linux?

  • A. 3550
  • B. 443
  • C. 80
  • D. 3552
Mark Question:
Answer:

D


Explanation:
The default port for Qlik Replicate Server on Linux is 3552. This port is used for outbound and
inbound communication unless it is overridden during the installation or configuration process.
Here’s a reference to the documentation that confirms this information:
The official Qlik Replicate documentation states that “Port 3552 (the default rest port) needs to be
opened for outbound and inbound communication, unless you override it as described below.” This
indicates that 3552 is the default port that needs to be considered during the installation and setup
of Qlik Replicate on a Linux system1
.
The other options provided do not correspond to the default port for Qlik Replicate Server on Linux:
A . 3550: This is not listed as the default port in the documentation.
B . 443: This is commonly the default port for HTTPS traffic, but not for Qlik Replicate Server.
C . 80: This is commonly the default port for HTTP traffic, but not for Qlik Replicate Server.
Therefore, the verified answer is D. 3552, as it is the port designated for Qlik Replicate Server on
Linux according to the official documentation1
.

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000
To page 2