Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In this tutorial, you learn how to use SAP Datasphere Replication Flow to replicate the data from SAP Datasphere to the Kafka endpoint which is in the Eventstream’s source custom endpoint. Once the data arrives in Eventstream, it can be used for alerting, reporting, or routed to various destinations in Fabric further processing or analysis.
In this tutorial, you will:
- Create an eventstream and add a custom endpoint source
- Get the Kafka endpoint information from a custom endpoint source
- Create the Kafka connection in SAP Datasphere
- Set up the Replication Flow with the Kafka connection in SAP Datasphere
- Deploy and activate the replication flow to replicate the data to Eventstream
Prerequisites
- Get access to a workspace with Contributor or higher permissions where your eventstream is located.
- An SAP Datasphere account with ‘Premium Outbound’ support.
Create an eventstream
Navigate to the Fabric portal.
Select My workspace on the left navigation bar.
On the My workspace page, select + New item on the command bar.
On the New item page, search for Eventstream, and then select Eventstream.
In the New Eventstream window, enter a name for the eventstream, and then select Create.
Creation of the new eventstream in your workspace can take a few seconds. After the eventstream is created, you're directed to the main editor where you can start with adding sources to the eventstream.
Get the Kafka endpoint information from an added custom endpoint source
To get the Kafka topic endpoint information from an eventstream, add a custom endpoint source to your eventstream. The Kafka connection endpoint is then readily available and exposed within the custom endpoint source.
To add a custom endpoint source, on the get-started page, select Use custom endpoint.
In the Custom endpoint dialog, enter a name for the custom source under Source name, and then select Add.
After you create the custom endpoint source, it's added to your eventstream on the canvas in edit mode. To enable the newly added custom endpoint source, select Publish.
After you successfully publish the eventstream, you can retrieve the detailed information about Kafka endpoint that is needed to configure the Kafka connection in SAP Datasphere later.
Select the custom endpoint source tile on the canvas. Then, in the bottom pane of the custom endpoint source node, select the Kafka tab. On the SAS Key Authentication page, you can get the following important Kafka endpoint information:
- Bootstrap servers
- Topic name
- Connection string (primary or secondary)
- Security protocol = SASL_SSL
- SASL mechanism = PLAIN
This information is used in following step when configuring the SAP Datasphere Kafka connection.
Create Kafka connection in SAP Datasphere
To define the Replication Flow in SAP Datasphere, a Kafka connection needs to be created first with the Kafka endpoint information that is from Eventstream’s custom endpoint source.
In the connection management tab of an SAP Datasphere space, select Apache Kafka connection type in the connection creation wizard:
Configure the connection properties in the second step of the wizard with the Kafka information from Eventstream’s custom endpoint source:
- Kafka Brokers: it's the bootstrap server in Eventstream’s source custom endpoint.
- Authentication Type: select “User Name And Password.”
- Kafka SASL User Name: use constant value “$ConnectionString”.
- Kafka SASL Password: can be either the Connection string-primary key value or the Connection string-secondary key value in Eventstream’s source custom endpoint.
- Replication Flows: “Enable”.
Select Save to get this new Kafka connection created after filling in all needed information.
Create Replication Flow with Kafka connection as target
Replication Flow allows you to replicate the data from the sources in SAP to a target. Kafka connection is one of the supported targets for Replication Flow. For more information regarding the Replication Flow, see Creating a Replication Flow | SAP Help Portal.
Following the steps below to create a Replication Flow with Kafka connection target.
Select Data Builder in the left navigation and select New Replication Flow to open the editor.
Select a source connection and a source container, then add source objects. The supported source objects can be found in Creating a Replication Flow | SAP Help Portal.
In the target configuration, select the Kafka connection that is created in previous step as the target connection:
IMPORTANT: by default, the target object name will automatically be assigned according to the source object name. The target object name is the name of the Kafka topic that is to be created and receive the data from source object. Since the Kafka topic for receiving data in Eventstream already exists, it is critical to rename the target object name to the ‘Topic name’ shown in Eventstream’s source custom endpoint.
Deploy and activate the Replication Flow
After the Replication Flow is created and configured with proper source and target objects, it can be deployed and activated through the command buttons in the ribbon. You can check the status in the monitoring environment in SAP Datasphere.
You can now verify the end-to-end flow via Eventstream to confirm whether data is being received. In the Fabric portal, open your eventstream, and select the default stream node - which appears as the central node displaying your eventstream's name - to preview the data present in your eventstream.
Related content
This tutorial showed you how to use Replication Flow to transfer data from SAP Datasphere to your eventstream via Eventstream’s source custom endpoint. Once the data reaches eventstream, you can process it and route it to different destinations for analysis, alerts, and reports. Below are some helpful resources for further reference: