Show Menu
TOPICS×

Connect to streaming destinations and activate data in Adobe's Real-time Customer Data Platform using APIs

The Amazon Kinesis and Azure Event Hubs destinations in Adobe Real-time CDP are currently in beta. The documentation and the functionality are subject to change.
This tutorial demonstrates how to use API calls to connect to your Adobe Experience Platform data, create a connection to a streaming cloud storage destination ( Amazon Kinesis or Azure Event Hubs ), create a dataflow to your new created destination, and activate data to your new created destination.
This tutorial uses the Amazon Kinesis destination in all examples, but the steps are identical for Azure Event Hubs.
If you prefer to use the user interface in Adobe's Real-time CDP to connect to a destination and activate data, see the Connect a destination and Activate profiles and segments to a destination tutorials.

Get started

This guide requires a working understanding of the following components of Adobe Experience Platform:
  • Experience Data Model (XDM) System : The standardized framework by which Experience Platform organizes customer experience data.
  • Catalog Service : Catalog is the system of record for data location and lineage within Experience Platform.
  • Sandboxes : Experience Platform provides virtual sandboxes which partition a single Platform instance into separate virtual environments to help develop and evolve digital experience applications.
The following sections provide additional information that you will need to know in order to activate data to streaming destinations in Adobe Real-time CDP.

Gather required credentials

To complete the steps in this tutorial, you should have the following credentials ready, depending on the type of destinations that you are connecting and activating segments to.
  • For Amazon Kinesis connections: accessKeyId , secretKey , region or connectionUrl
  • For Azure Event Hubs connections: sasKeyName , sasKey , namespace

Reading sample API calls

This tutorial provides example API calls to demonstrate how to format your requests. These include paths, required headers, and properly formatted request payloads. Sample JSON returned in API responses is also provided. For information on the conventions used in documentation for sample API calls, see the section on how to read example API calls in the Experience Platform troubleshooting guide.

Gather values for required and optional headers

In order to make calls to Platform APIs, you must first complete the authentication tutorial . Completing the authentication tutorial provides the values for each of the required headers in all Experience Platform API calls, as shown below:
  • Authorization: Bearer {ACCESS_TOKEN}
  • x-api-key: {API_KEY}
  • x-gw-ims-org-id: {IMS_ORG}
Resources in Experience Platform can be isolated to specific virtual sandboxes. In requests to Platform APIs, you can specify the name and ID of the sandbox that the operation will take place in. These are optional parameters.
  • x-sandbox-name: {SANDBOX_NAME}
All requests that contain a payload (POST, PUT, PATCH) require an additional media type header:
  • Content-Type: application/json

Swagger documentation

You can find accompanying reference documentation for all the API calls in this tutorial in Swagger. See the Flow Service API documentation on Adobe.io . We recommend that you use this tutorial and the Swagger documentation page in parallel.

Get the list of available streaming destinations

As a first step, you should decide which streaming destination to activate data to. To begin with, perform a call to request a list of available destinations that you can connect and activate segments to. Perform the following GET request to the connectionSpecs endpoint to return a list of available destinations:
API format
GET /connectionSpecs

Request
curl --location --request GET 'https://platform.adobe.io/data/foundation/flowservice/connectionSpecs' \
--header 'accept: application/json' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-sandbox-name: {SANDBOX_NAME}' \
--header 'Authorization: Bearer {ACCESS_TOKEN}'


Response
A successful response contains a list of available destinations and their unique identifiers ( id ). Store the value of the destination that you plan to use, as it will be required in further steps. For example, if you want to connect and deliver segments to Amazon Kinesis or Azure Event Hubs, look for the following snippet in the response:
{
    "id": "86043421-563b-46ec-8e6c-e23184711bf6",
  "name": "Amazon Kinesis",
  ...
  ...
}

{
    "id": "bf9f5905-92b7-48bf-bf20-455bc6b60a4e",
  "name": "Azure Event Hubs",
  ...
  ...
}

Connect to your Experience Platform data

Next, you must connect to your Experience Platform data, so you can export profile data and activate it in your preferred destination. This consists of two substeps which are described below.
  1. First, you must perform a call to authorize access to your data in Experience Platform, by setting up a base connection.
  2. Then, using the base connection ID, you will make another call in which you create a source connection, which establishes the connection to your Experience Platform data.

Authorize access to your data in Experience Platform

API format
POST /connections

Request
curl --location --request POST 'https://platform.adobe.io/data/foundation/flowservice/connections' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'x-sandbox-name: {SANDBOX_NAME} \
--header 'Content-Type: application/json' \
--data-raw '{
            "name": "Base connection to Experience Platform",
            "description": "This call establishes the connection to Experience Platform data",
            "connectionSpec": {
                "id": "{CONNECTION_SPEC_ID}",
                "version": "1.0"
            }
}'


  • {CONNECTION_SPEC_ID} : Use the connection spec ID for Unified Profile Service - 8a9c3494-9708-43d7-ae3f-cda01e5030e1 .
Response
A successful response contains the base connection's unique identifier ( id ). Store this value as it is required in the next step to create the source connection.
{
    "id": "1ed86558-59b5-42f7-9865-5859b552f7f4"
}

Connect to your Experience Platform data

API format
POST /sourceConnections

Request
curl --location --request POST 'https://platform.adobe.io/data/foundation/flowservice/sourceConnections' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'x-sandbox-name: {SANDBOX_NAME}' \
--header 'Content-Type: application/json' \
--data-raw '{
            "name": "Connecting to Unified Profile Service",
            "description": "Optional",
            "connectionSpec": {
                "id": "{CONNECTION_SPEC_ID}",
                "version": "1.0"
            },
            "baseConnectionId": "{BASE_CONNECTION_ID}",
            "data": {
                "format": "json"
            },
            "params" : {}
}'

  • {BASE_CONNECTION_ID} : Use the Id you have obtained in the previous step.
  • {CONNECTION_SPEC_ID} : Use the connection spec ID for Unified Profile Service - 8a9c3494-9708-43d7-ae3f-cda01e5030e1 .
Response
A successful response returns the unique identifier ( id ) for the newly created source connection to Unified Profile Service. This confirms that you have successfully connected to your Experience Platform data. Store this value as it is required in a later step.
{
    "id": "ed48ae9b-c774-4b6e-88ae-9bc7748b6e97"
}

Connect to streaming destination

In this step, you are setting up a connection to your desired streaming destination. This consists of two substeps which are described below.
  1. First, you must perform a call to authorize access to the streaming destination, by setting up a base connection.
  2. Then, using the base connection ID, you will make another call in which you create a target connection, which specifies the location in your storage account where the exported data will be delivered, as well as the format of the data that will be exported.

Authorize access to the streaming destination

API format
POST /connections

Request
curl --location --request POST 'https://platform.adobe.io/data/foundation/flowservice/connections' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'x-sandbox-name: {SANDBOX_NAME}' \
--header 'Content-Type: application/json' \
--data-raw '{
    "name": "Connection for Amazon Kinesis/ Azure Event Hubs",
    "description": "your company's holiday campaign",
    "connectionSpec": {
        "id": "{_CONNECTION_SPEC_ID}",
        "version": "1.0"
    },
    "auth": {
        "specName": "{AUTHENTICATION_CREDENTIALS}",
        "params": { // use these values for Amazon Kinesis connections
            "accessKeyId": "{ACCESS_ID}",
            "secretKey": "{SECRET_KEY}",
            "region": "{REGION}"
        },
        "params": { // use these values for Azure Event Hubs connections
            "sasKeyName": "{SAS_KEY_NAME}",
            "sasKey": "{SAS_KEY}",
            "namespace": "{EVENT_HUB_NAMESPACE}"
        }        
    }
}'

  • {CONNECTION_SPEC_ID} : Use the connection spec ID you obtained in the step Get the list of available destinations .
  • {AUTHENTICATION_CREDENTIALS} : fill in the name of your streaming destination, e.g.: Amazon Kinesis authentication credentials or Azure Event Hubs authentication credentials .
  • {ACCESS_ID} : For Amazon Kinesis connections. Your access ID for your Amazon Kinesis storage location.
  • {SECRET_KEY} : For Amazon Kinesis connections. Your secret key for your Amazon Kinesis storage location.
  • {REGION} : For Amazon Kinesis connections. The region in your Amazon Kinesis account where Adobe Real-time CDP will stream your data.
  • {SAS_KEY_NAME} : For Azure Event Hubs connections. Fill in your SAS key name. Learn about authenticating to Azure Event Hubs with SAS keys in the Microsoft documentation .
  • {SAS_KEY} : For Azure Event Hubs connections. Fill in your SAS key. Learn about authenticating to Azure Event Hubs with SAS keys in the Microsoft documentation .
  • {EVENT_HUB_NAMESPACE} : For Azure Event Hubs connections. Fill in the Azure Event Hubs namespace where Adobe Real-time CDP will stream your data. For more information, see Create an Event Hubs namespace in the Microsoft documentation.
Response
A successful response contains the base connection's unique identifier ( id ). Store this value as it is required in the next step to create a target connection.
{
    "id": "1ed86558-59b5-42f7-9865-5859b552f7f4"
}

Specify storage location and data format

API format
POST /targetConnections

Request
curl --location --request POST 'https://platform.adobe.io/data/foundation/flowservice/targetConnections' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'Content-Type: application/json' \
--data-raw '{
    "name": "Amazon Kinesis/ Azure Event Hubs target connection",
    "description": "Connection to Amazon Kinesis/ Azure Event Hubs",
    "baseConnection": "{BASE_CONNECTION_ID}",
    "connectionSpec": {
        "id": "{CONNECTION_SPEC_ID}",
        "version": "1.0"
    },
    "data": {
        "format": "json",
    },
    "params": { // use these values for Amazon Kinesis connections
        "stream": "{NAME_OF_DATA_STREAM}", 
        "region": "{REGION}"
    },
    "params": { // use these values for Azure Event Hubs connections
        "eventHubName": "{EVENT_HUB_NAME}"
    }
}'

  • {BASE_CONNECTION_ID} : Use the base connection ID you obtained in the step above.
  • {CONNECTION_SPEC_ID} : Use the connection spec you obtained in the step Get the list of available destinations .
  • {NAME_OF_DATA_STREAM} : For Amazon Kinesis connections. Provide the name of your existing data stream in your Amazon Kinesis account. Adobe Real-time CDP will export data to this stream.
  • {REGION} : For Amazon Kinesis connections. The region in your Amazon Kinesis account where Adobe Real-time CDP will stream your data.
  • {EVENT_HUB_NAME} : For Azure Event Hubs connections. Fill in the Azure Event Hub name where Adobe Real-time CDP will stream your data. For more information, see Create an event hub in the Microsoft documentation.
Response
A successful response returns the unique identifier ( id ) for the newly created target connection to your streaming destination. Store this value as it is required in later steps.
{
    "id": "12ab90c7-519c-4291-bd20-d64186b62da8"
}

Create a data flow

Using the IDs you obtained in the previous steps, you can now create a dataflow between your Experience Platform data and the destination where you will activate data to. Think of this step as constructing the pipeline, through which data will later flow, between Experience Platform and your desired destination.
To create a dataflow, perform a POST request, as shown below, while providing the values mentioned below within the payload.
Perform the following POST request to create a dataflow.
API format
POST /flows

Request
curl -X POST \
'https://platform.adobe.io/data/foundation/flowservice/flows' \
-H 'Authorization: Bearer {ACCESS_TOKEN}' \
-H 'x-api-key: {API_KEY}' \
-H 'x-gw-ims-org-id: {IMS_ORG}' \
-H 'x-sandbox-name: {SANDBOX_NAME}' \
-H 'Content-Type: application/json' \
-d  '{
   
        "name": "Create dataflow to Amazon Kinesis/ Azure Event Hubs",
        "description": "This operation creates a dataflow to Amazon Kinesis/ Azure Event Hubs",
        "flowSpec": {
            "id": "{FLOW_SPEC_ID}",
            "version": "1.0"
        },
        "sourceConnectionIds": [
            "{SOURCE_CONNECTION_ID}"
        ],
        "targetConnectionIds": [
            "{TARGET_CONNECTION_ID}"
        ]
    }

  • {FLOW_SPEC_ID} : The flow spec ID for profile based destinations is 71471eba-b620-49e4-90fd-23f1fa0174d8 . Use this value in the call.
  • {SOURCE_CONNECTION_ID} : Use the source connection ID you obtained in the step Connect to your Experience Platform .
  • {TARGET_CONNECTION_ID} : Use the target connection ID you obtained in the step Connect to streaming destination .
Response
A successful response returns the ID ( id ) of the newly created dataflow and an etag . Note down both values. as you will them in the next step, to activate segments.
{
    "id": "8256cfb4-17e6-432c-a469-6aedafb16cd5",
    "etag": "8256cfb4-17e6-432c-a469-6aedafb16cd5"
}

Activate data to your new destination

Having created all the connections and the data flow, now you can activate your profile data to the streaming platform. In this step, you select which segments and which profile attributes you are sending to the destination and you can schedule and send data to the destination.
To activate segments to your new destination, you must perform a JSON PATCH operation, similar to the example below. You can activate mutiple segments and profile attributes in one call. To learn more about JSON PATCH, see the RFC specification .
API format
PATCH /flows

Request
curl --location --request PATCH 'https://platform.adobe.io/data/foundation/flowservice/flows/{DATAFLOW_ID}' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'Content-Type: application/json' \
--header 'x-sandbox-name: {SANDBOX_NAME}' \
--header 'If-Match: "{ETAG}"' \
--data-raw '[
    {
        "op": "add",
        "path": "/transformations/0/params/segmentSelectors/selectors/-",
        "value": {
            "type": "PLATFORM_SEGMENT",
            "value": {
                "name": "Name of the segment that you are activating",
                "description": "Description of the segment that you are activating",
                "id": "{SEGMENT_ID}"
            }
        }
    },
        {
        "op": "add",
        "path": "/transformations/0/params/segmentSelectors/selectors/-",
        "value": {
            "type": "PLATFORM_SEGMENT",
            "value": {
                "name": "Name of the segment that you are activating",
                "description": "Description of the segment that you are activating",
                "id": "{SEGMENT_ID}"
            }
        }
    },
        {
        "op": "add",
        "path": "/transformations/0/params/profileSelectors/selectors/-",
        "value": {
            "type": "JSON_PATH",
            "value": {
                "operator": "EXISTS",
                "path": "{PROFILE_ATTRIBUTE}"
            }
        }
    },
        },
        {
        "op": "add",
        "path": "/transformations/0/params/profileSelectors/selectors/-",
        "value": {
            "type": "JSON_PATH",
            "value": {
                "operator": "EXISTS",
                "path": "{PROFILE_ATTRIBUTE}"
            }
        }
    }
]

  • {DATAFLOW_ID} : Use the data flow you obtained in the previous step.
  • {ETAG} : Use the etag that you obtained in the previous step.
  • {SEGMENT_ID} : Provide the segment ID that you want to export to this destination. To retrieve segment IDs for the segments that you want to activate, go to https://www.adobe.io/apis/experienceplatform/home/api-reference.html#/, select Segmentation Service API in the left navigation menu, and look for the GET /segment/jobs operation.
  • {PROFILE_ATTRIBUTE} : For example, personalEmail.address or person.lastName
Response
Look for a 202 OK response. No response body is returned. To validate that the request was correct, see the next step, Validate the data flow.

Validate the data flow

As a final step in the tutorial, you should validate that the segments and profile attributes have indeed been correctly mapped to the data flow.
To validate this, perform the following GET request:
API format
GET /flows

Request
curl --location --request PATCH 'https://platform.adobe.io/data/foundation/flowservice/flows/{DATAFLOW_ID}' \
--header 'Authorization: Bearer {ACCESS_TOKEN}' \
--header 'x-api-key: {API_KEY}' \
--header 'x-gw-ims-org-id: {IMS_ORG}' \
--header 'Content-Type: application/json' \
--header 'x-sandbox-name: prod' \
--header 'If-Match: "{ETAG}"' 

  • {DATAFLOW_ID} : Use the data flow from the previous step.
  • {ETAG} : Use the etag from the previous step.
Response
The returned response should include in the transformations parameter the segments and profile attributes that you submitted in the previous step. A sample transformations parameter in the response could look like below:
"transformations": [
    {
        "name": "GeneralTransform",
        "params": {
            "profileSelectors": {
                        "selectors": [
                            {
                                "type": "JSON_PATH",
                                "value": {
                                    "path": "personalEmail.address",
                                    "operator": "EXISTS"
                                }
                            },
                            {
                                "type": "JSON_PATH",
                                "value": {
                                    "path": "person.lastname",
                                    "operator": "EXISTS"
                                }
                            }
                        ]
                    },
            "segmentSelectors": {
                "selectors": [
                    {
                        "type": "PLATFORM_SEGMENT",
                        "value": {
                            "name": "Men over 50",
                            "description": "",
                            "id": "72ddd79b-6b0a-4e97-a8d2-112ccd81bd02"
                        }
                    }
                ]
            }
        }
    }
],

Exported Data
In addition to the profile attributes and the segments in the step Activate data to your new destination , the exported data in AWS Kinesis and Azure Event Hubs will also include information about the identity map. This represents the identities of the exported profiles (for example ECID , mobile ID, Google ID, email address, etc.). See an example below.
{
  "person": {
    "email": "yourstruly@adobe.con"
  },
  "segmentMembership": {
    "ups": {
      "72ddd79b-6b0a-4e97-a8d2-112ccd81bd02": {
        "lastQualificationTime": "2020-03-03T21:24:39Z",
        "status": "exited"
      },
      "7841ba61-23c1-4bb3-a495-00d695fe1e93": {
        "lastQualificationTime": "2020-03-04T23:37:33Z",
        "status": "existing"
      }
    }
  },
  "identityMap": {
    "ecid": [
      {
        "id": "14575006536349286404619648085736425115"
      },
      {
        "id": "66478888669296734530114754794777368480"
      }
    ],
    "email_lc_sha256": [
      {
        "id": "655332b5fa2aea4498bf7a290cff017cb4"
      },
      {
        "id": "66baf76ef9de8b42df8903f00e0e3dc0b7"
      }
    ]
  }
}



Next steps

By following this tutorial, you have successfully connected Real-time CDP to one of your preferred streaming destinations and set up a data flow to the respective destination. Outgoing data can now be used in the destination for customer analytics or any other data operations you may wish to perform. See the following pages for more details: