Overview
This is a streamlined guide for creating a new Sources configuration -- using the Adobe Streaming Sources SDK -- and testing it in your Adobe Experience Platform (AEP) sandbox. The target audience is an Adobe technology (ISV) partner who desires to publish a Sources connector in the AEP Sources Catalog for use by all Adobe RT-CDP customers. This guide will not cover all of the configuration options available for making a customized Source connector that works with your APIs -- for that see the full Streaming Sources SDK documentation. Instead, this guide takes you quickly through all the steps, giving you a solid overview of the technology and the process. This will help you to confidently scope and resource your own connector creation project.
Prerequisites
To successfully complete the steps in this guide you will need the following:
- Membership in the Adobe Technology Partner Program (formerly called Adobe Exchange program)
- Access to an AEP Sandbox environment
- An Adobe developer console project with "Experience Platform API" added;
- API credentials for the project need to be assigned to a Role by an administrator. This step may require opening a TPP support ticket.
- Access to Postman and basic experience with it
Postman Prep
In this guide, we will execute several API calls using Postman. Before we do that, we need to import a Postman collection and a Postman environment. We will also need to make a few additions to the environment.
First, download the How to Create a Streaming Source – Postman Collection (attached) and import it into your workspace.
Second, download the Postman environment file from the Adobe developer console project UI and import it. (Figure 1)
Figure 1: Developer Console page to download environment file for Postman
Third, we need to augment the environment with a few additional values.
- Add a new environment variable called "PLATFORM_GATEWAY" with the value "https://platform.adobe.io"
- Add a new environment variable called "SANDBOX_NAME" with the value of your sandbox name. The sandbox name is visible in the URL when you are logged in to the AEP UI. It is the “sname” value. (example, https://experience.adobe.com/#/@exchangeinternal/sname:sbx-a/platform...)
- Add a new environment variable called "UNIQUE_NAME" to provide a unique label for each time you might go through this tutorial. Initially you can give it a value like - - “Luma ISV” (if you ran through it a second time you could use “Luma ISV2”)
Figure 2: Postman Environment with additional fields ‘PLATFORM_GATEWAY’, ‘SANDBOX_NAME’, and ‘UNIQUE_NAME’
Generate an Access Token
Now that Postman is set up, let’s generate an access token so that we can make all the API calls in the remaining sections.
Open the “Create a Streaming Source...” collection in your Postman app. Expand the first subfolder, named “Authentication”, and you will see one request. Run this request.
You’ll know that this was successful if you see a 200 response and if you now have a ‘Current value’ for your ACCESS_TOKEN environment variable.
Figure 3: Postman Environment showing generated Access Token’s Current Value
Sandbox Setup
Now we will be creating a Schema, Dataset, and Identity. These will be needed for the end-to-end testing later on.
The Schema is a blueprint for the data to be stored in Adobe Experience Platform. A Schema is made up of a Class and zero or more Field Groups.
A Dataset is a storage and management construct for collecting data. A Dataset is always linked with a schema.
An Identity is data that is unique to an entity, typically an individual person. An identity, such as a log-in ID, ECID, or loyalty ID, is also referred to as a “known identity”.
We can create all the above by running through the postman collection “Sandbox Setup”. Right click the folder and hit “Run folder” (Make sure you have generated an access token, or this step will not work!)
Figure 4: Postman workspace showing how to run the ’Sandbox Setup’ folder
Figure 5: Postman workspace showing options on running the ’Sandbox Setup’ folder
Verify that all FOUR calls were given a 200 or 201 success message.
That’s it! We are now ready to create the connector.
Create a Connection Spec
A connection specification represents the structure of a source. It contains information on source’s attributes, authentication requirements, defines how source data can be explored and inspected.
Step 1: Navigate to the Create a Streaming Source Folder.
Step 2: Move to the Create Source Configs folder and open the POST request Create Connection Spec.
Figure 6: Postman workspace showing the Create Connection Spec call
Change the label.text and description.text of the connector to your desired values. You can also leave them as they are and the UNIQUE_NAME will be used. As we will see in a moment these values appear in the Sources catalog UI.
Figure 7: Postman body showing where to edit the description and label values
Run the POST request and you should receive a 200 response.
Copy the Connection Spec "id" value from the response received. We will be using this ID in a following step.
Figure 8: Postman response highlighting the ‘id’ value
Update the Flow Spec
Step 1: Navigate to Get Flow Specification request in the Create Source Configs folder.
Step 2: Run the Get Flow Specification request
Figure 9: Postman collection showing the Get Flow Specification call
Step 3: The response will generate a list of Source Connection Spec IDs like below, copy these values, ensure the Connection Spec Id received in the response from Create a Connection Post call is matching one of the values received, if not then append the value of Connection Spec Id received to the list of Source Connection Spec Ids.
Figure 10: Postman request body highlighting the connection spec ids
Step 4: Navigate to the Update Flow Specification request in the same folder.
Step 5: Update the request body with the values of SourceConnectionSpecIds copied from the previous step and click "Send" to make the PUT request. You should receive a 200 response.
Figure 11: Postman request body highlighting where to paste Connection Spec IDs
Create Data Flow in Sources UI
By Running the Create a Connection Specification and Update Flow Specification calls, we have now created a streaming Source Connector.
We will now configure the Data flow in the UI. These are the same steps that each Adobe AEP customer will take to activate and configure the Source Connector.
Log in to platform.adobe.com, make sure that we choose the right sandbox instance provisioned. In my case I am on Adobe Exchange Partner Team org and Sandbox A.
Figure 12: Adobe Experience Platform UI showing current Org (Adobe Exchange Partner Team) and Sandbox (Sandbox A).
Navigate to Sources on the left hand side of the screen under connections and select the Source card that has been created in the previous step.
The label we have given to the connector is Luma ISV. Click on the “Set up”.
Figure 13: AEP UI of Sources Catalog, showing the ‘Luma ISV’ Source that was created earlier in this tutorial.
Click on the Upload files and upload the sample JSON file (attached).
Figure 14: AEP UI of Sources configuration menu, showing the ‘Select Data’ step.
Figure 15: AEP UI of Sources configuration menu, showing the sample schema that is created when uploading a file.
Click on “Next”. Select Existing dataset for Target dataset and select the dataset that has been created in the previous step.
Ensure that the Profile dataset toggle button is selected and provide a valid name for the dataflow.
Click “Next”
Figure 16: AEP UI of Sources configuration menu, showing the Dataset that data will flow into. Note the ‘Profile dataset’ toggle is enabled.
The following screen appears, make sure that the Source and Target fields are mapped correctly.
Then click “Next”.
Figure 17: AEP UI of Sources configuration menu, showing the ‘Mapping’ step and personalEmail fields.
Figure 18: AEP UI of Sources configuration menu, showing the ‘Review’ step.
Clicking "Finish" on the Review step will complete the setup of the connector.
We are now ready to ingest data with this dataflow from our new streaming Source Connector.
End-to-end Testing
Now, we will ingest sample profiles into the platform using the streaming ingestion API.
Navigate to Sources and click on the Dataflows tab. Highlight your newly created dataflow from the previous step, and take note of the Dataflow ID, and the Streaming endpoint.
.
Figure 19: AEP UI showing the ‘Dataflows’ tab of the source just configured. Note that a Dataflow is highlighted, and the Dataflow ID and Streaming Endpoint properties in the right-side menu.
Open the Postman Collection and go to the End-to-end testing folder, Open the Import: Stream to PROFILE API call.
Copy the Streaming endpoint URL into the URL of the Stream to Profile API call, and the Dataflow ID to the header “x-adobe-flow-id". After making the changes, run the API call.
Figure 20: Postman 'End-to-end testing' folder and 'Import: Stream to PROFILE' request
A 200 response means this record has been successfully sent into the platform.
Note - you can edit the record details (name.firstName, name.lastName, personalEmail.address, etc.) and run this call multiple times to create additional profiles.
Log in to platform.adobe.com , navigate to the Profiles section on the left hand side of the navigation pane. Choose Email as the identity namespace and enter the email that was ingested in the API Request above (you can pull this from the body of the request); you will notice a list of Profiles.
Figure 21: AEP Profiles, searching for a specific profile
Click on the Profile ID to see the Attributes, Events and Audience Membership of the newly created Profile.
Figure 22: AEP Profile detail on sample profile
Click on the Segment Membership tab and notice all the segments this profile qualifies for:
Figure 23: AEP Sample Profile audience membership
Note, it may take some time, up to 24 hours, to see the first few profiles appear. There is latency initially with newly created datasets, dataflows and audiences.
Summary and Next Steps
In this guide you have created and tested a new Source connector within your AEP sandbox. Once published, this connector would be available to AEP customers that have access to the public Sources Catalog. They can then use the connector to push audiences and profile data to your platform.
However, before you offer this Source connector to AEP customers, there are a few more steps to follow:
1. Create and customize a connector that better matches your needs
You will need to go through this exercise again and create a connector that better matches the requirements for your product. You can modify the Postman requests provided and make the needed adjustments to fit the naming, authentication, data mapping (etc.) to configure a successful connector for your customers.
2. Create a new connection specification to Adobe.
Additionally, you will need to request access and submit a Git pull request to a private Git repository (provided by Adobe) that will communicate final settings on configs (icon, description, Flow Specification, etc.) to the Streaming Sources SDK team for review.
3. Submit your source for review.
After you have configured, tested, and documented your destination, you are ready to submit your source. Read more in the article on testing and submitting your streaming source for review.
Attachments: