The Salesforce Data Cloud simplifies the complex process of integration and ingestion of streaming data through very easy to use tools.
I am documenting below my work on a mini-project (my github repo is included at the end).
A Step by Step Walkthrough of How I Configured Streaming Ingestion
In this post I have used the terms CDP and Data Cloud interchangeably.
- Defined Ingestion API from CDP Setup
- Created & uploaded a YAML file for the source database table (i.e., MySQL database on Heroku). The YAML file is available in the Resources directory of this repository.
- The DLO in CDP created when you upload the YAML schema – the schema is essentially the schema (table definition) of the external database object.
- Created a Data Stream which corresponds to the DLO that is generated.
- I now have a Data Lake Object in which the data incoming from MySQL database table will land.
- Created a custom Data Model Object
- Created a data mapping from the Data Lake Object to my Data Model Object
- Defined the MuleSoft Flow using Mule Anypoint Studio (The JAR file is available in the resources directory of this repository).
- To authenticate Mule Connector with CDP I defined a Connected App in CDP – this connected app then will have the Consumer Key and Consumer Secret that you can use in Mule.
The following illustrations explain the steps for loading data into CDP with MuleSoft CDP Connector. I have used the streaming feature of the ingestion API. The same Mule Connector can be leveraged for both batch as well as streaming capabilities of the API.
My GitHub Repo for this Project
This contains my Mule jar file and my example YAML file I used for the Ingestion API: https://github.com/innovationworkshops/datacloudportal
This is not official Salesforce documentation. This is meant as a quick test/demo only. Not for production environments. Please use the official Salesforce & MuleSoft documentation for your project.
What is Salesforce Data Cloud: https://help.salesforce.com/s/articleView?id=sf.customer360_a.htm&type=5