Salesforce Data Cloud
Overview
Salesforce Data Cloud (formerly Customer Data Platform / CDP) is Salesforce's customer data platform for unifying data across Salesforce clouds and external systems into a single, queryable customer profile.
Integrating Lytics with Salesforce Data Cloud lets you exchange profile, segment, and insight data between the two platforms. You can import unified Individual profiles, segment membership, and calculated insights from Data Cloud into Lytics to enrich behavioral profiles, and you can export Lytics audiences and user fields back to Data Cloud either in real time via the Ingestion API or in bulk via Google Cloud Storage for zero-copy ingestion.
Authorization
Lytics connects to Salesforce Data Cloud using the JWT Bearer Token flow against an External Client App in your Salesforce org. There is no browser-based OAuth flow for Data Cloud — JWT is required.
Two authorization methods are available, one for production orgs and one for sandbox orgs. Both use the same configuration fields; only the Salesforce login host differs.
If you are new to creating authorizations in Lytics, see the Authorizations documentation for more information.
Prerequisites
Before creating an authorization in Lytics, you must set up an External Client App in Salesforce that is enabled for Data Cloud and grant the connecting user access to it.
-
Generate an RSA key pair:
# Generate a private key openssl genpkey -algorithm RSA -out sfdc_private_key.pem -pkeyopt rsa_keygen_bits:2048 # Create a self-signed X.509 certificate openssl req -new -x509 -key sfdc_private_key.pem -out sfdc_cert.pem -days 365 -subj "/CN=Lytics Data Cloud JWT" -
Create an External Client App in Salesforce:
- Go to Setup > App Manager > New External Client App.
- Fill in the basic info (name, contact email).
- Check Enable OAuth Settings.
- Set the callback URL to any valid URL (e.g.
https://login.salesforce.com/services/oauth2/callback). This value is not used by JWT Bearer Token authentication but is required by Salesforce. - Select the following OAuth scopes:
- Manage user data via APIs (api)
- Perform requests at any time (refresh_token, offline_access)
- Perform ANSI SQL queries on Data Cloud data (cdp_query_api)
- Manage Data Cloud Ingestion API data (cdp_ingest_api)
- Manage Data Cloud profile data (cdp_profile_api)
- Manage Data Cloud Calculated Insight data (cdp_calculated_insight_api)
- Manage Data Cloud segment membership (cdp_segment_api)
- Check Use digital signatures and upload the
sfdc_cert.pemcertificate file you created above. - Click Save.
-
Pre-authorize users for the app:
- Open the External Client App you created and click Manage.
- Click Edit Policies.
- Under Permitted Users, select Admin approved users are pre-authorized.
- Click Save.
- Scroll down to the Profiles section and click Manage Profiles.
- Select the profile(s) that include the Salesforce user you want to authorize (e.g. System Administrator).
- Click Save.
-
Copy the Consumer Key:
- In the app detail page, copy the Consumer Key value.
Create the Authorization in Lytics
- Select Salesforce Data Cloud from the list of providers.
- Select the JWT Bearer Token method (or JWT Bearer Token (Sandbox) for sandbox orgs).
- Enter the Consumer Key from your Salesforce External Client App.
- Enter the Salesforce Username of the pre-authorized user.
- Paste the contents of your Private Key file (
sfdc_private_key.pem). - Enter a Label to identify your authorization.
- (Optional) Enter a Description for further context on your authorization.
- Click Save Authorization.
Import Data Cloud Segment Membership
This is the default import. It pulls unified Individual profile data along with segment membership and contact-point details (email, phone, and address) from Salesforce Data Cloud into Lytics. The Individual data populates user profile fields, while contact points and segment events are written to additional streams that can be joined into the user profile.
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: REST API Integration.
- Frequency: Batch Integration, configurable hourly, daily, weekly, or monthly.
- Resulting data: User Profiles populated from Data Cloud Individuals, with contact-point and segment-membership events available as additional streams.
This integration uses the Data Cloud Query API to retrieve data from the standard unified Data Model Objects (DMOs):
UnifiedIndividual— the core profile DMOUnifiedContactPointEmailUnifiedContactPointPhoneUnifiedContactPointAddress- Segment membership records
Each DMO is written to its own Lytics stream, and a default LQL query maps the streams into the user entity by email or hashed email.
Streams and Fields
Each run of this import writes to up to five streams. Default field mappings are listed below; you can customize the LQL queries after the import is created if you need to map additional DMO fields.
sdc_individuals
sdc_individualsMaps the unified Individual DMO to user profile fields.
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_individual_id | sdc_individual_ids unique id | Data Cloud: Individual IDs | []string |
| sdc_email | email unique id | string | |
| sdc_email | email_domain | Email Domain | string |
| sdc_email | email_sha256 | SHA-256 hash of email | string |
| sdc_first_name | first_name, sdc_first_name | First Name | string |
| sdc_last_name | last_name, sdc_last_name | Last Name | string |
| sdc_city | city, sdc_city | City | string |
| sdc_country | country, sdc_country | Country (ISO code) | string |
| sdc_birth_date | dob | Date of Birth | date |
| sdc_created_date | sdc_individual_created_date | Data Cloud: Individual Created Date | date |
| sdc_last_modified_date | sdc_individual_last_modified_date | Data Cloud: Individual Last Modified Date | date |
| sdc_last_user_imported | Data Cloud: Last User Imported | date |
sdc_contact_point_emails
sdc_contact_point_emailsMaps the UnifiedContactPointEmail DMO and applies email opt-out as a Lytics consent value.
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_email_address | email unique id | string | |
| sdc_email_address | email_domain | Email Domain | string |
| sdc_email_address | email_sha256 | SHA-256 hash of email | string |
| sdc_email_opt_out | consent_email_marketing | false if opted out, otherwise true | map |
| sdc_email_address | sdc_contact_point_email | Data Cloud: Contact Point Email | string |
| sdc_email_type | sdc_contact_point_email_type | Data Cloud: Email Type | string |
| sdc_email_opt_out | sdc_contact_point_email_opt_out | Data Cloud: Email Opt Out | bool |
| sdc_email_last_modified_date | sdc_email_last_modified_date | Data Cloud: Email Last Modified Date | date |
sdc_contact_point_phones
sdc_contact_point_phonesMaps the UnifiedContactPointPhone DMO. Phone numbers are joined to user profiles via a related email address from the linked Individual.
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_phone_email | email unique id | Email (from linked Individual) | string |
| sdc_phone_number | phone | Phone Number | string |
| sdc_phone_type | sdc_phone_type | Data Cloud: Phone Type | string |
| sdc_phone_opt_out | sdc_phone_opt_out | Data Cloud: Phone Opt Out | bool |
| sdc_phone_last_modified_date | sdc_phone_last_modified_date | Data Cloud: Phone Last Modified Date | date |
sdc_contact_point_addresses
sdc_contact_point_addressesMaps the UnifiedContactPointAddress DMO. Addresses are joined to user profiles via a related email address from the linked Individual.
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_address_email | email unique id | Email (from linked Individual) | string |
| sdc_address_street | sdc_address_street | Data Cloud: Street Address | string |
| sdc_address_city | city | City | string |
| sdc_address_state | sdc_address_state | Data Cloud: State | string |
| sdc_address_postal_code | sdc_address_postal_code | Data Cloud: Postal Code | string |
| sdc_address_country | country | Country (ISO code) | string |
| sdc_address_type | sdc_address_type | Data Cloud: Address Type | string |
| sdc_address_last_modified_date | sdc_address_last_modified_date | Data Cloud: Address Last Modified Date | date |
sdc_segments
sdc_segmentsRecords segment enter/exit events from Data Cloud. Active segment names are accumulated into a set user field for easy targeting in Lytics audiences.
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_email | email unique id | string | |
| sdc_segment_id | sdc_segment_id | Data Cloud: Segment ID | string |
| sdc_segment_name | sdc_segment_name | Data Cloud: Segment Name | string |
| sdc_membership_status | sdc_membership_status | Data Cloud: Membership Status | string |
| sdc_segment_name | sdc_active_segments | Set of currently active Data Cloud segments (only added when membership status is active) | []string |
Joining DMOs by emailContact-point, segment, and change-event streams are joined into the user profile by
Configuration
- Select Salesforce Data Cloud from the list of providers.
- Select the Import Data Cloud Segment Membership job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- (Optional) From the Data Cloud Segments input, select specific Data Cloud segment IDs to import. Leave empty to import membership for all segments.
- In the Stream input, select an existing Lytics stream or enter a new stream name. The default is
sdc_segments. (Other DMO streams use the fixed names listed above.) - From the Import Frequency input, select Hourly, Daily, Weekly, or Monthly.
- Click Start Import.
Import Data Cloud Calculated Insights
Calculated Insights are pre-computed metrics in Data Cloud (for example, lifetime value or recency/frequency scores). This job imports the most recent insight values into a Lytics stream so they can be mapped onto the user profile.
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: REST API Integration.
- Frequency: Batch Integration, configurable hourly, daily, weekly, or monthly.
- Resulting data: Insight metrics written as Event records that map onto the user profile.
This integration uses the Data Cloud Calculated Insights API to retrieve insight result rows and writes them to a stream in Lytics.
Fields
The default mapping for the sdc_calculated_insights stream:
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_email | email unique id | string | |
| sdc_insight_name | sdc_insight_name | Data Cloud: Insight Name | string |
| sdc_calculated_at | sdc_insight_calculated_at | Data Cloud: Insight Calculated At | date |
Because each calculated insight has its own dimensions and measures, you'll typically extend the LQL query for this stream to map the specific dimension and measure fields you care about onto the user profile after the import is set up.
Configuration
- Select Salesforce Data Cloud from the list of providers.
- Select the Import Data Cloud Calculated Insights job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- From the Calculated Insights input, select one or more calculated insights to import.
- (Optional) From the Dimensions input, select dimension filters to limit which insight rows are imported. Leave empty to import all rows.
- In the Stream input, select an existing Lytics stream or enter a new stream name. The default is
sdc_calculated_insights. - From the Import Frequency input, select Hourly, Daily, Weekly, or Monthly.
- Click Start Import.
Import Data Cloud Change Events
This job polls Salesforce Data Cloud for CREATE, UPDATE, and DELETE events on selected Data Model Objects and writes them to a Lytics stream. Use it when you need a near-real-time signal in Lytics that something changed in Data Cloud — for example, to trigger a workflow on a profile update.
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: REST API Integration (interval polling).
- Frequency: Continuous, polled at a configurable interval (default 5 minutes).
- Resulting data: Event records describing DMO change events.
The job tracks changes by reading a configurable timestamp field on each DMO (default LastModifiedDate__c). On the first run it fetches changes within the configured initial lookback window; subsequent runs use the timestamp of the previous successful poll.
Fields
The default mapping for the sdc_changes stream:
| Source Field | Lytics User Field | Description | Type |
|---|---|---|---|
| sdc_email | email unique id | string | |
| sdc_change_entity | sdc_change_entity | Data Cloud: Changed Entity | string |
| sdc_change_timestamp | sdc_change_timestamp | Data Cloud: Change Timestamp | date |
Configuration
- Select Salesforce Data Cloud from the list of providers.
- Select the Import Data Cloud Change Events job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- From the Data Model Objects input, select the DMOs to monitor for change events.
- In the Stream input, select an existing Lytics stream or enter a new stream name. The default is
sdc_changes. - (Optional) In Poll Interval (seconds), set how often to poll Data Cloud. Default is
300(5 minutes). - (Optional) In Initial Lookback (minutes), set how far back to look on the first run. Default is
15. - (Optional, Advanced) In Timestamp Field, set the DMO field used to track changes. The field must exist on every selected DMO. Default is
LastModifiedDate__c. - Click Start Import.
Export Audience Membership to Data Cloud
This export keeps Lytics audience membership in sync with a Salesforce Data Cloud Ingestion API source in real time. Users entering an audience are sent as adds; users exiting are sent as removes. Use this when you want Data Cloud to reflect the current state of a Lytics audience.
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: REST API Integration — Audience Trigger Integration.
- Frequency: Real-time Integration with a one-time Backfill of existing audience members after setup.
- Resulting data: Records in a Data Cloud Ingestion API target object reflecting Lytics audience membership.
This integration uses the Data Cloud Ingestion API. Each record sent to Data Cloud is keyed by the Match Field you select in Lytics — typically email or another identifier — so you can match incoming records to existing Individuals in Data Cloud.
Configuration
Before configuring this export, your Salesforce administrator must create an Ingestion API source and define a target object (stream/schema) in Data Cloud to receive Lytics audience membership.
- Select Salesforce Data Cloud from the list of providers.
- Select the Export Audience Membership to Data Cloud job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- From the Ingestion API Source input, select the Ingestion API source configured in Salesforce Data Cloud.
- From the Target Object input, select the object (stream) within that source where audience membership should be written.
- From the Audiences input, select the Lytics audience(s) whose membership should be synced.
- From the Match Field input, select the Lytics user field used to identify each record in Data Cloud (e.g.
email). Users without a value for this field will be skipped. - Click Start Export.
Export Users to Data Cloud
This export streams Lytics user profile data to a Salesforce Data Cloud Ingestion API object. Unlike the audience-membership export, you supply a custom mapping of Lytics user fields to Data Cloud object fields, which makes it suitable for sending arbitrary profile attributes (not just audience adds/removes).
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: REST API Integration.
- Frequency: Real-time Integration with a one-time Backfill of existing audience members after setup.
- Resulting data: Records in a Data Cloud Ingestion API target object populated from Lytics user fields.
The integration uses the Data Cloud Ingestion API and supports per-hour rate limiting. Records are sent in batches of up to 200 per request (the Data Cloud limit).
Configuration
- Select Salesforce Data Cloud from the list of providers.
- Select the Export Users to Data Cloud job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- From the Ingestion API Source input, select the Ingestion API source configured in Salesforce Data Cloud.
- From the Target Object input, select the object (stream) within that source.
- From the Audiences input, select the Lytics audience(s) to export.
- From the Field Mappings input, map Lytics user fields (left) to Data Cloud object fields (right). The Data Cloud field list is populated from the selected target object.
- (Optional) From the Match Field input, select a field used to match records in Data Cloud. The chosen field must be one of the mapped target fields.
- (Optional) In Batch Size, set how many records are sent per Ingestion API request. Default is
200. Maximum is200. - (Optional) In Max Requests Per Hour, set a rate limit for outbound requests to the Ingestion API. Default is
0(no rate limit). - Click Start Export.
Bulk Export to Data Cloud via GCS
This export stages Lytics audience data as a CSV or newline-delimited JSON file in a Google Cloud Storage bucket so that Salesforce Data Cloud can pick it up via zero-copy ingestion. Use this when you need to move large volumes of audience data into Data Cloud and prefer file-based ingestion over per-record API calls.
Integration Details
- Implementation Type: Server-side Integration.
- Implementation Technique: File Based Transfer Integration.
- Frequency: Batch Integration.
- Resulting data: A CSV or JSON file written to GCS at
<gcs_prefix>/<timestamp>.<ext>, available for Data Cloud zero-copy ingestion.
Configuration
You must have a Google Cloud Storage bucket that is accessible to Salesforce Data Cloud. Configuring the GCS-side data source for zero-copy ingestion is a one-time setup performed in Salesforce.
- Select Salesforce Data Cloud from the list of providers.
- Select the Bulk Export to Data Cloud via GCS job type.
- Select the Authorization you would like to use or create a new one.
- Enter a Label to identify this job in Lytics.
- (Optional) Enter a Description for further context on your job.
- From the Audiences input, select the Lytics audience(s) to include in the bulk export file.
- From the Field Mappings input, map Lytics user fields (left) to the column names that should appear in the exported file (right).
- (Optional) From the File Format input, choose CSV (default) or JSON (newline-delimited).
- In the GCS Bucket input, enter the Google Cloud Storage bucket where the file should be staged.
- (Optional) In the GCS Prefix input, enter a prefix (folder path) within the bucket. Files are written as
<prefix>/<timestamp>.<ext>. - Click Start Export.
API Limitations
Salesforce Data Cloud has its own API limits separate from the standard Salesforce CRM API limits. Limits vary based on your Data Cloud edition. Key limits to be aware of:
- The Ingestion API caps each request at 200 records. Lytics enforces this limit automatically; the Export Users to Data Cloud job lets you set a per-hour request rate limit if you need to throttle further.
- The Query API has per-org concurrency and rate limits. Polling-based imports (Change Events, Calculated Insights, Segment Membership) honor the configured run frequency or poll interval to stay within these limits.
- For high-volume audience exports, consider the Bulk Export to Data Cloud via GCS job, which uses zero-copy ingestion instead of per-record API calls.
See the Salesforce Data Cloud Limits documentation for current per-edition limits.
Updated about 23 hours ago
