Salesforce Data Cloud

Share data between Airship and Salesforce Data Cloud using out-of-the-box tools on both platforms.

View as Markdown

Salesforce Data Cloud is a real-time data platform that helps you unify, harmonize, and activate customer data across your organization. You can share data between Airship and Salesforce Data Cloud using native integration capabilities on both platforms without requiring custom development or third-party tools.

This guide covers bidirectional data flow:

Import to Airship

You can import customer data from Salesforce Data Cloud into Airship using SFTP. This enables using Salesforce Data Cloud data for audience targeting and personalization in your Airship messaging campaigns.

  1. Set up Attributes — In Airship, create the necessary AttributesMetadata used for audience segmentation and personalization. They extend the concept of Tags by adding comparison operators and values to determine whether or not to target a user, helping you better evaluate your audience. that will receive the data from Salesforce Data Cloud. See the Attributes guide for details.

  2. Configure SFTP in Airship — Airship’s SFTP implementation uses SSH key pairs for authentication. You must create a pair of keys: a private key for your client and a public key for Airship.

    Follow the steps in Generate keys and Add your public key to Airship in SFTP upload for CSV files. Make sure to set the Purpose to Attributes, and note the host, port, and username for use in the next section.

  3. Create a Data Lake Object — In Salesforce Data Cloud, create a Data Lake Object (DLO) that matches the CSV structure required by Airship:

    1. Go to Data Cloud, then Data Streams.
    2. Create a new Data Lake Object.
      • Configure the schema to match your Airship Attributes.
      • Configure the output format as CSV.
      • Set up SFTP as the destination using your credentials from Airship.
      • Schedule the data export frequency based on your requirements.
  4. Verify data import — Verify that data is being successfully imported into Airship:

    1. Go to Audience, then Attributes.
    2. Check that Attribute values are being populated for your audience.

Monitor your SFTP logs in Salesforce Data Cloud for delivery issues.

Export to Salesforce Data Cloud

You can export Airship event data to Salesforce Data Cloud using the Data Export integration with Amazon S3 as the destination. This provides hourly batches of structured Real-Time Data StreamingA service that delivers user-level events in real time to your backend or third-party systems using the Data Streaming API. data that Salesforce Data Cloud can ingest.

Real-Time Data Streaming is required for this integration.

Before configuring the integration, you must set up a storage location:

  1. Create an Amazon S3 bucket that will serve as the intermediate storage for Airship data.
  2. Configure IAM credentials with write access to the S3 bucket. Follow Amazon’s documentation: Managing access keys.
  3. Grant Salesforce Data Cloud read access to the S3 bucket. Follow Salesforce documentation on S3 data sources.

Next, complete the integration steps:

  1. Configure Data Export in Airship — Set up the Data Export integration to send event data to your S3 bucket:

    1. Next to your project name, select the dropdown menu (), then Settings.
    2. Under Project settings, select Partner Integrations.
    3. Select Data Export.
    4. Select Configure for the outbound integration and follow the onscreen instructions to:
      • Select Amazon S3 as your storage provider.
      • Enter your AWS access key ID and secret access key.
      • Select your AWS region.
      • Enter your S3 bucket name.
      • Select Simplified as the file layout. This structure works well with Salesforce Data Cloud ingestion.
      • Select the Airship events to send to Salesforce Data Cloud.
      • Configure options:
        • Specify a folder path.
        • Enable compression to save storage space.
        • Enable server-side encryption.

    After completing configuration, it may take several minutes to begin populating events. Files will be generated hourly for each event type, following this pattern:

    [folder_path]/appKey/integrationId/eventType/YYYY_MM_DD_HH_mm_ss.csv

  2. Create Data Lake Objects — For each Airship event type you selected for export, create a corresponding Data Lake Object in Salesforce Data Cloud:

    1. Navigate to Data Cloud, then Data Streams, and create a new Data Lake Object.
    2. Configure the DLO to match the schema of the Airship event data. See the Snowflake integration for detailed event schemas and field definitions.
    3. Add the S3 bucket and folder path as a source for the DLO.
    4. Configure the ingestion schedule. Hourly is recommended to align with Airship’s export frequency.
    5. Create field mappings between the CSV columns and the DLO fields.

    Creating DLOs and mappings is a one-time setup exercise. Once configured, new data will automatically flow from Airship to Salesforce Data Cloud as events occur.

  3. Verify data export — Monitor the integration to ensure data is flowing correctly:

    1. In your S3 bucket, verify that CSV files are being created hourly in the expected folder structure.
    2. In Salesforce Data Cloud, check the Data Stream status and verify that data is being ingested successfully.
    3. Review any ingestion errors in Salesforce Data Cloud and adjust field mappings if necessary.