# Salesforce Data Cloud

Share data between Airship and Salesforce Data Cloud using out-of-the-box tools on both platforms.

Salesforce Data Cloud is a real-time data platform that helps you unify, harmonize, and activate customer data across your organization. You can share data between Airship and Salesforce Data Cloud using native integration capabilities on both platforms without requiring custom development or third-party tools.

This guide covers bidirectional data flow:
* **From Salesforce Data Cloud to Airship** — Import customer data from Salesforce Data Cloud using SFTP to enrich your Airship [Attributes](https://www.airship.com/docs/reference/glossary/#attributes).
* **From Airship to Salesforce Data Cloud** — Export Airship [Real-Time Data Streaming](https://www.airship.com/docs/reference/glossary/#rtds) event data to Salesforce Data Cloud using the Data Export integration with Amazon S3 storage.

## Import to Airship

You can import customer data from Salesforce Data Cloud into Airship using SFTP. This enables using Salesforce Data Cloud data for audience targeting and personalization in your Airship messaging campaigns.

1. **Set up Attributes** — In Airship, create the necessary [Attributes](https://www.airship.com/docs/reference/glossary/#attributes) that will receive the data from Salesforce Data Cloud. See the [Attributes guide](https://www.airship.com/docs/guides/audience/attributes/) for details.

1. **Configure SFTP in Airship** — Airship's SFTP implementation uses SSH key pairs for authentication. You must create a pair of keys: a private key for your client and a public key for Airship.

   Follow the steps in [Generate keys](https://www.airship.com/docs/guides/audience/segmentation/sftp-upload/#generate-keys) and [Add your public key to Airship](https://www.airship.com/docs/guides/audience/segmentation/sftp-upload/#add-your-public-key-to-airship) in *SFTP upload for CSV files*. Make sure to set the Purpose to `Attributes`, and note the host, port, and username for use in the next section.

1. **Create a Data Lake Object** — In Salesforce Data Cloud, create a Data Lake Object (DLO) that matches the CSV structure required by Airship:
   1. Go to **Data Cloud**, then **Data Streams**.
   1. Create a new Data Lake Object.
      * Configure the schema to match your Airship Attributes.
         * You must include a column for the channel identifier and one column for each Attribute. The identifier can be a [Channel ID](https://www.airship.com/docs/reference/glossary/#channel_id), [Named User](https://www.airship.com/docs/reference/glossary/#named_user), or email address.
         * Ensure field names match the Attribute keys defined in Airship.
         * For more information, see [Attributes CSV format](https://www.airship.com/docs/reference/messages/csv-formatting/#attributes) in the *CSV Formatting Reference*.
      * Configure the output format as CSV.
      * Set up SFTP as the destination using your credentials from Airship.
      * Schedule the data export frequency based on your requirements.

1. **Verify data import** — Verify that data is being successfully imported into Airship:
   1. Go to **Audience**, then **Attributes**.
   1. Check that Attribute values are being populated for your audience.

Monitor your SFTP logs in Salesforce Data Cloud for delivery issues.

## Export to Salesforce Data Cloud

You can export Airship event data to Salesforce Data Cloud using the [Data Export](https://www.airship.com/docs/integrations/data-export/) integration with Amazon S3 as the destination. This provides hourly batches of structured [Real-Time Data Streaming](https://www.airship.com/docs/reference/glossary/#rtds) data that Salesforce Data Cloud can ingest.

[Real-Time Data Streaming](https://www.airship.com/docs/reference/feature-packages/#data) is required for this integration.

Before configuring the integration, you must set up a storage location:

1. Create an Amazon S3 bucket that will serve as the intermediate storage for Airship data.
1. Configure IAM credentials with write access to the S3 bucket. Follow [Amazon's documentation: Managing access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_CreateAccessKey).
1. Grant Salesforce Data Cloud read access to the S3 bucket. Follow [Salesforce documentation on S3 data sources](https://developer.salesforce.com/docs/data/data-cloud-int/guide/c360-a-awss3-bucket-policies.html).

Next, complete the integration steps:

1. **Configure Data Export in Airship** — Set up the Data Export integration to send event data to your S3 bucket:
   1. Next to your project name, select the dropdown menu (), then **Settings**.
   1. Under **Project settings**, select **Partner Integrations**.
   1. Select **Data Export**.
   1. Select **Configure** for the outbound integration and follow the onscreen instructions to:
      * Select **Amazon S3** as your storage provider.
      * Enter your AWS access key ID and secret access key.
      * Select your AWS region.
      * Enter your S3 bucket name.
      * Select **Simplified** as the [file layout](https://www.airship.com/docs/integrations/data-export/#structure-and-files). This structure works well with Salesforce Data Cloud ingestion.
      * Select the Airship events to send to Salesforce Data Cloud.
      * Configure options:
         * Specify a folder path.
         * Enable compression to save storage space.
         * Enable server-side encryption.

   After completing configuration, it may take several minutes to begin populating events. Files will be generated hourly for each event type, following this pattern:

   > [folder_path]/appKey/integrationId/eventType/YYYY_MM_DD_HH_mm_ss.csv

1. **Create Data Lake Objects** — For each Airship event type you selected for export, create a corresponding Data Lake Object in Salesforce Data Cloud:
   1. Navigate to **Data Cloud**, then **Data Streams**, and create a new Data Lake Object.
   1. Configure the DLO to match the schema of the Airship event data. See the [Snowflake integration](https://www.airship.com/docs/integrations/snowflake/) for detailed event schemas and field definitions.
   1. Add the S3 bucket and folder path as a source for the DLO.
   1. Configure the ingestion schedule. Hourly is recommended to align with Airship's export frequency.
   1. Create field mappings between the CSV columns and the DLO fields.

   Creating DLOs and mappings is a one-time setup exercise. Once configured, new data will automatically flow from Airship to Salesforce Data Cloud as events occur.

1. **Verify data export** — Monitor the integration to ensure data is flowing correctly:
   1. In your S3 bucket, verify that CSV files are being created hourly in the expected folder structure.
   1. In Salesforce Data Cloud, check the Data Stream status and verify that data is being ingested successfully.
   1. Review any ingestion errors in Salesforce Data Cloud and adjust field mappings if necessary.
