You can set up automated import using AWS S3 when you want to bring in data from a system that is not supported by one of D&B CDP's automated import connectors. Systems supported by automated import connectors are listed in the Connectors for Automated Import article.
In order to leverage this capability, customers have to follow the below steps.
- Create the Import Template for the object (e.g. Accounts, Contacts etc.) before setting up the automation.
- Generate Automation Credentials.
- Get the folder location where users can transfer data from their systems to Atlas. Atlas will automatically ingest data from this folder
- Transfer data to the drop folder location of the object.
- Track progress through email notifications and Jobs page.
Create Import Template
It is necessary that an Import Template is created before setting up the automation. Every object from each system has a different template. For more information on how to create a template, visit the following links and scroll to the create template sections
1. Creating Accounts Import Template
2. Creating Contacts Import Template
3. Creating Product Purchases Import Template
4. Creating Product Bundles Import Template
5. Creating Product Hierarchy Import Template
6. Creating a Web Visit Import Template
7. Creating a Marketing Activity Import Template
8. Creating an Opportunity Import Template
Generate Automation Credentials
Every D&Bs tenant has an AWS S3 account set up setup that can be readily used. In order to access the information, you will have to be an Admin. No other user profile is allowed to generate or re-generate these credentials. To generate,
1. Click on "Import Data" from My Data page.
2. Click on Get Access Tokens.
3. You will be taken to the Connections section of Lattice where you will see an AWS S3 connection. This is automatically created for each tenant in Lattice. Click on Get Existing Token.
It is recommended that you click on the download button to save the credentials for future use. You will need the Access Key, Secret Key, Bucket and dropfolder location to automate your files to AWS S3.
4. Copy the "Automated Import location" of your object that is available on each template by clicking on the copy button.
5. Setup and test your automation outside the platform by using small data files. Each template will have a location on S3. Each object file should always be transferred to the respective location. This is because every location is tied to the Import Template created and will validate the file against it.
6. Transfer your data files to the respective folder. An action will be generated on the Data Processing Jobs page immediately.
The Customer Admins also get an email notification of the jobs progress.
Here is a good resource to programmatically transfer data to S3 buckets:
Please sign in to leave a comment.