Technical requirements🔗
This article outlines the current technical requirements and limitations for using CDC on Data Loader.
Agent setup🔗
Each CDC pipeline requires its own Matillion agent to control the replication process. For more information, see our articles on agents for CDC.
Amazon S3🔗
- An Amazon Web Services (AWS) account. Signing up is free. Click here or go to https://aws.amazon.com to create an account if you don't have one already.
- Amazon username and password for the Amazon instance used during testing.
- Permissions to create and manage S3 buckets in AWS. Your AWS user must be able to create a bucket (if one doesn't already exist), add/modify bucket policies, and upload files to the bucket.
- The IAM role used by the Agent container has
putObjectpermissions for the S3 bucket and its prefix to be used as the destination by the pipeline. - An up and running Amazon S3 bucket.
Azure Blob Storage🔗
- Your destination should be an Azure Storage account that supports containers, such as BlobStorage, Storage, or StorageV2.
- At the minimum, the role
Reader & Data Accessis required for sufficient permissions. The role should be applicable for the Azure Storage account in which your destination container is located. - The destination container needs to use an access key for authentication.
- The agent container needs to use a shared key injected as an environment variable for authentication to the storage container.
- If your storage account only allows access from selected networks, you'll need to allow IPs.
Google Cloud Storage🔗
- A Google Cloud Storage account. You can sign up with a trial account for free. Go to https://cloud.google.com/storage to sign up.
- Google username and password for the Google Cloud Storage account used during testing.
- An up and running Google Cloud Storage bucket.
- Permissions to create and manage storage in your Google Cloud Storage account. Your Google Cloud Storage user must be able to create a bucket (if one doesn't already exist), add/modify bucket policies, and upload files to the bucket.