Custom Kafka Connect Connectors

Menu

Updating custom connectorsIn this document we will look at how to configure and provide custom connectors for your Instaclustr managed Kafka Connect cluster.


Using an AWS S3 bucket as your custom connector storage

In this example we will be utilizing an AWS CloudFormation template to provision the required S3 bucket, Access Policy ,User and User access keys required to configure the S3 custom connector storage.

  1. Create a CloudFormation template file instaclustr_custom_connectors with the below content.
  2. Please log into you AWS console and access https://console.aws.amazon.com/cloudformation
  3. Click on the Create Stack dropdown and select With new resources option
  4. In the create stack view select  Template is Ready for the Prerequisite – Prepare template section
  5. In the create stack view select Upload a template file for the Specify template section
  6. Upload the previously created instaclustr_custom_connectors CloudFormation template file. And enter the resource prefix that would be used for naming the resources created
  7. You can follow the rest of the create stack wizard according to the requirements of your account
  8. When the stack creation is complete you can access the S3 bucket name, the created user access key and secret key required to configure the custom connectors with instaclustr in the CloudFormation stack output view
  9. Once you’ve set up a Kafka Connect cluster and an S3 bucket, you can upload connectors to your bucket, see Updating custom connectors for more details

Using Azure storage as your custom connector storage

In order to use Azure storage for your custom connectors, you will need to create a storage account and container to hold your connectors. You can configure your Instaclustr Kafka Connect cluster when creating it to download these connectors and make them available for use on the cluster. We recommend you create a storage account specifically for this purpose as accessing it will be done via the storage account key.

  1. Go to the Azure portal and navigate to the “Storage Accounts” section
  2. Click the ‘Add’ button
  3. Go through the storage account creation screens. The default options will be sufficient
  4. Navigate to your new storage account, then go to the containers section of your storage account
  5. Create a new container with the default private access
  6. Upload your connectors
  7. Navigate back to your storage account, and take a note of the storage account access key

  8. When creating your Instaclustr Kafka Connect cluster, you will need your storage account name, your container name, and your storage account access key. You can update your custom connectors later, see Updating custom connectors for more details.

Using GCP storage as your custom connector storage

In order to use GCP Storage for your custom connectors, you will need to create a service account and a storage bucket to hold your connectors. 

You can then configure your Instaclustr Kafka Connect cluster in the creation page or in your provisioning api request, to download these connectors stored in the defined bucket and make them available for use on the cluster. We recommend you create a service account specifically for this purpose as accessing it will be done via the service account key.

  1. Go to the IAM page in your GCP Console, and click into the Service Accounts section.
  2. Create a service account by clicking the add button.
  3. In the permission section, assign this service account Storage Object Viewer role.
  4. Once the service account is created, you need to add a key to authenticate with it. This can be done in the Edit page of that service account, you need to create a new key in JSON format. This will trigger a download of a JSON file to your local computer. This file is needed during the cluster creation process.
  5. Next is to create a storage bucket to put your custom connectors in if you haven’t got one yet, you can do this in the Storage page.
  6. The service account you just created needs to be added to this bucket so that the service account can read from it.
  7. During the cluster creation process, you need the project_id, private_key_id, private_key, client_email, client_id from the JSON file of the key you downloaded and the bucket name.                                                                                    
  8. In the Provisioning API request for cluster creation, you need to add the gcp storage details with keys : gcp.private.key.id, gcp.private.key, gcp.client.email, gcp.client.id, gcp.storage.bucket.name, gcp.project.id in the options object for the Kafka Connect bundle. Below is a sample kafka connect cluster creation request body with the GCP storage options included
  9. You can add more connectors to the bucket later, please refer to Updating custom connectors for more details.

FREE TRIAL

Spin up a cluster in less
than 5 minutes.
(No credit card required)

Sign Up Now
Close

Site by Swell Design Group