LogoLogo
HomeProductsDownload Community Edition
  • Lenses DevX
  • Kafka Connectors
  • Overview
  • Understanding Kafka Connect
  • Connectors
    • Install
    • Sources
      • AWS S3
      • Azure Data Lake Gen2
      • Azure Event Hubs
      • Azure Service Bus
      • Cassandra
      • GCP PubSub
      • GCP Storage
      • FTP
      • JMS
      • MQTT
    • Sinks
      • AWS S3
      • Azure CosmosDB
      • Azure Data Lake Gen2
      • Azure Event Hubs
      • Azure Service Bus
      • Cassandra
      • Elasticsearch
      • GCP PubSub
      • GCP Storage
      • HTTP
      • InfluxDB
      • JMS
      • MongoDB
      • MQTT
      • Redis
      • Google BigQuery
  • Secret Providers
    • Install
    • AWS Secret Manager
    • Azure KeyVault
    • Environment
    • Hashicorp Vault
    • AES256
  • Single Message Transforms
    • Overview
    • InsertFieldTimestampHeaders
    • InsertRecordTimestampHeaders
    • InsertRollingFieldTimestampHeaders
    • InsertRollingRecordTimestampHeaders
    • InsertRollingWallclock
    • InsertRollingWallclockHeaders
    • InsertSourcePartitionOrOffsetValue
    • InsertWallclock
    • InsertWallclockHeaders
    • InsertWallclockDateTimePart
    • TimestampConverter
  • Tutorials
    • Backup & Restore
    • Creating & managing a connector
    • Cloud Storage Examples
      • AWS S3 Source Examples
      • AWS S3 Sink Time Based Partitioning
      • GCP Source
      • GCP Sink Time Based Partitioning
    • Http Sink Templating
    • Sink converters & different data formats
    • Source converters with incoming JSON or Avro
    • Loading XML from Cloud storage
    • Loading ragged width files
    • Using the MQTT Connector with RabbitMQ
    • Using Error Policies
    • Using dead letter queues
  • Contributing
    • Developing a connector
    • Utilities
    • Testing
  • Lenses Connectors Support
  • Downloads
  • Release notes
    • Stream Reactor
    • Secret Providers
    • Single Message Transforms
Powered by GitBook
LogoLogo

Resources

  • Privacy
  • Cookies
  • Terms & Conditions
  • Community EULA

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.

On this page
  • Setting up a new module
  • Package name
  • Dependencies

Was this helpful?

Export as PDF
  1. Contributing

Developing a connector

This section describes how to contribute a new connector to the Stream Reactor.

SBT, Java 11, and Scala 2.13 are required.

Setting up a new module

The Stream Reactor is built using SBT. Each connector is defined in a submodule in the root project.

  1. Add the new directory called kafka-connect-[your-connector].

  2. Under this add the standard path /src/main/

Package name

Use io.lenses.streamreactor.connector as the parent package. The following convention is used but each connector is different and can have more sub packages:

  • config - configuration and settings

  • sink - sink connectors, tasks and writers

  • source - source connectors, task and readers

Dependencies

Dependencies are declared in project/Dependencies.scala. Add the dependencies for you connector as a new field in the version object and the maven coordinates, for example:

object version {
....
val azureServiceBusVersion = "7.14.7"
...
lazy val azureServiceBus:  ModuleID = "com.azure" % "azure-messaging-servicebus"  % azureServiceBusVersion

Next, in the Dependencies trait add a sequence to hold you dependencies:

val kafkaConnectAzureServiceBusDeps: Seq[ModuleID] = Seq(azureServiceBus)

Next, declare the submodule in Build.sbt.

  1. Add the project to the subproject list:

  2. Defined the dependencies for you module. In this example kafkaConnectAzureServiceBusDeps holds the dependencies defined earlier.

lazy val subProjects: Seq[Project] = Seq(
  `query-language`,
  common,
  `cloud-common`,
  `aws-s3`,
  `azure-documentdb`,
  `azure-datalake`,
  `azure-servicebus`,
  `azure-storage`,
  Cassandra,
  elastic6,
  elastic7,
  ftp,
  `gcp-storage`,
  influxdb,
  jms,
  mongodb,
  mqtt,
  redis,
)

-----

lazy val `azure-servicebus` = (project in file("kafka-connect-azure-servicebus"))
  .dependsOn(common)
  .settings(
    settings ++
      Seq(
        name := "kafka-connect-azure-servicebus",
        description := "Kafka Connect compatible connectors to move data between Kafka and popular data stores",
        libraryDependencies ++= baseDeps ++ kafkaConnectAzureServiceBusDeps,
        publish / skip := true,
        packExcludeJars := Seq(
          "scala-.*\\.jar",
          "zookeeper-.*\\.jar",
        ),
      ),
  )
  .configureAssembly(true)
  .configureTests(baseTestDeps)
  .enablePlugins(PackPlugin)  
PreviousContributingNextUtilities

Last updated 2 months ago

Was this helpful?