Kafka to Database Sync Application


This application template demonstrates continuous big data sync from a source to destination while reading data messages from a configured Kafka topic. This could be easily utilized and extended by any developer to create a fast, fault tolerant and scalable Big Data Sync or Retention Application to serve business with continuous data.

Required Properties

End user must specify the values for these properties.

Property Type Example Notes
Csv Parser Schema String { "separator": "|", "quoteChar": "\"", "fields": [{"name": "accountNumber","type": "Integer"} ,{"name": "name","type": "String"},{"name": "amount","type": "Integer"}]} JSON representing schema to be used by CSV parser
Jdbc Output Database Url String jdbc:postgresql://dest1 .corp1.com:5432/testdb Connection URL for the destination database.
Jdbc Output Store Password String postgres Password for destination database
Jdbc Output Store Username String postgres Username for destination database
Jdbc Output Table Name String test_event_output_table Destination table name
Kafka Broker List String
  • localhost:9092
  • node1.corp1.com:9092, node2.corp1.com:9092
Comma seperated list of kafka brokers
Kafka Topic Name String transactions Topic names on Kakfa

Advanced Properties (optional)

Property Default Type Example Notes
Initial Offset Of Topic For Kafka Consumer LATEST String
Whether to read from beginning or read from current offset.
Jdbc Output Database Driver org.postgresql .Driver String FQCN for jdbc driver class for destination database
Transform Expression Info {"accountNumber":"{$.accountNumber}", "name":"{$.name}.toUpperCase()", "amount":"{$.amount}"} String JSON map with key indicating output field. Value indicating expression to be used for calculating its value
Transform Output field Info {"accountNumber":"INTEGER", "name":"STRING", "amount":"INTEGER"} String JSON map with key indicating output field. Value indicating data type for the field.