[ad_1]
I’m using the mongoDB kafka connector from kafka connect to read a kafka topic in order to insert data to a mongoDB collection.
Here an example of my connector :
{
"name": "order-mongodb-sink",
"config": {
"connector.class": "com.mongodb.kafka.connect.MongoSinkConnector",
"database":"Trade",
"collection":"Order",
"topics":"Order",
"connection.uri":"{{mongodb-uri}}",
"mongo.errors.tolerance": "all",
"mongo.errors.log.enable": "true",
"errors.log.include.messages": "true",
"writemodel.strategy":"com.mongodb.kafka.connect.sink.writemodel.strategy.ReplaceOneBusinessKeyStrategy",
"document.id.strategy": "com.mongodb.kafka.connect.sink.processor.id.strategy.PartialValueStrategy",
"document.id.strategy.overwrite.existing": "true",
"document.id.strategy.partial.value.projection.type": "allowlist",
"document.id.strategy.partial.value.projection.list": "date,orderId"
}
}
After the document is inserted to the collection, I would also like to insert this same document in another collection with some some modification made to it.
The initial idea was to code a kafka stream application that would connect to this topic transform the event and write the new one in a new topic on which I would put a sink connector similar to the one above. It sounds complicated.
I was wondering if I could directly use a mongoDB feature like a trigger that would do that.
I came across the post processor feature of the kafka connect mongoDB sink but it doesn’t seem to allow writing to a different collection.
[ad_2]