We are using kafka pub sub architecture in our code. We have 3 data pipeline, 1st pipeline is we have to stream data from postgres sql to kafka. In second pipeline we are using kafka as a input in a python module. THe python module is having 3 seperate module which is running parallely and doing some calculation and return 3 seperate json. In 3rd pipeline we have to send the json output of the second pipeline to postgres sql in nested json(all three module json should be combined here).
Hi,
I am a certified big data developer and designed and developed many enterprise level applications using spark and with entire Hadoop ecosystem.
I also worked on same kind of requirements before with kafka and postgres db. I know how to handle all of your requirements.
Please let’s connect and discuss more on your requirements.
Thanks for your posting and I appreciate it
I already checked your project details and I am pleased to inform you that I have done same type for work before
I can provide you demo free
I have 15 years of expert team in different softwares, web solutions and mobile solutions.
waiting for your reply
Thanks and Regards,