We scraped thousand of data per second in node.js and post them to node.js server
I want better architect on my server. I want to handle to multiple hundred of request properly and supply them to mobile client as per needs.
DataGrabber app:-
1. It received those curl/post request from API and insert into MongoDB or MySQL Runtime using Node,js scripts..Redis a better option here?
We received more than thousand request a second, which mechanism handle that with best architectural technology, let me know your thoughts?
Open with any other solutions.
2. [login to view URL] required on DataGrabber app when new row inserted and we emit specific data say last recent 50 rows to some other client app
From your specifications i feel a hybrid approach of redis + mongodb. redis is insanely fast given the fact that its an in-memory [login to view URL] it supports pub-sub well which suits the second [login to view URL] but redis is not so perfect for warehousing the data for that we can use the [login to view URL] the data will be operated in redis and older data will be stored in mongodb,which rarely needs to be accessed.
$1,052 USD in 3 days
0.0 (0 reviews)
0.0
0.0
5 freelancers are bidding on average $699 USD for this job
Greetings!!!
We are glad to bid on your project and would be more happy if get a chance to work on the same.
Please see PMB for details.
Thanks
Sonali@TeamUnipixel