Hi, I'm looking for a company or ...it on Wordpress, I need automated web crawl set up on the website. It needs to crawl image, price, content, sale notification in real time every day. Also need a filter option on top of it. I know shopstyle is a custom built website. I need an experienced IT technician who can implement it on my wordpress website.
...and hyperlink of the API/Website to crawl should be read from a database table. Solution should be generalised, such that, similar pattern websites are read similarly. Ex: 1. Standard APIs (samples): [login to view URL] [login to view URL] [login to view URL] 2. Websites Data Crawl (samples): https://angrypool
My website is not indexed by google i want someone who can make some changes like in [login to view URL] and sitemap to make the site indexed fast.
We need a custom dynamic crawl-tool for a car marketplace online portal. We need car data like make, model, car features, price etc. directly from the websites of the car dealers / directly from merchants. Sure, the URLs (about 10000) have different structures /format, with its own semantics (e.g. category /names. The data should be normalized. SO
I'm searching a partner for creating a new website e commerce for automotive world in Europe (modern and classics cars). He will be Create on the basis of woocommerce and translate in French English at the minimum. TAKE A LOOK AT [login to view URL] Webresponsive, full design, and associated at a crawler robot for having data from différents automotive
...links for 90% of the sites and crawl the remaining sites. ( Many input files, the format always remains the same however, the data/names will be different) • All of the data is in a table on the site • All output formats and documentation are written • Basic features such as enabling/disabling sites, custom crawl delay, pause, play, skip, on-screen
...features: - Admin/users system - Admin needs to be able to add users - users needs to be able to add search requests/queries (form with 5 fields) - these requests dicate the crawl queries - 5 different websites need to get crawled (it currently only crawls 1) - The current code works, but needs refactoring (the current project is very small so this wont
Looking for an experienced big data specialist to use the common crawl data set to get websites that offer tours and travels to [login to view URL] the successful candidate should have experience with the common crawl data set and how to implement the processing of this data with Map Reduce and running it on AWS EMR. You should be able to do this as cheaply
Consolidating these WP sites: [login to view URL], [login to view URL], [login to view URL] ... into this domain [login to view URL], [login to view URL], [login to view URL] 1. Backup 2. Migrate 3. Setup 301 redirects (regex) 4. Site crawl to ensure paths are working with no 404s
...freelancer to build us a PHP Script like this: - Search for a predefined search term on Google (results: 4'100 results). - For every entry: + Go to the site of the result + Crawl 4 information + Create an xls with an entry for every result, including the crawled information (structure of the data will be presented when the job has been assigned). That's
I am looking for a php expert who can solve issue in php curl Its a simple php curl code to crawl a given url and get title description etc from that url If a url has cloudflare enabled, it returns as "access denied" If you can solve, only then bid
A Powershell script is needed which anonymize the IP, FQDN, username and put the real values in a new table. Should run in a folder and crawl all *.txt and *.log files. search for IP, fqdn, username and put org value into a new file with count, hash, real value and also replace the real value with the hash. eg. source: "https://*.[login to view URL],https://*
...Write a code to automatically to update the database (sometimes the data is updated, edited or deleted on the source website from where the data is scraped, so these changes should be reflected in our MySQL database after every crawl) 7. Do not have duplicate data in the database 8. Intelligently rotate IP to avoid getting banned 9. Run the crawler