We are looking for a python developer with web scraping experience for a full-time basis. People based in Delhi- NCR are required.
Looking for simple scraper, should be 1/2 hours max for anybody with scrapy skill set or similar library. URL [login to view URL] returns the latest uploads of designs on dribbble for keyword Crypto. I want a scraper that would pull down latest ones, maybe it runs once every 3 days on digital ocean. It creates new card in
[login to view URL] I need to be able to input a custom URL and it needs to cycle scrape the top section by using the previous button. Also the information contained in the + section needs to be included. This needs to be exported to an excel spreadsheet. The sheet will also need some basic formatting and calculations. Example from similar project uploaded.
I want build a scraper in python to extract list of companies with details details of each company. Details I need are: Company Name, Website, Address, Phone, Revenue Range, Revenue, Industry and SIC Codes. Max budget is INR 4000
We need a logo for our new secure file transfer service. https://sft.mbmltd.co.uk. It needs to depict security, data, movement, global reach, ease-of-use, etc. We’d like it to pick-up on the yellow colour used on our site and, ideally, be square.
Environment: PHP with PHPDesktop and SQLite I am looking for a solution where I can install my PHP based app on client's PC and ensure that no one else can copy and install it on other PC. For this, we can pick up the motherboard ID, merge it with some key from a remote database server, generate a new key, encrypt it and save it in registry or database
Scraper needed ASAP! I need an Instagram scraper. Please check the Excel file. FIRST excel spreadsheet "SCRAPER DESIGN" shows how the script/scraper might look like visually SECOND excel spreadsheet "OUTPUT EXAMPLE" shows the example of generated CSV report with variables selected THE INPUTS: * Instagram usernames (I may enter as many accounts as
I need someone to add a scraper from a manga page to my cms, in that I already have other scrapers but I need a particular web. i use my Manga Reader CMS created by cyberziko FEATURES: Crawler/scrapper engine: automatically create chapters with images by downloading them from other Manga websites. (Sources mangapanda,mangafox....) i want add https://nhentai
I am working with Instagram project and I need to collect good hashtag lists for my accounts. I need a script where I can input instagram accounts/handles/usernames and the script will scrape all posts (you type how many latest posts), and grabs hashtags being posted on all accounts in the captions. Probably, I have to input the proxies so the script could scrape many accounts without a ban or so...
I made a start page of a project, [login to view URL], with call to some databases in mysql, and as I am a poor programmer of php + css + ajax I have been copying and pasting various codes until it works. I need a professional to turn this page into a safe and professional layout. I need to present this landing page to an American company, and if it works out
Hi, I need someone who has experienced scrapping Zoominfo. I need someone with their own zoominfo account. - I pay fix amount $200 for 1 Million data. (NO Less NO More) - Milestone will be created in two installments $100 for every 500K data. - I will supply the titles that I need results for. NOTE: When you bid please write "I am agree with Rate" Thanks!
I need a super Mikrotik RouterOS EXPERT, NO cowboys - to investigate loss of pings, slow speeds and secure our Mikrotik Router. We are running the latest version of RouterOS on a VPS - 100 GB SSD - 16 GB RAM - all worked PERFECTLY for 8 months until beginning of August when our customers have been complaining of extreme slow speeds and loss if pings
I need a scraper software tool (either script which can be run on local computer or a web application tool) which will be able to do the following: 1) User will be able to import bulk keywords line by line (using a text file or direct import into the software). 2) The tool must take each keyword and run a query in Google with "allintitle:keyword".