Find Jobs
Hire Freelancers

Simple Traffic Simulating/Faking Program (Google Analytics Detected Traffic)

$100-250 USD

Cancelled
Posted over 11 years ago

$100-250 USD

Paid on delivery
I need a fairly simple program written to search in Google, then simulate website traffic to a target URL. The program should run on a windows platform. It should use or simulate an actual internal browser, so the traffic is registered in Google Analytics. It should use a list of proxies. (See Detailed requirements for full description of operation.) ## Deliverables I need a fairly simple program written to search in Google, then simulate website traffic to a target URL. The program should run on a windows platform. It should use or simulate an actual internal browser, so the traffic is registered in Google Analytics. If possible I should be able to enter a list of user agents for the program to rotate through and pass the user agent data through to the site being visited to simulate multiple browsers and platforms being used. (Here is a site with many different user/browser agent formats listed as examples: [login to view URL]) The program should also have the ability to use a list of proxies, and delete any proxies from the list that fail. The program should simulate real traffic by running an actual search at Google for a list of keywords entered in one of the fields in the program. It will then search in Google, drilling down as many pages as necessary until it finds a page from the target URL. It will then click in Google on the search result and go to the target page. So for example in one data entry area in the program you can enter a list of keywords and URL pairs (separated by a pipe "|" )such as: Blue widgets|<[login to view URL]> Yellow widget|<[login to view URL]> Right hand widget|<[login to view URL]>/[login to view URL] Widget on sale|<[login to view URL]>/[login to view URL] Etc. The program would then run beginning with "blue widgets", searching for it in Google, and if it finds a match of the target URL such as, <[login to view URL]> on say page 6 of the search results it would then click on that link and go to the page. The program will drill 50 pages deep in the search results and if no match is found it will move onto the next KW. I would then like a data entry area in the program where you can specify a minimum and maximum value in seconds for how long the browser agent stays on the page. For example: Min = 15 sec Max = 120 sec In this case the program would pick a random time between 15 seconds and 2 minutes to stay on the page it went to from the search results. After the random time expires, I would like the ability to tell the program to either click on any random link on the page, or click on a specific link. So for example there could be a check box that says "Click on random link" and a check box that says "click on specific link". If "click on specific link is checked" the program will look for an additional URL in the original list separated by another pipe. Example: Blue widgets|<[login to view URL]>|<[login to view URL]> Yellow widget|<[login to view URL]>|<[login to view URL]> Right hand widget|<[login to view URL]>/[login to view URL]|<[login to view URL]> Widget on sale|<[login to view URL]>/[login to view URL]|<[login to view URL]> (If no additional URL is found after the pipe | then it just picks a random URL.) Example: If the program searched for "blue widgets" in google, then clicked on the search result and landed on <[login to view URL]> and it was set to wait between 15 and 120 seconds, then after the time was up click on a link on the page that goes to the "home page" <[login to view URL]> The program would then wait on that "second" page for a random amount of time between 2 specified min/max values and then click on a random link on that page taking it to a 3rd page. (No "specific page" option is necessary for the 3rd page.) Once on that 3rd page, the program would wait again a certain random delay time before clearing cookies, changing proxies, changing user agents and then moving to the next KW in the list and starting the process all over again. So the setup or data entry options for the program in the user interface would include: List of browser agents to simulate List of proxies to use List of keyword/URL combos in the following format: Keyword|Target Search Result URL|Specific Click-on URL Min Time on Page 1 in seconds Max time on page 1 in seconds Check box for "click on random link" Check box for "Click on Specific link" Min Time on Page 2 in seconds Max time on page 2 in seconds Check box for "click on random link page 2" Delay before next search Min in seconds (Just stays on 3rd page) Delay before next search Max in seconds The program should detect and log any errors and just move on to next KW search. It should also clear cookies, change proxies and change user agents before moving onto the next search. So a full run of 1 cycle of the program would look like this: Setup: List of browser agents to simulate: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/[login to view URL] Safari/535.2 Mozilla/5.0 (Windows; U; WinNT4.0; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0 List of proxies to use: [login to view URL] [login to view URL] List of keyword/URL combos : Hotels in Detroit| [login to view URL]| <[login to view URL]> Hotels in Chicago| [login to view URL]| [login to view URL] Min Time on Page 1 in seconds: 10 Max time on page 1 in seconds: 60 Check box for "click on random link": Unchecked Check box for "Click on Specific link": Checked Min Time on Page 2 in seconds: 15 Max time on page 2 in seconds: 30 Check box for "click on random link page 2": Checked Delay before next search Min in seconds: 30 Delay before next search Max in seconds: 120 With the following setup, a full run of the program would do the following: First it would use browser agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/[login to view URL] Safari/535.2 Then use proxie: [login to view URL] Then search in google for: Hotels in Detroit Then drill down to page #2, and click on the 5th search result (or wherever it finds the target URL [login to view URL]) Then wait on the page a random time between 10 and 60 seconds. Then click on the link to: <[login to view URL]> Then wait on THAT page for a random time between 15 and 30 seconds. Then click on ANY random link found on the page. Then wait on THAT page for a random time between 30 and 120 seconds. Then begin again by clearing cookies Switching Browser agents to: Mozilla/5.0 (Windows; U; WinNT4.0; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0 Switching Proxies to: [login to view URL] And then searching in Google for: Hotels in Chicago And then repeating the entire process above with the new "target" and "specific click" URLs. That is it! Hopefully I've explained the program, requirements and function in enough detail for you to understand it's desired operation. I know the explanation was long, but if you are a decent programmer it should be a fairly quick and easy program to write. Let me know if you have any questions.
Project ID: 2773831

About the project

Remote project
Active 12 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs

About the client

Flag of UNITED STATES
Kansas City, United States
0.0
0
Member since Jun 10, 2010

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.