I have a file called "[login to view URL]". This is attached.
This file lets you crawl a website.
But a website like Google will only let you crawl a limited number of times before you are not allowed to crawl for a time.
I need the file to be able to use proxy servers (IP adresses) used at random, so Google can not see that it is the same website trying to crawl again and again.
This is how the file is used to crawl a google search result:
<?php
include("[login to view URL]");
$searchterm=$_GET["searchterm"];
$findresult="[login to view URL]$searchterm";
$resultfound = file_get_html($findresult);
echo $resultfound;
?>
Hi, Let me do it right now. I did the same for amazon product ordering software and have 8+ years of experience on PHP/MySQL and I am very much interested in working on your project. Thanks.
I will mask the script under different proxy using some random proxies.
I will take a maximum time of 3-4 hours to complete the task.
I am online, so we can chat and discuss over the project.
Thank you!
Best Regards.
you will get it done on the urgent basis we can chat right now we have made many websites you can have a look on the portfolio also we have 3 years of experience int the development field .