The proxy server allow you to set random IP address of cloud machines to hide the server IP while scraping data from websites anonymously. Because sending thousands of requests from same IP address may result in 4xx errors or temporary blocked, if the website has rate-limit.
So, to full-proof the Agenty web scraper from getting blocked, blacklisted or mislead by the target websites for scalable web scraping for businesses. We have the smart rotating proxies pool to automatically refresh the IP address every 30 seconds with full control to customize what proxy, country, city and when the auto-rotation should happen in your web scraping agent. Which allow data scraping professionals to success in 99.9% of their websites scraping projects running on Agenty cloud.
Attach proxy to scraper
Agenty has static, residential and Geo-based proxy servers available on different plans. You may use these servers for anonymous web scraping with auto-rotating IP address every 30 seconds to prevent getting blocked while scraping websites.
- Go to your agent page, click on the edit tab to change the proxy configuration
- Scroll down to proxy settings and use the Select a proxy options to select a proxy server as per your plan
- The country setting allow you to tell what country IP address should be used for particular website scraping.
- The auto-rotate settings is used to automatically refresh the IP address after
(n)number of pages crawled, so you can use this option to control when the IP address should be refreshed
The static proxy is data centre proxies and when you select these types of proxy in your scraping agent. The web page crawling request is routed through one of the data centres in available regions. Agenty provides the static proxy with up to 5,000 static IPs in over 40+ countries like United States (US), United Kingdom (UK), Canada, Australia, Japan, Germany, France, India, Singapore and more…
Residential proxies are an IP address given by an Internet Service Provider (ISP) to some individual (also called as home user). So, each residential proxy address has a physical area, latitude, longitude which makes it real personal computer IP address.
It’s nearly impossible to detect or block such proxies and they come with very high anonymity for web scraping, which results in great success in your enterprise level data scraping projects.
The Geo-based proxies allow you to route your web crawling request to go from selected country or city. Because, in case you might want to extract the store level pricing or promotion from an ecommerce website which has stores on 100+ zip codes and the Geolocation is automatically selected dependent on your IP’s geolocation.
That means the website has difference product pricing for each city and country. And, at that point you will need to use Geo-based proxy in your web scraping agent to scrape the local prices and content from that website.
Add a custom proxy
Customers with Professional and higher plan can also bring their custom proxies to add in Agenty and use with their web scraping agents.
- Go to proxy page
- Click on the New proxy button to add a new proxy
- The proxy dialog will appear, where you can enter your
domainto insert a new custom proxy
- Finally, click on the Add new proxy button to add this proxy
proxy_addressmust be in
IPaddress:portformat as shown in screenshot.
- You may leave the username, password and domain fields blank if you are using public, elite or transparent proxy. But, we highly recommend to use our high anonymous proxy server available under different plans and use the custom proxy option in web scraping if you really need it.
Update a proxy
- Go to your proxy page
- Click on the proxy name hyperlink to open the proxy editor
- Here you can change the values and save it back.
Delete a proxy
- Select the proxies you want to delete from your account and click on the Delete button on red color.
- A confirmation box will appear and click on the Yes button to delete the proxy permanently
Remember - You cannot delete the public proxies (with lock icon) created by Agenty. They are created and maintained by Agenty team and you have view-only access to select in your web scraping agent and scrape data from websites anonymously.