General
How to use the API
ScraperAPI is proxy solution for web scraping. It is designed to make scraping the web at scale as simple as possible by removing the hassle of finding high quality proxies, rotating proxy pools, detecting bans, solving CAPTCHAs, managing geotargeting and rendering JavaScript.
Send a request to either our simple REST API, our Proxy Port or our Async API and we will return the HTML (or JSON for a lsit of domains) response from the target website. For more information on how to get started then check out our documentation here.
What is an API Credit?
We know that scraping is complex, and so is pricing. To properly reflect the effort it takes to scrape each site and make it easier for you to grow and scale, our pricing model is based on API credits. This also allows us to bring everything we have to bear on keeping Success Rates as high as possible and opens up all our features to you, rather than restricting them based on your plan.
Depending on the type of website you want to scrape and what parameters you need to use when making a request, a different number of API Credits will be charged for a single request.
For more details, please see our Documentation page here.
Increasing scraping speed
When you send a request to the API, we will route it through our proxy pools, check the response for CAPTCHAs, bans, etc. and then either return the valid HTML response to you or keep retrying the request with different proxies for up to 60 seconds before returning you a 500 status error.
The latencies can sometimes be higher than just sending requests directly to a normal proxy, as average latencies typically range from 4-12 seconds (depending on the website) and single requests can sometimes take up to 60 seconds. However, this is compensated by the fact that our average success rate is around 98%.
If you would like to increase your scraping speed, then we can increase the number of concurrent threads on your account. Contact our support team to enquire about increasing your concurrency limit.
If you like to reduce the latency of each request or reduce the longtail of some requests taking 20-30 seconds then you can use our premium proxy pools by adding premium=true to your request, or by contacting our support team to see if they can help you optimize your requests.
Concurrent Requests
Every plan with ScraperAPI has a limited number of concurrent threads which limit the number of requests you can make in parallel to the API (the Async API supports batch requests). The more concurrent threads your plan has, the faster you can scrape. If you would like to increase the number of concurrent requests you can make, please contact our customer support team.
In case your scraper is not utilizing the full concurrency thread limit of your Subscription, please make sure that its settings are correct and are set to utilize the exact amount of Threads that your Subscription allows.
In case you are still not utilizing all of your Concurrent threads, there might be something else blocking your requests on your machine. Please make sure you check both your Antivirus/Firewall and your Network for any issues. Another culprit might be the resource usage for the user on your machine.
You can adjust "ulimit" on both Linux and Windows machines to make sure that there are no restrictions on resource usage for the users set up on your machines. For Linux users, you will need to modify the file "/etc/security/limits.conf", and for Windows Users, you will need to modify your Registry Values. There are multiple guides that can be found on the Internet on how exactly this can be done.
Getting Failed Requests from the API
ScraperAPI routes your requests through proxy pools with over 40 million proxies and retries requests for up to 60 seconds to get a successful response. However, some of the requests will inevitably fail. You can expect 1-3% of your requests to fail, but you won’t be charged for those. If you configure your code to automatically retry failed requests, then in the majority of cases the retry will be successful.
If you are experiencing failure rates in excess of 10% then contact our support team for assistance.
Unused API Credits
At the moment, we don’t have the possibility to roll over unused API Credits. When your subscription renews, the API Credits counter resets. If you want to increase or decrease the amount of API Credits, you can change your plan from the Billing section. For any changes that are outside of the standard plans, please contact our support team.
Custom Built Scrapers
ScraperAPI is a proxy API for web scraping, so unfortunately we don’t develop custom scrapers for users. If you would like someone to build a scraper for you, then we would recommend ScrapeSimple who provide high quality scraping services at low costs.
Last updated

