Cost Control

ScraperAPI helps you control and manage your costs efficiently. By using the max_cost parameter with your requests, you instruct the API to set a limit on the maximum API credits you'd like to spend per each individual scrape. This helps prevent overspending, ensuring you stay within your individual project's budget.

  • API REQUEST

import requests
payload = {'api_key': 'APIKEY', 'premium': 'true', 'max_cost': '5', 'url':'https://example.com/'}
r = requests.get('https://api.scraperapi.com', params=payload)
print(r.text)
  • ASYNC REQUEST

import requests
url = 'https://async.scraperapi.com/batchjobs'
data = {
    'apiKey': 'API_KEY',
    'url': 'https://example.com/page1',
    'apiParams': {
        'premium': 'true',
        'max_cost': '5'
    }
}
r = requests.post(url=url, json=data)
print(r.text)
  • PROXY MODE - COMING SOON!

If the scrape cost exceeds your limit, a 403 status code will be returned for the request, with the following error message:

"This request would cost more than your max_cost. You can see the cost per request in your response header or in API Playground in the dashboard."

Last updated