Cost Control
Cost limits can be configured using the max_cost parameter, which allows you to define the maximum number of API credits used per scrape. This ensures that each request stays within your defined budget and avoids unexpected usage.
If the scrape cost exceeds your limit, a 403 status code will be returned for the request, with the following error message:
"This request exceeds your max_cost. You can view the cost per request in your response header or in the API Playground on the dashboard."
API REQUEST
curl --request GET \
--url 'https://api.scraperapi.com?api_key=API_KEY&premium=true&max_cost=5&url=https://example.com/'import requests
target_url = 'https://example.com/'
# Replace the value for api_key with your actual API Key.
api_key = 'API_KEY'
request_url = (
f'https://api.scraperapi.com?'
f'api_key={api_key}'
f'&premium=true'
f'&max_cost=5'
f'&url={target_url}'
)
response = requests.get(request_url)
print(response.text)import request from 'node-fetch';
// Replace the value for api_key with your actual API Key.
const url = 'http://api.scraperapi.com/?api_key=API_KEY&premium=true&max_cost=5&url=https://example.com/';
request(url)
.then(response => {
console.log(response);
})
.catch(error => {
console.error(error);
});ASYNC REQUEST
curl -X POST \
-H "Content-Type: application/json" \
-d '{
"apiKey": "API_KEY",
"url": "https://example.com/",
"apiParams": {
"premium": "true",
"max_cost": "5"
}
}' \
"https://async.scraperapi.com/jobs"import requests
r = requests.post(
url='https://async.scraperapi.com/jobs',
json={
# Replace the value for api_key with your actual API Key.
'apiKey': 'API_KEY',
'premium': 'true',
'maxCost': '5',
'url': 'https://example.com/'
}
)
print(r.text)PROXY MODE
Last updated

