Custom Headers

Important note: Only use this feature if you need to send custom headers to retrieve specific results from the website. Within the API we have a sophisticated header management system designed to increase success rates and performance on difficult sites. When you send your own custom headers you override our header system, which oftentimes lowers your success rates. Unless you absolutely need to send custom headers to get the data you need, we advise that you don’t use this functionality.

If you would like to use your own custom headers (user agents, cookies, etc.) when making a request to the website, simply set keep_headers=true and send the API the headers you want to use. The API will then use these headers when sending requests to the website.

If you need to get results for mobile devices, use the device_type parameter to set the user-agent header for your request, instead of setting your own.

  • API REQUEST

import requests
url = 'http://httpbin.org/anything'
headers = {
'Accept': 'application/json'
'X-MyHeader': '123',
}
payload = {'api_key': 'APIKEY', 'url': 'https://httpbin.org/ip', 'keep_headers': 'true'}
r = requests.get('http://api.scraperapi.com', params=payload, headers=headers)
print(r.text)

# Scrapy users can simply replace the urls in their start_urls and parse function in their parse function
# ...other scrapy setup code
headers = {
'Accept': 'application/json',
'X-MyHeader': '123'
}
start_urls = ['http://api.scraperapi.com?api_key=APIKEY&url=' + url + '&keep_headers=true']

def parse(self, response):
# ...your parsing logic here
scraper_url = 'http://api.scraperapi.com/?api_key=APIKEY&url=' + url + '&keep_headers=true'
yield scrapy.Request(scraper_url, self.parse, headers=headers)
  • PROXY MODE

import requests
headers = {
  'Accept': 'application/json'
  'X-MyHeader': '123',
}
proxies = {
  "http": "http://scraperapi.render=true:APIKEY@proxy-server.scraperapi.com:8001"
}
r = requests.get('http://httpbin.org/ip', proxies=proxies, headers=headers, verify=False)
print(r.text)

# Scrapy users can likewise simply pass their API key in headers.
# NB: Scrapy skips SSL verification by default.
# ...other scrapy setup code
start_urls = ['http://httpbin.org/ip']
meta = {
  "proxy": "http://scraperapi.keep_headers=true:APIKEY@proxy-server.scraperapi.com:8001"
}
headers = {
  'Accept': 'application/json'
  'X-MyHeader': '123',
}
def parse(self, response):
# ...your parsing logic here
yield scrapy.Request(url, callback=self.parse, headers=headers, meta=meta)
  • SDK Method

// # remember to install the library: pip install scraperapi-sdk
from scraperapi_sdk import ScraperAPIClient
client = ScraperAPIClient("API_KEY")
content = client.get(
    'https://example.com/', 
    params={'keep_headers': True},
    headers={
        "X-MyHeader": "123"  # Add your custom header here
    }
)
print(content)

Last updated