ScraperAPI exposes a single API endpoint for you to send GET requests. Simply send a GET request to http://api.scraperapi.com with two query string parameters and the API will return the HTML response for that URL:
api_key which contains your API key, and
url which contains the url you would like to scrape
You should format your requests to the API endpoint as follows:
import requestspayload ={'api_key':'APIKEY','url':'https://httpbin.org/ip'}r = requests.get('https://api.scraperapi.com', params=payload)print(r.text)# Scrapy users can simply replace the urls in their start_urls and parse function# ...other scrapy setup codestart_urls = ['http://api.scraperapi.com?api_key=APIKEY&url='+ url]defparse(self,response):# ...your parsing logic hereyield scrapy.Request('http://api.scraperapi.com/?api_key=APIKEY&url='+ url, self.parse)
To enable other API functionality when sending a request to the API endpoint simply add the appropriate query parameters to the end of the ScraperAPI URL.
For example, if you want to enable Javascript rendering with a request, then add render=true to the request:
import requestspayload ={'api_key':'APIKEY','url':'https://httpbin.org/ip','render':'true'}r = requests.get('https://api.scraperapi.com', params=payload)print(r.text)# Scrapy users can simply replace the urls in their start_urls and parse function# ...other scrapy setup codestart_urls = ['http://api.scraperapi.com?api_key=APIKEY&url='+ url +'&render=true']defparse(self,response):# ...your parsing logic hereyield scrapy.Request('http://api.scraperapi.com/?api_key=APIKEY&url='+ url +'&render=true', self.parse)
To use two or more parameters, simply add them to the payload.