API Endpoint Method
ScraperAPI exposes a single API endpoint for you to send GET requests. Simply send a GET request to
http://api.scraperapi.com
with two query string parameters and the API will return the HTML response for that URL:api_key
which contains your API key, andurl
which contains the url you would like to scrape
You should format your requests to the API endpoint as follows:
1
import requests
2
payload = {'api_key': 'APIKEY', 'url': 'https://httpbin.org/ip'}
3
r = requests.get('http://api.scraperapi.com', params=payload)
4
print(r.text)
5
# Scrapy users can simply replace the urls in their start_urls and parse function
6
# ...other scrapy setup code
7
start_urls = ['http://api.scraperapi.com?api_key=APIKEY&url=' + url]
8
def parse(self, response):
9
# ...your parsing logic here
10
yield scrapy.Request('http://api.scraperapi.com/?api_key=APIKEY&url=' + url, self.parse)
To enable other API functionality when sending a request to the API endpoint simply add the appropriate query parameters to the end of the ScraperAPI URL.
For example, if you want to enable Javascript rendering with a request, then add
render=true
to the request:import requests
payload = {'api_key': 'APIKEY', 'url':'https://httpbin.org/ip', 'render': 'true'}
r = requests.get('http://api.scraperapi.com', params=payload)
print(r.text)
# Scrapy users can simply replace the urls in their start_urls and parse function
# ...other scrapy setup code
start_urls = ['http://api.scraperapi.com?api_key=APIKEY&url=' + url +'&render=true']
def parse(self, response):
# ...your parsing logic here
yield scrapy.Request('http://api.scraperapi.com/?api_key=APIKEY&url=' + url +'&render=true', self.parse)
To use two or more parameters, simply add them to the payload.
Last modified 1mo ago