Along with the “traditional” means of passing parameters, we also support passing parameters as headers. Passing parameters such as api_key, render, ultra_premium and instruction_set is very straightforward.
API REQUEST
Instead of including the parameters in the URL
require'net/http'require'uri'require'json'# Define parameters including headersparams = { api_key: "<YOUR_API_KEY>", url: "http://httpbin.org/ip", render: true}# URI for the API endpoint (note: use HTTPS)uri =URI('https://api.scraperapi.com/')uri.query =URI.encode_www_form(params)# Create a GET requestreq =Net::HTTP::Get.new(uri)req['Accept'] ='application/json'req['X-MyHeader'] ='123'# Create an HTTPS connectionhttp =Net::HTTP.new(uri.hostname, uri.port)http.use_ssl =truehttp.verify_mode =OpenSSL::SSL::VERIFY_PEER# Perform the HTTPS requestwebsite_content = http.request(req)# Output the response bodyputs website_content.body
you can just pass them as headers
require'net/http'require'json'# Define parameters including the target URLparams = { url: "https://httpbin.org/ip"# Specify the URL you want to scrape}# Set the API endpoint URI (HTTPS)uri =URI('https://api.scraperapi.com/')uri.query =URI.encode_www_form(params) # Encode parameters into the query string# Create a new GET requestreq =Net::HTTP::Get.new(uri)# Set Scraper API headersreq['x-sapi-render'] ='true'# Enable renderingreq['x-sapi-api_key'] ='<YOUR_API_KEY>'# Replace with your actual Scraper API key# Create an HTTPS connectionhttp =Net::HTTP.new(uri.hostname, uri.port)http.use_ssl =true# Enable SSL/TLS encryptionhttp.verify_mode =OpenSSL::SSL::VERIFY_PEER# Verify the server's certificate# Perform the HTTPS request and store the responsewebsite_content = http.request(req)# Output the response body (the scraped content)puts website_content.body
Please note that the 'x-sapi-' prefix is used on each header to avoid collisions with headers used by target sites. We support all standard parameters available with the API. The instruction_set parameter is specifically supported only through headers.
PROXY MODE
require'httparty'# Define headers with the required parametersheaders = {"x-sapi-render"=>"true"}# Set default options for HTTParty, disabling SSL verificationHTTParty::Basement.default_options.update(verify: false)# Make the HTTP request with HTTPartyresponse =HTTParty.get('http://httpbin.org/ip', { http_proxyaddr: "proxy-server.scraperapi.com", http_proxyport: "8001", http_proxyuser: "scraperapi", http_proxypass: "<YOUR_API_KEY>", headers: headers})# Capture the response bodyresults = response.body# Output the response bodyputs results
Note that credentials must still be passed to the proxy in the manner shown above, not as headers.