Proxy Port Method

To simplify implementation for users with existing proxy pools, we offer a proxy front-end to the API. The proxy will take your requests and pass them through to the API which will take care of proxy rotation, captchas, and retries.

The proxy mode is a light front-end for the API and has all the same functionality and performance as sending requests to the API endpoint.

The username for the proxy is scraperapi and the password is your API key.

"http://scraperapi:APIKEY@proxy-server.scraperapi.com:8001"
const axios = require('axios');
axios.get('http://httpbin.org/ip', {
  method: 'GET',
  proxy: {
    host: 'proxy-server.scraperapi.com',
    port: 8001,
    auth: {
      username: 'scraperapi',
      password: 'APIKEY'  
    },
    protocol: 'http'
  }
})
  .then(response => {
    console.log(response)
  })
  .catch(error => {
    console.log(error)
  });

Note: So that we can properly direct your requests through the API, your code must be configured to not verify SSL certificates.

To enable extra functionality whilst using the API in proxy mode, you can pass parameters to the API by adding them to username, separated by periods.

For example, if you want to enable Javascript rendering with a request, the username would be scraperapi.render=true

"http://scraperapi.render=true:APIKEY@proxy-server.scraperapi.com:8001"

Multiple parameters can be included by separating them with periods; for example:

"http://scraperapi.render=true.country_code=us:APIKEY@proxy-server.scraperapi.com:8001"

Proxy Mode with SSL verification

If you would like to send requests to our Proxy API with SSL verification, you can manually trust our certificate by following these steps:

  • Download Our Proxy CA Certificate:

Please follow this link to download our proxy CA certificate.

  • Manual trust:

Once you've downloaded the certificate, manually trust it in your scraping tool or library settings. This step may vary depending on the tool or library you're using, but typically involves importing the certificate into your trusted root store or by configuring SSL/TLS settings. Depending on your operating system, follow the instructions below to install the ScraperAPI CA Certificate:

Windows 10/11

  1. Press the Win key + R hotkey and input mmc in Run to open the Microsoft Management Console window.

  2. Click File and select Add/Remove Snap-ins.

  3. In the opened window select Certificates and press the Add > button.

  4. In the Certificates Snap-in window select Computer account > Local Account, and press the Finish button to close the window.

  5. Press the OK button in the Add or Remove Snap-in window.

  6. Back in the Microsoft Management Console window, select Certificates under Console Root and right-click Trusted Root Certification Authorities

  7. From the context menu select All Tasks > Import to open the Certificate Import Wizard window from which you can add the Scraper API certificate.

More details can be found here.

If you encounter any Certificate Revocation List (CRL) Distribution Points (DPs) related errors, please add rejectUnauthorized: false to your script.

macOS

  1. Open Keychain Access window (Launchpad > Other > Keychain Access).

  2. Select System tab under Keychains, drag and drop the downloaded certificate file (or select File > Import Items... and navigate to the file).

  3. Enter the administrator password to modify the keychain.

  4. Double-click the ScraperAPI CA certificate entry, expand Trust, next to When using this certificate: select Always Trust.

  5. Close the window and enter the administrator password again to update the settings.

Linux

  1. Install the downloaded ScraperAPI proxyca.pem file:

sudo cp proxyca.pem /usr/local/share/ca-certificates/proxyca.pem
  1. Update stored Certificate Authority files:

sudo update-ca-certificates

If you have any questions or need further assistance regarding web scraping or certificate management, don't hesitate to reach out to support team. We're here to help you every step of the way!

Last updated