Proxy Port Method

To simplify implementation for users with existing proxy pools, we offer a proxy front-end to the API. The proxy will take your requests and pass them through to the API which will take care of proxy rotation, captchas, and retries.

The proxy mode is a light front-end for the API and has all the same functionality and performance as sending requests to the API endpoint.

The username for the proxy is scraperapi and the password is your API key.

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Note: So that we can properly direct your requests through the API, your code must be configured to not verify SSL certificates.

To enable extra functionality whilst using the API in proxy mode, you can pass parameters to the API by adding them to username, separated by periods.

For example, if you want to enable Javascript rendering with a request, the username would be scraperapi.render=true

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi.render=true";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Multiple parameters can be included by separating them with periods; for example:

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi.render=true.country_code=us";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Proxy Mode with SSL verification

If you encounter any certificate issues while using our proxy mode, don't worry! You can now manually trust our certificate by following these steps:

  • Download Our Proxy CA Certificate:

To download our proxy CA certificate, simply make a call to: https://api.scraperapi.com/proxyca.pem

  • Manual trust:

Once you've downloaded the certificate, manually trust it in your scraping tool or library settings. This step may vary depending on the tool or library you're using, but typically involves importing the certificate into your trusted root store or by configuring SSL/TLS settings. Depending on your operating system, follow the instructions below to install the ScraperAPI CA Certificate:

Windows 10/11

  1. Press the Win key + R hotkey and input mmc in Run to open the Microsoft Management Console window.

  2. Click File and select Add/Remove Snap-ins.

  3. In the opened window select Certificates and press the Add > button.

  4. In the Certificates Snap-in window select Computer account > Local Account, and press the Finish button to close the window.

  5. Press the OK button in the Add or Remove Snap-in window.

  6. Back in the Microsoft Management Console window, select Certificates under Console Root and right-click Trusted Root Certification Authorities

  7. From the context menu select All Tasks > Import to open the Certificate Import Wizard window from which you can add the Scraper API certificate.

More details can be found here.

macOS

  1. Open Keychain Access window (Launchpad > Other > Keychain Access).

  2. Select System tab under Keychains, drag and drop the downloaded certificate file (or select File > Import Items... and navigate to the file).

  3. Enter the administrator password to modify the keychain.

  4. Double-click the ScraperAPI CA certificate entry, expand Trust, next to When using this certificate: select Always Trust.

  5. Close the window and enter the administrator password again to update the settings.

Linux

  1. Install the downloaded ScraperAPI proxyca.pem file:

sudo cp proxyca.pem /usr/local/share/ca-certificates/proxyca.pem
  1. Update stored Certificate Authority files:

sudo update-ca-certificates

If you have any questions or need further assistance regarding web scraping or certificate management, don't hesitate to reach out to support team. We're here to help you every step of the way!

Last updated