LogoLogo
OverviewRelease NotesDataPipelineFAQs
Java
Java
  • Make Requests with ScraperAPI in Java
    • Use ScraperAPI Endpoint in Java
    • Use ScraperAPI Proxy Port in Java
    • Use ScraperAPI SDK in Java
    • Make Async Requests with ScraperAPI in Java
      • How to Use ScraperAPI Async Web Scraping in Java
      • Use Async ScraperAPI Callbacks in Java
      • Configure ScraperAPI Parameters in Java
      • Request Async Batch Scraping with ScraperAPI in Java
      • Decode Base64 Async Responses in Java
    • ScraperAPI Structured Data Collection in Java
      • Amazon Product Page API: Structured Data in Java
      • Amazon Search API: Structured Data in Java
      • Amazon Offers API: Structured Data in Java
      • Amazon Reviews API: Structured Data in Java
      • Ebay Product Page API: Structured Data in Java
      • Ebay Search API: Structured Data in Java
      • Google SERP API: Structured Data in Java
      • Google News API: Structured Data in Java
      • Google Jobs API: Structured Data in Java
      • Google Shopping API: Structured Data in Java
      • Google Maps Search API: Structured Data in Java
      • Redfin Agent Details API: Structured Data in Java
      • Redfin 'For Rent' Listings API: Structured Data in Java
      • Redfin 'For Sale' Listings API: Structured Data in Java
      • Redfin Listing Search API: Structured Data in Java
      • Walmart Search API: Structured Data in Java
      • Walmart Category API: Structured Data in Java
      • Walmart Product API: Structured Data in Java
      • Walmart Reviews API: Structured Data in Java
    • ScraperAPI Async Structured Data Collection in Java
      • Amazon Product Page API: Async Structured Data in Java
      • Amazon Search API: Async Structured Data in Java
      • Amazon Offers API: Async Structured Data in Java
      • Amazon Reviews API: Async Structured Data in Java
      • Ebay Product Page API: Async Structured Data in Java
      • Ebay Search API: Async Structured Data in Java
      • Google SERP API: Async Structured Data in Java
      • Google News API: Async Structured Data in Java
      • Google Jobs API: Async Structured Data in Java
      • Google Shopping API: Async Structured Data in Java
      • Google Maps Search API: Async Structured Data in Java
      • Redfin Agent Details API: Async Structured Data in Java
      • Redfin 'For Rent' Listings API: Async Structured Data in Java
      • Redfin 'For Sale' Listings API: Async Structured Data in Java
      • Redfin Listing Search API: Async Structured Data in Java
      • Walmart Search API: Async Structured Data in Java
      • Walmart Category API: Async Structured Data in Java
      • Walmart Product API: Async Structured Data in Java
      • Walmart Reviews API: Async Structured Data in Java
    • Making POST/PUT Requests with ScraperAPI in Java
    • Customizing ScraperAPI Requests in Java
      • Customize Amazon Requests by ZIP Code via ScraperAPI in Java
      • Customize Cached Results via ScraperAPI in Java
      • Customize Control Costs with ScraperAPI Parameter in Java
      • Send Custom Headers with ScraperAPI in Java
      • Customize Device Type with ScraperAPI in Java
      • Customize Geotargeted Content Scrape via ScraperAPI in Java
      • Customize Premium Geotargeted Scrape via ScraperAPI in Java
      • Customize Header Parameter with ScraperAPI in Java
      • Customize Premium Residential/Mobile Proxies in Java
      • Customize JavaScript-Rendered Pages via ScraperAPI in Java
        • Use Render Instruction Set to Scrape Dynamic Pages in Java
        • Customize Taking a Website Screenshots via ScraperAPI in Java
      • Customize Scrape Session-Based Proxies via ScraperAPI in Java
  • Handle and Process Responses via ScraperAPI in Java
    • Use API Status Codes to Retry Failed Requests in Java
    • Customize Output Formats via ScraperAPI Parameters in Java
      • Request JSON Response via Autoparse Parameter in Java
      • Request LLM Output Formats with ScraperAPI in Java
    • Request Response Encoding and Content-Type via ScraperAPI in Java
  • Dashboard & Billing
    • API Key
    • Credit Usage
    • Delete Account
    • Invoice History
    • Billing Email
    • Billing Address
    • VAT Number
    • Payment Method
    • Cancel Subscription
  • Credits and Requests
  • Monitor Your ScraperAPI Account Information in Java
  • Documentation Overview
Powered by GitBook

Quick links

  • Homepage
  • Dashboard
  • Pricing
  • Contact Sales

Resources

  • Developer Guides
  • Blog
  • Contact Support
  • Learning Hub
On this page
  • Proxy Mode with SSL verification
  • Windows 10/11
  • macOS
  • Linux

Was this helpful?

  1. Make Requests with ScraperAPI in Java

Use ScraperAPI Proxy Port in Java

Learn to implement ScraperAPI's proxy port in Java. Configure rotating proxies, bypass CAPTCHAs, and enable JS rendering with API keys. Includes SSL setup guide.

To simplify implementation for users with existing proxy pools, we offer a proxy front-end to the API. The proxy will take your requests and pass them through to the API which will take care of proxy rotation, captchas, and retries.

The proxy mode is a light front-end for the API and has all the same functionality and performance as sending requests to the API endpoint.

The username for the proxy is scraperapi and the password is your API key.

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Note: So that we can properly direct your requests through the API, your code must be configured to not verify SSL certificates.

To enable extra functionality whilst using the API in proxy mode, you can pass parameters to the API by adding them to username, separated by periods.

For example, if you want to enable Javascript rendering with a request, the username would be scraperapi.render=true

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi.render=true";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Multiple parameters can be included by separating them with periods; for example:

try {
    String apiKey = "APIKEY";
    String proxy = "proxy-server.scraperapi.com";
    URL server = new URL("https://httpbin.org/ip");
    String proxyUsername="scraperapi.render=true.country_code=us";
    Authenticator.setDefault(new Authenticator() {
        protected PasswordAuthentication getPasswordAuthentication() {
          return new PasswordAuthentication(proxyUsername, apiKey.toCharArray());
        }
      });
    Properties systemProperties = System.getProperties();
    systemProperties.setProperty("http.proxyHost", proxy);
    systemProperties.setProperty("http.proxyPort", "8001");
    HttpURLConnection httpURLConnection = (HttpURLConnection) server.openConnection();
    httpURLConnection.connect();
    String readLine = null;
    int responseCode = httpURLConnection.getResponseCode();
    if (responseCode == HttpURLConnection.HTTP_OK) {
        BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
        StringBuffer response = new StringBuffer();
        while ((readLine = in.readLine()) != null) {
            response.append(readLine);
        }
        in.close();
        System.out.println(response.toString());
    } else {
        throw new Exception("Error in API Call");
    }
} catch (Exception ex) {
    ex.printStackTrace();
}

Proxy Mode with SSL verification

If you would like to send requests to our Proxy API with SSL verification, you can manually trust our certificate by following these steps:

  • Download Our Proxy CA Certificate:

  • Manual trust:

Once you've downloaded the certificate, manually trust it in your scraping tool or library settings. This step may vary depending on the tool or library you're using, but typically involves importing the certificate into your trusted root store or by configuring SSL/TLS settings. Depending on your operating system, follow the instructions below to install the ScraperAPI CA Certificate:

Windows 10/11

  1. Press the Win key + R hotkey and input mmc in Run to open the Microsoft Management Console window.

  2. Click File and select Add/Remove Snap-ins.

  3. In the opened window select Certificates and press the Add > button.

  4. In the Certificates Snap-in window select Computer account > Local Account, and press the Finish button to close the window.

  5. Press the OK button in the Add or Remove Snap-in window.

  6. Back in the Microsoft Management Console window, select Certificates under Console Root and right-click Trusted Root Certification Authorities

  7. From the context menu select All Tasks > Import to open the Certificate Import Wizard window from which you can add the Scraper API certificate.

If you encounter any Certificate Revocation List (CRL) Distribution Points (DPs) related errors, please configure your SSL context to ignore certificate revocation checks.

macOS

  1. Open Keychain Access window (Launchpad > Other > Keychain Access).

  2. Select System tab under Keychains, drag and drop the downloaded certificate file (or select File > Import Items... and navigate to the file).

  3. Enter the administrator password to modify the keychain.

  4. Double-click the ScraperAPI CA certificate entry, expand Trust, next to When using this certificate: select Always Trust.

  5. Close the window and enter the administrator password again to update the settings.

Linux

  1. Install the downloaded ScraperAPI proxyca.pem file:

sudo cp proxyca.pem /usr/local/share/ca-certificates/proxyca.pem
  1. Update stored Certificate Authority files:

sudo update-ca-certificates
PreviousUse ScraperAPI Endpoint in JavaNextUse ScraperAPI SDK in Java

Last updated 8 months ago

Was this helpful?

Please follow link to download our proxy CA certificate.

More details can be found .

If you have any questions or need further assistance regarding web scraping or certificate management, don't hesitate to reach out to . We're here to help you every step of the way!

this
here
support team