# Proxy Port Method

To simplify implementation for users with existing proxy pools, we offer a proxy front-end to the API. The proxy takes your requests, forwards them to ScraperAPI and handles proxy rotation, CAPTCHAs and retries automatically. It provides the same features and performance as the API endpoint.

The `username` for the proxy is **scraperapi** and the `password` is your **API key**.

{% tabs %}
{% tab title="cURL" %}

```bash
curl --proxy 'http://scraperapi:API_KEY@proxy-server.scraperapi.com:8001' \
  -k \
  'https://www.example.com'
```

{% endtab %}

{% tab title="Python" %}

```python
import requests

#Replace the value for api_key with your actual API Key
proxies = {
"http": "http://scraperapi:API_KEY@proxy-server.scraperapi.com:8001"
}

r = requests.get('https://www.example.com', proxies=proxies, verify=False)
print(r.text)
```

{% endtab %}

{% tab title="NodeJS" %}

```javascript
import axios from 'axios';

axios.get('https://www.example.com', {
  method: 'GET',
  proxy: {
    host: 'proxy-server.scraperapi.com',
    port: 8001,
    auth: {
      username: 'scraperapi',
      //Replace the value for password with your actual API Key
      password: 'API_KEY'  
    },
    protocol: 'http'
  }
})
  .then(response => {
    console.log(response)
  })
  .catch(error => {
    console.log(error)
  });
```

{% endtab %}

{% tab title="PHP" %}

```php
<?php

$ch = curl_init();  curl_setopt($ch, CURLOPT_URL,
"https://www.example.com");  curl_setopt($ch, CURLOPT_PROXY,
//Replace the value for api_key with your actual API Key
"http://scraperapi:API_KEY@proxy-server.scraperapi.com:8001");  

curl_setopt($ch, CURLOPT_RETURNTRANSFER,TRUE);
curl_setopt($ch, CURLOPT_HEADER,FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,0);

$response = curl_exec($ch);  curl_close($ch);  var_dump($response);
```

{% endtab %}

{% tab title="Ruby" %}

```ruby
require 'httparty'

HTTParty::Basement.default_options.update(verify: false)

response = HTTParty.get('https://www.example.com', {
http_proxyaddr: "proxy-server.scraperapi.com",
http_proxyport: "8001",
http_proxyuser: "scraperapi",
#Replace the value for api_key with your actual API Key
http_proxypass: "API_KEY"
})

results = response.body
puts results 
```

{% endtab %}

{% tab title="Java" %}

```java
import java.io.*;
import java.net.*;
import javax.net.ssl.*;

public class Main {
    public static void main(String[] args) {
        try {
            SSLContext sc = SSLContext.getInstance("TLS");
            sc.init(null, new TrustManager[]{(X509TrustManager) new X509TrustManager() {
                public java.security.cert.X509Certificate[] getAcceptedIssuers() { return null; }
                public void checkClientTrusted(java.security.cert.X509Certificate[] c, String a) {}
                public void checkServerTrusted(java.security.cert.X509Certificate[] c, String a) {}
            }}, new java.security.SecureRandom());
            HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory());
            HttpsURLConnection.setDefaultHostnameVerifier((h, s) -> true);

            String apiKey = "API_KEY"; //Replace the value for api_key with your actual API Key
            Authenticator.setDefault(new Authenticator() {
                protected PasswordAuthentication getPasswordAuthentication() {
                    return new PasswordAuthentication("scraperapi", apiKey.toCharArray());
                }
            });

            System.setProperty("jdk.http.auth.tunneling.disabledSchemes", "");
            System.setProperty("https.proxyHost", "proxy-server.scraperapi.com");
            System.setProperty("https.proxyPort", "8001");

            BufferedReader in = new BufferedReader(new InputStreamReader(
                new URL("https://example.com").openConnection().getInputStream()
            ));
            StringBuilder resp = new StringBuilder();
            for (String line; (line = in.readLine()) != null; ) resp.append(line).append("\n");
            in.close();
            System.out.println(resp);
        } catch (Exception e) { e.printStackTrace(); }
    }
}

```

{% endtab %}
{% endtabs %}

***Note:** So that we can properly direct your requests through the API, your code must be configured to not verify SSL certificates.*

To enable extra functionality whilst using the API in proxy mode, you can pass parameters to the API by adding them to the `username`, separated by dots. For example, if you want to enable Javascript Rendering to a request, the username would be `scraperapi.render=true`. You can add multiple parameters at a time.

{% tabs %}
{% tab title="cURL" %}

```bash
curl --proxy 'http://scraperapi.render=true.country_code=us:API_KEY@proxy-server.scraperapi.com:8001' \
  -k \
  'https://www.example.com'
```

{% endtab %}

{% tab title="Python" %}

```python
import requests

#Replace the value for api_key with your actual API Key
proxies = {
"http": "http://scraperapi.render=true.country_code=us:API_KEY@proxy-server.scraperapi.com:8001"
}

r = requests.get('https://www.example.com', proxies=proxies, verify=False)
print(r.text)
```

{% endtab %}

{% tab title="NodeJS" %}

```javascript
import axios from 'axios';

axios.get('https://www.example.com', {
  method: 'GET',
  proxy: {
    host: 'proxy-server.scraperapi.com',
    port: 8001,
    auth: {
      username: 'scraperapi.render=true.country_code=us',
      //Replace the value for password with your actual API Key
      password: 'API_KEY'  
    },
    protocol: 'http'
  }
})
  .then(response => {
    console.log(response)
  })
  .catch(error => {
    console.log(error)
  });
```

{% endtab %}

{% tab title="PHP" %}

```php
<?php

$ch = curl_init();  curl_setopt($ch, CURLOPT_URL,
"https://www.example.com");  curl_setopt($ch, CURLOPT_PROXY,
//Replace the value for api_key with your actual API Key
"http://scraperapi.render=true.country_code=us:API_KEY@proxy-server.scraperapi.com:8001");  

curl_setopt($ch, CURLOPT_RETURNTRANSFER,TRUE);
curl_setopt($ch, CURLOPT_HEADER,FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,0);

$response = curl_exec($ch);  curl_close($ch);  var_dump($response);
```

{% endtab %}

{% tab title="Ruby" %}

```ruby
require 'httparty'

HTTParty::Basement.default_options.update(verify: false)

response = HTTParty.get('https://www.example.com', {
http_proxyaddr: "proxy-server.scraperapi.com",
http_proxyport: "8001",
http_proxyuser: "scraperapi.render=true.country_code=us",
#Replace the value for api_key with your actual API Key
http_proxypass: "API_KEY"
})

results = response.body
puts results 
```

{% endtab %}

{% tab title="Java" %}

```java
import java.io.*;
import java.net.*;
import javax.net.ssl.*;

public class Main {
    public static void main(String[] args) {
        try {
            SSLContext sc = SSLContext.getInstance("TLS");
            sc.init(null, new TrustManager[]{(X509TrustManager) new X509TrustManager() {
                public java.security.cert.X509Certificate[] getAcceptedIssuers() { return null; }
                public void checkClientTrusted(java.security.cert.X509Certificate[] c, String a) {}
                public void checkServerTrusted(java.security.cert.X509Certificate[] c, String a) {}
            }}, new java.security.SecureRandom());
            HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory());
            HttpsURLConnection.setDefaultHostnameVerifier((h, s) -> true);

            String apiKey = "API_KEY"; //Replace the value for api_key with your actual API Ke
            Authenticator.setDefault(new Authenticator() {
                protected PasswordAuthentication getPasswordAuthentication() {
                    return new PasswordAuthentication("scraperapi.render=true.country_code=us", apiKey.toCharArray());
                }
            });

            System.setProperty("jdk.http.auth.tunneling.disabledSchemes", "");
            System.setProperty("https.proxyHost", "proxy-server.scraperapi.com");
            System.setProperty("https.proxyPort", "8001");

            BufferedReader in = new BufferedReader(new InputStreamReader(
                new URL("https://example.com").openConnection().getInputStream()
            ));
            StringBuilder resp = new StringBuilder();
            for (String line; (line = in.readLine()) != null; ) resp.append(line).append("\n");
            in.close();
            System.out.println(resp);
        } catch (Exception e) { e.printStackTrace(); }
    }
}

```

{% endtab %}
{% endtabs %}

### Proxy Mode with SSL Verification

If you would like to send SSL verified requests to our Proxy API, you can manually trust our certificate by following these steps:

{% stepper %}
{% step %}
**Download Our Proxy CA Certificate**

Follow [this](https://api.scraperapi.com/proxyca.pem) link to download our proxy CA certificate.
{% endstep %}

{% step %}
**Manually trust it on your device**

Once you've downloaded the certificate, manually trust it in your scraping tool or library settings. This step may vary depending on the tool or library you're using, but typically involves importing the certificate into your trusted root store or by configuring SSL/TLS settings. Depending on your operating system, follow the instructions below to install the ScraperAPI CA Certificate.
{% endstep %}
{% endstepper %}

#### Windows 10/11 <a href="#windows-10-11" id="windows-10-11"></a>

1. Press the **`Win key + R`** hotkey and input **`mmc`** in Run to open the Microsoft Management Console window.
2. Click **`File`** and select **`Add/Remove Snap-ins`**.
3. In the opened window select **`Certificates`** and press the **`Add >`** button.
4. In the Certificates Snap-in window select **`Computer account > Local Account`**, and press the **`Finish`** button to close the window.
5. Press the **`OK`** button in the Add or Remove Snap-in window.
6. Back in the Microsoft Management Console window, select **`Certificates`** under Console Root and right-click **`Trusted Root Certification Authorities`**
7. From the context menu select **`All Tasks > Import`** to open the Certificate Import Wizard window from which you can add the Scraper API certificate.

More details can be found [here](https://windowsreport.com/install-windows-10-root-certificates/).

{% hint style="warning" %}
If you encounter any Certificate Revocation List (CRL) Distribution Points (DPs) related errors, please configure your SSL context to ignore certificate revocation checks.
{% endhint %}

#### macOS <a href="#macos" id="macos"></a>

1. Open Keychain Access window (**`Launchpad > Other > Keychain Access`**).
2. Select **`System`** tab under Keychains, drag and drop the downloaded certificate file (or select File > **`Import Items...`** and navigate to the file).
3. Enter the administrator password to modify the keychain.
4. Double-click the **`ScraperAPI CA`** certificate entry, expand Trust, next to When using this certificate: select **`Always Trust`**.
5. Close the window and enter the administrator password again to update the settings.

#### Linux <a href="#linux" id="linux"></a>

1. Install the downloaded ScraperAPI proxyca.pem file:

```
sudo cp proxyca.pem /usr/local/share/ca-certificates/proxyca.pem
```

2. Update stored Certificate Authority files:

```
sudo update-ca-certificates
```

If you have any questions or need further assistance regarding web scraping or certificate management, don't hesitate to reach out to the [support team](https://dashboard.scraperapi.com/contact-support).&#x20;


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.scraperapi.com/synchronous-apis/proxy-port-method.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
