# Sending Custom Headers

{% hint style="danger" %}
**IMPORTANT**\
\
**Only use this feature if you need to send custom headers to retrieve specific results from the target website. Within the API, we have a sophisticated header management system designed to increase success rates and performance on difficult sites. When you send your own custom headers you override our header system, which oftentimes lowers your success rates. Unless you absolutely need to send custom headers to get the data you need, we advise that you don’t use this functionality.**

**Custom headers are not supported when `ultra_premium=true` is enabled. Even if `keep_headers=true` is specified, all custom headers will be discarded for ultra-premium-enabled requests.**
{% endhint %}

You can send custom headers (User Agent, Referer, Cookies, etc.) when making a request through our API. By addin the **`keep_headers=true`** parameter, the API will forward these headers to the target website.

* **API REQUEST**

{% tabs %}
{% tab title="cURL" %}

```bash
curl --request GET \
  --header "X-MyHeader: 123" \
  --url 'http://api.scraperapi.com/?api_key=API_KEY&keep_headers=true&url=http://example.com' 
```

{% endtab %}

{% tab title="Python" %}

```python
import requests

target_url = 'https://example.com'
# Replace the value for api_key with your actual API Key.
api_key = 'API_KEY'
headers = {
    'X-MyHeader': '123'
}

request_url = (
    f'http://api.scraperapi.com?'
    f'api_key={api_key}'
    f'&url={target_url}'
    f'&keep_headers=true'
)

response = requests.get(request_url, headers=headers)
print(response.text)
```

{% endtab %}

{% tab title="NodeJS" %}

```javascript
import fetch from 'node-fetch';

const targetUrl = 'https://www.example.com';
const apiKey = 'API_KEY';

const requestUrl =
  `http://api.scraperapi.com?` +
  // Replace the value for api_key with your actual API Key.
  `api_key=${apiKey}` +
  `&url=${encodeURIComponent(targetUrl)}` +
  `&keep_headers=true`;

const headers = {
  'X-MyHeader': '123'
};

const response = await fetch(requestUrl, { method: 'GET', headers });
console.log(await response.text());
```

{% endtab %}

{% tab title="PHP" %}

```php
<?php
# Replace the value for api_key with your actual API Key.
$url = "http://api.scraperapi.com?api_key=API_KEY&url=https://www.example.com&keep_headers=true";
$headers = [
    "X-MyHeader: 123"
];
$ch = curl_init();

curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);

$response = curl_exec($ch);

curl_close($ch);
echo $response;
?>
```

{% endtab %}

{% tab title="Ruby" %}

```ruby
require 'net/http'

params = {
  # Replace the value for api_key with your actual API Key.
  api_key: 'API_KEY',
  url: 'https://example.com/',
  keep_headers: 'true'
}

uri = URI('https://api.scraperapi.com/')
uri.query = URI.encode_www_form(params)

req = Net::HTTP::Get.new(uri)
req['X-MyHeader'] = '123'

website_content = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) do |http|
  http.request(req).body
end

puts website_content
```

{% endtab %}

{% tab title="Java" %}

```java
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;

public class Main {
    public static void main(String[] args) throws Exception {
        String apiKey = "API_KEY";
        String targetUrl = "https://www.example.com";

        String scraperApiUrl =
                "https://api.scraperapi.com?"
                // Replace the value for api_key with your actual API Key.
                + "api_key=" + apiKey
                + "&url=" + targetUrl
                + "&keep_headers=true";

        HttpClient client = HttpClient.newHttpClient();

        HttpRequest request = HttpRequest.newBuilder()
                .uri(URI.create(scraperApiUrl))
                .header("X-MyHeader", "123")
                .GET()
                .build();

        HttpResponse<String> response =
                client.send(request, HttpResponse.BodyHandlers.ofString());

        System.out.println(response.body());
    }
}
```

{% endtab %}
{% endtabs %}

* **PROXY MODE**

{% tabs %}
{% tab title="cURL" %}

```bash
curl --request GET \
  --header "X-MyHeader: 123" \
  --proxy "http://scraperapi.keep_headers=true:API_KEY@proxy-server.scraperapi.com:8001" \ 
  --insecure \
  "http://example.com"
```

{% endtab %}

{% tab title="Python" %}

```python
import requests

target_url = 'https://example.com'
# Replace the value for api_key with your actual API Key.
api_key = 'API_KEY'

headers = {
    'X-MyHeader': '123'
}

proxies = {
    'http': f'http://scraperapi.keep_headers=true:{api_key}@proxy-server.scraperapi.com:8001'
}

response = requests.get(
    target_url,
    headers=headers,
    proxies=proxies,
    verify=False
)

print(response.text)
```

{% endtab %}

{% tab title="NodeJS" %}

```javascript
import axios from 'axios';

axios.get('https://www.example.com', {
  headers: {
    'X-MyHeader': '123'
  },
  proxy: {
    host: 'proxy-server.scraperapi.com',
    port: 8001,
    auth: {
      username: 'scraperapi.keep_headers=true',
      //Replace the value for password with your actual API Key.
      password: 'API_KEY'
    },
    protocol: 'http'
  }
})
  .then(response => {
    console.log(response.data);
  })
  .catch(error => {
    console.log(error);
  });
```

{% endtab %}

{% tab title="PHP" %}

```php
<?php
$headers = [
    "X-MyHeader: 123"
];
$ch = curl_init();

curl_setopt($ch, CURLOPT_URL, "https://www.example.com");

curl_setopt(
    $ch,
    CURLOPT_PROXY,
    //Replace the value for api_key with your actual API Key.
    "http://scraperapi.keep_headers=true:API_KEY@proxy-server.scraperapi.com:8001"
);

curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);

$response = curl_exec($ch);
curl_close($ch);

echo $response;
```

{% endtab %}

{% tab title="Ruby" %}

```ruby
require 'httparty'

HTTParty::Basement.default_options.update(verify: false)

headers = {
  'X-MyHeader' => '123'
}

response = HTTParty.get(
  'https://www.example.com',
  http_proxyaddr: 'proxy-server.scraperapi.com',
  http_proxyport: 8001,
  http_proxyuser: 'scraperapi.keep_headers=true',
  # Replace the value for http_proxypass with your actual API Key.
  http_proxypass: 'API_KEY',
  headers: headers
)

puts response.body
```

{% endtab %}

{% tab title="Java" %}

```java
import java.io.*;
import java.net.*;
import javax.net.ssl.*;
import java.security.cert.X509Certificate;

public class Main {
    public static void main(String[] args) {
        try {
            SSLContext sc = SSLContext.getInstance("TLS");
            sc.init(null, new TrustManager[]{
                new X509TrustManager() {
                    public X509Certificate[] getAcceptedIssuers() { return null; }
                    public void checkClientTrusted(X509Certificate[] certs, String authType) {}
                    public void checkServerTrusted(X509Certificate[] certs, String authType) {}
                }
            }, new java.security.SecureRandom());
            HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory());
            HttpsURLConnection.setDefaultHostnameVerifier((hostname, session) -> true);

            String apiKey = "API_KEY"; //Replace the value for api_key with your actual API Key.

            Authenticator.setDefault(new Authenticator() {
                protected PasswordAuthentication getPasswordAuthentication() {
                    return new PasswordAuthentication(
                        "scraperapi.keep_headers=true",
                        apiKey.toCharArray()
                    );
                }
            });

            System.setProperty("jdk.http.auth.tunneling.disabledSchemes", "");
            System.setProperty("https.proxyHost", "proxy-server.scraperapi.com");
            System.setProperty("https.proxyPort", "8001");

            HttpsURLConnection conn =
                (HttpsURLConnection) new URL("https://www.example.com").openConnection();

            conn.setRequestMethod("GET");
            conn.setRequestProperty("X-MyHeader", "123");

            BufferedReader in;
            if (conn.getResponseCode() >= 400) {
                in = new BufferedReader(new InputStreamReader(conn.getErrorStream()));
            } else {
                in = new BufferedReader(new InputStreamReader(conn.getInputStream()));
            }

            StringBuilder response = new StringBuilder();
            String line;
            while ((line = in.readLine()) != null) {
                response.append(line).append("\n");
            }
            in.close();

            System.out.println(response.toString());

        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}
```

{% endtab %}
{% endtabs %}

{% hint style="success" %}
***PRO TIP:***\
*If you want to get mobile results, you can use the `device_type=mobile` parameter to set a Mobile User-Agent header for your request, instead of setting it manually yourself.*
{% endhint %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.scraperapi.com/control-and-optimization/sending-custom-headers.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
