LogoLogo
OverviewRelease NotesDataPipelineFAQs
Ruby
Ruby
  • Make Requests with ScraperAPI in Ruby
    • Use ScraperAPI Endpoint in Ruby
    • Use ScraperAPI Proxy Port in Ruby
    • Use ScraperAPI SDK in Ruby
    • Make Async Requests with ScraperAPI in Ruby
      • How to Use ScraperAPI Async Web Scraping in Ruby
      • Use Async ScraperAPI Callbacks in Ruby
      • Configure ScraperAPI Parameters in Ruby
      • Request Async Batch Scraping with ScraperAPI in Ruby
      • Decode Base64 Async Responses in Ruby
    • ScraperAPI Structured Data Collection in Ruby
      • Amazon Product Page API: Structured Data in Ruby
      • Amazon Search API: Structured Data in Ruby
      • Amazon Offers API: Structured Data in Ruby
      • Amazon Reviews API: Structured Data in Ruby
      • Ebay Product Page API: Structured Data in Ruby
      • Ebay Search API: Structured Data in Ruby
      • Google SERP API: Structured Data in Ruby
      • Google News API: Structured Data in Ruby
      • Google Jobs API: Structured Data in Ruby
      • Google Shopping API: Structured Data in Ruby
      • Google Maps Search API: Structured Data in Ruby
      • Redfin Agent Details API: Structured Data in Ruby
      • Redfin 'For Rent' Listings API: Structured Data in Ruby
      • Redfin 'For Sale' Listings API: Structured Data in Ruby
      • Redfin Listing Search API: Structured Data in Ruby
      • Walmart Search API: Structured Data in Ruby
      • Walmart Category API: Structured Data in Ruby
      • Walmart Product API: Structured Data in Ruby
      • Walmart Reviews API: Structured Data in Ruby
    • ScraperAPI Async Structured Data Collection in Ruby
      • Amazon Product Page API: Async Structured Data in Ruby
      • Amazon Search API: Async Structured Data in Ruby
      • Amazon Offers API: Async Structured Data in Ruby
      • Amazon Reviews API: Async Structured Data in Ruby
      • Ebay Product Page API: Async Structured Data in Ruby
      • Ebay Search API: Async Structured Data in Ruby
      • Google SERP API: Async Structured Data in Ruby
      • Google News API: Async Structured Data in Ruby
      • Google Jobs API: Async Structured Data in Ruby
      • Google Shopping API: Async Structured Data in Ruby
      • Google Maps Search API: Async Structured Data in Ruby
      • Redfin Agent Details API: Async Structured Data in Ruby
      • Redfin 'For Rent' Listings API: Async Structured Data in Ruby
      • Redfin 'For Sale' Listings API: Async Structured Data in Ruby
      • Redfin Listing Search API: Async Structured Data in Ruby
      • Walmart Search API: Async Structured Data in Ruby
      • Walmart Category API: Async Structured Data in Ruby
      • Walmart Product API: Async Structured Data in Ruby
      • Walmart Reviews API: Async Structured Data in Ruby
    • Making POST/PUT Requests with ScraperAPI in Ruby
    • Customizing ScraperAPI Requests in Ruby
      • Customize Amazon Requests by ZIP Code via ScraperAPI in Ruby
      • Customize Cached Results via ScraperAPI in Ruby
      • Customize Control Costs with ScraperAPI Parameter in Ruby
      • Send Custom Headers with ScraperAPI in Ruby
      • Customize Device Type with ScraperAPI in Ruby
      • Customize Geotargeted Content Scrape via ScraperAPI in Ruby
      • Customize Premium Geotargeted Scrape via ScraperAPI in Ruby
      • Customize Header Parameter with ScraperAPI in Ruby
      • Customize Premium Residential/Mobile Proxies in Ruby
      • Customize JavaScript-Rendered Pages via ScraperAPI in Ruby
        • Use Render Instruction Set to Scrape Dynamic Pages in Ruby
        • Customize Taking a Website Screenshots via ScraperAPI in Ruby
      • Customize Scrape Session-Based Proxies via ScraperAPI in Ruby
  • Handle and Process Responses via ScraperAPI in Ruby
    • Use API Status Codes to Retry Failed Requests in Ruby
    • Customize Output Formats via ScraperAPI Parameters in Ruby
      • Request JSON Response via Autoparse Parameter in Ruby
      • Request LLM Output Formats with ScraperAPI in Ruby
    • Request Response Encoding and Content-Type via ScraperAPI in Ruby
  • Dashboard & Billing
    • API Key
    • Credit Usage
    • Delete Account
    • Invoice History
    • Billing Email
    • Billing Address
    • VAT Number
    • Payment Method
    • Cancel Subscription
  • Credits and Requests
  • Monitor Your ScraperAPI Account Information in Ruby
  • Documentation Overview
Powered by GitBook

Quick links

  • Homepage
  • Dashboard
  • Pricing
  • Contact Sales

Resources

  • Developer Guides
  • Blog
  • Contact Support
  • Learning Hub
On this page
  • Proxy Mode with SSL verification
  • Windows 10/11
  • macOS
  • Linux

Was this helpful?

  1. Make Requests with ScraperAPI in Ruby

Use ScraperAPI Proxy Port in Ruby

Learn to implement ScraperAPI's proxy port in Ruby. Configure rotating proxies, bypass CAPTCHAs, and enable JS rendering with API keys. Includes SSL setup guide.

To simplify implementation for users with existing proxy pools, we offer a proxy front-end to the API. The proxy will take your requests and pass them through to the API which will take care of proxy rotation, captchas, and retries.

The proxy mode is a light front-end for the API and has all the same functionality and performance as sending requests to the API endpoint.

The username for the proxy is scraperapi and the password is your API key.

require 'httparty'
HTTParty::Basement.default_options.update(verify: false)
response = HTTParty.get('http://httpbin.org/ip', {
http_proxyaddr: "proxy-server.scraperapi.com",
http_proxyport: "8001",
http_proxyuser: "scraperapi",
http_proxypass: "APIKEY"
})
results = response.body
puts results 

Note: So that we can properly direct your requests through the API, your code must be configured to not verify SSL certificates.

To enable extra functionality whilst using the API in proxy mode, you can pass parameters to the API by adding them to username, separated by periods.

For example, if you want to enable Javascript rendering with a request, the username would be scraperapi.render=true

require 'httparty'
HTTParty::Basement.default_options.update(verify: false)
response = HTTParty.get('http://httpbin.org/ip', {
http_proxyaddr: "proxy-server.scraperapi.com",
http_proxyport: "8001",
http_proxyuser: "scraperapi.render=true",
http_proxypass: "APIKEY"
})
results = response.body
puts results 

Multiple parameters can be included by separating them with periods; for example:

require 'httparty'
HTTParty::Basement.default_options.update(verify: false)
response = HTTParty.get('http://httpbin.org/ip', {
http_proxyaddr: "proxy-server.scraperapi.com",
http_proxyport: "8001",
http_proxyuser: "scraperapi.render=true.country_code=us",
http_proxypass: "APIKEY"
})
results = response.body
puts results 

Proxy Mode with SSL verification

If you would like to send requests to our Proxy API with SSL verification, you can manually trust our certificate by following these steps:

  • Download Our Proxy CA Certificate:

  • Manual trust:

Once you've downloaded the certificate, manually trust it in your scraping tool or library settings. This step may vary depending on the tool or library you're using, but typically involves importing the certificate into your trusted root store or by configuring SSL/TLS settings. Depending on your operating system, follow the instructions below to install the ScraperAPI CA Certificate:

Windows 10/11

  1. Press the Win key + R hotkey and input mmc in Run to open the Microsoft Management Console window.

  2. Click File and select Add/Remove Snap-ins.

  3. In the opened window select Certificates and press the Add > button.

  4. In the Certificates Snap-in window select Computer account > Local Account, and press the Finish button to close the window.

  5. Press the OK button in the Add or Remove Snap-in window.

  6. Back in the Microsoft Management Console window, select Certificates under Console Root and right-click Trusted Root Certification Authorities

  7. From the context menu select All Tasks > Import to open the Certificate Import Wizard window from which you can add the Scraper API certificate.

If you encounter any Certificate Revocation List (CRL) Distribution Points (DPs) related errors, please set the http.verify_modeoption toOpenSSL::SSL::VERIFY_NONE in your script.

macOS

  1. Open Keychain Access window (Launchpad > Other > Keychain Access).

  2. Select System tab under Keychains, drag and drop the downloaded certificate file (or select File > Import Items... and navigate to the file).

  3. Enter the administrator password to modify the keychain.

  4. Double-click the ScraperAPI CA certificate entry, expand Trust, next to When using this certificate: select Always Trust.

  5. Close the window and enter the administrator password again to update the settings.

Linux

  1. Install the downloaded ScraperAPI proxyca.pem file:

sudo cp proxyca.pem /usr/local/share/ca-certificates/proxyca.pem
  1. Update stored Certificate Authority files:

sudo update-ca-certificates
PreviousUse ScraperAPI Endpoint in RubyNextUse ScraperAPI SDK in Ruby

Last updated 8 months ago

Was this helpful?

Please follow link to download our proxy CA certificate.

More details can be found .

If you have any questions or need further assistance regarding web scraping or certificate management, don't hesitate to reach out to . We're here to help you every step of the way!

this
here
support team