LogoLogo
OverviewRelease NotesDataPipelineFAQs
Python
Python
  • Make Requests with ScraperAPI in Python
    • Use ScraperAPI Endpoint in Python
    • Use ScraperAPI Proxy Port in Python
    • Use ScraperAPI SDK in Python
    • Make Async Requests with ScraperAPI in Python
      • How to Use ScraperAPI Async Web Scraping in Python
      • Use Async ScraperAPI Callbacks in Python
      • Configure ScraperAPI Parameters in Python
      • Request Async Batch Scraping with ScraperAPI in Python
      • Decode Base64 Async Responses in Python
    • ScraperAPI Structured Data Collection in Python
      • Amazon Product Page API: Structured Data in Python
      • Amazon Search API: Structured Data in Python
      • Amazon Offers API: Structured Data in Python
      • Amazon Reviews API: Structured Data in Python
      • Ebay Product Page API: Structured Data in Python
      • Ebay Search API: Structured Data in Python
      • Google SERP API: Structured Data in Python
      • Google News API: Structured Data in Python
      • Google Jobs API: Structured Data in Python
      • Google Shopping API: Structured Data in Python
      • Google Maps Search API: Structured Data in Python
      • Redfin Agent Details API: Structured Data in Python
      • Redfin 'For Rent' Listings API: Structured Data in Python
      • Redfin 'For Sale' Listings API: Structured Data in Python
      • Redfin Listing Search API: Structured Data in Python
      • Walmart Search API: Structured Data in Python
      • Walmart Category API: Structured Data in Python
      • Walmart Product API: Structured Data in Python
      • Walmart Reviews API: Structured Data in Python
    • ScraperAPI Async Structured Data Collection in Python
      • Amazon Product Page API: Async Structured Data in Python
      • Amazon Search API: Async Structured Data in Python
      • Amazon Offers API: Async Structured Data in Python
      • Amazon Reviews API: Async Structured Data in Python
      • Ebay Product Page API: Async Structured Data in Python
      • Ebay Search API: Async Structured Data in Python
      • Google SERP API: Async Structured Data in Python
      • Google News API: Async Structured Data in Python
      • Google Jobs API: Async Structured Data in Python
      • Google Shopping API: Async Structured Data in Python
      • Google Maps Search API: Async Structured Data in Python
      • Redfin Agent Details API: Async Structured Data in Python
      • Redfin 'For Rent' Listings API: Async Structured Data in Python
      • Redfin 'For Sale' Listings API: Async Structured Data in Python
      • Redfin Listing Search API: Async Structured Data in Python
      • Walmart Search API: Async Structured Data in Python
      • Walmart Category API: Async Structured Data in Python
      • Walmart Product API: Async Structured Data in Python
      • Walmart Reviews API: Async Structured Data in Python
    • Making POST/PUT Requests with ScraperAPI in Python
    • Customizing ScraperAPI Requests in Python
      • Customize Amazon Requests by ZIP Code via ScraperAPI in Python
      • Customize Cached Results via ScraperAPI in Python
      • Customize Control Costs with ScraperAPI Parameter in Python
      • Send Custom Headers with ScraperAPI in Python
      • Customize Device Type with ScraperAPI in Python
      • Customize Geotargeted Content Scrape via ScraperAPI in Python
      • Customize Premium Geotargeted Scrape via ScraperAPI in Python
      • Customize Header Parameter with ScraperAPI in Python
      • Customize Premium Residential/Mobile Proxies in Python
      • Customize JavaScript-Rendered Pages via ScraperAPI in Python
        • Use Render Instruction Set to Scrape Dynamic Pages in Python
        • Customize Taking a Website Screenshots via ScraperAPI in Python
      • Customize Scrape Session-Based Proxies via ScraperAPI in Python
  • Handle and Process Responses via ScraperAPI in Python
    • Use API Status Codes to Retry Failed Requests in Python
    • Customize Output Formats via ScraperAPI Parameters in Python
      • Request JSON Response via Autoparse Parameter in Python
      • Request LLM Output Formats with ScraperAPI in Python
    • Request Response Encoding and Content-Type via ScraperAPI in Python
  • Dashboard & Billing
    • API Key
    • Credit Usage
    • Delete Account
    • Invoice History
    • Billing Email
    • Billing Adress
    • VAT Number
    • Payment Method
    • Cancel Subscription
  • Credits and Requests
  • Monitor Your ScraperAPI Account Information in Python
  • Documentation Overview
Powered by GitBook

Quick links

  • Homepage
  • Dashboard
  • Pricing
  • Contact Sales

Resources

  • Developer Guides
  • Blog
  • Learning Hub
  • Contact Support
On this page
  • Google parameters supported by this endpoint
  • Sample Response

Was this helpful?

  1. Make Requests with ScraperAPI in Python
  2. ScraperAPI Async Structured Data Collection in Python

Google SERP API: Async Structured Data in Python

Scrape Google SERP into structured JSON/CSV with ScraperAPI async in Python. Extract product info, filter by region and language, and automate SERP insights.

This endpoint will retrieve search data from a Google search result page and transform it into usable JSON.

Single Query Request:

import requests

url = "https://async.scraperapi.com/structured/google/search"
headers = {
    "Content-Type": "application/json"
}
data = {
    "apiKey": APIKEY,
    "query": QUERY,
    "tld": TLD,
    "uule": UULE,
    "num": NUM,
    "hl": HOSTLANGUAGE,
    "gl": GUESTLANGUAGE,
    "ie": QUERYENCODING,
    "oe": RESULTENCODING,
    "start": START,
    "callback": {
        "type": "webhook",
        "url": "YYYY"
    }
}

response = requests.post(url, json=data, headers=headers)
print(response.text)

Multiple Query Request:

import requests

url = "https://async.scraperapi.com/structured/google/search"
headers = {
    "Content-Type": "application/json"
}
data = {
    "apiKey": APIKEY,
    "queries": [QUERY1, QUERY2, QUERY3],
    "tld": TLD,
    "uule": UULE,
    "num": NUM,
    "hl": HOSTLANGUAGE,
    "gl": GUESTLANGUAGE,
    "ie": QUERYENCODING,
    "oe": RESULTENCODING,
    "start": START,
    "callback": {
        "type": "webhook",
        "url": "YYYY"
    }
}

response = requests.post(url, json=data, headers=headers)
print(response.text)

Parameter
Details

API_KEY(required)

User's normal API Key

QUERY(required)

Query keywords that a user wants to search for

COUNTRY_CODE

Valid values are two letter country codes for which we offer Geo Targeting (e.g. “au”, “es”, “it”, etc.). Where a Google domain needs to be scraped from another country (e.g. scraping google.com from Canada), both TLD and COUNTRY_CODE parameters must be specified.

TLD

Country of Google domain to scrape. This is an optional argument and defaults to “com” (google.com). Valid values include: com (google.com) co.uk (google.co.uk) ca (google.ca) de (google.de) es (google.es) fr (google.fr) it (google.it) co.jp (google.co.jp) in (google.in) cn (google.cn) com.sg (google.com.sg) com.mx (google.com.mx) ae (google.ae) com.br (google.com.br) nl (google.nl) com.au (google.com.au) com.tr (google.com.tr) sa (google.sa) se (google.se) pl (google.pl)

OUTPUT_FORMAT

For structured data methods we offer CSV and JSON output. JSON is default if parameter is not added. Options:

  • csv

  • json

Google parameters supported by this endpoint

Google Parameters
Details

UULE

NUM

Number of results

HL

Host Language. For example: DE

GL

Boosts matches whose country of origin matches the parameter value. For example: DE

TBS

Limits results to a specific time range. For example: tbs=d returns results from the past day. Possible values: tbs=h - Hour

tbs=d - Day

tbs=w - Week

tbs=m - Month

tbs=y - Year

IE

Character encoding how the engine interpret the query string. For example: UTF8

OE

Character encoding used for the results. For example: UTF8

START

Set the starting offset in the result list. When start=10 set the first element in the result list will be the 10th search result. (meaning it starts with page 2 of results if the "num" is 10)

Sample Response

Single Query Request:

{
	"id": "89b388bf-0cea-49c1-8db6-9e00042c8c3a",
	"status": "running",
	"statusUrl": "https://async.scraperapi.com/structured/google/search/89b388bf-0cea-49c1-8db6-9e00042c8c3a",
	"query": "webscraping"
}

Multiple Query Request:

[
	{
		"id": "2955ad23-c812-475c-bc52-572576815d78",
		"status": "running",
		"statusUrl": "http://async.scraperapi.com/structured/google/search/2955ad23-c812-475c-bc52-572576815d78",
		"query": "webscraping"
	},
	{
		"id": "120a7344-7832-4fd0-a64f-ce9cd39726f3",
		"status": "running",
		"statusUrl": "https://async.scraperapi.com/structured/google/search/120a7344-7832-4fd0-a64f-ce9cd39726f3",
		"query": "data grabbing"
	}
]

After the job(s) finish, you will find the result under the response key in the response JSON object. The structure is the same as in the corresponding SYNC data endpoint.

Last updated 18 days ago

Was this helpful?

Set a region for a search. For example: w+CAIQICINUGFyaXMsIEZyYW5jZQ. You can find an online UULE generator .

here