Async Batch Requests

We have created a separate endpoint that accepts an array of URLs instead of just one to initiate scraping of multiple URLs at the same time: https://async.scraperapi.com/batchjobs. The API is almost the same as the single endpoint, but we expect an array of strings in the urls field instead of a string in url.

<?php
$payload = json_encode(
array(
"apiKey"
=>
"xxxxxx",
"urls"
=>
("https://example.com/page1", "https://example.com/page2")
) ); $ch = curl_init(); curl_setopt( $ch, CURLOPT_URL,
"https://async.scraperapi.com/batchjobs"
); curl_setOpt( $ch, CURLOPT_POST,
1
); curl_setopt( $ch, CURLOPT_POSTFIELDS, $payload ); curl_setopt( $ch, CURLOPT_HTTPHEADER,
array(
"Content-Type:application/json"
) ); curl_setopt( $ch, CURLOPT_RETURNTRANSFER,
TRUE
); $response = curl_exec( $ch ); curl_close( $ch ); print_r( $response );

As a response you’ll also get an array of the same response that you get using our single job endpoint:

[
{
"id":"0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"status":"running",
"statusUrl":"https://async.scraperapi.com/jobs/0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"url":"https://example.com/page1"
},
{
"id":"238d54a1-62af-41a9-b0b4-63f240bad439",
"status":"running",
"statusUrl":"https://async.scraperapi.com/jobs/238d54a1-62af-41a9-b0b4-63f240bad439",
"url":"https://example.com/page2"
}
]

Last updated