Async Batch Requests

We have created a separate endpoint that accepts an array of URLs instead of just one to initiate scraping of multiple URLs at the same time: https://async.scraperapi.com/batchjobs. The API is almost the same as the single endpoint, but we expect an array of strings in the urls field instead of a string in url.

try {
URL asyncApiUrl = new URL("https://async.scraperapi.com/batchjobs");
HttpURLConnection connection = (HttpURLConnection) asyncApiUrl.openConnection();
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json");
connection.setDoOutput(true);
try (OutputStream outputStream = connection.getOutputStream()) {
outputStream.write("{\"apiKey\": \"xxxxxx\", \"urls\": [\"https://example.com/page1\", \"https://example.com/page2\"}".getBytes(StandardCharsets.UTF_8));
}

int responseCode = connection.getResponseCode();

if (responseCode == HttpURLConnection.HTTP_OK) {
try (BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream()))) {
String readLine = null;
StringBuffer response = new StringBuffer();

while ((readLine = in.readLine()) != null) {
	response.append(readLine);
}

System.out.println(response);
}
} else {
throw new Exception("Error in API Call");
}
} catch (Exception ex) {
ex.printStackTrace();
}

As a response you’ll also get an array of the same response that you get using our single job endpoint:

[
{
"id":"0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"status":"running",
"statusUrl":"https://async.scraperapi.com/jobs/0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"url":"https://example.com/page1"
},
{
"id":"238d54a1-62af-41a9-b0b4-63f240bad439",
"status":"running",
"statusUrl":"https://async.scraperapi.com/jobs/238d54a1-62af-41a9-b0b4-63f240bad439",
"url":"https://example.com/page2"
}
]

Last updated