Job Handling
Submitting a Job
Using the Async API is easy. Submit a scraping job and receive a status URL, where you can monitor the job and collect the results once it's finished.
Sample Request
curl --request POST \
--url "https://async.scraperapi.com/jobs" \
--header "Content-Type: application/json" \
--data '{
"apiKey": "API_KEY",
"url": "https://example.com"
}'import requests
r = requests.post(url = 'https://async.scraperapi.com/jobs',
json={ 'apiKey': 'API_KEY', 'url': 'https://example.com' })
print(r.text)import axios from 'axios';
(async () => {
const { data } = await axios({
method: 'POST',
url: 'https://async.scraperapi.com/jobs',
headers: { 'Content-Type': 'application/json' },
data: {
apiKey: 'API_KEY',
url: 'https://example.com'
}
});
console.log(data);
})();Response:
}
"id":"0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"status":"running",
"statusUrl":"https://async.scraperapi.com/jobs/0962a8e0-5f1a-4e14-bf8c-5efcc18f0953",
"url":"https://example.com"
}Sending a POST request to the Async API is done by using "method": "POST" inside the payload. Here is an example:
curl --request POST \
--url "https://async.scraperapi.com/jobs" \
--header "Content-Type: application/json" \
--data '{
"apiKey": "API_KEY",
"url": "https://postman-echo.com/post",
"method": "POST",
"headers": {
"content-type": "application/x-www-form-urlencoded"
},
"body": "foo=bar"
}'import requests
body = {'apiKey': 'API_KEY', 'url': 'https://postman-echo.com/post', 'method': 'POST', 'headers': {"content-type": "application/x-www-form-urlencoded"}, 'body': 'foo=bar'}
r = requests.post('https://async.scraperapi.com/jobs', headers={"Content-Type": "application/json"}, json=body)
print(r.text)const url = 'https://async.scraperapi.com/jobs';
const body = {
apiKey: 'API_KEY',
url: 'https://postman-echo.com/post',
method: 'POST',
headers: { "content-type": "application/x-www-form-urlencoded" },
body: "foo=bar"
};
(async () => {
try {
const res = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body)
});
const data = await res.json();
console.log(data);
} catch (err) {
console.error(err);
}
})();Should you wish, you can include meta object in your request to store custom data (your own request ID for example), which will be returned in the response as well.
Job Status
The statusUrl is a unique job URL, used to retrieve the status and results of the scraping job. Invoking that endpoint provides you with the job status (while it is still running)
Once your job is finished, the response from the statusURL will this time contain the results of your scraping job:
Please note that the response for an Async job is stored for up to 72 hours (24hrs guaranteed) or until you retrieve the results, whichever comes first. If you do not get the response in due time, it will be deleted from our side and you will have to send another request for the same job.
Cancelling a Job
Should you wish to cancel a running job, you can do so by sending a DELETE request to the job endpoint using the job ID:
Last updated

