You can use webhooks to trigger actions in external services every time one of your scrapers finishes running. You can trigger all kind of fun actions: send emails when a scraper is finished; immediately pull fresh data from a scraper’s API; or run realtime analysis—anything you like.
Add webhook URLs under the Webhooks heading on your scraper’s Settings page. Every time your scraper finishes running we’ll make an HTTP POST request to each URL you’ve added. Use this request to trigger your custom code.
We currently just make an empty HTTP POST request to your specified webhook URLs; no information about the scraper run is added as a payload. You can send information through the request by putting it in the webhook URL, either as the path or as GET parameters. If there’s something specific you’d like included in the webhook payload then please ask a question on our help forum.