Apify Web Scraper



Apify is a platform built to serve large scale and high performance web scrapingand automation needs. It provides easy access to compute instances (Actors),convenient request and result storages, proxies,scheduling, webhooksand more, accessible through a web interfaceor an API.

Free

Apify SDK builds on popular tools like playwright, puppeteer and cheerio, to deliver large-scale high-performance web scraping and crawling of any website. Automates any web workflow Run headless Chrome, Firefox, WebKit or other browsers, manage lists and queues of URLs to crawl, run crawlers in parallel at maximum system capacity. Web scraping with Apify Apify offers several different ways to scrape. You can start from scratch with your own solution, build upon existing tools, use ready-made tools, or get a solution created for you. Learn how to scrape a website using Apify's Cheerio Scraper. Build an actor's page function, extract information from a web page and download your data. This scraping tutorial will go into the nitty gritty details of extracting data from using Cheerio Scraper (apify/cheerio-scraper). Apify is a platform built to serve large scale and high performance web scraping and automation needs. It provides easy access to compute instances (Actors), convenient request and result storages, proxies, scheduling, webhooks and more, accessible through a web interface or an API. While we think that the Apify platform is super cool, and you should definitely try the free account, Apify SDK.

While we think that the Apify platform is super cool, and you should definitely try thefree account, Apify SDK is and will always be open source,runnable locally or on any cloud infrastructure.

Appify Web Scraper Login

Apify Web Scraper

Note that we do not test Apify SDK in other cloud environments such as Lambda or on specificarchitectures such as Raspberry PI. We strive to make it work, but there's no guarantee.

Logging into Apify platform from Apify SDK

Appify Web Scraper Free

To access your Apify account from the SDK, you must providecredentials - your API token. You can do thateither by utilizing Apify CLI or by environmentvariables.

Once you provide credentials to your scraper, you will be able to use all the Apify platformfeatures of the SDK, such as calling Actors, saving to cloud storages, using Apify proxies,setting up webhooks and so on.

Log in with CLI

Apify CLI allows you to log in to your Apify account on your computer. If you then run yourscraper using the CLI, your credentials will automatically be added.

In your project folder:

Log in with environment variables

If you prefer not to use Apify CLI, you can always provide credentials to your scraperby setting the APIFY_TOKEN environmentvariable to your API token.

There's also the APIFY_PROXY_PASSWORDenvironment variable. It is automatically inferred from your token by the SDK, but it can be usefulwhen you need to access proxies from a different account than your token represents.

What is an Actor

When you deploy your script to the Apify platform, it becomes an actor.Actor is a serverless microservice that accepts an input and produces an output. It can run fora few seconds, hours or even infinitely. An actor can perform anything from a simple action suchas filling out a web form or sending an email, to complex operations such as crawling an entire websiteand removing duplicates from a large dataset.

Actors can be shared in the Apify Store so that other people can use them.But don't worry, if you share your actor in the store and somebody uses it, it runs under their account,not yours.

Related links