Crawler
A crawler, also known as a web crawler, spider, or search engine bot, is an automated program or script that systematically browses the World Wide Web. Its purpose is to index website content for search engines, allowing them to provide users with up-to-date data.
Crawlers focus on indexing websites for search engines, while scrapers aim to extract specific data from targeted web pages.