Search Systems are what you typically refer to as Search Engines (Google, Bing, DuckDuckGo, etc.). They are massively complex systems that tackle some of the biggest challenges in technology history.
Search Systems have four main responsibilities:
- Crawling – the process of going through the Web and parsing the content in all websites. This is a massive task as there are over 350 million domains available.
- Indexing – finding places to store all of the data gathered during the crawling stage so it can be accessed.
- Ranking – querying data to craft relevant results pages based on user input. This is where the different ranking criteria are applied in Search engines to give users the best answer to fulfill their intent.
In the next section, we will learn more specifically how Googlebot works. Googlebot is Google's internet crawler, the part of the search system that gathers all the information needed to create the massive database of content to serve search results.