Respuesta :
Answer
option C) Text and keywords are selected and recorded in huge data centers.
Explanation:
A crawler can be defined as a a program that visits Web sites and reads their pages and other information and after reading creates entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider"