In AI Applications > Data Stores, I’ve ingested (mysite.com/*), however only ~ 100 urls were indexed when there are a lot more pages.
How can I index all the pages without manually doing them individually?
In AI Applications > Data Stores, I’ve ingested (mysite.com/*), however only ~ 100 urls were indexed when there are a lot more pages.
How can I index all the pages without manually doing them individually?
Hey,
Hope you’re keeping well.
When ingesting a full domain in Vertex AI Search, the crawler will only index URLs it can discover through links starting from the seed URL and within the allowed crawl depth. If your site uses JavaScript-heavy navigation, has disallowed paths in robots.txt, or requires authentication, many pages may be skipped. Make sure your sitemap is accessible and submitted in the Data Store settings, and consider increasing the crawl depth in the ingestion configuration. You can also upload a full URL list via a Cloud Storage file and point the Data Store to it for guaranteed coverage.
Thanks and regards,
Taz