Technical & Infrastructure
LLM Indexing
The process by which AI platforms crawl and store your site's content for use in generating answers.
What it means
LLM indexing is analogous to Google indexing but executed by AI platform crawlers such as GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, and Google's AI crawlers. These bots visit your site, read your content, and store parsed representations that can be used in live retrieval or future training. Blocking these crawlers — intentionally or accidentally via robots.txt — means your content cannot be retrieved when buyers ask AI platforms about your category.
Why it matters for Shopify
Shopify merchants should audit their robots.txt to confirm GPTBot, PerplexityBot, and similar AI crawlers are permitted access to product and brand content.