Robots.txt (AI Context)
The site configuration file that controls which crawlers — including AI crawlers like GPTBot and PerplexityBot — can access your content.
What it means
Robots.txt is a text file at the root of your domain that instructs crawlers which pages they may or may not access. Several Shopify merchants accidentally block AI crawlers by using wildcard rules designed to block scrapers. As AI platforms have introduced their own crawlers, merchants need to audit their robots.txt specifically to ensure AI crawlers are permitted on product, collection, and brand pages. A single misconfigured robots.txt rule can remove a Shopify store from all AI citation consideration across every platform simultaneously.
Why it matters for Shopify
Every Shopify merchant should audit their robots.txt for inadvertent GPTBot, PerplexityBot, or Bingbot blocks — it's the most common silent killer of AI visibility.