Meta Robots AI-Compatible
noindex on production is a kill-switch. Search engines and AI crawlers will not index the site. Often left over from staging. SaaSalyst checks both <meta name="robots"> AND the X-Robots-Tag HTTP header for AI-blocking directives.
What SaaSalyst Checks
SaaSalyst parses the homepage's <meta name="robots"> tag content attribute and the X-Robots-Tag HTTP response header. The check fails (case-insensitive) when any of noindex, none, or nosnippet appear in either source. It passes when neither source contains those directives.
Why This Matters
This is a critical-severity check because the failure mode is invisible. A staging site has <meta name="robots" content="noindex"> to keep it out of search. The site goes to production. Someone forgets to remove the tag. The site is live, but Googlebot, Bingbot, GPTBot, ClaudeBot, PerplexityBot — every crawler that respects robots directives — refuses to index it. Founders see fine traffic on direct visits and don't realize organic discovery is gone until weeks later.
March 2025 Google update: nosnippet and max-snippet now extend to AI Overviews and AI Mode. So nosnippet on production removes you from AI answer panels too — same kill-switch shape, broader blast radius.
X-Robots-Tag header has the same effect as the meta tag and is even easier to forget — set in nginx/Apache/CDN config, not visible in source HTML, often persists across deploys. SaaSalyst checks both sources because real production breakage usually involves the one developers don't think to look at.
How to Fix It
- Search your codebase for any <meta name="robots" ...> tag containing noindex, none, or nosnippet. Remove it on production. Conditional rendering by environment (NODE_ENV !== 'production' → noindex) is the safest pattern.
- Search your nginx / Apache / CDN configuration for X-Robots-Tag headers. The header may be set globally or per-route. Remove or scope it to staging only.
- Test by curling your homepage with a bot user-agent and inspecting the response: curl -A "Mozilla/5.0 (compatible; Googlebot/2.1)" -I https://yourdomain.com — look for X-Robots-Tag in the headers.
- Add an automated check to your deploy pipeline: fetch the homepage and assert no AI-blocking directive is present. Several teams have shipped accidental noindex; assertion-on-deploy is cheap insurance.
Frequently Asked Questions
Why critical severity for what could be a one-line fix?
Because the consequence is total: the site is invisible to search and AI. SaaSalyst rates by failure mode impact, not by fix difficulty. A one-line fix that locks you out of organic traffic is critical — same logic as a missing privacy policy on a production site.
What does nosnippet do for AI specifically?
Per the March 2025 Google update, nosnippet and max-snippet now extend to AI Overviews and AI Mode. Before: the directive only suppressed search result snippets. After: it suppresses AI answer panels too. SaaSalyst flags any nosnippet directive on production because the AI side-effect is often unintentional.
Should I worry if my staging site has noindex?
No — that's the right configuration for staging. The SaaSalyst check is about production. Use environment-conditional rendering (NODE_ENV !== 'production' → noindex) or per-domain configuration so the staging directive doesn't leak to production.
Check Your SaaS Now | Free
SaaSalyst scans your website in 30 seconds and checks for Meta Robots AI-Compatible along with 101+ other business readiness signals.
Scan Your App