marcustas 3 hours ago

I’ve been experimenting with an unusual idea: creating a static, multilingual HTML page optimized specifically for LLM crawling rather than for humans.

No JS, no frameworks, no styling. Only: • clean HTML • JSON-LD (Restaurant, FAQ, WebSite, WebPage) • hreflang for 5 languages • a hand-written sitemap.xml • robots.txt explicitly allowing GPTBot, CCBot, ClaudeBot, PerplexityBot • hosted on Netlify, with no language negotiation or redirects • served with stable 200 responses for all bots

The experiment is here: https://ai.asasushi.pl/

Goal: see whether LLMs (ChatGPT, Bing, Claude, Perplexity) start surfacing this page as part of local business knowledge, and how long indexing takes if everything is handcrafted and deterministic.

I’m interested in feedback from people who have played with AI-SEO, crawler behavior, LLM retrieval patterns, or static-site indexing.

Does this approach make sense? Anything obvious I should change?