Standards, Files & Examples
This page summarizes common files and structured formats used in GEO / AIO implementations. Exact configuration depends on your goals and risk tolerance.
robots.txt
robots.txt controls crawler access. Misconfiguration can unintentionally block AI systems and search engines from high-value pages.
- Ensure important pages are crawlable
- Avoid blocking CSS/JS required for rendering key content
- Prevent crawl traps and infinite URL patterns
ai.txt and llms.txt (Optional)
Some organizations choose to publish AI-focused policy and discovery files. These are optional and should align with business policy.
Typical contents
- preferred crawling behavior
- content usage guidance
- important canonical sources
- contact / policy references
Structured Data (JSON-LD)
Structured data helps disambiguate entities (company, services, products) and improves machine interpretation. Implementation should be accurate and consistent.
Common schemas we use
- Organization
- LocalBusiness (if applicable)
- Service
- Product (ecommerce)
- FAQPage (where appropriate)
- Article / TechArticle (documentation pages)
Next: FAQ & Glossary
