Skip to main content

Build with AI

Analog provides integrations to use AI-assisted development practics.

LLMs Index files

Analog's docs site publishes two AI-friendly index files at the site root:

  • https://analogjs.org/llms.txt
  • https://analogjs.org/llms-full.txt

These files make it easier to feed the docs into AI-assisted workflows without crawling the full site manually.

What's the difference?

llms.txt

llms.txt is a compact index of the docs. It contains the page titles, URLs, and short descriptions so an assistant or retrieval pipeline can discover the relevant docs pages quickly.

Use it when you want:

  • a lightweight entry point for retrieval
  • a page index for custom RAG pipelines
  • a quick way to point an AI tool at the Analog docs corpus

llms-full.txt

llms-full.txt is the expanded version. It concatenates the full Markdown content for the docs pages into a single text file.

Use it when you want:

  • a single file for local indexing
  • a fuller context window for long-form prompting
  • offline processing without fetching each docs page individually

How Analog generates these files

The docs app builds both files automatically in apps/docs-app/docusaurus.config.js.

During the docs build:

  • llms.txt is generated from the current docs route records
  • llms-full.txt is generated by concatenating the Markdown source files under apps/docs-app/docs

That means the files stay aligned with the published docs instead of requiring a separate export step.

Example workflows

Point an assistant at the docs index

Use llms.txt when your AI tool supports a remote docs index:

Use https://analogjs.org/llms.txt as the primary AnalogJS documentation index.

Build a local retrieval corpus

Use llms-full.txt when you want one source file for embeddings or local search:

curl -O https://analogjs.org/llms-full.txt

The AI-oriented files are a supplement, not a replacement for the published docs UI. Keep linking users to the canonical docs pages when you want navigable documentation, and use the llms files when you want AI-friendly ingestion.