Skip to main content

Where to find it

The documentation agent powers the chat experience on this very site. You can use it in two places:
  • Top: The search bar at the top of the page (“Ask Contextual AI…”) — type your question and send.
  • Right: The chat panel persists on the right after you run your first query — use it to run multi-turn conversations.
When you ask a question, Mintlify’s built-in search first surfaces keyword-based results so you get quick, relevant links. In parallel, the agent runs the longer-running task: it retrieves across the documentation, runs multi-step research (Agentic Search), and streams back a detailed, cited answer. You get both instant keyword hits and a synthesized response.
GitHub repo: The sync code for keeping a Contextual AI datastore in sync with your docs is public: ContextualAI/datastore-sync—see the section below for an overview.

Why this agent

(and why not generic LLMs or basic RAG)

Use cases: This is the kind of agent we use for customer engineering and self-serve users. We believe that all docs need agents, and all websites need agents, because people would rather prompt than search for info. If you ask a generic LLM and point it at the docs page, you tend to get more general, less relevant responses (based on qualitative review). Our agent gives very specific, detailed answers; the docs provided to it more precisely constrain the information it can use. In short: it’s a better docs agent, and you can set one up for yourself in hours. Compared with web search, basic RAG or no citations:
  • Citations are key — People often want to read the source doc after the initial response: to verify, go deeper, or share the link. This agent cites every factual claim and adds a References section with doc URLs.
  • Multi-step retrieval — Single-shot RAG can miss context or mix in irrelevant chunks; Agentic Search does breadth-then-depth over the docs and produces more precise answers.
  • Grounded generation — Responses are generated only from retrieved content, reducing hallucination.
We’ve already seen users rely on it for troubleshooting and support-style questions—before we had even written about it—because they can trust the references and follow them into the docs.

Challenges we solved with this agent

Running a docs agent well comes with a few realities we’ve designed for:
  • Multimodal content — Documentation often has important information in images (diagrams, screenshots, UI). Our pipeline handles multimodal content state-of-the-art: we parse and index both text and images so the agent can retrieve and use information from figures, screenshots, and diagrams, not just body text.
  • Keeping the agent up to date — Docs change many times a day. The agent must stay in sync with the latest content. You can do that with Contextual AI connectors (e.g. Confluence, SharePoint, Google Drive), or—for GitHub-backed docs—with the code we wrote and share: datastore-sync. It syncs your repo (and optionally your website) into a Contextual AI datastore via webhooks and incremental updates so the agent always has fresh content.
  • Citations — As above: citations aren’t just for trust; they let users open the source and read the full doc after the initial response.
  • Repo vs. web: Unlike searching docs via the web, our documentation lives in our GitHub repo and we sync only what’s published (e.g. pages in nav, docs.json). We may have outdated or unlisted pages in the repo that still exist but aren’t findable on the site—our agent won’t surface those, because they’re not in the synced datastore. LLMs that use web search can sometimes surface those older or hidden pages; our agent deliberately reflects only the docs we publish, so answers stay aligned with what we actually ship.

How it works

  1. Your documentation as data: These docs are ingested into a Contextual AI datastore, chunked and indexed. They’re kept in sync so when content is updated, the agent has the latest info (here, via sync code we share; see below).
  2. Keyword search + agent: Mintlify’s built-in search returns keyword-based results immediately. The agent then runs the Agentic Search template—multi-turn research over the datastore—and streams a response with citations and a References section.
  3. Grounded generation: Answers are generated only from the research step, with citations in [n]() format and links to the documentation URLs.

Try it

Use the search bar at the top and ask a question about Contextual AI.

Build your own: sync + agent

We share the building blocks so you can run a documentation agent on your own content.

datastore-sync

Open-source repo that keeps a Contextual AI datastore in sync with your docs (and optionally your website). We use it for this site: ContextualAI/datastore-sync.
  • GitHub docs sync: Automatically syncs MDX/MD and OpenAPI specs from a GitHub repo (e.g. Mintlify docs). Incremental sync (commit SHAs, tree diffs), published-content filtering via docs.json, and OpenAPI parsing into per-endpoint markdown with human-readable URLs. Compiles MDX to clean PDFs before ingestion for better RAG quality.
  • Website sync (optional): Firecrawl-based sync for other website content, with sitemap crawling and diffing (agent using this section coming soon)
  • State: Redis-backed state (or local JSON in dev); Vercel-ready with webhook endpoints (e.g. POST /sync/github for GitHub push events).
Use it to feed an agent with your published documentation so it stays up to date as you push changes.

Next steps