DEV Community

Nicholas Narmada
Nicholas Narmada

Posted on

Introducing docusaurus-plugin-chat-page: An AI-Powered Chat Interface for Your Documentation

Imagine if your documentation could answer your users’ questions in real time—providing context-aware responses, complete with source references—all without requiring maintainers to build and manage a separate backend. Today, I’m excited to introduce docusaurus-plugin-chat-page – a plug-and-play Docusaurus plugin that adds an AI-powered chat interface directly to your documentation site.

What Is docusaurus-plugin-chat-page?

Docusaurus has transformed how we build documentation sites with its modern, React‑based architecture, Markdown‑driven content, and a robust plugin ecosystem. However, one challenge remains: users often have to sift through extensive docs to find the information they need.

docusaurus-plugin-chat-page solves this problem by integrating an interactive chat interface into your documentation site. End-users can ask questions in natural language and receive context‑aware answers generated from your documentation content.

Key Features

  • AI-Powered Assistance: Users can ask questions and receive intelligent, context-driven answers.
  • Semantic Search via Embeddings: The plugin processes your Markdown files at build time—splitting content into chunks, computing embeddings (using OpenAI for now), and saving these as a JSON asset. At runtime, a cosine similarity search retrieves the most relevant content.
  • Source References: Answers include metadata about the source (file paths, titles) so users know exactly where the information came from.
  • Plug-and-Play Integration: With minimal configuration (just supplying an API key), you can add a chat page to your documentation without managing a backend database.
  • Future-Proof & Model-Agnostic (Coming Soon): While the current release uses OpenAI for both embeddings and chat completions, future releases will allow you to choose from multiple providers by simply updating your configuration.

How It Works

1. Build-Time Ingestion

During the Docusaurus build process, the plugin:

  • Scans Local Markdown Files: It traverses your /docs and /src/pages directories to collect all your Markdown content—no need to scrape HTML.
  • Processes and Chunks Content: Using tools like gray-matter and remark (with strip-markdown), the plugin extracts frontmatter (such as titles and tags) and converts the Markdown into plain text. Then it splits this content into manageable chunks (with configurable maximum chunk size) while preserving metadata like file paths.
  • Generates Embeddings: For each chunk, the plugin calls OpenAI’s Embedding API (currently hardcoded) to compute a vector representation. These embeddings, along with their corresponding text and metadata, are bundled into a JSON file via Docusaurus’s createData API.
  • Static Asset Creation: The resulting JSON file is then deployed as a static asset, making it available on the client without needing a live database.

2. Client-Side Chat Processing

At runtime, the chat page:

  • Loads the Embeddings: The precomputed embeddings JSON file is loaded into the client.
  • Processes User Queries: When a user submits a question, the plugin:
    • Converts the query into an embedding using OpenAI’s API.
    • Performs a cosine similarity search against the stored embeddings to find the top relevant chunks.
  • Generates Contextual Answers: The relevant chunks (with their source metadata) are combined with the user’s query to form a prompt. This prompt is sent to the Chat API (currently OpenAI) to generate an answer, which is streamed back in real time.
  • Displays Source References: The chat response includes source information (e.g., “Source: docs/intro.md”) to show where the answer originated.

Configuration

Right now, the plugin requires you to supply an OpenAI API key for both generating embeddings and for chat completions. Here’s an example of how to configure it in your docusaurus.config.js:

plugins: [
  [
    "docusaurus-plugin-chat-page",
    {
      path: "chat", // URL path for the chat page
      openai: {
        apiKey: process.env.OPENAI_API_KEY, // Your OpenAI API key
        // Currently, both embeddings and chat completions use OpenAI.
        // Future releases will allow you to specify different providers.
      },
    },
  ],
]
Enter fullscreen mode Exit fullscreen mode

Adding to Navigation

To add the chat page to your site's navigation bar, update the themeConfig in your docusaurus.config.js:

module.exports = {
  // ...
  themeConfig: {
    navbar: {
      items: [
        // ... your other navbar items
        {
          to: "/chat", // Make sure this matches your plugin's path configuration
          label: "Chat",
          position: "left",
        },
        // ...
      ],
    },
  },
}
Enter fullscreen mode Exit fullscreen mode

Future Plans:

In upcoming releases, I plan to make the plugin model agnostic—allowing you to specify separate providers for embeddings and chat completions. For now, the plugin uses OpenAI exclusively.

Overcoming Build-Time Memory Challenges

One challenge we faced was out-of-memory (OOM) errors during the build process due to the large number of chunks and embeddings being processed. To address this, the plugin implements several optimizations:

  • Optimized Chunking: Increasing the default chunk size and capping the maximum number of chunks per file reduces the total number of chunks.
  • Batch Processing Improvements: Reducing the batch size for embedding generation and adding delays between batches help lower peak memory usage.
  • Optional Caching: Future releases may include a caching mechanism so that embeddings are only recomputed for files that have changed.
  • Dynamic Imports: Provider-specific code is loaded dynamically, ensuring that only necessary dependencies are loaded during build and runtime.

Installation & Deployment

Installation is straightforward via npm or yarn:

npm install docusaurus-plugin-chat-page
# or
yarn add docusaurus-plugin-chat-page
Enter fullscreen mode Exit fullscreen mode

For more details, check out the plugin on npm.

After installing, update your docusaurus.config.js as shown above. When you run docusaurus build, the plugin processes your documentation at build time and saves the embeddings as a JSON asset. When deployed, users can access the chat page at the configured route (e.g., /chat) and start interacting with your documentation.

Conclusion

docusaurus-plugin-chat-page transforms your static documentation site into an interactive, AI-powered experience—making it easier for your users to find the information they need. While the current version uses OpenAI for both embeddings and chat completions, future releases will offer model agnosticism, allowing you to choose the best provider for your needs.

I’m excited to see how this plugin empowers documentation maintainers to deliver a richer and more interactive experience. Give it a try on your Docusaurus site and share your feedback!

Top comments (0)