AI Engineer Guide

How to serve Markdown instead of HTML for AI Agents

Bun.sh team follows a really interesting technique to make their docs LLM friendly.

Basically, when a AI agent (claude code in this case) fetches their docs they reply with markdown instead of HTML.

They claim that this approach reduces token usage by 10x. And yeah, we all know that LLMs are really good with markdown.

How does it work?

AI Agents don’t include text/html in their Accept header.

For example, when fetching a URL, Claude Code sends a custom Accept header

Instead of the default, */* they send application/json, text/plain, */*

2025-11-13-at-23.02.352x.png

And Bun team uses this to identify the request and serve the content accordingly.

For example, this is the response that we get if we try to fetch a doc

2025-11-13-at-23.02.162x.png

And by sending Accept: application/json, text/plain, */* header, we’re getting markdown.

2025-11-13-at-23.03.572x.png

What’s next?

And Claude Code team claims that they’ll start sending Accept: “text/markdown" then making a request in the upcoming versions.

Reference

Happy optimizing docs!

#Claude-Code #Ai-Agents

Stay Updated

Get the latest AI engineering insights delivered to your inbox.

No spam. Unsubscribe at any time.