Interesting proposal similar to robots.txt but for LLMs. When AI is your parser, you can have a single file that is readable by both humans and machines:

The llms.txt file is unusual in that it uses Markdown to structure the information rather than a classic structured format such as XML. The reason for this is that we expect many of these files to be read by language models and agents.

Thomas Brand

@manton clearly we should have stuck with Gopher.

marius

> It can complement robots.txt by providing context for allowed content.

@manton call me cynical, but since a lot of AI crawlers don't really respect robots.txt, I am pretty sure they won't follow any other type of document that tries a similar thing. This is wasted effort.

Matt Anderson

@mariusor maybe, although robots.txt is kind of like a “keep off the grass” sign, easily ignored with no motivation to follow it. If I’m understanding it the llms.txt proposal, it seems more valuable to the reader, as it offers more streamlined, text friendly information. There’s motivation to use that in non-crawling/indexing scenarios.

Manton Reece @manton
Lightbox Image