LM Studio's 7437-line llms.txt shows what thorough AI preparation looks like
Run Llama, DeepSeek, and Phi locally with LM Studio—a powerful desktop app for experimenting with LLMs. Easy setup, chat interface, and model management!
7,437
Lines
+426% vs avg
149
Sections
+521% vs avg
742+
Companies
using llms.txt
2
Files
llms.txt + full
Key Insights
Comprehensive structure
With 149 distinct sections, this file provides thorough coverage for AI systems.
Comprehensive detail
7437 lines of thorough documentation for AI systems.
Two-file approach
Uses both llms.txt and llms-full.txt for different AI use cases.
llms.txt Preview
First 100 lines of 7,437 total
# app
# About LM Studio
> Learn how to run Llama, DeepSeek, Phi, and other LLMs locally with LM Studio.
LM Studio is a desktop app for developing and experimenting with LLMs locally on your computer.
**Key functionality**
1. A desktop application for running local LLMs
2. A familiar chat interface
3. Search & download functionality (via Hugging Face 🤗)
4. A local server that can listen on OpenAI-like endpoints
5. Systems for managing local models and configurations
<hr>
### How do I install LM Studio?
Head over to the [Downloads page](/download) and download an installer for your operating system.
LM Studio is available for macOS, Windows, and Linux.
<hr>
### System requirements
LM Studio generally supports Apple Silicon Macs, x64/ARM64 Windows PCs, and x64 Linux PCs.
Consult the [System Requirements](app/system-requirements) page for more detailed information.
<hr>
### Run llama.cpp (GGUF) or MLX models
LM Studio supports running LLMs on Mac, Windows, and Linux using [`llama.cpp`](https://github.com/ggerganov/llama.cpp).
On Apple Silicon Macs, LM Studio also supports running LLMs using Apple's [`MLX`](https://github.com/ml-explore/mlx).
To install or manage LM Runtimes, press `⌘` `Shift` `R` on Mac or `Ctrl` `Shift` `R` on Windows/Linux.
<hr>
### Run an LLM like `Llama`, `Phi`, or `DeepSeek R1` on your computer
To run an LLM on your computer you first need to download the model weights.
You can do this right within LM Studio! See [Download an LLM](app/basics/download-model) for guidance.
<hr>
### Chat with documents entirely offline on your computer
You can attach documents to your chat messages and interact with them entirely offline, also known as "RAG".
Read more about how to use this feature in the [Chat with Documents](app/basics/rag) guide.
### Use LM Studio's API from your own apps and scripts
LM Studio provides a REST API that you can use to interact with your local models from your own apps and scripts.
- [OpenAI Compatibility API](api/openai-api)
- [LM Studio REST API (beta)](api/rest-api)
<hr>
### Community
Join the LM Studio community on [Discord](https://discord.gg/aPQfnNkxGC) to ask questions, share knowledge, and get help from other users and the LM Studio team.
## API Changelog
> LM Studio API Changelog - new features and updates
###### [👾 LM Studio 0.3.9](blog/lmstudio-v0.3.9) • 2025-01-30
### Idle TTL and Auto Evict
Set a TTL (in seconds) for models loaded via API requests (docs article: [Idle TTL and Auto-Evict](/docs/api/ttl-and-auto-evict))
```diff
curl http://localhost:1234/api/v0/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-r1-distill-qwen-7b",
"messages": [ ... ]
+ "ttl": 300,
}'
```
With `lms`:
```
lms load --ttl <seconds>
```
LM Studio is ready for AI search. Are you?
Join 742+ companies preparing their websites for the future of search. Create your llms.txt file in minutes.
Generate Your llms.txtDon't get left behind
Your competitors are preparing for AI search.
LM Studio has 149 organized sections ready for AI crawlers. Generate your llms.txt file and join the companies optimizing for the future of search.