CLI Commands & Options
CLI Reference
Moltext is a high-density documentation compiler designed to be invoked via the command line. It transforms standard web-based documentation into a deterministic, agent-readable Markdown structure.
Usage
moltext <url> [options]
Arguments
| Argument | Description |
| :--- | :--- |
| <url> | The entry point URL of the documentation site you wish to compile. Moltext will crawl links within the same domain from this starting point. |
Options
| Flag | Description | Default |
| :--- | :--- | :--- |
| -r, --raw | Raw Mode. Skips LLM normalization. Returns cleaned, structured Markdown without AI-driven compression. | false |
| -o, --output <path> | Specifies the destination file path for the compiled context. | context.md |
| -l, --limit <number> | The maximum number of pages to crawl and parse. Prevents accidental infinite loops on massive sites. | 100 |
| -k, --key <key> | Your API Key for LLM processing. (Can also be set via OPENAI_API_KEY environment variable). | — |
| -u, --base-url <url> | The API endpoint for the LLM provider. Useful for local inference or alternative providers. | https://api.openai.com/v1 |
| -m, --model <model> | The specific model to use for documentation normalization and compression. | gpt-4o-mini |
Authentication & Modes
Moltext adapts its authentication requirements based on your processing mode:
- Raw Mode (
--raw): No API key is required. Moltext performs standard HTML-to-Markdown conversion with aggressive noise filtering (removing navs, footers, and scripts). - Local Inference: If
--base-urlis pointed away from OpenAI (e.g., to Ollama or LM Studio athttp://localhost:11434/v1), Moltext uses a placeholder key. - Managed LLM (OpenAI): Requires a valid API key via the
-kflag or theOPENAI_API_KEYenvironment variable.
Examples
Agent-Native "Raw" Ingestion
Recommended for agents with large context windows that prefer to handle raw data themselves.
moltext https://docs.example.com --raw -o internal_docs.md
High-Density Compression (OpenAI)
Compresses documentation into a hyper-concise format optimized for vector retrieval and RAG.
moltext https://docs.example.com -k sk-proj-... -m gpt-4o
Local Logic Flow (Ollama)
Process documentation through a local model to keep data on-premise.
moltext https://docs.example.com \
--base-url http://localhost:11434/v1 \
--model llama3 \
--limit 50
Environment Variables
You can define a .env file in your working directory to avoid passing keys via the CLI:
OPENAI_API_KEY=sk-your-key-here