Command Line Interface
Command Line Interface
The moltext CLI is the primary entry point for compiling human-centric documentation into agent-native context. It handles crawling, HTML cleaning, and LLM-based normalization.
Basic Syntax
moltext <url> [options]
<url>: The base URL of the documentation site you wish to compile. The crawler will stay within this domain and protocol.
Options
| Flag | Description | Default |
| :--- | :--- | :--- |
| -k, --key <key> | Your LLM API Key. Required for OpenAI processing. | process.env.OPENAI_API_KEY |
| -u, --base-url <url> | The base URL for the LLM API. Supports local providers like Ollama. | https://api.openai.com/v1 |
| -m, --model <model> | The specific model to use for normalization. | gpt-4o-mini |
| -r, --raw | Raw Mode: Skips LLM processing. Returns structured Markdown only. | false |
| -o, --output <path> | The destination file path for the compiled context. | context.md |
| -l, --limit <number> | Maximum number of pages to crawl and parse. | 100 |
Authentication Modes
Moltext adapts its authentication requirements based on your processing mode:
- Raw Mode (
--raw): No API key is required. The CLI simply extracts and cleans the Markdown. - Local Inference: If you point
--base-urlto a local endpoint (e.g.,http://localhost:11434/v1), the CLI uses a dummy key automatically. - OpenAI Mode: Requires a valid API key via the
-kflag or theOPENAI_API_KEYenvironment variable.
Usage Examples
1. Quick Ingestion (Raw Mode)
Best for high-speed parsing where you want the agent to handle the raw data directly. No LLM costs or keys involved.
moltext https://docs.example.com --raw -o example_docs.md
2. Agentic Optimization (OpenAI)
Compiles documentation into a high-density, "agent-readable" format using GPT-4o-mini. This strips conversational filler and optimizes for vector retrieval.
moltext https://docs.example.com -k sk-xxxx...
3. Local Privacy Mode (Ollama/LM Studio)
Keep your documentation parsing entirely local by routing requests to a local inference server.
moltext https://docs.example.com \
--base-url http://localhost:11434/v1 \
--model llama3
4. Targeted Deep Crawl
Increase the page limit for massive documentation repositories.
moltext https://massive-docs.io --limit 500 --output deep_context.md
Environment Variables
You can define a .env file in your working directory or set variables in your shell to avoid passing flags manually:
OPENAI_API_KEY: Used as the default key for LLM processing if the-kflag is omitted.