Examples
Real-world examples showing how to use livectx effectively in different scenarios.
Quick Context for AI Chat
The simplest use case: generate context and paste it into an AI chat.
$ cd my-project && livectx generate$ cat SYSTEM_PROMPT.md | pbcopy # Copy to clipboard (macOS)Then paste into Claude, ChatGPT, or your preferred AI assistant.
Large Codebase
For large projects, you may want to limit the number of files to reduce token usage and cost:
max_files: 200
max_file_size: 51200 # 50KB
exclude:
- "**/*.test.*"
- "**/*.spec.*"
- "**/fixtures/**"
- "**/testdata/**"
- "**/__mocks__/**"
- "**/node_modules/**"
- "**/dist/**"
- "**/build/**"Monorepo
For monorepos, generate context for specific packages:
$ cd packages/api && livectx generate --output .$ cd packages/web && livectx generate --output .Or create a single context for the entire monorepo with selective includes:
include:
- "packages/shared/**"
- "packages/api/src/**"
- "packages/web/src/**"
- "package.json"
- "turbo.json"
exclude:
- "**/node_modules/**"
- "**/*.test.*"Go Project
Configuration optimized for Go codebases:
include:
- "*.go"
- "go.mod"
- "go.sum"
- "Makefile"
- "*.yaml"
- "*.yml"
exclude:
- "*_test.go"
- "**/testdata/**"
- "**/vendor/**"TypeScript/React Project
Configuration for a typical React application:
include:
- "src/**/*.ts"
- "src/**/*.tsx"
- "package.json"
- "tsconfig.json"
- "tailwind.config.*"
- "next.config.*"
exclude:
- "**/*.test.*"
- "**/*.spec.*"
- "**/__tests__/**"
- "**/node_modules/**"
- "**/dist/**"
- "**/.next/**"
max_file_size: 102400 # 100KBPython/Django Project
include:
- "**/*.py"
- "requirements*.txt"
- "pyproject.toml"
- "setup.py"
exclude:
- "**/tests/**"
- "**/test_*.py"
- "**/*_test.py"
- "**/migrations/**"
- "**/__pycache__/**"
- "**/venv/**"
- "**/.venv/**"Dry Run
Preview what will be generated without writing files:
$ livectx generate --dry-run --verboseThis is useful for:
- Debugging exclude/include patterns
- Seeing which files will be included
- Estimating token count before generation
Custom Output Location
Output to a docs directory:
$ livectx generate --output ./docs/ai-contextOr configure in yaml:
output_dir: "./docs/ai-context"
context_file: "codebase.md"
prompt_file: "assistant-instructions.md"Using with Different AI Models
Claude (Anthropic)
llm:
provider: anthropic
model: claude-sonnet-4-20250514 # or claude-3-opus, claude-3-haikuOpenRouter (Multiple Providers)
llm:
provider: openrouter
model: anthropic/claude-3.5-sonnet # Anthropic via OpenRouter
# model: openai/gpt-4-turbo # OpenAI via OpenRouter
# model: google/gemini-pro # Google via OpenRouterIntegration with AI Assistants
livectx generates two files that work together:
- SYSTEM_PROMPT.md — Instructions for the AI (how to work with your codebase)
- CONTEXT.md — The actual codebase content (file tree, key files, code)
Automatic Integration
The easiest way to set up your AI tools is with the integrate command:
$ livectx generate # Generate context files$ livectx integrate # Configure AI tools automaticallyThis auto-detects your AI coding tools and configures them to reference the context files:
- Claude Code → Updates CLAUDE.md
- Cursor → Updates .cursorrules
- GitHub Copilot → Updates .github/copilot-instructions.md
You can also target specific tools:
$ livectx integrate claude-code cursorManual Integration
If you prefer manual setup, add references to both files in your tool's config:
# Project Instructions
Read SYSTEM_PROMPT.md for codebase conventions and patterns.
Read CONTEXT.md for the full codebase context and file contents.Web-based AI
ChatGPT / Claude web: For manual conversations, paste SYSTEM_PROMPT.md at the start. For larger context needs, also include relevant sections from CONTEXT.md.
CI/CD Integration
See the GitHub Action documentation for comprehensive CI/CD integration examples.