Examples

Real-world examples showing how to use livectx effectively in different scenarios.

Quick Context for AI Chat

The simplest use case: generate context and paste it into an AI chat.

Terminal
$ cd my-project && livectx generate
Terminal
$ cat SYSTEM_PROMPT.md | pbcopy # Copy to clipboard (macOS)

Then paste into Claude, ChatGPT, or your preferred AI assistant.

Large Codebase

For large projects, you may want to limit the number of files to reduce token usage and cost:

.livectx.yaml
max_files: 200
max_file_size: 51200  # 50KB

exclude:
  - "**/*.test.*"
  - "**/*.spec.*"
  - "**/fixtures/**"
  - "**/testdata/**"
  - "**/__mocks__/**"
  - "**/node_modules/**"
  - "**/dist/**"
  - "**/build/**"

Monorepo

For monorepos, generate context for specific packages:

Terminal
$ cd packages/api && livectx generate --output .
Terminal
$ cd packages/web && livectx generate --output .

Or create a single context for the entire monorepo with selective includes:

.livectx.yaml
include:
  - "packages/shared/**"
  - "packages/api/src/**"
  - "packages/web/src/**"
  - "package.json"
  - "turbo.json"

exclude:
  - "**/node_modules/**"
  - "**/*.test.*"

Go Project

Configuration optimized for Go codebases:

.livectx.yaml
include:
  - "*.go"
  - "go.mod"
  - "go.sum"
  - "Makefile"
  - "*.yaml"
  - "*.yml"

exclude:
  - "*_test.go"
  - "**/testdata/**"
  - "**/vendor/**"

TypeScript/React Project

Configuration for a typical React application:

.livectx.yaml
include:
  - "src/**/*.ts"
  - "src/**/*.tsx"
  - "package.json"
  - "tsconfig.json"
  - "tailwind.config.*"
  - "next.config.*"

exclude:
  - "**/*.test.*"
  - "**/*.spec.*"
  - "**/__tests__/**"
  - "**/node_modules/**"
  - "**/dist/**"
  - "**/.next/**"

max_file_size: 102400  # 100KB

Python/Django Project

.livectx.yaml
include:
  - "**/*.py"
  - "requirements*.txt"
  - "pyproject.toml"
  - "setup.py"

exclude:
  - "**/tests/**"
  - "**/test_*.py"
  - "**/*_test.py"
  - "**/migrations/**"
  - "**/__pycache__/**"
  - "**/venv/**"
  - "**/.venv/**"

Dry Run

Preview what will be generated without writing files:

Terminal
$ livectx generate --dry-run --verbose

This is useful for:

  • Debugging exclude/include patterns
  • Seeing which files will be included
  • Estimating token count before generation

Custom Output Location

Output to a docs directory:

Terminal
$ livectx generate --output ./docs/ai-context

Or configure in yaml:

.livectx.yaml
output_dir: "./docs/ai-context"
context_file: "codebase.md"
prompt_file: "assistant-instructions.md"

Using with Different AI Models

Claude (Anthropic)

.livectx.yaml
llm:
  provider: anthropic
  model: claude-sonnet-4-20250514  # or claude-3-opus, claude-3-haiku

OpenRouter (Multiple Providers)

.livectx.yaml
llm:
  provider: openrouter
  model: anthropic/claude-3.5-sonnet  # Anthropic via OpenRouter
  # model: openai/gpt-4-turbo        # OpenAI via OpenRouter
  # model: google/gemini-pro          # Google via OpenRouter

Integration with AI Assistants

livectx generates two files that work together:

  • SYSTEM_PROMPT.md — Instructions for the AI (how to work with your codebase)
  • CONTEXT.md — The actual codebase content (file tree, key files, code)

Automatic Integration

The easiest way to set up your AI tools is with the integrate command:

Terminal
$ livectx generate # Generate context files
Terminal
$ livectx integrate # Configure AI tools automatically

This auto-detects your AI coding tools and configures them to reference the context files:

  • Claude Code → Updates CLAUDE.md
  • Cursor → Updates .cursorrules
  • GitHub Copilot → Updates .github/copilot-instructions.md

You can also target specific tools:

Terminal
$ livectx integrate claude-code cursor

Manual Integration

If you prefer manual setup, add references to both files in your tool's config:

CLAUDE.md
# Project Instructions

Read SYSTEM_PROMPT.md for codebase conventions and patterns.
Read CONTEXT.md for the full codebase context and file contents.

Web-based AI

ChatGPT / Claude web: For manual conversations, paste SYSTEM_PROMPT.md at the start. For larger context needs, also include relevant sections from CONTEXT.md.

CI/CD Integration

See the GitHub Action documentation for comprehensive CI/CD integration examples.