Files
autocoder/.env.example
nogataka 3588dc8df7 feat: add EXTRA_READ_PATHS for read-only external file access
Allow agents to read files from directories outside the project folder
via the EXTRA_READ_PATHS environment variable.

Changes:
- Add EXTRA_READ_PATHS_VAR constant in client.py
- Parse comma-separated paths and add Read/Glob/Grep permissions
- Log configured extra read paths on agent startup
- Document the feature in .env.example

Usage:
  EXTRA_READ_PATHS=/path/to/docs,/path/to/libs

Security: External paths are read-only (no Write/Edit permissions)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-29 11:55:49 +09:00

53 lines
2.1 KiB
Plaintext

# Optional: N8N webhook for progress notifications
# PROGRESS_N8N_WEBHOOK_URL=https://your-n8n-instance.com/webhook/...
# Playwright Browser Configuration
#
# PLAYWRIGHT_BROWSER: Which browser to use for testing
# - firefox: Lower CPU usage, recommended (default)
# - chrome: Google Chrome
# - webkit: Safari engine
# - msedge: Microsoft Edge
# PLAYWRIGHT_BROWSER=firefox
#
# PLAYWRIGHT_HEADLESS: Run browser without visible window
# - true: Browser runs in background, saves CPU (default)
# - false: Browser opens a visible window (useful for debugging)
# PLAYWRIGHT_HEADLESS=true
# Extra Read Paths (Optional)
# Comma-separated list of absolute paths for read-only access to external directories.
# The agent can read files from these paths but cannot write to them.
# Useful for referencing documentation, shared libraries, or other projects.
# Example: EXTRA_READ_PATHS=/Volumes/Data/dev,/Users/shared/libs
# EXTRA_READ_PATHS=
# GLM/Alternative API Configuration (Optional)
# To use Zhipu AI's GLM models instead of Claude, uncomment and set these variables.
# This only affects AutoCoder - your global Claude Code settings remain unchanged.
# Get an API key at: https://z.ai/subscribe
#
# ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
# ANTHROPIC_AUTH_TOKEN=your-zhipu-api-key
# API_TIMEOUT_MS=3000000
# ANTHROPIC_DEFAULT_SONNET_MODEL=glm-4.7
# ANTHROPIC_DEFAULT_OPUS_MODEL=glm-4.7
# ANTHROPIC_DEFAULT_HAIKU_MODEL=glm-4.5-air
# Ollama Local Model Configuration (Optional)
# To use local models via Ollama instead of Claude, uncomment and set these variables.
# Requires Ollama v0.14.0+ with Anthropic API compatibility.
# See: https://ollama.com/blog/claude
#
# ANTHROPIC_BASE_URL=http://localhost:11434
# ANTHROPIC_AUTH_TOKEN=ollama
# API_TIMEOUT_MS=3000000
# ANTHROPIC_DEFAULT_SONNET_MODEL=qwen3-coder
# ANTHROPIC_DEFAULT_OPUS_MODEL=qwen3-coder
# ANTHROPIC_DEFAULT_HAIKU_MODEL=qwen3-coder
#
# Model recommendations:
# - For best results, use a capable coding model like qwen3-coder or deepseek-coder-v2
# - You can use the same model for all tiers, or different models per tier
# - Larger models (70B+) work best for Opus tier, smaller (7B-20B) for Haiku