You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RCoder is a modern AI-powered development platform built with Rust. It provides unified interaction with multiple AI agents through the SACP (Symposium ACP) protocol, featuring a microservice architecture with Docker containerized deployment and high-performance gRPC communication.
Computer Agent - Containerized AI agent environment with VNC remote desktop, audio streaming, and IME input
Architecture
Overview
External Client (HTTP/SSE)
|
RCoder (HTTP API Server + Docker Management + gRPC Client)
| gRPC (Chat, CancelSession, SubscribeProgress)
Agent Runner (gRPC Server in Docker)
| Server Streaming (real-time progress events)
RCoder (converts to SSE)
|
External Client (SSE)
Core Components
RCoder Main Service - Axum HTTP server + container management + gRPC client
Agent Runner - Isolated AI agent runtime environment (inside Docker), provides gRPC service
Pingora Proxy - High-performance reverse proxy with port-based routing
Docker Manager - Global container lifecycle management
Tech Stack
Category
Technology
Description
Language
Rust 2024 Edition
Modern systems programming language
HTTP Framework
Axum + Tower
High-performance async web framework
RPC Framework
Tonic (gRPC)
High-performance RPC communication
AI Protocol
SACP + MCP
Multi-agent protocol support
Containerization
Docker + Bollard
Container management and orchestration
Database
DuckDB + SQLx
Embedded analytical database
Logging
Tracing + OpenTelemetry
Structured logging and distributed tracing
Profiling
Pyroscope
Continuous performance profiling
CLI
clap
Modern command-line argument parsing
Getting Started
Prerequisites
Rust 1.75+ (2024 Edition)
Docker (for containerized deployment)
Optional: Claude Code CLI (for Claude agent)
Installation & Running
Local Development
# Clone the repository
git clone https://github.com/your-org/rcoder.git
cd rcoder
# Build all crates
cargo build --workspace
# Run the main service
cargo run --bin rcoder
# Specify port and projects directory
cargo run --bin rcoder -- --port 8087 --projects-dir ./my-projects
Docker Development Mode (Recommended)
# Build images and start containers
make dev-build # Build Docker images
make dev-up # Start development containers# Restart after code changes
make dev-restart # Rebuild and restart# View logs
make dev-logs
# Stop containers
make dev-down
Enable Reverse Proxy
# Enable Pingora reverse proxy
cargo run --bin rcoder -- --enable-proxy --proxy-port 8080
# Specify default backend port
cargo run --bin rcoder -- --enable-proxy --proxy-port 8080 --backend-port 3000
CLI Arguments
Argument
Short
Description
Example
--port
-p
Set main service port
--port 8087
--projects-dir
-d
Set project workspace directory
--projects-dir ./projects
--enable-proxy
-
Enable Pingora reverse proxy
--enable-proxy
--proxy-port
-
Set Pingora listen port
--proxy-port 8080
--backend-port
-
Default backend port
--backend-port 3000
# View all arguments
cargo run --bin rcoder -- --help
API Reference
Pingora Reverse Proxy
Pingora is the built-in high-performance reverse proxy.
# Enable proxy
cargo run --bin rcoder -- --enable-proxy --proxy-port 8080
# Proxy request example (forward to port 5173)
curl "http://127.0.0.1:8080/proxy/5173/page/123/"
Core Endpoints
Endpoint
Method
Description
/health
GET
Health check
/chat
POST
Send chat message to AI agent
/agent/progress/{session_id}
GET (SSE)
Real-time progress stream
/agent/session/cancel
POST
Cancel an active task
/agent/stop
POST
Stop the Agent
/agent/status/{project_id}
GET
Query Agent status
/api/docs
GET
Swagger UI API documentation
gRPC Services (Agent Runner)
Method
Type
Description
Chat
Unary
Send chat request
SubscribeProgress
Server Streaming
Subscribe to progress event stream
CancelSession
Unary
Cancel a session task
GetStatus
Unary
Query Agent status
StopAgent
Unary
Stop the Agent
GetContainerStatus
Unary
Query container status
GetVncStatus
Unary
Query VNC service status
Computer Agent Endpoints
Computer Agent provides a containerized AI agent environment with VNC remote desktop, audio streaming, and IME input support. Each user gets an isolated Docker container, and multiple projects can share the same container.
Core Interfaces
Endpoint
Method
Description
/computer/chat
POST
Send chat message to Computer Agent
/computer/progress/{session_id}
GET (SSE)
Real-time progress stream
/computer/agent/stop
POST
Stop Agent for a specific project (container stays alive)
/computer/agent/status
POST
Query Agent status (alive/idle/busy)
/computer/agent/session/cancel
POST
Cancel an active session
Desktop & Media Proxy (via Pingora)
Endpoint
Method
Description
/computer/desktop/{user_id}/{project_id}
GET
Get VNC desktop access URLs
/computer/vnc/{user_id}/{project_id}/{*path}
GET
VNC/noVNC proxy (port 6080)
/computer/audio/{user_id}/{project_id}/{*path}
GET
Audio stream proxy (port 6089/6090)
/computer/ime/{user_id}/{project_id}/{*path}
GET
IME input method proxy (port 6091)
Pod/Container Management
Endpoint
Method
Description
/computer/pod/count
GET
Container count statistics (grouped by service type)
/computer/pod/list
GET
List all container details (pagination: ?limit=100)
/computer/pod/ensure
POST
Ensure container exists (idempotent, does not start Agent)
curl -X POST http://localhost:8087/chat \
-H "Content-Type: application/json" \
-d '{ "prompt": "Help me create a Rust Web API project", "project_id": "my-project", "session_id": "optional-session-id" }'
Computer Agent Chat
curl -X POST http://localhost:8087/computer/chat \
-H "Content-Type: application/json" \
-d '{ "user_id": "user-123", "project_id": "my-project", "prompt": "Help me create a Python web application" }'
Real-time Progress Stream
curl -X GET http://localhost:8087/agent/progress/your-session-id \
-H "Accept: text/event-stream"
# Start development server
RUST_LOG=debug cargo run --bin rcoder -- --port 8087
# Watch for file changes
cargo install cargo-watch
cargo watch -x "run --bin rcoder"
Deployment
Docker
# Build images
make docker-build
# Or build separately
make docker-build-master # Main service image
make docker-build-agent-runner # Agent Runner image# Production image (no debug tools)
make docker-build-agent-production
Docker Compose
# Start services
make dev-up
# Check status
docker-compose -f docker/docker-compose.yml ps
# Stop services
make dev-down
Pyroscope Profiling
# Start Pyroscope Server
make pyroscope-up
# Open Web UI
open http://localhost:4040
# Stop service
make pyroscope-down
Troubleshooting
Common Issues
Port already in use - Use --port to specify a different port
Container startup failure - Check Docker service status and network configuration
gRPC connection failure - Verify container network and port configuration
API Key error - Check api_key_auth configuration
Debug Mode
# Enable verbose logging
RUST_LOG=debug cargo run --bin rcoder
# View container logs
make dev-logs
# Enter container for debugging
docker exec -it <container_id> /bin/bash