Getting Started with VEX

Get up and running with VEX in under 5 minutes.

Prerequisites

  • Rust 1.75+ (stable toolchain)
  • Git

Installation

Add to Your Project

Code
# Cargo.toml
[dependencies]
vex-core = { git = "https://github.com/provnai/vex" }
vex-adversarial = { git = "https://github.com/provnai/vex" }
vex-llm = { git = "https://github.com/provnai/vex" }

Clone and Build

Code
git clone https://github.com/provnai/vex.git
cd vex
cargo build --workspace --release

Project Structure

Understanding the workspace layout is crucial for effective development:

Code
vex/
├── crates/
│   ├── vex-core/        # Core types (Agent, Genome, Merkle)
│   ├── vex-adversarial/ # Debate engine & consensus
│   ├── vex-temporal/    # Memory horizons & decay
│   ├── vex-llm/         # LLM provider traits & impls
│   ├── vex-api/         # Axum web server
│   ├── vex-persist/     # SQLite storage & migrations
│   └── ...
├── examples/            # Standalone examples
├── Cargo.toml           # Workspace configuration
└── .env.example         # Template for environment variables

Quick Example

Code
use vex_core::{Agent, AgentConfig};
use vex_llm::MockProvider;

#[tokio::main]
async fn main() {
    // Create an agent
    let agent = Agent::new(AgentConfig {
        name: "Researcher".to_string(),
        role: "You are a helpful research assistant".to_string(),
        max_depth: 3,
        spawn_shadow: true,
    });

    // Use with an LLM provider
    let llm = MockProvider::smart();
    let response = llm.ask("What is quantum computing?").await.unwrap();
    
    println!("Response: {}", response);
}

Running the Demo

Code
# Set up API key (optional, uses mock if not set)
export DEEPSEEK_API_KEY="sk-..."

# Run the research agent demo
cargo run -p vex-demo

# Run fraud detection demo
cargo run -p vex-demo --bin fraud-detector

# Interactive chat
cargo run -p vex-demo --bin interactive

Running the API Server

Code
export VEX_JWT_SECRET="your-32-char-secret-here"
cargo run -p vex-api
# Server starts on 0.0.0.0:3000

Configuration & Environment

VEX uses dotenv to load configuration. Create a .env file in your project root:

Code
# LLM Providers (At least one is required)
DEEPSEEK_API_KEY="sk-..."       # For DeepSeek (Default recommended)
OPENAI_API_KEY="sk-..."         # For OpenAI
MISTRAL_API_KEY="sk-..."        # For Mistral
ANTHROPIC_API_KEY="sk-..."      # For Anthropic/Claude
OLLAMA_URL="http://localhost:11434" # For local LLMs

# Agent Configuration (Optional defaults)
VEX_DEFAULT_PROVIDER="deepseek"  # deepseek, openai, mistral, anthropic, ollama
VEX_DEFAULT_MODEL="deepseek-chat"
VEX_MAX_DEPTH="5"                # Max recursion depth for fractal agents
VEX_ADVERSARIAL="true"           # Enable/disable ShadowAgent verification
VEX_DEBUG="false"                # Enable verbose agent thought process logs

# Security & API
VEX_JWT_SECRET="your-super-secret-32-char-key-must-be-long"
VEX_ENV="development"            # development | production

# Database & Persistence
DATABASE_URL="sqlite://vex.db"   # Main storage
REDIS_URL="redis://localhost:6379" # Optional: For distributed job queue

# Rate Limiting (vex-api)
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_WINDOW=60             # Window size in seconds

# Telemetry & Observability
RUST_LOG="info,vex_core=debug"   # Log levels
OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:4317" # Jaeger/Prometheus endpoint

Step-by-Step Setup

  1. Install Rust:

    Code
    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    
  2. Clone Repository:

    Code
    git clone https://github.com/provnai/vex.git
    cd vex
    
  3. Install Dependencies: (VEX requires sqlx-cli for database migrations)

    Code
    cargo install sqlx-cli
    
  4. Initialize Database:

    Code
    # Create .env file first
    echo 'DATABASE_URL="sqlite://vex.db"' > .env
    sqlx database create
    sqlx migrate run --source crates/vex-persist/migrations
    
  5. Run Tests: Ensure everything is working correctly.

    Code
    cargo test --workspace
    

Next Steps