Installation

This guide covers the installation of vLLM Judge and its prerequisites.

Prerequisites

Python Version

vLLM Judge requires Python 3.8 or higher. You can check your Python version:

python --version

vLLM Server

You need access to a vLLM server running your preferred model. If you don’t have one:

# Install vLLM
pip install vllm

# Start a model server
python -m vllm.entrypoints.openai.api_server \
    --model meta-llama/Llama-3-8b-instruct \
    --port 8000

Installing vLLM Judge

Basic Installation

Install the core library with pip:

pip install vllm-judge

This installs the essential dependencies:

  • httpx - Async HTTP client

  • pydantic - Data validation

  • tenacity - Retry logic

  • click - CLI interface

Optional Features

API Server

To run vLLM Judge as an API server:

pip install vllm-judge[api]

This adds:

  • fastapi - Web framework

  • uvicorn - ASGI server

  • websockets - WebSocket support

Jinja2 Templates

For advanced template support:

pip install vllm-judge[jinja2]

This enables Jinja2 template engine for complex template logic.

Everything

Install all optional features:

pip install vllm-judge[dev]

Installation from Source

To install the latest development version:

# Clone the repository
git clone https://github.com/trustyai-explainability/vllm_judge.git
cd vllm_judge

# Install in development mode
pip install -e .

# With all extras
pip install -e ".[dev]"

Verifying Installation

Basic Check

# In Python
from vllm_judge import Judge
print("vLLM Judge installed successfully!")

CLI Check

# Check CLI installation
vllm-judge --help

Version Check

import vllm_judge
print(f"vLLM Judge version: {vllm_judge.__version__}")

Environment Setup

It’s recommended to use a virtual environment:

venv

# Create virtual environment
python -m venv vllm-judge-env

# Activate it
# On Linux/Mac:
source vllm-judge-env/bin/activate
# On Windows:
vllm-judge-env\Scripts\activate

# Install vLLM Judge
pip install vllm-judge

Troubleshooting

Common Issues

ImportError: No module named 'vllm_judge'

Make sure you’ve activated your virtual environment and installed the package:

pip install vllm-judge

Connection errors to vLLM server

Verify your vLLM server is running and accessible:

curl http://localhost:8000/health

Permission errors during installation

Try installing with user permissions:

pip install --user vllm-judge

Getting Help

If you encounter issues, report at GitHub Issues

🎉 Next Steps

Congratulations! You’ve successfully installed vLLM Judge. Here’s what to explore next: