Quick Start

Get SourceMeld up and running in minutes with Docker.

# Pull the SourceMeld Docker image
docker pull sourcemeld/sourcemeld:latest

# Run SourceMeld
docker run -d \
  -p 8080:8080 \
  -v sourcemeld-data:/data \
  --name sourcemeld \
  sourcemeld/sourcemeld:latest

Access SourceMeld at http://localhost:8080

Installation

Prerequisites

  • Docker 20.10 or higher
  • 4GB RAM minimum (8GB recommended)
  • 10GB disk space

Docker Compose

For production deployments, use Docker Compose:

version: '3.8'
services:
  sourcemeld:
    image: sourcemeld/sourcemeld:latest
    ports:
      - "8080:8080"
    volumes:
      - sourcemeld-data:/data
      - ./config:/config
    environment:
      - SOURCEMELD_LLM_PROVIDER=openai
      - SOURCEMELD_API_KEY=your-api-key
    restart: unless-stopped

volumes:
  sourcemeld-data:

Configuration

Configure SourceMeld using environment variables or a configuration file.

Environment Variables

  • SOURCEMELD_LLM_PROVIDER - LLM provider (openai, anthropic, google)
  • SOURCEMELD_API_KEY - Your LLM API key
  • SOURCEMELD_PORT - Server port (default: 8080)
  • SOURCEMELD_DATA_DIR - Data directory path

Code Migration

Migrate code between languages with AI-powered assistance.

Supported Languages

Python JavaScript TypeScript Java Go Rust C# Ruby

Repository Integrations

Connect your code repositories from various platforms.

GitHub Integration

  1. Navigate to Settings → Integrations
  2. Click "Add GitHub Integration"
  3. Authorize SourceMeld with your GitHub account
  4. Select repositories to index

GitLab Integration

Similar process for GitLab, Bitbucket, and Gerrit.

API Reference

SourceMeld provides a REST API for programmatic access.

Authentication

curl -H "Authorization: Bearer YOUR_API_TOKEN" \
  https://your-instance.com/api/v1/search

Security

SourceMeld is designed with security as a top priority.

  • All data stays within your infrastructure
  • End-to-end encryption for data in transit
  • Role-based access control (RBAC)
  • Audit logging for compliance

Scaling

Scale SourceMeld horizontally for large deployments.

  • Deploy multiple instances behind a load balancer
  • Use shared storage for data persistence
  • Configure Redis for distributed caching
  • Monitor with Prometheus and Grafana