Enterprise-grade monorepo for analyzing GitHub repositories using Large Language Models
A comprehensive AI-powered analysis platform that evaluates GitHub repositories across multiple dimensions including code quality, architecture, security, and project health using Google Gemini and other LLMs.
This project is organized as a modern monorepo with four main packages:
ai-project-analyzer/
βββ packages/
β βββ core/ # π§ Business logic & AI analysis
β βββ cli/ # π» Command-line interface
β βββ api/ # π REST API server
β βββ frontend/ # βοΈ React web application
βββ config/ # βοΈ Shared configuration
βββ docker/ # π³ Container configurations
βββ scripts/ # π§ Build & development scripts
βββ docs/ # π Documentation
Package | Description | Technology | Port/Usage |
---|---|---|---|
Core | AI analysis engine, GitHub integration, metrics | Python 3.11+, LangChain, Gemini | Library |
CLI | Command-line tool for repository analysis | Python, Typer, Rich | analyze command |
API | REST API with authentication, job queues | FastAPI, PostgreSQL, Redis | Port 8000 |
Frontend | Web interface with reports dashboard | React, TypeScript, Tailwind | Port 3000 |
- Python 3.11+ (recommend using uv)
- Node.js 18+ (for frontend)
- Docker & Docker Compose (for full stack)
uv run packages/cli/src/main.py --input-file "./pos.csv" --output "./celo-reports-pos"
git clone <repository-url>
cd ai-project-analyzer
# Install Python dependencies (workspace-level)
uv sync --dev
# Install frontend dependencies
cd packages/frontend && npm install
# Copy environment template
cp config/env.template .env
# Edit .env with your configuration:
# GOOGLE_API_KEY=your_gemini_api_key_here
# DATABASE_URL=postgresql://localhost/ai_analyzer
# REDIS_URL=redis://localhost:6379/0
# Analyze a single repository
uv run python packages/cli/src/main.py analyze https://github.com/user/repo
# Analyze multiple repositories
uv run python packages/cli/src/main.py batch-analyze repos.txt
# Use different models
uv run python packages/cli/src/main.py analyze --model gemini-2.5-pro-preview-03-25 <repo-url>
# Start all services with Docker
docker-compose up -d
# Or run services individually:
# 1. Start API server (Terminal 1)
cd packages/api && uvicorn app.main:app --reload
# 2. Start frontend (Terminal 2)
cd packages/frontend && npm run dev
# 3. Start background workers (Terminal 3)
cd packages/api && python -m app.worker
This monorepo includes complete VS Code configuration:
- IntelliSense for all packages
- Debug configurations for CLI, API, and tests
- Task runners for linting, testing, building
- Recommended extensions for Python, React, Docker
Just open the workspace in VS Code and install recommended extensions!
# Install all dependencies
uv sync --dev
# Run tests across all packages
pytest packages/ -v
# Lint and format Python code
ruff check packages/
ruff format packages/
# Build frontend
cd packages/frontend && npm run build
# Run with Docker (recommended for API development)
docker-compose up -d postgres redis # Start dependencies
cd packages/api && uvicorn app.main:app --reload
# Basic analysis
uv run python packages/cli/src/main.py analyze https://github.com/microsoft/vscode
# Celo-specific analysis
uv run python packages/cli/src/main.py analyze --prompt celo https://github.com/celo-org/celo-monorepo
# Batch analysis with custom output
uv run python packages/cli/src/main.py batch-analyze \
--input repos.txt \
--output ./reports \
--model gemini-2.5-flash
# Start analysis job
curl -X POST "http://localhost:8000/api/v1/analysis" \
-H "Content-Type: application/json" \
-d '{"github_url": "https://github.com/user/repo", "analysis_type": "deep"}'
# Get report
curl "http://localhost:8000/api/v1/reports/{report_id}"
# List user reports
curl "http://localhost:8000/api/v1/reports" \
-H "Authorization: Bearer <your_token>"
from core.src.analyzer import analyze_single_repository
from core.src.fetcher import fetch_single_repository
# Fetch repository content
repo_name, digest = fetch_single_repository("https://github.com/user/repo")
# Analyze with AI
analysis = analyze_single_repository(
repo_name=repo_name,
code_digest=digest,
prompt_path="config/prompts/default.txt",
model_name="gemini-2.5-flash"
)
print(analysis)
Variable | Description | Default | Required |
---|---|---|---|
GOOGLE_API_KEY |
Gemini API key | - | β |
DATABASE_URL |
PostgreSQL connection | postgresql://localhost/ai_analyzer |
API only |
REDIS_URL |
Redis connection | redis://localhost:6379/0 |
API only |
LOG_LEVEL |
Logging level | INFO |
β |
config/prompts/default.txt
- General repository analysisconfig/prompts/celo.txt
- Celo blockchain project analysisconfig/prompts/report-template.txt
- Report formatting template
Model | Best For | Speed | Quality |
---|---|---|---|
gemini-2.5-flash |
General analysis | β‘ Fast | βββ |
gemini-2.5-pro-preview-03-25 |
Deep analysis | π Slow | βββββ |
# Run all tests
pytest packages/ -v
# Test specific package
pytest packages/core/ -v
# Test with coverage
pytest packages/ --cov=packages --cov-report=html
# Integration tests (requires Docker)
docker-compose up -d postgres redis
pytest packages/api/tests/integration/ -v
The production setup includes comprehensive monitoring:
- Prometheus metrics collection
- Grafana dashboards and alerting
- Loki log aggregation with Promtail
- Health checks for all services
# Start monitoring stack
docker-compose -f docker-compose.prod.yml up -d
# Access dashboards
open http://localhost:3000 # Grafana (admin/admin)
open http://localhost:9090 # Prometheus
# Build production images
docker-compose -f docker-compose.prod.yml build
# Deploy with monitoring
docker-compose -f docker-compose.prod.yml up -d
# Check deployment health
curl http://localhost/health
See docs/DEPLOYMENT.md
for detailed cloud deployment guides:
- AWS ECS/EKS deployment
- Google Cloud Run deployment
- Azure Container Instances
- Contributing Guide - Development workflow, coding standards
- Deployment Guide - Production deployment instructions
- API Documentation - Interactive OpenAPI docs (when API is running)
- CHANGELOG - Version history and changes
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes with tests
- Run quality checks:
ruff check packages/ && pytest packages/
- Commit using conventional commits (
git commit -m 'feat: add amazing feature'
) - Push and create a Pull Request
See CONTRIBUTING.md for detailed guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
- Google Gemini for advanced language model capabilities
- LangChain for LLM integration framework
- FastAPI for high-performance API development
- React & shadcn/ui for modern frontend experience
Built with β€οΈ for the developer community