This repo contains code demos that were discussed in "Open WebUI + LangGraph AI Agents with Python Executor and Postgres Database-A Walkthrough".
Disclaimer: This repository is just a demonstration of what is possible with LangGraph agents and Open WebUI integration. It will not be actively maintained.
This repo will teach you how to:
- Create AI Agent (agent AI) in LangGraph that can use Python Executor and Postgres Database
- Integrate the LangGraph agents with Open WebUI via Pipelines
Note that we use OpenAI API models for fast prototyping. You can use Ollama to run local LLM API. However, the instruction is not provided in this repo.
- Create conda environment with python=3.12.9
- Install dependencies with poetry
poetry install
- Set up OpenAI API key via
.env
file (which should be located inlanggraph_agents/config/
):
OPENAI_API_KEY=your_openai_api_key
- Set up Postgres connection string in
.env
file (which should be located inlanggraph_agents/config/
). This is optional if you do not want to access Postgres database.
POSTGRES_USER=your_postgres_user
POSTGRES_PASSWORD=your_postgres_password
POSTGRES_HOST=your_postgres_host
POSTGRES_PORT=your_postgres_port
POSTGRES_DB=your_postgres_db
- Activate your conda environment
- Simply run:
sh start.sh
- Open Open WebUI and make sure your pipeline is connected to it.
- Use 'Data Analyst' agent as a model to chat.
Feel free to copy and distribute, but we appreciate you giving us credits.
👍 Like | 🔗 Share | 📢 Subscribe | 💬 Comments | ❓ Questions