Skip to content

Commit 5c26db0

Browse files
authored
Merge pull request #32 from severian42/dev
Refactor/Update Here's a summary of the key changes in this release: 1. Major UI and functionality updates: - New interactive user interface with comprehensive features - Support for local models and OpenAI-compatible APIs - Real-time 2D/3D graph visualization - Enhanced file and settings management - Improved querying capabilities 2. Expanded compatibility: - LLM agnostic: Support for Ollama and custom base URLs/models - Flexible embedding options 3. New features: - Custom configurable graph visuals - Preset query/indexing library options - Improved error handling and logging 4. Documentation updates: - Revised installation and setup instructions - Updated usage guide for new UI features - Expanded troubleshooting section 5. Backend improvements: - Numerous bug fixes and optimizations - Updated dependencies and requirements 6. Roadmap updates: - Added recently completed features - Outlined upcoming planned improvements 7. File changes: - Major updates to app.py and other core files - New cache and output files from testing - Updated settings.yaml and requirements.txt 8. Removed visualize-graphml.py as functionality is now integrated into the main app
2 parents 5f2e3ba + 45170bb commit 5c26db0

File tree

603 files changed

+9705
-448
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

603 files changed

+9705
-448
lines changed

.DS_Store

0 Bytes
Binary file not shown.

README.md

Lines changed: 131 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,92 +1,181 @@
1-
# 🕸️ GraphRAG Local with Ollama and Gradio UI
1+
# 🕸️ GraphRAG Local with Interactive UI
22

3-
Welcome to **GraphRAG Local with Ollama and Interactive UI**! This is an adaptation of Microsoft's [GraphRAG](https://github.com/microsoft/graphrag), tailored to support local models using Ollama and featuring a new interactive user interface.
4-
5-
*NOTE: The app gained traction much quicker than I anticipated so I am working to get any found bugs fixed and suggested improvements integrated. Right now it is fully functional, tested only on my Mac Studio M2 though.
6-
7-
The next update is a major refactor and overall improvement in all the areas so it has unfortunately taken a bit longer than expected. I am trying to be fluid with the adjustments and so will try to update as often as possible.
8-
9-
Changes being made right now and tested:
10-
11-
- LLM agnostic: Use Ollama or set your own base URL and local model for LLM and Embedder
12-
- Launch your own GraphRAG API server so you can use the functions in your own external app
13-
- Dockerfile for easier deployment
14-
- Experimental: Mixture of Agents for Indexing/Query of knowledge graph
15-
- Custom configurable graph visuals
16-
- Preset Query/Indexing library options to quickly and easily harness all the GraphRAG args
17-
- More file formats (CSV, PDF, etc)
18-
- Web search/Scraping
19-
20-
Feel free to open an Issue if you run into an error and I will try to address it ASAP so you don't run into any downtime*
3+
Welcome to **GraphRAG Local with Interactive UI**! This is an adaptation of Microsoft's [GraphRAG](https://github.com/microsoft/graphrag), tailored to support local models and featuring a comprehensive interactive user interface.
214

225
## 📄 Research Paper
236

247
For more details on the original GraphRAG implementation, please refer to the [GraphRAG paper](https://arxiv.org/pdf/2404.16130).
258

269
## 🌟 Features
2710

28-
- **Local Model Support:** Leverage local models with Ollama for LLM and embeddings.(I highly recommend using the 'sciphi/triplex' Ollama model for Indexing your data) ('ollama pull sciphi/triplex')
29-
- **Cost-Effective:** Eliminate dependency on costly OpenAI models.
11+
- **Local Model Support:** Leverage local models for LLM and embeddings, including compatibility with Ollama and OpenAI-compatible APIs.
12+
- **Cost-Effective:** Eliminate dependency on costly cloud-based models by using your own local models.
3013
- **Interactive UI:** User-friendly interface for managing data, running queries, and visualizing results.
31-
- **Real-time Graph Visualization:** Visualize your knowledge graph in 3D using Plotly.
14+
- **Real-time Graph Visualization:** Visualize your knowledge graph in 2D or 3D using Plotly.
3215
- **File Management:** Upload, view, edit, and delete input files directly from the UI.
3316
- **Settings Management:** Easily update and manage your GraphRAG settings through the UI.
3417
- **Output Exploration:** Browse and view indexing outputs and artifacts.
3518
- **Logging:** Real-time logging for better debugging and monitoring.
19+
- **Flexible Querying:** Support for global, local, and direct chat queries with customizable parameters.
3620

3721
![GraphRAG UI](ui.png)
3822

23+
## 🗺️ Roadmap
24+
25+
**Important Note:** GraphRAG Local UI is currently a major work in progress. As we strive to make the application more stable with local LLMs, users should expect to encounter some bugs. We appreciate your patience and feedback during this development phase.
26+
27+
*The app gained traction much quicker than anticipated, so we are actively working to fix bugs and integrate suggested improvements. While it is currently functional, it has been primarily tested on a Mac Studio M2.*
28+
29+
My vision for GraphRAG Local UI is to become the ultimate GraphRAG app for local LLMs, incorporating as many cool features and knowledge graph tools as possible. I am continuously working on improvements and new features.
30+
31+
### Recent Updates
32+
- [x] LLM agnostic: Use Ollama or set your own base URL and local model for LLM and Embedder
33+
- [x] Custom configurable graph visuals
34+
- [x] Preset Query/Indexing library options to quickly and easily harness all the GraphRAG args
35+
36+
### Upcoming Features
37+
- [ ] Dockerfile for easier deployment
38+
- [ ] Launch your own GraphRAG API server for use in external applications
39+
- [ ] Experimental: Mixture of Agents for Indexing/Query of knowledge graph
40+
- [ ] Support for more file formats (CSV, PDF, etc.)
41+
- [ ] Web search/Scraping capabilities
42+
- [ ] Enhanced error handling and user feedback
43+
- [ ] Improved performance and scalability
44+
- [ ] Advanced graph analysis tools
45+
- [ ] Integration with popular knowledge management tools
46+
- [ ] Collaborative features for team-based knowledge graph building
47+
48+
I am committed to making GraphRAG Local UI the most comprehensive and user-friendly tool for working with knowledge graphs and local language models. Your feedback and suggestions are much needed in shaping the future of this project.
49+
50+
Feel free to open an Issue if you run into an error, and we will try to address it as soon as possible to minimize any downtime you might experience.
51+
3952
## 📦 Installation and Setup
4053

41-
Follow these steps to set up and run GraphRAG Local with Ollama and Interactive UI:
54+
Follow these steps to set up and run GraphRAG Local with Interactive UI:
4255

4356
1. **Create and activate a new conda environment:**
4457
```bash
45-
conda create -n graphrag-ollama -y
46-
conda activate graphrag-ollama
58+
conda create -n graphrag-local -y
59+
conda activate graphrag-local
4760
```
4861

49-
2. **Install Ollama:**
50-
Visit [Ollama's website](https://ollama.com/) for installation instructions.
51-
52-
4. **Install the required packages:**
62+
2. **Install the required packages:**
5363
```bash
5464
pip install -r requirements.txt
5565
```
5666

57-
4. **Launch the interactive UI:**
67+
3. **Launch the interactive UI:**
5868
```bash
5969
gradio app.py
6070
```
71+
6172
or
6273

6374
```bash
6475
python app.py
6576
```
6677

67-
6. **Using the UI:**
68-
- Once the UI is launched, you can perform all necessary operations through the interface.
69-
- This includes initializing the project, managing settings, uploading files, running indexing, and executing queries.
70-
- The UI provides a user-friendly way to interact with GraphRAG without needing to run command-line operations.
78+
4. **Access the UI:**
79+
Open your web browser and navigate to `http://localhost:7860` to access the GraphRAG Local UI.
80+
81+
## 🖥️ Using the GraphRAG Local UI
82+
83+
### Data Management
84+
85+
1. **File Upload:**
86+
- Navigate to the "Data Management" tab.
87+
- Use the "File Upload" section to upload .txt files to the input directory.
88+
89+
2. **File Management:**
90+
- View, edit, and delete uploaded files in the "File Management" section.
91+
- Use the "Refresh File List" button to update the list of available files.
92+
93+
### Indexing
94+
95+
1. **Configure Indexing:**
96+
- Go to the "Indexing" tab.
97+
- Set the root directory (default is "./ragtest").
98+
- Optionally upload a config file.
99+
- Adjust other parameters like verbosity, caching, and output formats.
71100

72-
Note: The UI now handles all the operations that were previously done through command-line instructions, making the process more streamlined and user-friendly.
101+
2. **Run Indexing:**
102+
- Click "Run Indexing" to start the indexing process.
103+
- Monitor progress in real-time through the output box and progress bar.
104+
- Use "Stop Indexing" if you need to halt the process.
105+
106+
### KG Chat/Outputs
107+
108+
1. **Explore Indexed Data:**
109+
- Select an output folder from the dropdown.
110+
- Browse through the folder contents and view file information and content.
111+
112+
2. **Visualize Graph:**
113+
- Select a GraphML file from the output folder.
114+
- Click "Visualize Graph" to generate a 2D or 3D visualization of your knowledge graph.
115+
- Customize the visualization using the "Visualization Settings" accordion.
116+
117+
### LLM Settings
118+
119+
1. **Configure LLM and Embeddings:**
120+
- Set API base URLs and keys for both LLM and embeddings.
121+
- Choose the service type (OpenAI-compatible or Ollama).
122+
- Select models from the dropdown or refresh the list.
123+
124+
2. **Adjust Parameters:**
125+
- Set the system message, context window, temperature, and max tokens.
126+
- Click "Update LLM Settings" to save your changes.
127+
128+
### Querying
129+
130+
1. **Choose Query Type:**
131+
- Select between global, local, or direct chat queries.
132+
133+
2. **Select Preset or Custom Options:**
134+
- Choose a preset query option or customize your query parameters.
135+
136+
3. **Enter Your Query:**
137+
- Type your query in the input box and click "Send Query" or press Shift+Enter.
138+
139+
4. **View Results:**
140+
- See the chat history and responses in the chat interface.
141+
142+
### Other Settings
143+
144+
- Adjust additional GraphRAG settings in the "YAML Settings" tab as needed.
73145

74146
## 🛠️ Customization
75147

76-
Users can experiment by changing the models in the `settings.yaml` file. The LLM model expects language models like llama3, mistral, phi3, etc., and the embedding model section expects embedding models like mxbai-embed-large, nomic-embed-text, etc., which are provided by Ollama. You can find the complete list of models provided by Ollama [here](https://ollama.com/library).
148+
Users can experiment with different models and settings:
149+
150+
- For OpenAI-compatible APIs: Use any model compatible with the OpenAI API format.
151+
- For Ollama: Use models like llama2, mistral, phi-2, etc. Find the complete list of Ollama models [here](https://ollama.com/library).
77152

78153
## 📊 Visualization
79154

80-
The UI now includes a 3D graph visualization feature. To use it:
155+
The UI includes a 2D/3D graph visualization feature:
156+
157+
1. Run indexing on your data.
158+
2. Go to the "KG Chat/Outputs" tab.
159+
3. Select the latest output folder and navigate to the GraphML file.
160+
4. Click the "Visualize Graph" button.
161+
5. Customize the visualization using the settings provided.
162+
163+
## 🚀 Advanced Usage
81164

82-
1. Run indexing on your data
83-
2. Go to the "Indexing Outputs" tab
84-
3. Select the latest output folder and navigate to the GraphML file
85-
4. Click the "Visualize Graph" button
165+
- **Custom CLI Arguments:** In the query interface, advanced users can add custom CLI arguments for more granular control over the query process.
86166

87167
## 📚 Citations
88168

89169
- Original GraphRAG repository by Microsoft: [GraphRAG](https://github.com/microsoft/graphrag)
90-
- Ollama: [Ollama](https://ollama.com/)
91170

92171
---
172+
173+
## Troubleshooting
174+
175+
- If you can't run `gradio app.py`, try running `pip install --upgrade gradio` and then exit out and start a new terminal. It should then load and launch properly as a Gradio app
176+
177+
- On Windows, if you run into and encoding/UTF error, you can change it to the correct format in the YAML Settings menu
178+
179+
- Indexing Errors: These are still tough to debug a track down as it is dependant on your specific pipeline of llms and embedders. Right now it seems to call /v1/embeddings no matter what in the Index workflow, but I think I found a workaround that allows Ollama and other local options. I'll keep trying to reenforce the Indexing process to make it more stable and robust.
180+
181+
For any issues or feature requests, please open an issue on the GitHub repository. Happy knowledge graphing!

0 commit comments

Comments
 (0)