A complete AI tools suite with React interface and Flask backend for testing Large Language Models (LLM) using llama.cpp
- π Features
- ποΈ Architecture
- π¦ Installation
- π§ Configuration
- π― Usage
- π οΈ Available Tools
- π Project Structure
- π API
- π§ͺ Development
- π€ Contributing
- π License
- π Total Privacy: Works completely offline, without sending data to third parties
- π¨ Modern Interface: Responsive React frontend with modern design
- π Modular Architecture: Auto-discoverable tool system with dynamic registration
- β‘ Real Time: Bidirectional communication with WebSockets
- π§ Multiple Models: Support for any GGUF model
- π Integrated RAG: Retrieval Augmented Generation system included
- π οΈ AI Tools: Complete set of tools for the assistant with automatic discovery
- π Internet Search: Serper API integration
- π Advanced Search: Specialized searches with Google dorks
- π₯ Video Search: YouTube API integration
- π° Cryptocurrency Prices: Real-time information
- πΌοΈ Image Generation: AI image generation tool with Gradle API integration
- π IP Information: Geographic data retrieval
IALab-Suite/
βββ π₯οΈ Frontend-React/ # Modern user interface
βββ π§ Backend-API/ # Flask API with WebSockets
βββ π οΈ tools/ # Modular tools
βββ π chats/ # Conversation storage
- π§ Cortex: AI-powered tool processing engine
- π€ Assistant: Main AI assistant logic
- π RAG: Retrieval Augmented Generation system
- π API React: Flask server serving the React frontend
- π‘ SocketIO: Real-time communication
- π οΈ ToolRegistry: Automatic tool discovery and registration system
- Python 3.8+
- Node.js 16+
- npm or yarn
- Git
git clone https://github.com/your-username/IALab-Suite.git
cd IALab-Suite# Create virtual environment
python -m venv venv
# Activate virtual environment
# Windows
venv\Scripts\activate
# macOS/Linux
source venv/bin/activate
# Install Python dependencies
pip install -r requirements.txtFor CPU only:
pip install --upgrade --quiet llama-cpp-pythonFor GPU with CUDA:
CMAKE_ARGS="-DGGML_CUDA=ON" FORCE_CMAKE=1 pip install --upgrade --force-reinstall llama-cpp-python --no-cache-dirFor macOS with Metal (M1/M2/M3):
CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install --upgrade --force-reinstall llama-cpp-python --no-cache-dircd Frontend-React
npm install
npm run build
cd ..# Create .env file in Backend-API/
cp Backend-API/.env.example Backend-API/.env
# Edit Backend-API/.env with your API keys
YOUTUBE_API_KEY=your_youtube_api_key
SERPER_API_KEY=your_serper_api_key# Create models directory
mkdir -p Backend-API/models/llama
# Download example model (Llama 2 7B)
wget -O Backend-API/models/llama/llama-2-7b-chat.Q8_0.gguf \
"https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGUF/resolve/main/llama-2-7b-chat.Q8_0.gguf"-
YouTube API Key:
- Go to Google Cloud Console
- Create a project and enable YouTube Data API v3
- Create an API key
-
Serper API Key:
- Sign up at Serper.dev
- Get your free API key
Edit Backend-API/.env:
YOUTUBE_API_KEY=your_youtube_api_key_here
SERPER_API_KEY=your_serper_api_key_here# From the project root
python build_frontend.py # Terminal 1: Start the backend server
cd Backend-API
python start_server.py
# Terminal 2: Start the React development server
cd Frontend-React
npm start- Development Mode: Open your browser and go to:
http://localhost:3000(React dev server)
- Select a model from the dropdown menu
- Type your question in the chat field
- Press Enter or click send
- Watch the tools execute in real-time
- Receive the complete response with enriched information
# Example of automatic usage
"What's the weather in Madrid today?"# The assistant will search for videos automatically
"Show me videos about Python programming"# Query real-time prices
"What's the current price of Bitcoin and Ethereum?"# Generate images with AI
"Generate an image of a space cat"# Get geographic information
"Where is the IP 8.8.8.8 from?"# Specialized searches
"Search for technical information about vulnerabilities"IALab-Suite/
βββ π Backend-API/
β βββ π§ Cortex.py # Tool processing engine
β βββ π€ Assistant.py # Main assistant logic
β βββ π Rag.py # RAG system
β βββ π Api_react.py # Flask API for React
β βββ π‘ SocketResponseHandler.py # WebSocket handling
β βββ π start_server.py # Main server
β βββ π tools/ # Modular auto-discoverable tools
β β βββ π οΈ base_tool.py # Base tool interface
β β βββ π§ tool_registry.py # Auto-discovery registry
β β βββ π search_tools.py # Internet search
β β βββ π₯ video_search_tool.py # Video search
β β βββ π° cripto_price.py # Cryptocurrency prices
β β βββ πΌοΈ generate_image.py # Image generation
β β βββ π ip_info_tool.py # IP information
β β βββ π advanced_search.py # Advanced search
β βββ π templates/ # HTML templates
β βββ π static/ # Static files
β βββ π models/ # LLM models
β βββ π documents/ # Documents for RAG
β βββ π chats/ # Conversation history
βββ π Frontend-React/
β βββ π src/
β β βββ π± App.js # Main component
β β βββ π components/ # React components
β β βββ π context/ # State contexts
β β βββ π hooks/ # Custom hooks
β β βββ π services/ # API services
β βββ π public/ # Public files
β βββ π build/ # Production build
β βββ π¦ package.json # Node.js dependencies
βββ π README.md # This file
βββ π requirements.txt # Python dependencies
βββ π LICENSE # MIT License
message: Send message to assistantdisconnect: Disconnect from server
response: Assistant response (streaming)console_output: Console output (tools)utilities_data: Utilities data (videos, images)finalization_signal: Completion signal
GET /api/system-infoGET /api/models
POST /api/load-modelGET /api/chats
POST /api/save-chat
DELETE /api/delete-chatcd Frontend-React
npm startThis will start the React development server at http://localhost:3000
cd Backend-API
python start_server.pyThis will start the Flask API server at http://localhost:8081
For development, you'll need both servers running:
- Backend:
http://localhost:8081(Flask API) - Frontend:
http://localhost:3000(React dev server)
The React development server will proxy API requests to the Flask backend automatically.
The system now uses an automatic tool discovery and registration system. All tools inherit from BaseTool and are automatically discovered and registered.
- Create tool file in
Backend-API/tools/my_new_tool.py - Inherit from BaseTool and implement required methods
- The tool will be automatically discovered and registered π
Every tool must:
- β
Inherit from
BaseTool - β
Implement
metadataproperty - β
Implement
get_tool_name()class method - β
Implement
execute()method
# Backend-API/tools/my_new_tool.py
from .base_tool import BaseTool, ToolMetadata, ToolCategory
class MyNewTool(BaseTool):
@property
def metadata(self) -> ToolMetadata:
return ToolMetadata(
name="my_new_tool",
description="Description of what my tool does",
category=ToolCategory.UTILITY, # SEARCH, FINANCE, IMAGE, UTILITY, MEDIA
requires_api_key=False, # Set to True if API key needed
api_key_env_var="MY_API_KEY" # Only if requires_api_key=True
)
@classmethod
def get_tool_name(cls) -> str:
return "my_new_tool"
def execute(self, query: str, **kwargs):
"""
Execute the tool with the given query
Args:
query (str): Input query from the user
**kwargs: Additional parameters
Returns:
Any: Tool result (string, dict, tuple, etc.)
"""
# Your tool logic here
result = f"Processing: {query}"
# You can return different types:
# - String: Simple text result
# - Dict: Structured data
# - Tuple: (result, additional_data) for special handling
return result
def is_available(self) -> bool:
"""
Check if the tool is available (has required API keys, etc.)
Returns:
bool: True if tool can be used
"""
# Default implementation checks API key if required
# Override if you need custom availability logic
return super().is_available()from .base_tool import ToolCategory
# Available categories:
ToolCategory.SEARCH # Search and information retrieval
ToolCategory.FINANCE # Financial data and cryptocurrency
ToolCategory.IMAGE # Image generation and processing
ToolCategory.UTILITY # General utilities (IP info, etc.)
ToolCategory.MEDIA # Video, audio, media searchFor tools that need special result handling:
def execute(self, query: str, **kwargs):
# For tools that return additional data (like video IDs)
result_text = f"Found videos for: {query}"
video_ids = ["abc123", "def456"]
# Return tuple for special handling in Cortex
return (result_text, video_ids)For tools with API key validation:
def is_available(self) -> bool:
"""Custom availability check"""
api_key = os.getenv(self.metadata.api_key_env_var)
if not api_key:
return False
# Additional validation logic
return self._validate_api_key(api_key)Once you create your tool file:
- Automatic Registration: The
ToolRegistrywill automatically find and register your tool - Immediate Availability: Your tool becomes available to the AI assistant
- No Manual Configuration: No need to modify
Cortex.pyor other files
# Test your tool directly
from tools.my_new_tool import MyNewTool
tool = MyNewTool()
result = tool.execute("test query")
print(result)The system provides runtime information about all tools:
from tools.tool_registry import ToolRegistry
registry = ToolRegistry.get_instance()
# List all registered tools
print("Available tools:", registry.list_tools())
# Get tool information
tool_info = registry.get_tool_info("my_new_tool")
print("Tool info:", tool_info)
# Execute tool through registry
result = registry.execute_tool("my_new_tool", "test query")
print("Execution result:", result.data if result.success else result.error)Contributions are welcome! π
- π΄ Fork the project
- πΏ Create a branch for your feature:
git checkout -b feature/AmazingFeature - πΎ Commit your changes:
git commit -m 'Add some AmazingFeature' - π€ Push to the branch:
git push origin feature/AmazingFeature - π Open a Pull Request
Open an issue describing:
- π Bug description
- π Steps to reproduce
- π± Environment (OS, Python version, etc.)
- πΈ Screenshots if relevant
Open an issue with:
- π Detailed description
- π― Use cases
- π Expected benefits
- π» macOS: MacBook Pro M3-Pro (11 CPU cores, 14 GPU cores, 18GB RAM)
- π§ Linux: AMD Ryzen 5600X, NVIDIA RTX 3060 (12GB VRAM, 32GB RAM)
- πͺ Windows: Various configurations
- Python: 3.8 - 3.12
- Node.js: 16+
- React: 18.2+
- Flask: 2.0+
- π llama.cpp Documentation
- π€ GGUF Models on Hugging Face
- π₯ YouTube Data API
- π Serper API
If this project has been useful to you, consider:
- β Giving the repository a star
- π Reporting bugs or improvements
- π Contributing with code
- β Buy me a coffee:
This project is licensed under the MIT License - see the LICENSE file for more details.
π€ Made with β€οΈ for the AI community
β If you like this project, give it a star! β

