A pure Rust + WASM implementation for BERT inference with minimal dependencies
-
Updated
Oct 23, 2025 - Rust
A pure Rust + WASM implementation for BERT inference with minimal dependencies
A comprehensive comparative analysis system that implements and evaluates two approaches for answering questions based on company financial statements
The Sankalpiq Foundation AI Automation Suite is a comprehensive microservices-based Multiagent designed to automate critical NGO operations through intelligent micro agents. The solution addresses operational bottlenecks by implementing specialized micro-agents that handle specific organizational functions while maintaining seamless integration.
A minimalistic Android app showcasing semantic search using ObjectBox and Lucene KNN, leveraging the MiniLM-L6-V2 embedding model and bert_vocab.txt for efficient retrieval.
VALORA AI is a Multimodal Pricing Prediction Model that uses textual and visual data to make precise predictions on product prices
AI-powered academic planner orchestrating LangChain workflows for task analysis, calendar integration, and natural language planning, with multi-objective optimization for adaptive, schedules.
Chat with your PDFs using FastAPI, LangChain, Qdrant, and Gemini — an async RAG pipeline built with love
Detect and analyze resource leaks in Android apps using method signature matching and NLP-based similarity detection.
ChefMate is an AI-driven recipe assistant that uses Retrieval-Augmented Generation and a local LLM to provide context-aware cooking suggestions. It supports ingredient queries, recipe generation, and dietary filters through semantic search and real-time chat.
An AI-powered study companion that helps students understand lecture material through intelligent question answering, slide summarization, PDF summaries, and flashcard generation. Built with LangChain, Hugging Face Transformers, and Gradio — and fully powered by open-source LLMs running on your local GPU.
Add a description, image, and links to the minilm-l6-v2 topic page so that developers can more easily learn about it.
To associate your repository with the minilm-l6-v2 topic, visit your repo's landing page and select "manage topics."