- π Overview
- π¦ Features
- π Structure
- π» Installation
- ποΈ Usage
- π Hosting
- π License
- π Authors
This repository hosts the backend for the AI Powered OpenAI Request Wrapper MVP. It's a Python application designed to simplify interactions with OpenAI's powerful language models. The MVP functions as a bridge between users and the OpenAI API, making it easier to leverage AI for diverse tasks.
| Feature | Description |
|---|---|
| Request Handling | Accepts user requests for OpenAI API interactions, including model selection, prompt, and parameters. |
| API Call Generation | Translates user requests into properly formatted OpenAI API calls, ensuring accurate encoding of model selection, prompt, parameters, and authentication details. |
| Response Processing | Parses and formats responses from the OpenAI API for easy understanding by the user, handling various response formats and extracting relevant information. |
| Database Integration | Stores API keys and user preferences in a PostgreSQL database for efficient and personalized interactions. |
| Authentication | Uses JWTs for secure user authentication and authorization, ensuring safe access to the application and its resources. |
βββ commands.json # Defines available commands for the application
βββ .env # Stores environment variables securely
βββ README.md # This file
βββ requirements.txt # Lists all Python dependencies
βββ startup.sh # Script for setting up the application environment
βββ main.py # Main application entry point
βββ src
βββ __init__.py # Application initialization
βββ routers
βββ request_router.py # Handles requests to the /request endpoint
βββ services
βββ openai_service.py # Manages interactions with the OpenAI API
βββ models
βββ request.py # Database model for storing requests and responses
βββ schemas
βββ request_schema.py # Pydantic schemas for validating API requests and responses
βββ database
βββ database.py # Manages the database connection
βββ models.py # Defines the database models for the application
βββ utils
βββ logger.py # Provides a centralized logging system
βββ tests
βββ test_openai_service.py # Unit tests for the openai_service
βββ test_request_router.py # Unit tests for the request_router
- Python 3.9+
- PostgreSQL 14+
- Docker
- Clone the repository:
git clone https://github.com/coslynx/OpenAI-Request-Wrapper-MVP.git cd OpenAI-Request-Wrapper-MVP - Install dependencies:
pip install -r requirements.txt
- Create the database:
createdb openai_request_wrapper
- Create the
pgcryptoextension in the database:psql -U postgres -d openai_request_wrapper -c "CREATE EXTENSION IF NOT EXISTS pgcrypto" - Set up environment variables:
Replace placeholders in
cp .env.example .env
.envwith your own values:OPENAI_API_KEY: Your OpenAI API keyDATABASE_URL: Your PostgreSQL database connection stringJWT_SECRET: Your secret key for JWT token generation
- Start the application:
python main.py
For deployment, consider using a platform like Heroku or AWS.
Heroku Deployment:
- Install Heroku CLI:
npm install -g heroku
- Login to Heroku:
heroku login
- Create a new Heroku app:
heroku create openai-request-wrapper-production
- Set up environment variables:
heroku config:set OPENAI_API_KEY=YOUR_OPENAI_API_KEY heroku config:set DATABASE_URL=your_database_url_here heroku config:set JWT_SECRET=your_secret_key
- Deploy the code:
git push heroku main
This Minimum Viable Product (MVP) is licensed under the GNU AGPLv3 license.
- Drix10
- Kais Radwan
Create Your Custom MVP in Minutes With CosLynxAI!