not-again-ai is a collection of various building blocks that come up over and over again when developing AI products. The key goals of this package are to have simple, yet flexible interfaces and to minimize dependencies. It is encouraged to also a) use this as a template for your own Python package. b) instead of installing the package, copy and paste functions into your own projects. We make this easier by limiting the number of dependencies and use an MIT license.
Documentation available within individual notebooks or docstrings within the source code.
Requires: Python 3.11, or 3.12 which can be installed with uv by running the command uv python install 3.12
Install the entire package from PyPI with:
$ pip install not_again_ai[data,llm,statistics,viz]
The package is split into subpackages, so you can install only the parts you need.
pip install not_again_ai
pip install not_again_ai[data]
crawl4ai-setup
to run crawl4ai post-installation setup.- Set the
BRAVE_SEARCH_API_KEY
environment variable to use the Brave Search API for web data extraction.- Get the API key from https://api-dashboard.search.brave.com/app/keys. You must have at least the Free "Data for Search" subscription.
pip install not_again_ai[llm]
- Setup OpenAI API
- Go to https://platform.openai.com/settings/profile?tab=api-keys to get your API key.
- (Optional) Set the
OPENAI_API_KEY
and theOPENAI_ORG_ID
environment variables.
- Setup Azure OpenAI (AOAI)
- Using AOAI requires using Entra ID authentication. See https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity for how to set this up for your AOAI deployment.
- Requires the correct role assigned to your user account and being signed into the Azure CLI.
- (Optional) Set the
AZURE_OPENAI_ENDPOINT
environment variable.
- Using AOAI requires using Entra ID authentication. See https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity for how to set this up for your AOAI deployment.
- If you wish to use Ollama:
- Follow the instructions at https://github.com/ollama/ollama to install Ollama for your system.
- (Optional) Add Ollama as a startup service (recommended)
- (Optional) To make the Ollama service accessible on your local network from a Linux server, add the following to the
/etc/systemd/system/ollama.service
file which will make Ollama available athttp://<local_address>:11434
:[Service] ... Environment="OLLAMA_HOST=0.0.0.0"
- It is recommended to always have the latest version of Ollama. To update Ollama check the docs. The command for Linux is:
curl -fsSL https://ollama.com/install.sh | sh
pip install not_again_ai[statistics]
pip install not_again_ai[viz]
This package uses uv to manage dependencies and isolated Python virtual environments.
To proceed, install uv globally onto your system.
To install a specific version of Python:
uv python install 3.12
Dependencies are defined in pyproject.toml
and specific versions are locked
into uv.lock
. This allows for exact reproducible environments across
all machines that use the project, both during development and in production.
To install all dependencies into an isolated virtual environment:
uv sync --all-extras --all-groups
To upgrade all dependencies to their latest versions:
uv lock --upgrade
This project is designed as a Python package, meaning that it can be bundled up and redistributed as a single compressed file.
Packaging is configured by the pyproject.toml
.
To package the project as both a source distribution and a wheel:
$ uv build
This will generate dist/not-again-ai-<version>.tar.gz
and dist/not_again_ai-<version>-py3-none-any.whl
.
Source and wheel redistributable packages can
be published to PyPI or installed
directly from the filesystem using pip
.
uv publish
Automated code quality checks are performed using Nox. Nox
will automatically create virtual environments and run commands based on
noxfile.py
for unit testing, PEP 8 style guide checking, type checking and
documentation generation.
To run all default sessions:
uv run nox
Unit testing is performed with pytest. pytest has become the de facto Python unit testing framework. Some key advantages over the built-in unittest module are:
- Significantly less boilerplate needed for tests.
- PEP 8 compliant names (e.g.
pytest.raises()
instead ofself.assertRaises()
). - Vibrant ecosystem of plugins.
pytest will automatically discover and run tests by recursively searching for folders and .py
files prefixed with test
for any functions prefixed by test
.
The tests
folder is created as a Python package (i.e. there is an __init__.py
file within it)
because this helps pytest
uniquely namespace the test files. Without this, two test files cannot
be named the same, even if they are in different subdirectories.
Code coverage is provided by the pytest-cov plugin.
When running a unit test Nox session (e.g. nox -s test
), an HTML report is generated in
the htmlcov
folder showing each source file and which lines were executed during unit testing.
Open htmlcov/index.html
in a web browser to view the report. Code coverage reports help identify
areas of the project that are currently not tested.
pytest and code coverage are configured in pyproject.toml
.
To run selected tests:
(.venv) $ uv run nox -s test -- -k "test_web"
PEP 8 is the universally accepted style guide for Python
code. PEP 8 code compliance is verified using Ruff. Ruff is configured in the
[tool.ruff]
section of pyproject.toml
.
To lint code, run:
(.venv) $ uv run nox -s lint
To automatically fix fixable lint errors, run:
(.venv) $ uv run nox -s lint_fix
Ruff is used to automatically format code and group and sort imports.
To automatically format code, run:
(.venv) $ uv run nox -s fmt
To verify code has been formatted, such as in a CI job:
(.venv) $ uv run nox -s fmt_check
Type annotations allows developers to include optional static typing information to Python source code. This allows static analyzers such as mypy, PyCharm, or Pyright to check that functions are used with the correct types before runtime.
def factorial(n: int) -> int:
...
mypy is configured in pyproject.toml
. To type check code, run:
(.venv) $ uv run nox -s type_check
PEP 561 defines how a Python package should communicate the presence of inline type annotations to static type checkers. mypy's documentation provides further examples on how to do this.
Mypy looks for the existence of a file named py.typed
in the root of the
installed package to indicate that inline type annotations should be checked.
Check for typos using typos
(.venv) $ uv run nox -s typos
Continuous integration is provided by GitHub Actions. This runs all tests, lints, and type checking for every commit and pull request to the repository.
GitHub Actions is configured in .github/workflows/python.yml
.
Install the Python extension for VSCode.
Install the Ruff extension for VSCode.
Default settings are configured in .vscode/settings.json
which will enable Ruff with consistent settings.
python-blueprint for the Python package skeleton.
This project uses Crawl4AI (https://github.com/unclecode/crawl4ai) for web data extraction.