The Eligibility Signposting API and Eligibility Data Product are designed to be the single source of the truth for providing trustworthy assessments of an individual’s eligibility based upon clinically assured data sources, for one or more vaccinations. It will also provide details about whether the individual can book/get a vaccine and how they can get vaccinated where relevant.
Initially this will support inclusion of eligibility for Respiratory Syncytial Virus (RSV) vaccination for older adults within the NHS App (Vaccinations in the App).
The software will only be used for signposting an individual to an appropriate service. Ultimately the eligibility for a particular vaccination will be decided by a healthcare professional at the point of care prior to giving the vaccination.
First, ensure Prerequisites are met. Then clone the repository, and install dependencies.
git clone https://github.com/NHSDigital/eligibility-signposting-api.git
cd eligibility-signposting-api
make dependencies install-pythonThe following software packages, or their equivalents, are expected to be installed and configured:
- Python version 3.13. (It may be easiest to use pyenv to manage Python versions.)
- Docker container runtime or a compatible tool, e.g. colima or Podman,
- asdf version manager,
- GNU make 3.82 or later,
Note
The version of GNU make available by default on macOS is earlier than 3.82. You will need to upgrade it or certain make tasks will fail. On macOS, you will need Homebrew installed, then to install make, like so:
brew install makeYou will then see instructions to fix your $PATH variable to make the newly installed version available. If you are using dotfiles, this is all done for you.
- GNU sed and GNU grep are required for the scripted command-line output processing,
- GNU coreutils and GNU binutils may be required to build dependencies like Python, which may need to be compiled during installation,
Note
For macOS users, installation of the GNU toolchain has been scripted and automated as part of the dotfiles project. Please see this script for details.
| Variable | Default | Description |
|---|---|---|
AWS_ACCESS_KEY_ID |
dummy_key |
AWS Access Key |
AWS_DEFAULT_REGION |
eu-west-1 |
AWS Region |
AWS_SECRET_ACCESS_KEY |
dummy_secret |
AWS Secret Access Key |
DYNAMODB_ENDPOINT |
http://localhost:4566 |
Endpoint for the app to access DynamoDB |
S3_ENDPOINT |
http://localhost:4566 |
Endpoint for the app to access S3 |
PERSON_TABLE_NAME |
test_eligibility_datastore |
AWS DynamoDB table for person data. |
LOG_LEVEL |
WARNING |
Logging level. Must be one of DEBUG, INFO, WARNING, ERROR or CRITICAL as per Logging Levels |
RULES_BUCKET_NAME |
test-rules-bucket |
AWS S3 bucket from which to read rules. |
| Variable | Default | Description | Comments |
|---|---|---|---|
AWS_DEFAULT_REGION |
eu-west-1 |
AWS Region | |
AWS_ACCESS_KEY_ID |
None | AWS Access Key | AWS_ACCESS_KEY_ID is set to None, because it is provided by the AWS environment automatically. |
AWS_SECRET_ACCESS_KEY |
None | AWS Secret Access Key | AWS_SECRET_ACCESS_KEY is set to None, because it is provided by the AWS environment automatically. |
DYNAMODB_ENDPOINT |
None | Endpoint for the app to access DynamoDB | DYNAMODB_ENDPOINT are set to None, since we are using aws service default endpoints which are provided automatically. |
S3_ENDPOINT |
None | Endpoint for the app to access S3 | S3_ENDPOINT are set to None, since we are using aws service default endpoints which are provided automatically. |
PERSON_TABLE_NAME |
test_eligibility_datastore |
AWS DynamoDB table for person data. | |
LOG_LEVEL |
WARNING |
Logging level. Must be one of DEBUG, INFO, WARNING, ERROR or CRITICAL as per Logging Levels |
|
RULES_BUCKET_NAME |
test-rules-bucket |
AWS S3 bucket from which to read rules. |
After a successful installation, provide an informative example of how this project can be used. Additional code snippets, screenshots and demos work well in this space. You may also link to the other documentation resources, e.g. the User Guide to demonstrate more use cases and to show more features.
Run all tests and linting:
make precommitThere are make tasks for you to configure to run your tests. Run make test to see how they work. You should be able to use the same entry points for local development as in your CI pipeline.
See the specification repository for details on how to publish both the specification and sandbox.
If you have previously built yanai, which is the platform we use to supply data to this project, that uses an old version of localstack that does not support our Python version. We have pinned the correct version here and yanai have their version pinned as well so it should work fine, but sometimes issues can arise - if so then removing the docker image can solve that, before then rebuilding.
docker rmi localstack/localstackA Postman collection can be generated from the Open API specification in specification/ by running the following make command:
make convert-postmanThe conversion is done using the Portman CLI. The resulting Postman collection
is saved to specification/postman/.
We'll be separating our presentation layer (where API logic lives, in views/), business services layer (where business logic lives, in services/) and repository layer (where database logic lives, repos/). We will be using wireup for dependency injection, so services get their dependencies given to them ("injection"), and wireup takes care of that. (We'll usually use the @service annotation, but factory functions will be used where necessary, typically for creating resources from 3rd party libraries.) We'll be using Pydantic for both response models and database models.
app.py is the best place to start exploring the code.
Local tests will use localstack, started & stopped using pytest-docker. We'll make extensive use of pytest fixtures, builders and matchers to keep our tests clean.
graph TB
subgraph "System Context"
direction TB
Client["NHS App / Client"]
Consumer["Postman / Consumer"]
API["Eligibility Signposting API"]
AWS["AWS"]
end
Client -->|"HTTP Request"| API
Consumer -->|"HTTP Request"| API
API -->|"Deployed on"| AWS
subgraph "Container Diagram"
direction TB
subgraph "AWS Infrastructure"
direction TB
APIGW["API Gateway"]
Lambda["Python Lambda (app.py)"]
DynamoDB["DynamoDB Table"]
S3Bucket["S3 Bucket (rules)"]
IAM["IAM Roles & Policies"]
end
subgraph "CI/CD Pipeline"
direction TB
GH["GitHub Actions"]
TF["Terraform"]
end
end
Client -->|"HTTPS POST /eligibility"| APIGW
APIGW -->|"Invoke"| Lambda
Lambda -->|"GetItem/PutItem"| DynamoDB
Lambda -->|"GetObject"| S3Bucket
Lambda -->|"Uses"| IAM
GH -->|"runs pipelines"| TF
TF -->|"provisions"| APIGW
TF -->|"provisions"| DynamoDB
TF -->|"provisions"| S3Bucket
TF -->|"provisions"| IAM
subgraph "Eligibility Lambda Function - Components"
direction TB
App["app.py (WireUp DI)"]
Config["config.py, error_handler.py"]
subgraph "Audit Layer"
direction TB
Audit["audit/audit_service.py"]
AuditModels["audit/audit_models.py"]
end
subgraph "Validation Layer"
direction TB
Validator["common/request_validator.py"]
ApiErrResp["common/api_error_response.py"]
end
subgraph "Presentation Layer"
direction TB
View["views/eligibility.py"]
ResponseModel["views/response_model/eligibility_response.py"]
end
subgraph "Business Logic Layer"
direction TB
Service["services/eligibility_services.py"]
Operators["services/operators/operators.py"]
end
subgraph "Data Access Layer"
direction TB
PersonRepo["repos/person_repo.py"]
CampaignRepo["repos/campaign_repo.py"]
Factory["repos/factory.py, exceptions.py"]
end
subgraph "Models"
direction TB
ModelElig["model/eligibility_status.py"]
ModelRules["model/campaign_config.py"]
end
end
Lambda -->|"loads"| App
App -->|injects| View
View -->|calls| Service
View -->|validates via| Validator
View -->|audits via| Audit
View -->|uses| RespModel
Audit -->|uses| AuditModels
Validator -->|uses| ApiErrResp
Service -->|calls| Operators
Service -->|calls| PersonRepo
Service -->|calls| CampaignRepo
PersonRepo -->|uses| DynamoDB
CampaignRepo -->|uses| S3Bucket
App -->|reads| Config
App -->|wires| Factory
Service -->|uses| ModelElig
Operators -->|uses| ModelRules
Describe or link templates on how to raise an issue, feature request or make a contribution to the codebase. Reference the other documentation files, like
- Environment setup for contribution, i.e.
CONTRIBUTING.md - Coding standards, branching, linting, practices for development and testing
- Release process, versioning, changelog
- Backlog, board, roadmap, ways of working
- High-level requirements, guiding principles, decision records, etc.
See DEPLOYMENT.md for details on how to cut release candidates and promote them to production.
Please contact the team on Slack
The LICENCE.md file will need to be updated with the correct year and owner
Unless stated otherwise, the codebase is released under the MIT License. This covers both the codebase and any sample code in the documentation.
Any HTML or Markdown documentation is © Crown Copyright and available under the terms of the Open Government Licence v3.0.