-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Open
Description
Now that pytorch offers a decent subset of stable APIs, it should be relatively straightforward to have mamba be ABI stable with libtorch by completing the following steps:
- Switch from using PYBIND to TORCH_LIBRARY (this may also allow mamba to be CPython agnostic) in https://github.com/state-spaces/mamba/blob/main/csrc/selective_scan/selective_scan.cpp. For an example on how to use TORCH_LIBRARY: https://docs.pytorch.org/tutorials/advanced/cpp_custom_ops.html#defining-an-operator
- Go through [selective_scan.cpp (https://github.com/state-spaces/mamba/blob/main/csrc/selective_scan/selective_scan.cpp) and replace its torch APIs with versions in
torch/headeronlyandtorch/csrc/stable, similar to what is done in https://github.com/ABI stable fa3 Dao-AILab/flash-attention#1791.
I'm opening this issue to see if this would be of interest to the community + to start a discussion!
Metadata
Metadata
Assignees
Labels
No labels