There is a currently a lot of work that goes into adding new models/architectures to TransformerLens. It requires a lot of additional reimplementations and verifications.
The project ("TransformerBridge") will allow loading any nn.Module, including the current transformers/HuggingFace models into TransformerLens in a simple way. People will be able to easily use any architectures with TransformerLens, regardless of whether these models exist on HuggingFace by configuring one file. This project is not only designed to support and enhance all current TransformerLens usages, but it also opens the door for interpretability research in closed environments where HuggingFace hosted models may not be the target.
The proof of concept of this is done already, but it will take more time to complete, polish, and test, so that it can be rolled out for real world interpretability research. The next two months will allow us to enter beta, and begin helping people transition to using the new module as opposed to the existing HookedRootModules. Once all of the reasonable use cases of TransformerLens have been tested, we will release this new module into TransformerLens 3.0.
$10,000 USD for Bryce to work on this for the next 2 months
$3,000 USD for Fabian, Bryce's mentee, who has already been making great contributions to TransformerLens over the past year
Any additional funds will be used to continue support for TransformerLens. If the full funding goal of $30,000 is met, that will be enough for Bryce to manage TransformerLens through the rest of 2025 with no issue.