# Transformer Lens + NNsight
```
$ TypeError: HookedTransformer.forward() got an unexpected \
keyword argument 'input_ids'
```
- `.forward()` method in Transformer Lens uses `input` kwarg. LanguageModel in NNsight uses `input_ids` to prepare inputs.
### Fix
- Check function signature in NNsightModel or make a pull request to change TransformerLens naming conventions.
<br/><br/>
```
$ TypeError: HookedTransformer.forward() got an unexpected \
keyword argument 'labels'
```
- Some auto transformer models use `labels` to compute loss at the end. TransformerLens does not support these models.
### Fix
```python
def _example_input(self) -> Dict[str, torch.Tensor]:
meta_signature = inspect.signature(self.meta_model.forward)
if "labels" in meta_signature.parameters.keys():
return BatchEncoding(
{"input": torch.tensor([[0]]), "labels": torch.tensor([[0]])}
)
else:
return BatchEncoding({"input": torch.tensor([[0]])})
```
- Use `inspect.signature()` to check if model forward method takes labels.
# Summary
- Might be better just to include a wrapper with NNsight.