-
Notifications
You must be signed in to change notification settings - Fork 376
Closed
Labels
questionFurther information is requestedFurther information is requested
Description
❓ Question
Is it possible to convert only part of the model to TRT. I have model that cannot be directly converted to trt because it uses custom classes. I wanted to convert only modules that can be converted but as I tried it torch cannot save it.
What you have already tried
I tried the following:
import torch.nn
import torch_tensorrt
class MySubmodule(torch.nn.Module):
def __init__(self):
super(MySubmodule, self).__init__()
self.layer = torch.nn.Linear(10, 10)
def forward(self, x):
return self.layer(x)
class MyMod(torch.nn.Module):
def __init__(self):
super(MyMod, self).__init__()
self.submod = MySubmodule()
self.submod = torch_tensorrt.compile(self.submod, inputs=[
torch_tensorrt.Input(shape=(1, 10))
])
def forward(self, x):
return self.submod(x)
if __name__ == "__main__":
model = MyMod()
scripted = torch.jit.script(model)
scripted(torch.zeros(1, 10).cuda())
scripted.save("test.pt")
But it raises exception: RuntimeError: method.qualname() == QualifiedName(selfClass->name()->qualifiedName(), methodName)INTERNAL ASSERT FAILED at "../torch/csrc/jit/serialization/python_print.cpp":1105, please report a bug to PyTorch.
Environment
- PyTorch Version (e.g., 1.0): 1.10.2
- CPU Architecture: intel
- OS (e.g., Linux): linux
- How you installed PyTorch (
conda,pip,libtorch, source): pip - Build command you used (if compiling from source):
- Are you using local sources or building from archives: from archives
- Python version: 3.8
- CUDA version: 11.3
- GPU models and configuration: rtx 3090
- Any other relevant information:
Kubaaa96
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested