我制作并训练了一个pytorch v1.4模型,该模型可以预测sin()值(基于网上的示例)。推理工作。然后,我尝试使用x86 cpu在Ubuntu上使用tvm v0.8dev0和llvm 10对其进行编译。我遵循了tvm设置指南,并为onnx运行了一些有效的教程。 我主要使用tvm上的现有教程来确定以下过程。请注意,我不是ML也不是DataScience工程师。这些是我的步骤:
import tvm,torch,os
from tvm import relay
state = torch.load("/home/dude/tvm/tst_state.pt") # load the trained pytorch state
import tst
m = tst.Net()
m.load_state_dict(state) # init the model with its trained state
m.eval()
sm = torch.jit.trace(m,torch.tensor([3.1415 / 4])) # convert to a scripted model
# the model only takes 1 input for inference hence [("input0",(1,))]
mod,params = tvm.relay.frontend.from_pytorch(sm,[("input0",))])
mod.astext # outputs some small relay(?) script
with tvm.transform.PassContext(opt_level=1):
lib = relay.build(mod,target="llvm",target_host="llvm",params=params)
最后一行给我这个错误,我不知道该如何解决,也不知道我在哪里出错。我希望有人能指出我的错误...
... removed some lines here ...
[bt] (3) /home/dude/tvm/build/libtvm.so(tvmFuncCall+0x5f) [0x7f5cd65660af]
[bt] (2) /home/dude/tvm/build/libtvm.so(+0xb4f8a7) [0x7f5cd5f318a7]
[bt] (1) /home/dude/tvm/build/libtvm.so(tvm::GenericFunc::CallPacked(tvm::runtime::tvmArgs,tvm::runtime::tvmRetvalue*) const+0x1ab) [0x7f5cd5f315cb]
[bt] (0) /home/tvm/build/libtvm.so(+0x1180cab) [0x7f5cd6562cab]
File "/home/tvm/python/tvm/_ffi/_ctypes/packed_func.py",line 81,in cfun
rv = local_pyfunc(*pyargs)
File "/home/tvm/python/tvm/relay/op/strategy/x86.py",line 311,in dense_strategy_cpu
m,_ = inputs[0].shape
ValueError: not enough values to unpack (expected 2,got 1)