EADST

ONNX Inference with ONNX Runtime

ONNX Inference with ONNX Runtime

import torch
import onnxruntime
import numpy as np

# tensor input to numpy
# input_data=tensor.numpy()
# numpy input
input_data = np.load("test_data.npy")
session = onnxruntime.InferenceSession("model_name.onnx")
ort_inputs = {session.get_inputs()[0].name: input_data}
ort_outs = session.run(None, ort_inputs)
# tensor output
# out = torch.Tensor(ort_outs).cpu()
np.save("output.npy", ort_outs)
print("save output.npy done")
相关标签
About Me
XD
Goals determine what you are going to be.
Category
标签云
站点统计

本站现有博文266篇,共被浏览440650

本站已经建立2019天!

热门文章
文章归档
回到顶部