I was running a wasmedge server with inference support cloned from here, but with newer dependencies and was facing an error in the GraphExecutionContext's set_input function.
It was an issue with the TensorType parameter. I was passing U8 but it was causing an error in the function
[error] [WASI-NN] Expect tensor type U8, but got I32
It works in wasi-nn 0.4, 0.5 but breaks in 0.6
Now I almost found the change and reason it is happening, This commit is changing the order of the generated TensorType enum (The new position of U8 is the previous position of I32).
So I tried setting the input with TensorType::F64 and it worked.
I couldn't figure out where it is breaking though. Since these are just the bindings for rust, I would like to know where the implementation of it is.
I would love to help if someone can point me in the right direction...
I was running a wasmedge server with inference support cloned from here, but with newer dependencies and was facing an error in the
GraphExecutionContext'sset_inputfunction.It was an issue with the
TensorTypeparameter. I was passingU8but it was causing an error in the function[error] [WASI-NN] Expect tensor type U8, but got I32It works in wasi-nn 0.4, 0.5 but breaks in 0.6
Now I almost found the change and reason it is happening, This commit is changing the order of the generated TensorType enum (The new position of U8 is the previous position of I32).
So I tried setting the input with
TensorType::F64and it worked.I couldn't figure out where it is breaking though. Since these are just the bindings for rust, I would like to know where the implementation of it is.
I would love to help if someone can point me in the right direction...