Array input with integers results in "value type not convertible"
Repro steps:
import coremltools
import numpy as np
spec = coremltools.proto.Model_pb2.Model()
spec.specificationVersion = 1
spec.identity.MergeFromString(b'')
input = spec.description.input.add()
input.type.multiArrayType.shape.append(3)
input.type.multiArrayType.dataType = coremltools.proto.FeatureTypes_pb2.ArrayFeatureType.INT32
input.name = "input"
output = spec.description.output.add()
output.type.multiArrayType.shape.append(3)
output.type.multiArrayType.dataType = coremltools.proto.FeatureTypes_pb2.ArrayFeatureType.INT32
output.name = "output"
model = coremltools.models.MLModel(spec)
model.predict({'input': [1,2,3]})
Expected: something like
{'input': np.array([ 1., 2., 3.])}
Actual:
/Users/zach/venv/lib/python2.7/site-packages/coremltools/models/model.pyc in predict(self, data, useCPUOnly, **kwargs)
318
319 if self.__proxy__:
--> 320 return self.__proxy__.predict(data,useCPUOnly)
321 else:
322 if _macos_version() < (10, 13):
RuntimeError: value type not convertible
Note that changing the input to [1.0, 2, 3] seems to fix the issue; so despite the multiArrayType being INT32, it only seems to allow float input (at least in some cases).
I see this too. Thanks for the workaround!
Faced the same error. I was able to resolve it by using float64 numpy array
input_array.astype(np.float64)
I'm fairly certain I read that CoreML has a max precision of 32. I had a PyTorch model that had input types of type long or torch.int64. Trying to call the predict() method on my mlmodel raised this error. My solution was to cast to type np.int32 when calling predict:
y = model.predict('one_hot': one_hot_array.astype(np.int32))
Doing so solved my issue
With coremltools 6.0, the predict call from the original code now hangs.
float arithmetic might be wrong in some cases with large integer values (e.g. time series)
+1 seems like this is still an issue?