Nadia Yakimakha
Nadia Yakimakha
Please refer to our documentation for Spark support. https://docs.aws.amazon.com/sagemaker/latest/dg/apache-spark.html We are evaluating our roadmap and will add support for the latest version in the future.
``` Driver stacktrace: Cause : com.amazonaws.services.sagemakerruntime.model.ModelErrorException: Received client error (400) from hd-kmeans-Model-20191031-184713 with message "unable to evaluate payload provided". See https://us-east-1.console.aws.amazon.com/cloudwatch/home?region=us-east-1#logEventViewer:group=/aws/sagemaker/Endpoints/hd-kmeans-endpoint-20191031-184713 in account 820784505615 for more information. (Service: AmazonSageMakerRuntime; Status...
Can you post the error message you got? Also the currently supported spark version is 2.2
Thank you for reporting that!
You get ```train``` printed out because this is how the container is launched by SageMaker https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-training-algo-dockerfile.html: ``` docker run image train ``` For the hyperparameters those are by default available...
You can overwrite MMS configuration through environment variables: https://github.com/awslabs/multi-model-server/blob/master/docs/configuration.md#environment-variables Though I am not sure whether it's possible for WorkerLifeCycle and going to delegate this question to MMS team.
Thank you for the suggestion!
Hi, We are still in the process of migration/separation plan of DLC and sagemaker toolkits that has been happening in the last few months. We already cleaned up tests 10...
Did you give it a try? Hostname + port should work.
Here is documentation on how to do this using sagemaker-python-sdk: https://github.com/aws/sagemaker-python-sdk/blob/master/src/sagemaker/tensorflow/deploying_python.rst#overriding-input-preprocessing-with-an-input_fn