sagemaker-python-sdk icon indicating copy to clipboard operation
sagemaker-python-sdk copied to clipboard

Cant see evaluation metrics in new SageMaker Studio UI after doing model.register

Open francisco-parrilla-lner opened this issue 1 year ago • 4 comments

Describe the bug I have the following code, inside a SageMaker pipeline, for registering a trained model to model registry.

# Regression
    model_metrics = ModelMetrics(
        model_statistics=MetricsSource(
            s3_uri = Join(
                on="/",
                values=[step_evaluate_regressor.properties.ProcessingOutputConfig.Outputs["evaluation"].S3Output.S3Uri, "evaluation.json"]
            ),
            content_type="application/json"
        )
    )

    regressor_model = TensorFlowModel(
        model_data=step_train_regressor.properties.ModelArtifacts.S3ModelArtifacts,
        sagemaker_session=pipeline_session,
        role=role_pipeline,
        image_uri=training_image_uri
    )

    regressor_register_args = regressor_model.register(
        inference_instances=["ml.m5.xlarge"],
        transform_instances=["ml.m5.xlarge"],    
        model_package_group_name=regressor_model_package_name,
        model_metrics=model_metrics,
    )

    step_register_regressor = ModelStep(
        name = "RegressorRegisterModel", step_args=regressor_register_args
    )

However, when looking at the new SageMaker Studio UI (not classic one), I can see the model metrics associated to the model group packaged version:

image

My evaluation.json file looks like this:

{"metrics": {"acc": {"value": 0.2973}, "precision": {"value": 0.8219}, "recall": {"value": 0.3648}, "f1": {"value": 0.4735}, "crossentropy": {"value": 1.3566}, "acc_mc": {"value": 0.3109}, "precision_mc": {"value": 0.8253}, "recall_mc": {"value": 0.3832}, "f1_mc": {"value": 0.492}, "crossentropy_mc": {"value": 1.5692}}}

If i switch to classic sagemaker studio, I can see the metrics in UI:

image

Has there been a change of how .register() works in terms of model metrics?

To reproduce A clear, step-by-step set of instructions to reproduce the bug. The provided code need to be complete and runnable, if additional data is needed, please include them in the issue.

Expected behavior I want the metrics from my evaluation file to appear in the new SageMaker studio UI as it does in the classic one.

Screenshots or logs If applicable, add screenshots or logs to help explain your problem.

System information A description of your system. Please provide:

  • SageMaker Python SDK version: Latest (2.222.1)
  • Framework name (eg. PyTorch) or algorithm (eg. KMeans): Train and evaluation use the Tensorflow framework with a custom image.
  • Framework version: Of the processor, framework_version="0.0". If i dont supply one, it fails.
  • Python version: 3.10.14
  • CPU or GPU: CPU
  • Custom Docker image (Y/N): Y

Additional context Add any other context about the problem here.

francisco-parrilla-lner avatar Jun 12 '24 18:06 francisco-parrilla-lner

I am facing similar issue. If I register the model manually from the UI, I can see the metrics but not if I pass the model metrics to .register() function.

karppmik avatar Jun 17 '24 10:06 karppmik

I'm also seeing this issue. They're there if I flip back the classic UI, but not the new one. I'm not sure if I'm doing something wrong or it's just a bug.

nbeshouri avatar Jun 17 '24 18:06 nbeshouri

To be honest, I think the expected evaluation json file, the structure, it is different. I tried to upload manually an json file in the new UI, however, I got an error saying the schema is not correct. So not sure if the model.register API call is "failing" to log the metrics in the new UI because of this. There is really little docs on this: https://docs.aws.amazon.com/sagemaker/latest/dg/model-registry-details.html - under Model Package model card schema.

It would be good if there is an example from the sagemaker sdk perspective.

francisco-parrilla-lner avatar Jun 17 '24 19:06 francisco-parrilla-lner

Yeah, it feels like they want something different with the new UI.

I reached out to AWS support and they basically told me told me ModelMetrics was still the correct way to do it and the new UI just didn't support the model quality stuff yet be I still feel like I'm doing something wrong. It seems like such a fundamental and simple thing that they wouldn't have released the new UI without it and there's a whole "Evaluate" section in the new UI.

nbeshouri avatar Jun 18 '24 16:06 nbeshouri

@nbeshouri it seems the new UI is now supporting metrics and are being showned under evaluation section. See attached image:

image

Didn't change anything in regards to my code.

francisco-parrilla-lner avatar Jul 05 '24 11:07 francisco-parrilla-lner

I get The specified key does not exist. in the Evaluate panel, after fresh model retraining. evaluation.json file, of course, exists on S3.

j-adamczyk avatar Jul 08 '24 08:07 j-adamczyk

Have you included the name of the file like this?

s3_uri = Join(
                on="/",
                values=[step_evaluate_regressor.properties.ProcessingOutputConfig.Outputs["evaluation"].S3Output.S3Uri, "evaluation.json"]
            ),

I previously had it like this, but the step_evaluate_regressor... doesn't include the evaluation.json bit

s3_uri = step_evaluate_regressor.properties.ProcessingOutputConfig.Outputs["evaluation"].S3Output.S3Uri

francisco-parrilla-lner avatar Jul 08 '24 09:07 francisco-parrilla-lner

@francisco-parrilla-lner Thanks for the heads up! I'm able to see model metrics now too and didn't change anything either.

nbeshouri avatar Jul 08 '24 17:07 nbeshouri

The new studio model UI now supports showing metrics registered along with metrics added to ModelCardContent field.

saumitravikram avatar Jul 12 '24 04:07 saumitravikram