diffusers
diffusers copied to clipboard
Get error 'Can only serialize PIL.Image.Image, got <class 'dict'>'
Describe the bug
I am trying to use a model hosted on huggingface and I have created an inference endpoint for it. I have attached the code I used to access the endpoint and generate images. However, after the image is generated I get a base 64 string which when decoded translates to: 'Can only serialize PIL.Image.Image, got <class 'dict'>'
I get the following on the endpoint console as well:
INFO | POST / | Duration: 43398.34 ms 2024/04/25 23:29:01 ~ 2024-04-25 17:59:01,559 | ERROR | Can only serialize PIL.Image.Image, got <class 'dict'>
I'm new to hosting and using models so any help is appreciated. Model I'm trying to use is: https://huggingface.co/Linaqruf/animagine-xl
Reproduction
from typing import Dict, List, Any
import torch
from torch import autocast
from diffusers import DiffusionPipeline, StableDiffusionXLPipeline
import base64
from io import BytesIO
set device
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
if device.type != 'cuda':
raise ValueError("need to run on GPU")
class EndpointHandler():
def __init__(self, path=""):
# load the optimized model
self.pipe = DiffusionPipeline.from_pretrained(path, torch_dtype=torch.float16, use_safetensors=True)
self.pipe = self.pipe.to(device)
def __call__(self, data: Any) -> List[List[Dict[str, float]]]:
"""
Args:
data (:obj:):
includes the input data and the parameters for the inference.
Return:
A :obj:`dict`:. base64 encoded image
"""
inputs = data.pop("inputs", data)
# run inference pipeline
with autocast(device.type):
image = self.pipe(inputs, guidance_scale=7.5).images[0]
# encode image as base 64
buffered = BytesIO()
image.save(buffered, format="JPEG")
img_str = base64.b64encode(buffered.getvalue())
print(img_str)
# postprocess the prediction
return {"image": img_str.decode()}
Logs
INFO | POST / | Duration: 43398.34 ms
2024/04/25 23:29:01 ~ 2024-04-25 17:59:01,559 | ERROR | Can only serialize PIL.Image.Image, got <class 'dict'>
System Info
Huggingface Inference Endpoint
Who can help?
No response