write_image (via kaleido) visually distorts large square heatmaps through aliasing artifacts
I have a relatively large matrix that I want to plot usinggraph_objects.Heatmap. The matrix has some structure that should become immediately apparent to the viewer (i.e. it has a constant diagonal and is symmetric). However due to visual artifacts, it looks like these properties are violated in the image produced by write_image (with the kaleido backend, haven't tested orca).
It was difficult to write a simple example script to reproduce these distortions, as variations in both the matrix and output image size can affect the issue significantly (both in intensity and quality). In any case it looks to be an aliasing artifact related to the selection of heatmap cell data that results in a non-bijective map between the heatmap cells and regions within the (vector) output image.
In the script below, one output format is PDF to rule out image compression as a culprit (as there the heatmap appears to be stored as a vector image as its cells remain sharp when zoomed in). However, the issue can be reproduced also when the output format is PNG. I assume that in the vector-output case, the heatmap data is (badly) rescaled like a bitmap image whose pixels are then re-interpreted as an array of square cells, even though those cells then do not match the heatmap/data cells any more.
#!/usr/bin/env python3
import numpy as np
import plotly.graph_objects as go
import plotly.io as pio
resultions = [
# Gives very unequal cell sizes, making the result look like a mosaic:
(200, 400, 400),
# Looks almost correct, but some diagonal elements show bad values:
(400, 400, 400),
]
for n, width, height in resultions:
# Produce a random matrix with key properties (symmetric, constant
# diagonal) matching my actual data.
np.random.seed(1)
A = np.random.random(size=(n, n))
A += A.T
np.fill_diagonal(A, -1)
# Plot the matrix as a square heatmap.
fig = go.Figure()
fig.add_trace(go.Heatmap(z=A))
fig.update_layout(
autosize=False,
width=width,
height=height,
margin=dict(b=0, l=0, r=0, t=0),
yaxis1=dict(scaleanchor="x1", constrain="domain"),
yaxis2=dict(scaleanchor="x2", constrain="domain"),
)
for ext in ("pdf", "png"):
pio.write_image(fig, f"n={n}_{width}x{height}.{ext}")
The outcomes are:
PDF: n=200_400x400.pdf n=400_400x400.pdf
PNG (note in the first image that only the heatmap, not the text looks blurred):

- For n = 200 and a layout size of 400x400, many adjacent rows and columns appear to use the same heatmap data. As a result the diagonal looks to be composed of alternating small and large blue cells. On closer inspection you can see that there are two locations where a double row and a double column meet off-diagonal, so that the result looks asymmetric.
- For n = 400 and a layout size of 400x400, the result looks mostly correct but some main diagonal cells are not blue, i.e. they have not been sampled from the data's main diagonal.
Given that my understanding of the problem is correct, I would suggest:
- For bitmap output, use a better interpolation method to scale the heatmap image to its output size.
- For vector output, if possible, do not go the middle route of rescaling a bitmap image. Instead rescale the vector data.
Here is a simpler example that I think is related to the problem.
The image shown on the Notebook and the exported .png versions are perfectly crisp.
However, the .svg and .pdfversions are heavily blurry (antialiasing filter?).
I tried both with the orca and the kaleido engines; the problem is the same.
import plotly.express as px
fig = px.imshow([[0, 1], [2, 3]])
fig.write_image("heatmap.pdf")
fig.show()
Versions:
MacOS 13.4.1
python 3.10.12
plotly 5.15.0
plotly-orca 4.1.4
python-kaleido 0.2.1
I am facing similar issues, when saving heat map plotted using px.imshow as png, the figure is alright but when saving it as pdf the heatmap is blurred (not the text).
Version: macOS 13.2 Python 3.11.5 plotly 5.9.0 kaleido 0.2.1
As png:
As pdf:
I also encountered the same problem. Is there any follow-up on this issue?