Google Colab crashes for large datasets
Hi,
I would like to use a 'lonboard' in Google Colab to visualize a GeoDataFrame ('gdf'). The code runs without any errors, but no output is displayed, even though it works on my local PC. What could be the issue?
Can you ensure that you don't have version 0.9.0 installed, which was broken and yanked?
Otherwise, everything is expected to work in Colab.
Can you look in your browser console for any errors? (e.g. command option I on a mac)
I'm facing a similar issue. No error but the map doesn't show. Here's a link to the colab notebook: https://colab.research.google.com/drive/1A-6eopcEH8XlXH2IVWzmitSjxALxe8iz?usp=sharing
The code I'm trying to run is from this Overture demo https://docs.overturemaps.org/blog/2024/05/16/land-cover/
I know when I tested Colab originally that there were issues with rendering large datasets. Can you see any map even one without any data? If so, it's likely issues on Colab's side with dropping the connection for large datasets
@Melda-s looking at your notebook, it also crashes for me during the cell that constructs the layer. It looks like that is the same issue I referenced before, where Colab crashes when trying to send large amounts of data outside of the Python session. At its core, this is a colab issue that's hard to work around. We could potentially export user settings for how big each data chunk should be from Python to the frontend, but that's an unideal workaround
This isn't really possible for us to reliably work around. It's Colab's core issue that it's unable to manage a stable connection for large data to move from Python to JS.