LWM icon indicating copy to clipboard operation
LWM copied to clipboard

Results 73 LWM issues
Sort by recently updated
recently updated
newest added

It would be worth to provide the measured memory requirements for inference Text Models at 32K, 128K,256K,512K and 1M tokens context window in both PyTorch and JAX.

Hi, I'm trying to run `run_vision_chat.sh` but getting the following error: ``` (lwm) minyoung@claw2:~/Projects/LWM$ bash scripts/run_vision_chat.sh I0215 18:19:20.605390 140230836105600 xla_bridge.py:689] Unable to initialize backend 'rocm': NOT_FOUND: Could not find registered...

Only got image with vision jax model to work, and even then had to remove the mesh_grid arg. Everything else has failed. E.g. needle fails like: ``` #! /bin/bash export...

Do you have any plans on creating safetensors for the models?

Thank you for your contribution! Amazing performance! I just wonder the computational requirements for training such world models, e.g., how many GPUs and how long you need to train it?

Hi, Great work on LWM! I noticed the weights are licensed under the Apache license but derived from Llama 2, do both the Llama 2 license and the Apache license...

Hi, LWM is incredible! Any plans to release a Mistral version? Thanks!

I am trying to run the `run_sample_video.sh` file from the scripts folder. I am running into a lot of dependency issues when running this on a mac M1. Has anyone...

Hi, thanks for releasing the code. Looks pretty interesting! I noticed that the LWM-Chat (multimodal) model checkpoint is only released in Jax. It would be great if you could release...

Hello there! Architecture innovator here! Everything preceding my model seems very inefficient.