zhixin
zhixin
Thanks > This is not a bug, but go to your snippet file (for example through settings). > > Here you can remove the line: `{trigger: "set", replacement: "\\{ $0...
If you delete these two properties manually, it seems that nothing wrong happens. So it is really strange to me that this completely redundant items rendering defaultly.
Same issue here. from thinkpad x1c 2020. Maybe it is due to thinkpad.
最新版0.8.8也有这个问题
@Kallinteris-Andreas Thanks for the reply. But that is not satisifying. I am sorry if I didn't explain it clearly. My episode length is not a constant. What I want it...
@pseudo-rnd-thoughts Hi I have implemented a workaround. Put it short, If you have 16 vecenv and know the average episode length, say 20, then 1. Choose a proper larger batch...
The implementation is not complicated and fairly decompled. Say you have collect a batch of data with shape of `(num_steps, num_env, single_data_len)`: ```python obs = torch.zeros((args.num_steps, args.num_envs) + envs.single_observation_space.shape).to(device) actions...
> Sounds like a rollover issue (ie hardware issue). Does caps+shift+x get detected when keyd is not running? You can also check this wirh keyd monitor. > […](#) Thanks for...
> Are you looking for something like [this](https://docs.kidger.site/jaxtyping/api/pytree/#path-dependent-shapes)? > > This also works by creating a type alias for `T` in the example above, with > > from typing import...