tools/dataset_generator.py
Hi,I want to produce some RLBench data for my current research, I run this file to only get some png images, why no action data?I think most imitation learning tasks require groundtruth actions, can you supplement the preservation of action?
actions depend on arm_action_mode and I found out the actions from arm mode JointVelocity and JointPosition can be found in observations. For other arm modes, you should find out how these actions are generated and I think most of them can be calculated with the observations from the environment.
actions depend on arm_action_mode and I found out the actions from arm mode JointVelocity and JointPosition can be found in observations. For other arm modes, you should find out how these actions are generated and I think most of them can be calculated with the observations from the environment.
Thanks for your reply. By reading the source code, I think I should understand the meaning of the action. Here's a question for you: Does RLBench give you a range of data? Because I want to use the maximum and minimum values of the data to do some discretization.