UtilsRL
UtilsRL copied to clipboard
[Feature]: Full attention block support
Prerequisites
- [X] I have tried updating UtilsRL to the newest version.
- [X] I have checked both open and closed issues but found nothing related to my request.
UtilsRL verison when proposing this request
0.3.13
What I am expecting
Add attention block as a basic net structure to UtilsRL. Not sure about its design.
Possible solutions
Maybe implement the attention block as some sort of linear layer, and re-use previous code by passing the attention block as linear_type?
Any additional messages which might help
No response
Urgency
Urgent, will bring significant improvement and should be considered in next main version.