Anas Awadalla
Anas Awadalla
Use transformers instead of LSTM to boost text classification performance
Add an IR query to NLP queries. Stick to TF/Keras.
Add a Q&A query for NLP. Try to stick to Keras + TF for consistency and use a finetune approach (HuggingFace could be a great place to go)
Allow the user to directly use pre-trained models without finetuning.
We need to add a way for users to visualize their data depending on the task at hand.
Add the ability to restore saved models through the model path for all NLP queries by passing in the saved model path
@anas-awadalla Thanks for your quick reply. Take your running command as an example, how can I change the following command to only train on LAION-2B based on a pre-trained OPT-1.3B?...