Peter Wildeford
Peter Wildeford
See my gist here: https://gist.github.com/peterhurford/0f93bd1ece78457b9a99
https://github.com/neptune-ml/kaggle-toxic-starter https://github.com/ChenglongChen/Kaggle_HomeDepot https://github.com/PavelOstyakov/toxic/tree/master/toxic https://github.com/AloneGu/kaggle_spooky/blob/master/prepare_nn_feat_2gram.ipynb https://www.kaggle.com/yekenot/pooled-gru/code https://www.kaggle.com/jhoward/improved-lstm-baseline-glove-dropout/notebook https://www.kaggle.com/mschumacher/using-fasttext-models-for-robust-embeddings https://www.kaggle.com/jhoward/minimal-lstm-nb-svm-baseline-ensemble/notebook https://www.kaggle.com/sterby/fasttext-like-baseline-with-keras-lb-0-053 https://www.kaggle.com/umbertogriffo/combined-cnn-and-pooled-lstm-fasttext/code https://www.kaggle.com/knowledgegrappler/embeddings-features-tdf-idf-let-s-party https://www.kaggle.com/eashish/bidirectional-lstm-with-convolution https://www.kaggle.com/konohayui/bi-gru-cnn-poolings https://www.kaggle.com/umbertogriffo/combined-pooled-gru-and-cnn-fasttext/code https://www.kaggle.com/rohitanil/lemmatization-and-pooled-gru/code https://www.kaggle.com/tunguz/cnn-glove300-3-oof-4-epochs/code https://www.kaggle.com/andrewmatte/w2v-into-3-layer-gru-with-relu-output/code https://www.kaggle.com/zhbain/pooled-gru-fasttext-6c07c9/code https://www.kaggle.com/johnfarrell/tfidf-3layers-mlp-from-mercari/output https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/51010#290521 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/50913#290051 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/48836#280115 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/47964 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/48038 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/48676 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/49950#288921 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/50888#290478 https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/discussion/50350#287297 https://github.com/facebookresearch/fastText/blob/master/pretrained-vectors.md https://github.com/epfml/sent2vec...
V2.2
http://hyperopt.github.io/hyperopt/ https://github.com/eignatenkov/vowpal_wabbit/blob/f2faf9392933681bf5c8722f8190c624e3ecf40f/utl/vw-hyperopt.py
Use `deploy=True` to automatically deploy a model: ``` run(logistic_regression(name='Titanic', # Gives a name to the model file. passes=3, # How many online passes to do. quadratic='ff', # Generates automatic quadratic...
If the model name contains a space character, the underlying VW process will crash with `option '--data' cannot be specified more than once`. This is because the command ends up...
This is caused because when VP crashes, the VW spanning tree cluster never spins down. This means future clusters will be initialized against a pre-existing spanning tree, which causes problems....