Travis Bartley

Results 64 comments of Travis Bartley

STill catching up on PR, useful code may be how we implement transducers for asr. (We need to mini batch for the grad accumulation since it's such a memory hog.)...

Hmm, thinking about this. I think this may be a case of needing to add a wrapper to the default PTL trainer instead of stashing in `train.py` or subclassing. This...

> This is pretty cool! Right now my comments are mostly about naming and style, and all quite minor. I would like Adam to weigh in how it interacts with...

> Broader question: given that Wu & Cotterell didn't find an improvement for going from zeroth to first order, would we want to consider getting rid of the contextual form?...

> > I guess since Travis implemented the refactor for distinguishing models and modules I can defer to where he thinks things should go. That being said, I do not...

> This all basically looks fine to me, but I'll defer to Adam for final approval. > > I would recommend defining `1e7` as a constant and explaining what it...

> > Right agreed. And tbh we already have more modularity than I like :D, I think it is an issue in almost every modern NLP library/toolkit I have used.....

@kylebgorman Replaced the `math` with `log`. Found a few redundant uses of `inf` so just created a `defaults.INF` and `defaults.NEG_INF` set of arguments. While at it I also offloaded epsilon...

@Adamits Anything major that I haven't commented on yet?

@kylebgorman revert fly by comment to width. Size should only refer.to tensor.dim.