python-stanford-corenlp icon indicating copy to clipboard operation
python-stanford-corenlp copied to clipboard

String interning

Open vzhong opened this issue 7 years ago • 0 comments

Hey @arunchaganty ,

@jekbradbury and @bmccann recently discovered a huge performance oversight in another tokenization library by @jekbradbury. Namely, string interning improved DecaNLP performance by something like 100x. It dawned on me that we don't seem to do this for this python client? So the output annotations are storing a bazillion copies of words, gloss, tags, whitespaces etc? Can you confirm/deny this?

For reference the issue in question is here: https://github.com/jekbradbury/revtok/pull/4

vzhong avatar Aug 22 '18 21:08 vzhong