gailweiss

Results 4 comments of gailweiss

Hi, thanks @WolfLo ! One thing is confusing me - does this also take into account the probability of the first token in the sentence? (i.e., the probability the model...

Hi @WolfLo , thanks for the quick response! I guess what I'm not clear on is: isn't `logProba[i]` the (log) next-token distribution *after* step i? i.e. if the input is...

This seems to make sense :) thank you for taking the time to get into this! I assume/hope the way the models here are trained, one sequence begins after the...

Hi, thanks for telling me about this! I will be afk for about a month now, but I’ll try to handle this (and similar issues) on my return!