Jasper Dekoninck
Jasper Dekoninck
In the following query, I am still required to write "\\n" instead of "\n" in order to be able to compile the query in the Python package (note that this...
The following query returns "What did the fish say when" instead of the expected "What did the fish say". ```python argmax(max_len=80) """A list of good dad jokes. A indicates the...
I have the following query: ```python sample(temperature=0.8, openai_chunksize=32, max_len=64, chatty_openai=True) "The movie review in positive sentiment is: '[OUTPUT]" FROM "openai/text-ada-001" ``` The OUTPUT variable in this case is thus constrained...
It would be very helpful to be able to have more advanced stopping conditions in STOPS_AT/STOPS_BEFORE. One use case for stopping conditions with lists instead of strings, is that: ```python...
LMQL automatically prepends the token " to the beginning of the prompt (as BOS token). However, when querying openai models through playground/api this doesn't happen in the prompt (they might...
In the following query, we would not expect the query to end on the first occurrence of "wall" (output sentence: "what did the fish say when it hit the wall?"),...