Shangyin Tan
Shangyin Tan
> > dsp.settings.trace > > Yeah I believe we should set it to [] going forward since there are [checks in assertion.py](https://github.com/stanfordnlp/dspy/issues/432). It seems that the optimizations logic also set...
The issue might be due to multithreading in compiling. Needs to investigate further, but as for the cached notebook, it is fine to use one thread for `compile`, which solves...
Great advice! Now we can do: ```python a = Program() a.activate_assertions() # now a can backtrack and self-refine ``` or ```python teleprompter.compile(student=Program().activate_assertions()) ```
Yes. And it shouldn't break any existing code! Could you double check both the old and new api works @arnavsinghvi11?
Thanks! Pushed a fix. Both should work now
I am having the same issure here. I assume the bits occured the same order as your query defines them. It would be great if this can be confirmed by...
@arnavsinghvi11 Thanks! Fixed both issues.
@arnavsinghvi11 Would there be an issue if we change `target_module` to be the actual module in Assertions[854a259](https://github.com/stanfordnlp/dspy/pull/1372/commits/854a259dd23c1c985054f8aa612e915fa2461ab9)? It seems `assert_transform` will re-assign the wrapped `Retry` module to all modules defined...
I've had similar experience behavior with caching for other web search APIs where bad (but valid) http response got cached too. Is there a way to wrap the caching function...