When quitting a program which streams from ollama midway through streaming a panic happens
I wrote this little program in Go where the code is rather simple.
- Give it a prompt
- Stream the response from ollama
Now, the program has an os.Interrupt handler that cancels the context (I don't think this is relevant) and exits the program.
Here's a small code sample where the panic happens:
_, err := llms.GenerateFromSinglePrompt(ctx, llm, "some prompt", // panics here
llms.WithStreamingFunc(func(_ context.Context, chunk []byte) error {
select {
case <-ctx.Done():
return nil
default:
return nil
}
}))
if err != nil {
return err
}
Here's the signal handler (here only for brevity):
signal handler
ctx := context.Background()
ctx, cancel := context.WithCancel(ctx)
c := make(chan os.Signal, 1)
signal.Notify(c, os.Interrupt)
defer func() {
signal.Stop(c)
cancel()
}()
Here's the panic I get occasionally when I quit the program:
stacktrace
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x2 addr=0x18 pc=0x1007de2e0]
goroutine 24 [running]:
github.com/tmc/langchaingo/llms/ollama.(*LLM).GenerateContent(0x140000bc240, {0x10091a130, 0x1400007aa50}, {0x14000331230, 0x1, 0x100bcc5b8?}, {0x1400020e000, 0x1, 0x1400020e000?})
/Users/milosgajdos/go/pkg/mod/github.com/tmc/[email protected]/llms/ollama/ollamallm.go:148 +0x800
github.com/tmc/langchaingo/llms.GenerateFromSinglePrompt({0x10091a130, 0x1400007aa50}, {0x100918b80, 0x140000bc240}, {0x14000124900, 0x66d}, {0x1400020e000, 0x1, 0x1})
/Users/milosgajdos/go/pkg/mod/github.com/tmc/[email protected]/llms/llms.go:43 +0x148
main.LLMStream({0x10091a130, 0x1400007aa50}, 0x140000bc240, 0x140000ae120, 0x140000ae0c0, 0x1400009e120)
/Users/milosgajdos/go/src/foo/main.go:140 +0x270
created by main.main in goroutine 1
/Users/milosgajdos/go/src/foo/main.go:214 +0x48c
exit status 2
Looks like we're not checking if the Message is nil here: https://github.com/tmc/langchaingo/blob/8b67ef320cc5f19a10e650ca02f468a5bf8c1bf4/llms/ollama/ollamallm.go#L148
The problem is kinda hard to repro locally because it does not happen every time I quit the tiny program, but I figured I'd open an issue anyway.
I wrote this little program in Go where the code is rather simple.
1. Give it a prompt 2. Stream the response from ollamaNow, the program has an
os.Interrupthandler that cancels the context (I don't think this is relevant) and exits the program.Here's a small code sample where the panic happens:
_, err := llms.GenerateFromSinglePrompt(ctx, llm, "some prompt", // panics here llms.WithStreamingFunc(func(_ context.Context, chunk []byte) error { select { case <-ctx.Done(): return nil default: return nil } })) if err != nil { return err }Here's the signal handler (here only for brevity): signal handler
Here's the panic I get occasionally when I quit the program: stacktrace
Looks like we're not checking if the
Messageis nil here:https://github.com/tmc/langchaingo/blob/8b67ef320cc5f19a10e650ca02f468a5bf8c1bf4/llms/ollama/ollamallm.go#L148
The problem is kinda hard to repro locally because it does not happen every time I quit the tiny program, but I figured I'd open an issue anyway.
you don't need use select
var msgChan := make(chan string)
_, err := llms.GenerateFromSinglePrompt(ctx, llm, "some prompt", // panics here
llms.WithStreamingFunc(func(_ context.Context, chunk []byte) error {
msgChan <- string(chunk)
return nil
}))
if err != nil {
return err
}
you don't need use select
select is there to make sure the function in question (which runs in a goroutine) unblocks the goroutine it runs in when the context gets cancelled, say by pressing Ctrl+C. If you omit it the function might block until msgChan consumer consumes the chunk sent to the channel. So I dont really understand what you mean, tbqh.
Besides, I dont quite follow what does this have to do with the reported issue.
I encountered the same issue, here is my code.
var sbd strings.Builder
appCtx.Response.Header.Set("mime-type", "text/event-stream")
_, err = c.llm.GenerateContent(timeout, content,
llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
sbd.Write(chunk)
_, err2 := appCtx.Write(chunk)
if err2 != nil {
return err2
}
return appCtx.Flush()
}))
if err != nil {
return err
}
I encountered the same issue, here is my code.
var sbd strings.Builder
appCtx.Response.Header.Set("mime-type", "text/event-stream")
_, err = c.llm.GenerateContent(timeout, content,
llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
sbd.Write(chunk)
_, err2 := appCtx.Write(chunk)
if err2 != nil {
return err2
}
return appCtx.Flush()
}))
if err != nil {
return err
}
Pass in a context.Timeout(context.Background(), time.Second * time.Duration(3)), and you can reproduce the problem