langchaingo
langchaingo copied to clipboard
Support for custom http client for googleai/vertex llm
Hello, Many a time a generative AI model needs to be accessed via a proxy service URL or with additional http headers. Would it possible to add an option to use a http client for the googleai/vertex llms - like it is available for the PaLM or OpenAI models?
I see that, the underlying client package already provides the necessary functionality: https://github.com/googleapis/google-api-go-client/blob/v0.180.0/option/option.go#L115