Allow for sourcing of Fluent Bit Kafka Rdkafka values from secrets
Is your feature request related to a problem? Please describe.
Currently, the Kafka struct declares the Rdkafka map as map[string]string, which means that values must be stored in plaintext in the CRD. However, when connecting to a Kafka broker using SASL_SSL (i.e. when there is a username/password corresponding to the API key/secret, respectively), then it would be better to not store those values in plaintext.
Describe the solution you'd like
Modify the Rdkafka map to allow for values to be either string or *plugins.Secret. That way, users can designate a secret ref for particular values (such as sasl.username and sasl.password), so they don't need to be provided in plaintext at the CRD level.
Additional context
This is useful in multi-tenant scenarios.
Hi is this issue still open? I would like to try it out
Hi is this issue still open? I would like to try it out
That will be great! @TheJadeLion2004
Hi, so I've run into a roadblock. I've defined a new struct mapvalue which can take either a string or a secret in package output, and modified the Kafka struct. In the kvs.go (params package) I've added a new function called InsertMapValMap to work with the mapvalue struct. However, the mapvalue struct uses *plugins.Secret which is defined in plugins.
i cannot import plugins since plugins imports params.
could you help me with how to make InsertMapValmap either work via an interface/ how to manage the circular dependency?
https://github.com/fluent/fluent-operator/pull/1328
@benjaminhuo , I have raised a PR with the changes that I have made. Overall most of the changes were to avoid circular dependencies. However, the code is not compiling, and raises an error related to the DeepCopy function. Would appreciate your help on this.
#1328
@benjaminhuo , I have raised a PR with the changes that I have made. Overall most of the changes were to avoid circular dependencies. However, the code is not compiling, and raises an error related to the DeepCopy function. Would appreciate your help on this.
@wenchajun @cw-Guo Can you help on this?
the compile issue is there is no generated codes for kafka object in apis/fluentbit/v1alpha2/plugins/output/zz_generated.deepcopy.go
to add it, you need to add the following line before the object.
// +kubebuilder:object:generate:=true
~But after all, i would suggest you use kubernetes api to get the secret value.~
~For exmaple, you can keep the old Rdkafka map[string]string, and add a new field called rdkafkaSecret to get the secretName. And in the controller, you use kubernetes api to get the secret content and then update the config file accordingly. You can define how you want to use Rdkafka, the value can be the keys in the secret. In this way, you can avoid breaking changes too.~
Noticed fluentbit has already implemented an interface to do that.
But after all, i would suggest you use kubernetes api to get the secret value
@TheJadeLion2004 you can refer to the following example
https://github.com/fluent/fluent-operator/blob/master/apis/fluentbit/v1alpha2/plugins/output/datadog_types.go#L24 https://github.com/fluent/fluent-operator/blob/master/apis/fluentbit/v1alpha2/plugins/output/datadog_types.go#L64-L70
I would suggest you to follow the above example and use a new field instead of changing the Rdkafka type
@TheJadeLion2004 you can refer to this MR https://github.com/fluent/fluent-operator/pull/1338