spring-cloud-stream icon indicating copy to clipboard operation
spring-cloud-stream copied to clipboard

Instrumentation of the Kafka consumers to allow the configuration of MDC logging

Open codependent opened this issue 3 years ago • 6 comments

This Stackoverflow question addresses how to configure SCS producers and consumers to automatically propagate the tracing information (traceId, spanId).

There's no info about the consumer side. At the moment there's no automatism doing anything with the traceparent header. Therefore, all consumer logs don't show the correlation information.

Version of the framework Boot 3.0.0 - Cloud 2022.0.0-RC3

codependent avatar Dec 05 '22 16:12 codependent

@marcingrzejszczak @jonatan-ivanov I believe this is now Observability concern and as such is it just a matter of missing dependency or documentation link on our part? If so please let me know. . .

olegz avatar Jan 11 '23 19:01 olegz

@olegz Based on Gary's response it seems the issue is with Spring Cloud Stream (and its instrumentation).

jonatan-ivanov avatar Jan 11 '23 23:01 jonatan-ivanov

@jonatan-ivanov That response was resolved by #2576 - this issue is about KStream instrumentation - Spring is not involved with Kafka Streams at runtime; it just sets up the stream topology.

With the normal consumer/producers, Spring (for Apache Kafka) observes the consume and produce operations; Kafka Streams does not provide access to the underlying components so Spring can't do any instrumentation there.

I don't know if/how sleuth instrumented those components but, if it did, it remains outside the scope of Spring - perhaps this is a gap that was missed in the migration to Micrometer tracing from Sleuth.

garyrussell avatar Jan 12 '23 14:01 garyrussell

This is what we used to recommend in the past for tracing with Kafka Streams in the binder. This used to work, although not in an ideal way since people preferred more native such support, which we couldn't do due to the reasons Gary mentioned above. I have yet to see how things have changed in this area - but the general ideal here could still be used as a workaround.

sobychacko avatar Jan 12 '23 15:01 sobychacko

I've been having a look at our projects and we manually instrumented the KStream processing. However, in order to do so we injected a brave.kafka.streams.KafkaStreamsTracing bean and applied it in a KStream transformation:

@Configuration
class EventKStreamConfiguration(...,
                                           private val kafkaStreamsTracing: KafkaStreamsTracing) {
    ...
    @Bean
    fun processEventSync() =
        Function<KStream<Key, Value>, KStream<Key, Value2?>> { x ->
            processKStream("x-sync", x)
        }

    private fun processKStream(processor: String, x: KStream<Key, Value>)=
        x.transform(
            kafkaStreamsTracing.transformer(processor) {

With Boot 3.0 KafkaStreamsTracing is not defined as a bean anymore. It used to be created in a Sleuth autoconfiguration -> BraveKafkaStreamsAutoConfiguration:

package org.springframework.cloud.sleuth.autoconfig.brave.instrument.messaging;
...
@Configuration(proxyBeanMethods = false)
@ConditionalOnMessagingEnabled
@ConditionalOnBean(Tracing.class)
@ConditionalOnProperty(value = "spring.sleuth.messaging.kafka.streams.enabled", matchIfMissing = true)
@ConditionalOnClass({ KafkaStreams.class, KafkaTracing.class, StreamsBuilderFactoryBean.class })
class BraveKafkaStreamsAutoConfiguration {

	@Bean
	@ConditionalOnMissingBean
	static KafkaStreamsTracing kafkaStreamsTracing(Tracing tracing) {
		return KafkaStreamsTracing.create(tracing);
	}

It'd be great to have a similar autoconfiguration so that the KafkaStreamsTracing bean is available out of the box again.

codependent avatar Jan 12 '23 15:01 codependent

@garyrussell Thank you! As far as I know, we don't have Kafka Streams instrumentation in Sleuth.

jonatan-ivanov avatar Jan 12 '23 19:01 jonatan-ivanov