logstash-codec-avro icon indicating copy to clipboard operation
logstash-codec-avro copied to clipboard

A logstash codec plugin for decoding and encoding Avro records

Results 15 logstash-codec-avro issues
Sort by recently updated
recently updated
newest added

# New description I was not able to get the codec to work based on a Kafka input, and I created a minimum example using the file input plugin in...

bug

## Issue description Currently, the plugin doesn't have any options to set CA, SSL certificate or any SSL related configs. There are cases where corporates sign their own CAs to...

enhancement
status:needs-triage

I listen to kafka messages. Kafka messages are in avro format, and errors are reported when running. ``` avro-logstash_1 | [2019-12-13T15:38:13,239][INFO ][org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] [Consumer clientId=kafka_avro_dev-2, groupId=kafka_avro_dev] Setting newly assigned partitions: avro-logstash_1...

int-shortlist

We’re using Logstash to write event data to a Kafka queue. The consumer of this queue expects an Avro binary blob (or fragment, the terminology seems to differ sometimes). It...

enhancement

Please post all product and debugging questions on our [forum](https://discuss.elastic.co/c/logstash). Your questions will reach our wider community members there, and if we confirm that there is a bug, then we...

For all general issues, please provide the following details for fast resolution: - Version: 5.4.1 - Operating System: linux uname -a Linux dssubuntu02 4.4.0-62-generic #83-Ubuntu SMP Wed Jan 18 14:10:15...

We keep our Avro schemas in Confluent's [Schema Registry](http://docs.confluent.io/2.0.0/schema-registry/docs/index.html). It would be great if we could point the `schema_uri` to the registry's API, but the registry returns the Avro schema...

P2

Running Avro through JRuby instead of Java comes with significant overhead. Let's investigate moving the main data path of this plugin to pure Java and Java's Avro library.

enhancement
performance-improvements

All float values from Kafka printed by logstash are wrong. To Kafka I'm sending following record ProducerRecord(topic=weather, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=BN, value={"Temperature": 20.49, "Humidity": 56.0, "Pressure":...

Hi, guys. There I have a request want handle data through Storm blot in batch. But when I integrate with Logstash found that the it push data by line. So...