poseidon icon indicating copy to clipboard operation
poseidon copied to clipboard

Correctly handle error cases around metadata fetching and message sending

Open drcapulet opened this issue 10 years ago • 5 comments

This expands on the work from #88.

  • Available partitions includes ones that have a replica missing
  • Topic metadata is correctly refreshed when a replica is missing
  • Topic metadata is fully refreshed when sending a message fails, since by default topic metadata is only updated when each piece succeeds. This also helps expose better errors that the reason a message is sending to fail is due to a metadata problem.
  • Updated some logging calls to make it clearer what is happening
  • Refactored the RSpec before/each into shared contexts to prevent cross-pollution

drcapulet avatar Aug 04 '15 16:08 drcapulet

Hi @bpot this would be a huge help at Square, do you have any time to look at this?

zachmargolis avatar Aug 04 '15 17:08 zachmargolis

@drcapulet @zachmargolis I hope @bpot doesn't mind – he doesn't seem to have been active here for a while.

After running into various walls with Poseidon I've started work on a new Ruby client for Kafka. We'll be testing it out in Zendesk in the coming weeks, but it's still at the very early beta level and only supports the Producer API. The idea is to eventually support the new Consumer API from Kafka 0.9.

I would love to get some additional feedback on the API and design of this library, and if you could test it out it would be even better. I want the library to be easy to operate, with plenty of logging and eventually performance metrics.

Please check out https://github.com/zendesk/ruby-kafka and let me know what you think.

@bpot: sorry if you feel like this is spam, but this repo feels dead and a lot of people are using the code in production.

dasch avatar Jan 25 '16 14:01 dasch

@dasch thanks, neat gem. On our side, we mostly consume in Ruby, so please tag me when you guys add consuming to your gem, I'd love to check it out

zachmargolis avatar Jan 28 '16 22:01 zachmargolis

@zachmargolis will do – I hope to start work on a Consumer implementation in the next few weeks, but it's much more complex than producing.

dasch avatar Jan 29 '16 13:01 dasch

@zachmargolis ruby-kafka now implements the Kafka 0.9 Consumer Groups API: https://github.com/zendesk/ruby-kafka#consuming-messages-from-kafka

dasch avatar Mar 08 '16 15:03 dasch