Stanislav Malyshev
Stanislav Malyshev
If I add something like this after resume(): ``` offsets = {p: consumer.position(p) for p in paused_tp} for p, off in offsets.items(): consumer.seek(p, off) ``` then everything works as expected....
> in https://github.com/dpkp/kafka-python/blob/master/kafka/consumer/fetcher.py#L458, the batch should be skipped if it's a control batch: > > ```python > while batch is not None: > if getattr(batch, 'is_control_batch', False): > continue >...
Digging a bit more into it, in JVMPipelinedHashJoinUtility/acceptAndOutputSolutions this loop: ```java for (IBindingSet solution : solutions) { // add solutions to the subquery into the hash index. rightSolutions.add(solution); /* *...
Any thoughts on this one?
I am sorry, I understand it's hard to debug it this way. I can reproduce it with clean install of WDQS and some small data set - so if you'd...
Tracing into acceptAndOutputSolutions, I see this: * joinVars = [] * projectInVars = [item] * solutions = a bunch of items (99 at all for first iteration), e.g. Q22241243, Q22252392,...
Tried to debug further and I see something strange - when I watch how the execution process proceeds, the first chunk is processed OK. It forms one bucket, since the...
Also interesting - there's this code in JVMHashIndex: ``` if (keyVars == null) { /* * A ZERO LENGTH joinVars[] means that all solutions will be in the * same...