Will Benton
Will Benton
> Another way to do it is to use Spark's native key-based partitioning, and manipulate the keys themselves strategically. This is what I've suggested in the past as well.
@erikerlandson Agreed; the issue AIUI is that we'd need to guarantee (for the use cases that @rnowling immediately cares about, at least) that each partition would have exactly one key....
@mattf, sorry, I didn't notice your PR. PTAL at this alternative approach.
@rnowling I'm going to propose instead that we drop the bundled `sbt`. Bundling `sbt` is an antipattern that a lot of projects do simply because a lot of other projects...
Thanks! To go upstream to agnosticd, the role needs to be `ocp4-workload-...`.
I'm seeing the same behavior in 0.13.7.
@ssimeonov This issue was a problem for older (pre-1.4) versions of Spark because of how they used type tags. The problem in Spark has been fixed (you can now use...
Hi, can you post the original patch in this issue? It's no longer available from Google Code.