sbt-spark-package icon indicating copy to clipboard operation
sbt-spark-package copied to clipboard

Sbt plugin for Spark packages

Results 20 sbt-spark-package issues
Sort by recently updated
recently updated
newest added

This URL https://dl.bintray.com/spark-packages/maven is forbidden now.

As I've seen this package is not updated for about 2 years, many non-provided packages is not included (such as `sql-kafka`) and a lots of other issues. Does databricks have...

Hi, When adding: ``` resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/" addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6") ``` To my project/plugins.sbt file, I get this error: `[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.spark-packages#sbt-spark-package;0.2.6: not...

v0.2.6 sbt package ``` java.lang.NoSuchMethodError: sbt.UpdateConfiguration.copy$default$1()Lscala/Option; at sbt.dependencygraph.DependencyGraphSbtCompat$Implicits$RichUpdateConfiguration$.withMissingOk$extension(DependencyGraphSbtCompat.scala:12) at net.virtualvoid.sbt.graph.DependencyGraphSettings$$anonfun$baseSettings$3.apply(DependencyGraphSettings.scala:40) at net.virtualvoid.sbt.graph.DependencyGraphSettings$$anonfun$baseSettings$3.apply(DependencyGraphSettings.scala:40) at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) at sbt.EvaluateSettings$MixedNode.evaluate0(INode.scala:175) at sbt.EvaluateSettings$INode.evaluate(INode.scala:135) at sbt.EvaluateSettings$$anonfun$sbt$EvaluateSettings$$submitEvaluate$1.apply$mcV$sp(INode.scala:69) at sbt.EvaluateSettings.sbt$EvaluateSettings$$run0(INode.scala:78) at sbt.EvaluateSettings$$anon$3.run(INode.scala:74) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)...

when trying to cross-compile for 2.12 a spark package against spark 2.4.0, I am getting ``` [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.12;1.4.0: not found...

Up until now I've used `sbt assembly`, but now I'm trying to work with `spPublishLocal` for packaging before performing (automated) integration tests (e.g. on travis) for http://spark-packages.org/package/TargetHolding/pyspark-cassandra. I've build pyspark-cassandra...

bug

I have given access to spark packages in my github and pushed all the local commit into remove repository. Still I am getting the following error. Please help to resolve...

As per documentation I should be able to add spark dependencies via: `spDependencies += "databricks/spark-avro:3.2.0"` But I have the following error: ```ML5886:job-cerebro-spark XXX$ sbt compile [info] Loading global plugins from...