You can do sbt/sbt assembly/assembly to assemble only the main package.


On Nov 25, 2014, at 7:50 PM, lihu <> wrote:

    The spark assembly is time costly. If  I only need the spark-assembly-1.1.0-hadoop2.3.0.jar, do not need the spark-examples-1.1.0-hadoop2.3.0.jar.  How to configure the spark to avoid assemble the example jar. I know export SPARK_PREPEND_CLASSES=true method can reduce the assembly, but I do not
develop locally. Any advice?

Best Wishes!