spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Burak Yavuz <brk...@gmail.com>
Subject Re: Cannot Import Package (spark-csv)
Date Mon, 03 Aug 2015 16:01:46 GMT
In addition, you do not need to use --jars with --packages. --packages will
get the jar for you.

Best,
Burak

On Mon, Aug 3, 2015 at 9:01 AM, Burak Yavuz <brkyvz@gmail.com> wrote:

> Hi, there was this issue for Scala 2.11.
> https://issues.apache.org/jira/browse/SPARK-7944
> It should be fixed on master branch. You may be hitting that.
>
> Best,
> Burak
>
> On Sun, Aug 2, 2015 at 9:06 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>
>> I tried the following command on master branch:
>> bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
>> ../spark-csv_2.10-1.0.3.jar --master local
>>
>> I didn't reproduce the error with your command.
>>
>> FYI
>>
>> On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers <
>> wchambers@ischool.berkeley.edu> wrote:
>>
>>> Sure the commands are:
>>>
>>> scala> val df =
>>> sqlContext.read.format("com.databricks.spark.csv").option("header",
>>> "true").load("cars.csv")
>>>
>>> and get the following error:
>>>
>>> java.lang.RuntimeException: Failed to load class for data source:
>>> com.databricks.spark.csv
>>>   at scala.sys.package$.error(package.scala:27)
>>>   at
>>> org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
>>>   at
>>> org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
>>>   ... 49 elided
>>>
>>> On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>>>
>>>> The command you ran and the error you got were not visible.
>>>>
>>>> Mind sending them again ?
>>>>
>>>> Cheers
>>>>
>>>> On Sun, Aug 2, 2015 at 8:33 PM, billchambers <
>>>> wchambers@ischool.berkeley.edu> wrote:
>>>>
>>>>> I am trying to import the spark csv package while using the scala spark
>>>>> shell. Spark 1.4.1, Scala 2.11
>>>>>
>>>>> I am starting the shell with:
>>>>>
>>>>> bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
>>>>> ../sjars/spark-csv_2.11-1.1.0.jar --master local
>>>>>
>>>>>
>>>>> I then try and run
>>>>>
>>>>>
>>>>>
>>>>> and get the following error:
>>>>>
>>>>>
>>>>>
>>>>> What am i doing wrong?
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Bill Chambers
>>> http://billchambers.me/
>>> Email <wchambers@ischool.berkeley.edu> | LinkedIn
>>> <http://linkedin.com/in/wachambers> | Twitter
>>> <https://twitter.com/b_a_chambers> | Github
>>> <https://github.com/anabranch>
>>>
>>
>>
>

Mime
View raw message